hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
71bff0eb58fc830ed2a328902fb23bbab102454f | 34 | py | Python | lab/lab-02/lab2/a.py | MikenzieAlasca/F21-1010 | a7c15b8d9bf84f316aa6921f6d8a588c513a22b8 | [
"MIT"
] | 5 | 2021-09-09T21:08:14.000Z | 2021-12-14T02:30:52.000Z | lab/lab-02/lab2/a.py | MikenzieAlasca/F21-1010 | a7c15b8d9bf84f316aa6921f6d8a588c513a22b8 | [
"MIT"
] | null | null | null | lab/lab-02/lab2/a.py | MikenzieAlasca/F21-1010 | a7c15b8d9bf84f316aa6921f6d8a588c513a22b8 | [
"MIT"
] | 8 | 2021-09-09T17:46:07.000Z | 2022-02-08T22:41:35.000Z |
print (3*5_878_625_352_016_794)
| 11.333333 | 31 | 0.794118 | 8 | 34 | 2.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.566667 | 0.117647 | 34 | 2 | 32 | 17 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
e0b4c477c96439dc46480b93b2b4e813e0ffd421 | 644 | py | Python | morse/tests/test_encrypt.py | vitorueno/morse-translater | e20269e4826c7f7608cf044fe8307b8cb1a7f9a4 | [
"MIT"
] | 3 | 2020-05-28T15:49:04.000Z | 2021-05-20T19:35:19.000Z | morse/tests/test_encrypt.py | vitorueno/morse-translater | e20269e4826c7f7608cf044fe8307b8cb1a7f9a4 | [
"MIT"
] | null | null | null | morse/tests/test_encrypt.py | vitorueno/morse-translater | e20269e4826c7f7608cf044fe8307b8cb1a7f9a4 | [
"MIT"
] | null | null | null | from ..encrypt import encrypt
def test_encrypt():
assert encrypt('oi') == '--- .. '
assert encrypt('OI') == '--- .. '
assert encrypt('@') == '.--.-. '
assert encrypt('!') == '-.-.-- '
assert encrypt('?') == '..--.. '
assert encrypt('$') == '...-..- '
assert encrypt(
'1234567890') == '.---- ..--- ...-- ....- ..... -.... --... ---.. ----. ----- '
assert encrypt('\'') == '.----. '
assert encrypt('"') == '.-..-. '
assert encrypt('(') == '-.--. '
assert encrypt(')') == '-.--.- '
assert encrypt('.') == '.-.-.- '
assert encrypt('oi pessoal') == '--- .. .--. . ... ... --- .- .-.. ' | 35.777778 | 87 | 0.358696 | 38 | 644 | 6.052632 | 0.236842 | 0.734783 | 0.869565 | 1.017391 | 0.752174 | 0.621739 | 0.621739 | 0.621739 | 0.621739 | 0.621739 | 0 | 0.02045 | 0.240683 | 644 | 18 | 88 | 35.777778 | 0.449898 | 0 | 0 | 0 | 0 | 0 | 0.316279 | 0 | 0 | 0 | 0 | 0 | 0.8125 | 1 | 0.0625 | true | 0 | 0.0625 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4616cf8a570b3407d2b409b5f3c01faee307b245 | 124,778 | py | Python | integration/python/integration_api/api/card_api.py | ShekharPaatni/SDK | 6534ffdb63af87c02c431df9add05a90370183cb | [
"Apache-2.0"
] | 11 | 2019-04-16T02:11:17.000Z | 2021-12-16T22:51:40.000Z | integration/python/integration_api/api/card_api.py | ShekharPaatni/SDK | 6534ffdb63af87c02c431df9add05a90370183cb | [
"Apache-2.0"
] | 81 | 2019-11-19T23:24:28.000Z | 2022-03-28T11:35:47.000Z | integration/python/integration_api/api/card_api.py | ShekharPaatni/SDK | 6534ffdb63af87c02c431df9add05a90370183cb | [
"Apache-2.0"
] | 11 | 2020-07-08T02:29:56.000Z | 2022-03-28T10:05:33.000Z | # coding: utf-8
"""
Hydrogen Integration API
The Hydrogen Integration API # noqa: E501
OpenAPI spec version: 1.3.1
Contact: info@hydrogenplatform.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from integration_api.api_client import ApiClient
class CardApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_auto_reload_using_post(self, request, **kwargs): # noqa: E501
"""Card auto reload # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_auto_reload_using_post(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardAutoReloadRequestCO request: request (required)
:return: CardAutoReloadResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_auto_reload_using_post_with_http_info(request, **kwargs) # noqa: E501
else:
(data) = self.create_auto_reload_using_post_with_http_info(request, **kwargs) # noqa: E501
return data
def create_auto_reload_using_post_with_http_info(self, request, **kwargs): # noqa: E501
"""Card auto reload # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_auto_reload_using_post_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardAutoReloadRequestCO request: request (required)
:return: CardAutoReloadResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_auto_reload_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in params or
params['request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `request` when calling `create_auto_reload_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'request' in params:
body_params = params['request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/auto_reload', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardAutoReloadResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_activate_using_post(self, activate_request, **kwargs): # noqa: E501
"""Activate card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_activate_using_post(activate_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO activate_request: activateRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_activate_using_post_with_http_info(activate_request, **kwargs) # noqa: E501
else:
(data) = self.create_card_activate_using_post_with_http_info(activate_request, **kwargs) # noqa: E501
return data
def create_card_activate_using_post_with_http_info(self, activate_request, **kwargs): # noqa: E501
"""Activate card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_activate_using_post_with_http_info(activate_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO activate_request: activateRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['activate_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_activate_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'activate_request' is set
if self.api_client.client_side_validation and ('activate_request' not in params or
params['activate_request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `activate_request` when calling `create_card_activate_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'activate_request' in params:
body_params = params['activate_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/activate', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_business_using_post(self, card_business_request_co, **kwargs): # noqa: E501
"""Create a card business # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_business_using_post(card_business_request_co, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBusinessRequestCO card_business_request_co: cardBusinessRequestCO (required)
:return: CreateBusinessResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_business_using_post_with_http_info(card_business_request_co, **kwargs) # noqa: E501
else:
(data) = self.create_card_business_using_post_with_http_info(card_business_request_co, **kwargs) # noqa: E501
return data
def create_card_business_using_post_with_http_info(self, card_business_request_co, **kwargs): # noqa: E501
"""Create a card business # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_business_using_post_with_http_info(card_business_request_co, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBusinessRequestCO card_business_request_co: cardBusinessRequestCO (required)
:return: CreateBusinessResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['card_business_request_co'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_business_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'card_business_request_co' is set
if self.api_client.client_side_validation and ('card_business_request_co' not in params or
params['card_business_request_co'] is None): # noqa: E501
raise ValueError("Missing the required parameter `card_business_request_co` when calling `create_card_business_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'card_business_request_co' in params:
body_params = params['card_business_request_co']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/business', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CreateBusinessResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_close_using_post(self, close_request, **kwargs): # noqa: E501
"""close a card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_close_using_post(close_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO close_request: closeRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_close_using_post_with_http_info(close_request, **kwargs) # noqa: E501
else:
(data) = self.create_card_close_using_post_with_http_info(close_request, **kwargs) # noqa: E501
return data
def create_card_close_using_post_with_http_info(self, close_request, **kwargs): # noqa: E501
"""close a card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_close_using_post_with_http_info(close_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO close_request: closeRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['close_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_close_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'close_request' is set
if self.api_client.client_side_validation and ('close_request' not in params or
params['close_request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `close_request` when calling `create_card_close_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'close_request' in params:
body_params = params['close_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/close', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_issue_using_post(self, issue_request, **kwargs): # noqa: E501
"""issue a card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_issue_using_post(issue_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO issue_request: issueRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_issue_using_post_with_http_info(issue_request, **kwargs) # noqa: E501
else:
(data) = self.create_card_issue_using_post_with_http_info(issue_request, **kwargs) # noqa: E501
return data
def create_card_issue_using_post_with_http_info(self, issue_request, **kwargs): # noqa: E501
"""issue a card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_issue_using_post_with_http_info(issue_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO issue_request: issueRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['issue_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_issue_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'issue_request' is set
if self.api_client.client_side_validation and ('issue_request' not in params or
params['issue_request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `issue_request` when calling `create_card_issue_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'issue_request' in params:
body_params = params['issue_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/issue', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_load_using_post(self, load_request, **kwargs): # noqa: E501
"""Create a card load # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_load_using_post(load_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardLoadRequestCO load_request: loadRequest (required)
:return: CardLoadUnloadResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_load_using_post_with_http_info(load_request, **kwargs) # noqa: E501
else:
(data) = self.create_card_load_using_post_with_http_info(load_request, **kwargs) # noqa: E501
return data
def create_card_load_using_post_with_http_info(self, load_request, **kwargs): # noqa: E501
"""Create a card load # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_load_using_post_with_http_info(load_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardLoadRequestCO load_request: loadRequest (required)
:return: CardLoadUnloadResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['load_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_load_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'load_request' is set
if self.api_client.client_side_validation and ('load_request' not in params or
params['load_request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `load_request` when calling `create_card_load_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'load_request' in params:
body_params = params['load_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/load', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardLoadUnloadResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_pin_using_post(self, card_pin_request_co, **kwargs): # noqa: E501
"""pin card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_pin_using_post(card_pin_request_co, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardPinRequestCO card_pin_request_co: cardPinRequestCO (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_pin_using_post_with_http_info(card_pin_request_co, **kwargs) # noqa: E501
else:
(data) = self.create_card_pin_using_post_with_http_info(card_pin_request_co, **kwargs) # noqa: E501
return data
def create_card_pin_using_post_with_http_info(self, card_pin_request_co, **kwargs): # noqa: E501
"""pin card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_pin_using_post_with_http_info(card_pin_request_co, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardPinRequestCO card_pin_request_co: cardPinRequestCO (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['card_pin_request_co'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_pin_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'card_pin_request_co' is set
if self.api_client.client_side_validation and ('card_pin_request_co' not in params or
params['card_pin_request_co'] is None): # noqa: E501
raise ValueError("Missing the required parameter `card_pin_request_co` when calling `create_card_pin_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'card_pin_request_co' in params:
body_params = params['card_pin_request_co']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/pin', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_reactivate_using_post(self, reactivate_request, **kwargs): # noqa: E501
"""reactivate card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_reactivate_using_post(reactivate_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO reactivate_request: reactivateRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_reactivate_using_post_with_http_info(reactivate_request, **kwargs) # noqa: E501
else:
(data) = self.create_card_reactivate_using_post_with_http_info(reactivate_request, **kwargs) # noqa: E501
return data
def create_card_reactivate_using_post_with_http_info(self, reactivate_request, **kwargs): # noqa: E501
"""reactivate card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_reactivate_using_post_with_http_info(reactivate_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO reactivate_request: reactivateRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['reactivate_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_reactivate_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'reactivate_request' is set
if self.api_client.client_side_validation and ('reactivate_request' not in params or
params['reactivate_request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `reactivate_request` when calling `create_card_reactivate_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'reactivate_request' in params:
body_params = params['reactivate_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/reactivate', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_reissue_using_post(self, request, **kwargs): # noqa: E501
"""Reissue a card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_reissue_using_post(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO request: request (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_reissue_using_post_with_http_info(request, **kwargs) # noqa: E501
else:
(data) = self.create_card_reissue_using_post_with_http_info(request, **kwargs) # noqa: E501
return data
def create_card_reissue_using_post_with_http_info(self, request, **kwargs): # noqa: E501
"""Reissue a card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_reissue_using_post_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO request: request (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_reissue_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in params or
params['request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `request` when calling `create_card_reissue_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'request' in params:
body_params = params['request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/reissue', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_replace_using_post(self, request, **kwargs): # noqa: E501
"""Create card replace # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_replace_using_post(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO request: request (required)
:return: CardReplaceResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_replace_using_post_with_http_info(request, **kwargs) # noqa: E501
else:
(data) = self.create_card_replace_using_post_with_http_info(request, **kwargs) # noqa: E501
return data
def create_card_replace_using_post_with_http_info(self, request, **kwargs): # noqa: E501
"""Create card replace # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_replace_using_post_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO request: request (required)
:return: CardReplaceResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_replace_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in params or
params['request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `request` when calling `create_card_replace_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'request' in params:
body_params = params['request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/replace', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardReplaceResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_reserve_transfer_using_post(self, request, **kwargs): # noqa: E501
"""Card reserve transfer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_reserve_transfer_using_post(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardReserveTransferRequestCO request: request (required)
:return: CardReserveTransferResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_reserve_transfer_using_post_with_http_info(request, **kwargs) # noqa: E501
else:
(data) = self.create_card_reserve_transfer_using_post_with_http_info(request, **kwargs) # noqa: E501
return data
def create_card_reserve_transfer_using_post_with_http_info(self, request, **kwargs): # noqa: E501
"""Card reserve transfer # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_reserve_transfer_using_post_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardReserveTransferRequestCO request: request (required)
:return: CardReserveTransferResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_reserve_transfer_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in params or
params['request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `request` when calling `create_card_reserve_transfer_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'request' in params:
body_params = params['request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/reserve_transfer', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardReserveTransferResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_spending_control_using_post(self, request, **kwargs): # noqa: E501
"""Create card spending control # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_spending_control_using_post(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardSpendingControlRequestCO request: request (required)
:return: CardSpendingControlResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_spending_control_using_post_with_http_info(request, **kwargs) # noqa: E501
else:
(data) = self.create_card_spending_control_using_post_with_http_info(request, **kwargs) # noqa: E501
return data
def create_card_spending_control_using_post_with_http_info(self, request, **kwargs): # noqa: E501
"""Create card spending control # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_spending_control_using_post_with_http_info(request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardSpendingControlRequestCO request: request (required)
:return: CardSpendingControlResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_spending_control_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'request' is set
if self.api_client.client_side_validation and ('request' not in params or
params['request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `request` when calling `create_card_spending_control_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'request' in params:
body_params = params['request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/spending_control', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardSpendingControlResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_suspend_using_post(self, suspend_request, **kwargs): # noqa: E501
"""suspend card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_suspend_using_post(suspend_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO suspend_request: suspendRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_suspend_using_post_with_http_info(suspend_request, **kwargs) # noqa: E501
else:
(data) = self.create_card_suspend_using_post_with_http_info(suspend_request, **kwargs) # noqa: E501
return data
def create_card_suspend_using_post_with_http_info(self, suspend_request, **kwargs): # noqa: E501
"""suspend card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_suspend_using_post_with_http_info(suspend_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardBaseRequestCO suspend_request: suspendRequest (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['suspend_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_suspend_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'suspend_request' is set
if self.api_client.client_side_validation and ('suspend_request' not in params or
params['suspend_request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `suspend_request` when calling `create_card_suspend_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'suspend_request' in params:
body_params = params['suspend_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/suspend', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_token_using_post(self, tokenize_request, **kwargs): # noqa: E501
"""token card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_token_using_post(tokenize_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardTokenRequestCO tokenize_request: tokenizeRequest (required)
:return: CardTokenResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_token_using_post_with_http_info(tokenize_request, **kwargs) # noqa: E501
else:
(data) = self.create_card_token_using_post_with_http_info(tokenize_request, **kwargs) # noqa: E501
return data
def create_card_token_using_post_with_http_info(self, tokenize_request, **kwargs): # noqa: E501
"""token card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_token_using_post_with_http_info(tokenize_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardTokenRequestCO tokenize_request: tokenizeRequest (required)
:return: CardTokenResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['tokenize_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_token_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'tokenize_request' is set
if self.api_client.client_side_validation and ('tokenize_request' not in params or
params['tokenize_request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `tokenize_request` when calling `create_card_token_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'tokenize_request' in params:
body_params = params['tokenize_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/token', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardTokenResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_card_unload_using_post(self, reload_request, **kwargs): # noqa: E501
"""Create a card upload # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_unload_using_post(reload_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardUnloadRequestCO reload_request: reloadRequest (required)
:return: CardLoadUnloadResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_card_unload_using_post_with_http_info(reload_request, **kwargs) # noqa: E501
else:
(data) = self.create_card_unload_using_post_with_http_info(reload_request, **kwargs) # noqa: E501
return data
def create_card_unload_using_post_with_http_info(self, reload_request, **kwargs): # noqa: E501
"""Create a card upload # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_card_unload_using_post_with_http_info(reload_request, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardUnloadRequestCO reload_request: reloadRequest (required)
:return: CardLoadUnloadResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['reload_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_card_unload_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'reload_request' is set
if self.api_client.client_side_validation and ('reload_request' not in params or
params['reload_request'] is None): # noqa: E501
raise ValueError("Missing the required parameter `reload_request` when calling `create_card_unload_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'reload_request' in params:
body_params = params['reload_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/unload', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardLoadUnloadResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_client_card_using_post(self, card_client_request_co, **kwargs): # noqa: E501
"""Create a card client # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_client_card_using_post(card_client_request_co, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardClientRequestCO card_client_request_co: cardClientRequestCO (required)
:return: CreateCardClientResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_client_card_using_post_with_http_info(card_client_request_co, **kwargs) # noqa: E501
else:
(data) = self.create_client_card_using_post_with_http_info(card_client_request_co, **kwargs) # noqa: E501
return data
def create_client_card_using_post_with_http_info(self, card_client_request_co, **kwargs): # noqa: E501
"""Create a card client # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_client_card_using_post_with_http_info(card_client_request_co, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardClientRequestCO card_client_request_co: cardClientRequestCO (required)
:return: CreateCardClientResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['card_client_request_co'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_client_card_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'card_client_request_co' is set
if self.api_client.client_side_validation and ('card_client_request_co' not in params or
params['card_client_request_co'] is None): # noqa: E501
raise ValueError("Missing the required parameter `card_client_request_co` when calling `create_client_card_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'card_client_request_co' in params:
body_params = params['card_client_request_co']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/client', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CreateCardClientResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_business_status_using_get(self, nucleus_business_id, **kwargs): # noqa: E501
"""Get a business status # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_business_status_using_get(nucleus_business_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str nucleus_business_id: nucleus_business_id (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_business_status_using_get_with_http_info(nucleus_business_id, **kwargs) # noqa: E501
else:
(data) = self.get_business_status_using_get_with_http_info(nucleus_business_id, **kwargs) # noqa: E501
return data
def get_business_status_using_get_with_http_info(self, nucleus_business_id, **kwargs): # noqa: E501
"""Get a business status # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_business_status_using_get_with_http_info(nucleus_business_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str nucleus_business_id: nucleus_business_id (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['nucleus_business_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_business_status_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'nucleus_business_id' is set
if self.api_client.client_side_validation and ('nucleus_business_id' not in params or
params['nucleus_business_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `nucleus_business_id` when calling `get_business_status_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'nucleus_business_id' in params:
query_params.append(('nucleus_business_id', params['nucleus_business_id'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/status', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_card_balance_using_get(self, id, **kwargs): # noqa: E501
"""Get a Card Balance # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_balance_using_get(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:param date end_date: end_date
:param date start_date: start_date
:return: CardBalanceResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_card_balance_using_get_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_card_balance_using_get_with_http_info(id, **kwargs) # noqa: E501
return data
def get_card_balance_using_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get a Card Balance # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_balance_using_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:param date end_date: end_date
:param date start_date: start_date
:return: CardBalanceResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'end_date', 'start_date'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_card_balance_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `get_card_balance_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'end_date' in params:
query_params.append(('end_date', params['end_date'])) # noqa: E501
if 'start_date' in params:
query_params.append(('start_date', params['start_date'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/balance/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardBalanceResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_card_image(self, card_id, **kwargs): # noqa: E501
"""Get card image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_image(card_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str card_id: card_id (required)
:return: GetCardImageResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_card_image_with_http_info(card_id, **kwargs) # noqa: E501
else:
(data) = self.get_card_image_with_http_info(card_id, **kwargs) # noqa: E501
return data
def get_card_image_with_http_info(self, card_id, **kwargs): # noqa: E501
"""Get card image # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_image_with_http_info(card_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str card_id: card_id (required)
:return: GetCardImageResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['card_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_card_image" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'card_id' is set
if self.api_client.client_side_validation and ('card_id' not in params or
params['card_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `card_id` when calling `get_card_image`") # noqa: E501
collection_formats = {}
path_params = {}
if 'card_id' in params:
path_params['card_id'] = params['card_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/image/{card_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetCardImageResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_card_pci_details(self, card_id, **kwargs): # noqa: E501
"""Get card pci details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_pci_details(card_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str card_id: card_id (required)
:return: GetCardPciDetailsResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_card_pci_details_with_http_info(card_id, **kwargs) # noqa: E501
else:
(data) = self.get_card_pci_details_with_http_info(card_id, **kwargs) # noqa: E501
return data
def get_card_pci_details_with_http_info(self, card_id, **kwargs): # noqa: E501
"""Get card pci details # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_pci_details_with_http_info(card_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str card_id: card_id (required)
:return: GetCardPciDetailsResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['card_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_card_pci_details" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'card_id' is set
if self.api_client.client_side_validation and ('card_id' not in params or
params['card_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `card_id` when calling `get_card_pci_details`") # noqa: E501
collection_formats = {}
path_params = {}
if 'card_id' in params:
path_params['card_id'] = params['card_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/pci_details/{card_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetCardPciDetailsResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_card_reserve_account_details_using_get(self, **kwargs): # noqa: E501
"""Card reserve account # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_reserve_account_details_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: CardReserveAccountResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_card_reserve_account_details_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_card_reserve_account_details_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_card_reserve_account_details_using_get_with_http_info(self, **kwargs): # noqa: E501
"""Card reserve account # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_reserve_account_details_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: CardReserveAccountResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_card_reserve_account_details_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/reserve', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardReserveAccountResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_card_statement_using_get(self, card_id, **kwargs): # noqa: E501
"""Get card statement # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_statement_using_get(card_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str card_id: card_id (required)
:param date end_date: end_date
:param date start_date: start_date
:return: GetCardStatementResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_card_statement_using_get_with_http_info(card_id, **kwargs) # noqa: E501
else:
(data) = self.get_card_statement_using_get_with_http_info(card_id, **kwargs) # noqa: E501
return data
def get_card_statement_using_get_with_http_info(self, card_id, **kwargs): # noqa: E501
"""Get card statement # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_statement_using_get_with_http_info(card_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str card_id: card_id (required)
:param date end_date: end_date
:param date start_date: start_date
:return: GetCardStatementResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['card_id', 'end_date', 'start_date'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_card_statement_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'card_id' is set
if self.api_client.client_side_validation and ('card_id' not in params or
params['card_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `card_id` when calling `get_card_statement_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'card_id' in params:
path_params['card_id'] = params['card_id'] # noqa: E501
query_params = []
if 'end_date' in params:
query_params.append(('end_date', params['end_date'])) # noqa: E501
if 'start_date' in params:
query_params.append(('start_date', params['start_date'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/statement/{card_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GetCardStatementResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_card_token_using_token(self, id, **kwargs): # noqa: E501
"""Get a card token # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_token_using_token(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:param str device_id: device_id
:param str device_type: device_type
:param str wallet: wallet
:return: list[GetCardTokenResponseVO]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_card_token_using_token_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_card_token_using_token_with_http_info(id, **kwargs) # noqa: E501
return data
def get_card_token_using_token_with_http_info(self, id, **kwargs): # noqa: E501
"""Get a card token # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_token_using_token_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:param str device_id: device_id
:param str device_type: device_type
:param str wallet: wallet
:return: list[GetCardTokenResponseVO]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'device_id', 'device_type', 'wallet'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_card_token_using_token" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `get_card_token_using_token`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'device_id' in params:
query_params.append(('device_id', params['device_id'])) # noqa: E501
if 'device_type' in params:
query_params.append(('device_type', params['device_type'])) # noqa: E501
if 'wallet' in params:
query_params.append(('wallet', params['wallet'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/token/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[GetCardTokenResponseVO]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_card_transaction_using_get(self, id, **kwargs): # noqa: E501
"""Get a card transaction # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_transaction_using_get(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:param date end_date: end_date
:param date start_date: start_date
:return: CardTransactionResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_card_transaction_using_get_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_card_transaction_using_get_with_http_info(id, **kwargs) # noqa: E501
return data
def get_card_transaction_using_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get a card transaction # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_transaction_using_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:param date end_date: end_date
:param date start_date: start_date
:return: CardTransactionResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'end_date', 'start_date'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_card_transaction_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `get_card_transaction_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'end_date' in params:
query_params.append(('end_date', params['end_date'])) # noqa: E501
if 'start_date' in params:
query_params.append(('start_date', params['start_date'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/transaction/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardTransactionResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_card_using_get(self, id, **kwargs): # noqa: E501
"""Get a card information # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_using_get(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_card_using_get_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_card_using_get_with_http_info(id, **kwargs) # noqa: E501
return data
def get_card_using_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get a card information # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_card_using_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_card_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `get_card_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_card_business_using_put(self, nucleus_business_id, **kwargs): # noqa: E501
"""Update a card business # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_card_business_using_put(nucleus_business_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str nucleus_business_id: nucleus_business_id (required)
:return: UpdateBusinessResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_card_business_using_put_with_http_info(nucleus_business_id, **kwargs) # noqa: E501
else:
(data) = self.update_card_business_using_put_with_http_info(nucleus_business_id, **kwargs) # noqa: E501
return data
def update_card_business_using_put_with_http_info(self, nucleus_business_id, **kwargs): # noqa: E501
"""Update a card business # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_card_business_using_put_with_http_info(nucleus_business_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str nucleus_business_id: nucleus_business_id (required)
:return: UpdateBusinessResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['nucleus_business_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_card_business_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'nucleus_business_id' is set
if self.api_client.client_side_validation and ('nucleus_business_id' not in params or
params['nucleus_business_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `nucleus_business_id` when calling `update_card_business_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'nucleus_business_id' in params:
path_params['nucleus_business_id'] = params['nucleus_business_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/business/{nucleus_business_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UpdateBusinessResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_card_pin_using_put(self, card_pin_request_co, id, **kwargs): # noqa: E501
"""update a pin card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_card_pin_using_put(card_pin_request_co, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardUpdatePinRequestCO card_pin_request_co: cardPinRequestCO (required)
:param str id: id (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_card_pin_using_put_with_http_info(card_pin_request_co, id, **kwargs) # noqa: E501
else:
(data) = self.update_card_pin_using_put_with_http_info(card_pin_request_co, id, **kwargs) # noqa: E501
return data
def update_card_pin_using_put_with_http_info(self, card_pin_request_co, id, **kwargs): # noqa: E501
"""update a pin card # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_card_pin_using_put_with_http_info(card_pin_request_co, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardUpdatePinRequestCO card_pin_request_co: cardPinRequestCO (required)
:param str id: id (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['card_pin_request_co', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_card_pin_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'card_pin_request_co' is set
if self.api_client.client_side_validation and ('card_pin_request_co' not in params or
params['card_pin_request_co'] is None): # noqa: E501
raise ValueError("Missing the required parameter `card_pin_request_co` when calling `update_card_pin_using_put`") # noqa: E501
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `update_card_pin_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'card_pin_request_co' in params:
body_params = params['card_pin_request_co']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/pin/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_card_spending_control_using_put(self, nucleus_spending_control_id, **kwargs): # noqa: E501
"""Update a card spending control # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_card_spending_control_using_put(nucleus_spending_control_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str nucleus_spending_control_id: nucleus_spending_control_id (required)
:return: CardSpendingControlResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_card_spending_control_using_put_with_http_info(nucleus_spending_control_id, **kwargs) # noqa: E501
else:
(data) = self.update_card_spending_control_using_put_with_http_info(nucleus_spending_control_id, **kwargs) # noqa: E501
return data
def update_card_spending_control_using_put_with_http_info(self, nucleus_spending_control_id, **kwargs): # noqa: E501
"""Update a card spending control # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_card_spending_control_using_put_with_http_info(nucleus_spending_control_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str nucleus_spending_control_id: nucleus_spending_control_id (required)
:return: CardSpendingControlResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['nucleus_spending_control_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_card_spending_control_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'nucleus_spending_control_id' is set
if self.api_client.client_side_validation and ('nucleus_spending_control_id' not in params or
params['nucleus_spending_control_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `nucleus_spending_control_id` when calling `update_card_spending_control_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'nucleus_spending_control_id' in params:
path_params['nucleus_spending_control_id'] = params['nucleus_spending_control_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/spending_control/{nucleus_spending_control_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CardSpendingControlResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_card_using_put(self, id, **kwargs): # noqa: E501
"""Update a card information # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_card_using_put(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_card_using_put_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.update_card_using_put_with_http_info(id, **kwargs) # noqa: E501
return data
def update_card_using_put_with_http_info(self, id, **kwargs): # noqa: E501
"""Update a card information # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_card_using_put_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_card_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `update_card_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_client_card_using_put(self, id, **kwargs): # noqa: E501
"""Update a card client # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_client_card_using_put(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:return: UpdateCardClientResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_client_card_using_put_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.update_client_card_using_put_with_http_info(id, **kwargs) # noqa: E501
return data
def update_client_card_using_put_with_http_info(self, id, **kwargs): # noqa: E501
"""Update a card client # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_client_card_using_put_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: id (required)
:return: UpdateCardClientResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_client_card_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `update_client_card_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/client/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UpdateCardClientResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def verify_card_pin_using_post(self, card_pin_request_co, **kwargs): # noqa: E501
"""verify card pin # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.verify_card_pin_using_post(card_pin_request_co, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardPinRequestCO card_pin_request_co: cardPinRequestCO (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.verify_card_pin_using_post_with_http_info(card_pin_request_co, **kwargs) # noqa: E501
else:
(data) = self.verify_card_pin_using_post_with_http_info(card_pin_request_co, **kwargs) # noqa: E501
return data
def verify_card_pin_using_post_with_http_info(self, card_pin_request_co, **kwargs): # noqa: E501
"""verify card pin # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.verify_card_pin_using_post_with_http_info(card_pin_request_co, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CardPinRequestCO card_pin_request_co: cardPinRequestCO (required)
:return: BaseResponseVO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['card_pin_request_co'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method verify_card_pin_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'card_pin_request_co' is set
if self.api_client.client_side_validation and ('card_pin_request_co' not in params or
params['card_pin_request_co'] is None): # noqa: E501
raise ValueError("Missing the required parameter `card_pin_request_co` when calling `verify_card_pin_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'card_pin_request_co' in params:
body_params = params['card_pin_request_co']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/card/pin/verify', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='BaseResponseVO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 41.018409 | 160 | 0.617905 | 14,206 | 124,778 | 5.113755 | 0.015698 | 0.048344 | 0.023897 | 0.030724 | 0.980178 | 0.968698 | 0.954684 | 0.949688 | 0.940217 | 0.93215 | 0 | 0.015449 | 0.296062 | 124,778 | 3,041 | 161 | 41.031897 | 0.811615 | 0.299284 | 0 | 0.798574 | 1 | 0 | 0.188949 | 0.06189 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037433 | false | 0 | 0.002377 | 0 | 0.095663 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1cd550cf584fea51ac4a120b1a6764063d16a06d | 1,272 | py | Python | bfs/tests/test_bfs_client.py | NordFk/bfs-soap-api-wrapper | f149e33db9a19f325e3ae335bb6682e15b667e6a | [
"Apache-2.0"
] | 2 | 2021-11-20T14:16:56.000Z | 2021-12-15T10:33:01.000Z | bfs/tests/test_bfs_client.py | NordFk/bfs-soap-api-wrapper | f149e33db9a19f325e3ae335bb6682e15b667e6a | [
"Apache-2.0"
] | null | null | null | bfs/tests/test_bfs_client.py | NordFk/bfs-soap-api-wrapper | f149e33db9a19f325e3ae335bb6682e15b667e6a | [
"Apache-2.0"
] | 2 | 2021-11-20T16:49:38.000Z | 2021-11-20T21:26:16.000Z | from unittest import TestCase
import json
from bfs import Bfs
class TestBfs(TestCase):
def test_bricknode_configuration_warning(self):
with self.assertRaises(ValueError) as cm:
bfs = Bfs({})
self.assertTrue(isinstance(cm.exception, ValueError))
self.assertEqual(cm.exception.args[0], '"bricknode" element missing from configuration')
def test_bricknode_configuration_warning(self):
with self.assertRaises(ValueError) as cm:
bfs = Bfs({})
self.assertTrue(isinstance(cm.exception, ValueError))
self.assertEqual(cm.exception.args[0], '"bricknode" element missing from configuration')
def test_wsdl_configuration_warning(self):
with self.assertRaises(ValueError) as cm:
bfs = Bfs({'bricknode': {}})
self.assertTrue(isinstance(cm.exception, ValueError))
self.assertEqual(cm.exception.args[0], '"wsdl" element missing from "bricknode" configuration')
def test_no_url_in_wsdl_configuration_warning(self):
with self.assertRaises(ValueError) as cm:
bfs = Bfs({'bricknode': {'wsdl': ''}})
self.assertTrue(isinstance(cm.exception, ValueError))
self.assertEqual(cm.exception.args[0], 'No URL given for the wsdl')
| 35.333333 | 103 | 0.687107 | 145 | 1,272 | 5.924138 | 0.234483 | 0.102445 | 0.111758 | 0.130384 | 0.820722 | 0.820722 | 0.820722 | 0.820722 | 0.820722 | 0.820722 | 0 | 0.003937 | 0.201258 | 1,272 | 35 | 104 | 36.342857 | 0.841535 | 0 | 0 | 0.583333 | 0 | 0 | 0.150943 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.166667 | false | 0 | 0.125 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e80bca082aa083407dfc669021ede15dcacd9199 | 597,776 | py | Python | colormaps/cmaps.py | pratiman-91/colormaps | fed37bc60a7570b802ebe4116a948db40f8e62d3 | [
"MIT"
] | null | null | null | colormaps/cmaps.py | pratiman-91/colormaps | fed37bc60a7570b802ebe4116a948db40f8e62d3 | [
"MIT"
] | null | null | null | colormaps/cmaps.py | pratiman-91/colormaps | fed37bc60a7570b802ebe4116a948db40f8e62d3 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import re
from glob import glob
import matplotlib.cm
import numpy as np
# from ._version import __version__
from .colormap import Colormap
CMAPSFILE_DIR = os.path.join(
os.path.dirname(os.path.abspath(__file__)), 'colormaps')
USER_CMAPFILE_DIR = os.environ.get('CMAP_DIR')
class Cmaps(object):
"""colormaps"""
def __init__(self, ):
self._parse_cmaps()
# self.__version__ = __version__
def _coltbl(self, cmap_file):
pattern = re.compile(r'(\d\.?\d*)\s+(\d\.?\d*)\s+(\d\.?\d*).*')
with open(cmap_file) as cmap:
cmap_buff = cmap.read()
cmap_buff = re.compile('ncolors.*\n').sub('', cmap_buff)
if re.search(r'\s*\d\.\d*', cmap_buff):
return np.asarray(pattern.findall(cmap_buff), 'f4')
else:
return np.asarray(pattern.findall(cmap_buff), 'u1') / 255.
@property
def colors(self):
"""
Colors expressed on the range 0-1 as used by matplotlib.
"""
mc = []
for color in self.colors:
mc.append(tuple([x / 255. for x in color]))
return mc
def _parse_cmaps(self):
if USER_CMAPFILE_DIR is not None:
cmapsflist = sorted(glob(os.path.join(USER_CMAPFILE_DIR, '*.rgb')))
for cmap_file in cmapsflist:
cname = os.path.basename(cmap_file).split('.rgb')[0]
# start with the number will result illegal attribute
if cname[0].isdigit() or cname.startswith('_'):
cname = 'C' + cname
if '-' in cname:
cname = cname.replace('-', '_')
if '+' in cname:
cname = cname.replace('+', '_')
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
setattr(self, cname, cmap)
cname = cname + '_r'
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
setattr(self, cname, cmap)
@property
def aggrnyl(self):
cname = "aggrnyl"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "aggrnyl.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def aggrnyl_r(self):
cname = "aggrnyl_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "aggrnyl.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def agsunset(self):
cname = "agsunset"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "agsunset.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def agsunset_r(self):
cname = "agsunset_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "agsunset.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def antique(self):
cname = "antique"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "antique.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def antique_r(self):
cname = "antique_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "antique.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def armyrose(self):
cname = "armyrose"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "armyrose.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def armyrose_r(self):
cname = "armyrose_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "armyrose.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blugrn(self):
cname = "blugrn"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "blugrn.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blugrn_r(self):
cname = "blugrn_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "blugrn.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bluyl(self):
cname = "bluyl"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "bluyl.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bluyl_r(self):
cname = "bluyl_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "bluyl.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bold(self):
cname = "bold"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "bold.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bold_r(self):
cname = "bold_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "bold.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brwnyl(self):
cname = "brwnyl"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "brwnyl.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brwnyl_r(self):
cname = "brwnyl_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "brwnyl.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def burg(self):
cname = "burg"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "burg.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def burg_r(self):
cname = "burg_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "burg.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def burgyl(self):
cname = "burgyl"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "burgyl.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def burgyl_r(self):
cname = "burgyl_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "burgyl.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def darkmint(self):
cname = "darkmint"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "darkmint.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def darkmint_r(self):
cname = "darkmint_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "darkmint.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def earth(self):
cname = "earth"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "earth.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def earth_r(self):
cname = "earth_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "earth.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def emrld(self):
cname = "emrld"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "emrld.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def emrld_r(self):
cname = "emrld_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "emrld.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def fall(self):
cname = "fall"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "fall.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def fall_r(self):
cname = "fall_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "fall.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def geyser(self):
cname = "geyser"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "geyser.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def geyser_r(self):
cname = "geyser_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "geyser.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def magenta(self):
cname = "magenta"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "magenta.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def magenta_r(self):
cname = "magenta_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "magenta.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def mint(self):
cname = "mint"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "mint.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def mint_r(self):
cname = "mint_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "mint.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oryel(self):
cname = "oryel"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "oryel.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oryel_r(self):
cname = "oryel_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "oryel.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel(self):
cname = "pastel"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "pastel.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel_r(self):
cname = "pastel_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "pastel.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def peach(self):
cname = "peach"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "peach.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def peach_r(self):
cname = "peach_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "peach.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pinkyl(self):
cname = "pinkyl"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "pinkyl.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pinkyl_r(self):
cname = "pinkyl_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "pinkyl.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prism(self):
cname = "prism"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "prism.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prism_r(self):
cname = "prism_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "prism.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purp(self):
cname = "purp"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "purp.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purp_r(self):
cname = "purp_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "purp.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purpor(self):
cname = "purpor"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "purpor.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purpor_r(self):
cname = "purpor_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "purpor.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def redor(self):
cname = "redor"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "redor.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def redor_r(self):
cname = "redor_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "redor.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def safe(self):
cname = "safe"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "safe.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def safe_r(self):
cname = "safe_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "safe.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def sunset(self):
cname = "sunset"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "sunset.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def sunset_r(self):
cname = "sunset_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "sunset.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def sunsetdark(self):
cname = "sunsetdark"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "sunsetdark.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def sunsetdark_r(self):
cname = "sunsetdark_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "sunsetdark.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def teal(self):
cname = "teal"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "teal.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def teal_r(self):
cname = "teal_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "teal.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tealgrn(self):
cname = "tealgrn"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "tealgrn.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tealgrn_r(self):
cname = "tealgrn_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "tealgrn.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tealrose(self):
cname = "tealrose"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "tealrose.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tealrose_r(self):
cname = "tealrose_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "tealrose.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temps(self):
cname = "temps"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "temps.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temps_r(self):
cname = "temps_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "temps.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tropic(self):
cname = "tropic"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "tropic.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tropic_r(self):
cname = "tropic_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "tropic.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vivid(self):
cname = "vivid"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "vivid.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vivid_r(self):
cname = "vivid_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cartocolors", "vivid.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def algae(self):
cname = "algae"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "algae.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def algae_r(self):
cname = "algae_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "algae.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def amp(self):
cname = "amp"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "amp.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def amp_r(self):
cname = "amp_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "amp.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def balance(self):
cname = "balance"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "balance.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def balance_r(self):
cname = "balance_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "balance.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def curl(self):
cname = "curl"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "curl.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def curl_r(self):
cname = "curl_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "curl.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def deep(self):
cname = "deep"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "deep.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def deep_r(self):
cname = "deep_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "deep.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def delta(self):
cname = "delta"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "delta.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def delta_r(self):
cname = "delta_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "delta.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dense(self):
cname = "dense"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "dense.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dense_r(self):
cname = "dense_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "dense.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gray(self):
cname = "gray"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "gray.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gray_r(self):
cname = "gray_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "gray.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def haline(self):
cname = "haline"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "haline.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def haline_r(self):
cname = "haline_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "haline.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ice(self):
cname = "ice"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "ice.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ice_r(self):
cname = "ice_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "ice.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matter(self):
cname = "matter"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "matter.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matter_r(self):
cname = "matter_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "matter.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oxy(self):
cname = "oxy"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "oxy.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oxy_r(self):
cname = "oxy_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "oxy.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def phase(self):
cname = "phase"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "phase.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def phase_r(self):
cname = "phase_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "phase.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def solar(self):
cname = "solar"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "solar.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def solar_r(self):
cname = "solar_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "solar.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def speed(self):
cname = "speed"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "speed.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def speed_r(self):
cname = "speed_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "speed.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tempo(self):
cname = "tempo"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "tempo.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tempo_r(self):
cname = "tempo_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "tempo.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def thermal(self):
cname = "thermal"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "thermal.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def thermal_r(self):
cname = "thermal_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "thermal.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def turbid(self):
cname = "turbid"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "turbid.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def turbid_r(self):
cname = "turbid_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cmocean", "turbid.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent(self):
cname = "accent"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_r(self):
cname = "accent_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_3(self):
cname = "accent_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_3_r(self):
cname = "accent_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_4(self):
cname = "accent_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_4_r(self):
cname = "accent_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_5(self):
cname = "accent_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_5_r(self):
cname = "accent_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_6(self):
cname = "accent_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_6_r(self):
cname = "accent_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_7(self):
cname = "accent_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_7_r(self):
cname = "accent_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_8(self):
cname = "accent_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def accent_8_r(self):
cname = "accent_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "accent_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues(self):
cname = "blues"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_r(self):
cname = "blues_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_3(self):
cname = "blues_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_3_r(self):
cname = "blues_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_4(self):
cname = "blues_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_4_r(self):
cname = "blues_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_5(self):
cname = "blues_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_5_r(self):
cname = "blues_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_6(self):
cname = "blues_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_6_r(self):
cname = "blues_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_7(self):
cname = "blues_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_7_r(self):
cname = "blues_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_8(self):
cname = "blues_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_8_r(self):
cname = "blues_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_9(self):
cname = "blues_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blues_9_r(self):
cname = "blues_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "blues_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg(self):
cname = "brbg"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_r(self):
cname = "brbg_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_10(self):
cname = "brbg_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_10_r(self):
cname = "brbg_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_11(self):
cname = "brbg_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_11_r(self):
cname = "brbg_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_3(self):
cname = "brbg_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_3_r(self):
cname = "brbg_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_4(self):
cname = "brbg_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_4_r(self):
cname = "brbg_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_5(self):
cname = "brbg_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_5_r(self):
cname = "brbg_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_6(self):
cname = "brbg_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_6_r(self):
cname = "brbg_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_7(self):
cname = "brbg_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_7_r(self):
cname = "brbg_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_8(self):
cname = "brbg_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_8_r(self):
cname = "brbg_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_9(self):
cname = "brbg_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brbg_9_r(self):
cname = "brbg_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "brbg_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn(self):
cname = "bugn"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_r(self):
cname = "bugn_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_3(self):
cname = "bugn_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_3_r(self):
cname = "bugn_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_4(self):
cname = "bugn_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_4_r(self):
cname = "bugn_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_5(self):
cname = "bugn_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_5_r(self):
cname = "bugn_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_6(self):
cname = "bugn_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_6_r(self):
cname = "bugn_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_7(self):
cname = "bugn_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_7_r(self):
cname = "bugn_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_8(self):
cname = "bugn_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_8_r(self):
cname = "bugn_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_9(self):
cname = "bugn_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bugn_9_r(self):
cname = "bugn_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bugn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu(self):
cname = "bupu"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_r(self):
cname = "bupu_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_3(self):
cname = "bupu_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_3_r(self):
cname = "bupu_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_4(self):
cname = "bupu_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_4_r(self):
cname = "bupu_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_5(self):
cname = "bupu_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_5_r(self):
cname = "bupu_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_6(self):
cname = "bupu_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_6_r(self):
cname = "bupu_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_7(self):
cname = "bupu_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_7_r(self):
cname = "bupu_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_8(self):
cname = "bupu_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_8_r(self):
cname = "bupu_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_9(self):
cname = "bupu_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bupu_9_r(self):
cname = "bupu_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "bupu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2(self):
cname = "dark2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_r(self):
cname = "dark2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_3(self):
cname = "dark2_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_3_r(self):
cname = "dark2_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_4(self):
cname = "dark2_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_4_r(self):
cname = "dark2_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_5(self):
cname = "dark2_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_5_r(self):
cname = "dark2_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_6(self):
cname = "dark2_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_6_r(self):
cname = "dark2_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_7(self):
cname = "dark2_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_7_r(self):
cname = "dark2_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_8(self):
cname = "dark2_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dark2_8_r(self):
cname = "dark2_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "dark2_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu(self):
cname = "gnbu"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_r(self):
cname = "gnbu_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_3(self):
cname = "gnbu_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_3_r(self):
cname = "gnbu_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_4(self):
cname = "gnbu_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_4_r(self):
cname = "gnbu_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_5(self):
cname = "gnbu_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_5_r(self):
cname = "gnbu_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_6(self):
cname = "gnbu_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_6_r(self):
cname = "gnbu_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_7(self):
cname = "gnbu_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_7_r(self):
cname = "gnbu_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_8(self):
cname = "gnbu_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_8_r(self):
cname = "gnbu_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_9(self):
cname = "gnbu_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gnbu_9_r(self):
cname = "gnbu_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "gnbu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens(self):
cname = "greens"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_r(self):
cname = "greens_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_3(self):
cname = "greens_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_3_r(self):
cname = "greens_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_4(self):
cname = "greens_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_4_r(self):
cname = "greens_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_5(self):
cname = "greens_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_5_r(self):
cname = "greens_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_6(self):
cname = "greens_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_6_r(self):
cname = "greens_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_7(self):
cname = "greens_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_7_r(self):
cname = "greens_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_8(self):
cname = "greens_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_8_r(self):
cname = "greens_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_9(self):
cname = "greens_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greens_9_r(self):
cname = "greens_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greens_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys(self):
cname = "greys"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_r(self):
cname = "greys_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_3(self):
cname = "greys_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_3_r(self):
cname = "greys_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_4(self):
cname = "greys_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_4_r(self):
cname = "greys_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_5(self):
cname = "greys_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_5_r(self):
cname = "greys_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_6(self):
cname = "greys_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_6_r(self):
cname = "greys_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_7(self):
cname = "greys_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_7_r(self):
cname = "greys_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_8(self):
cname = "greys_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_8_r(self):
cname = "greys_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_9(self):
cname = "greys_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greys_9_r(self):
cname = "greys_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "greys_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges(self):
cname = "oranges"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_r(self):
cname = "oranges_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_3(self):
cname = "oranges_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_3_r(self):
cname = "oranges_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_4(self):
cname = "oranges_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_4_r(self):
cname = "oranges_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_5(self):
cname = "oranges_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_5_r(self):
cname = "oranges_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_6(self):
cname = "oranges_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_6_r(self):
cname = "oranges_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_7(self):
cname = "oranges_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_7_r(self):
cname = "oranges_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_8(self):
cname = "oranges_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_8_r(self):
cname = "oranges_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_9(self):
cname = "oranges_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oranges_9_r(self):
cname = "oranges_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "oranges_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd(self):
cname = "orrd"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_r(self):
cname = "orrd_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_3(self):
cname = "orrd_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_3_r(self):
cname = "orrd_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_4(self):
cname = "orrd_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_4_r(self):
cname = "orrd_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_5(self):
cname = "orrd_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_5_r(self):
cname = "orrd_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_6(self):
cname = "orrd_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_6_r(self):
cname = "orrd_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_7(self):
cname = "orrd_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_7_r(self):
cname = "orrd_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_8(self):
cname = "orrd_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_8_r(self):
cname = "orrd_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_9(self):
cname = "orrd_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def orrd_9_r(self):
cname = "orrd_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "orrd_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired(self):
cname = "paired"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_r(self):
cname = "paired_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_10(self):
cname = "paired_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_10_r(self):
cname = "paired_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_11(self):
cname = "paired_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_11_r(self):
cname = "paired_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_12(self):
cname = "paired_12"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_12.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_12_r(self):
cname = "paired_12_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_12.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_3(self):
cname = "paired_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_3_r(self):
cname = "paired_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_4(self):
cname = "paired_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_4_r(self):
cname = "paired_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_5(self):
cname = "paired_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_5_r(self):
cname = "paired_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_6(self):
cname = "paired_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_6_r(self):
cname = "paired_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_7(self):
cname = "paired_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_7_r(self):
cname = "paired_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_8(self):
cname = "paired_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_8_r(self):
cname = "paired_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_9(self):
cname = "paired_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def paired_9_r(self):
cname = "paired_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "paired_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1(self):
cname = "pastel1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_r(self):
cname = "pastel1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_3(self):
cname = "pastel1_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_3_r(self):
cname = "pastel1_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_4(self):
cname = "pastel1_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_4_r(self):
cname = "pastel1_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_5(self):
cname = "pastel1_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_5_r(self):
cname = "pastel1_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_6(self):
cname = "pastel1_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_6_r(self):
cname = "pastel1_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_7(self):
cname = "pastel1_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_7_r(self):
cname = "pastel1_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_8(self):
cname = "pastel1_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_8_r(self):
cname = "pastel1_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_9(self):
cname = "pastel1_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel1_9_r(self):
cname = "pastel1_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel1_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2(self):
cname = "pastel2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_r(self):
cname = "pastel2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_3(self):
cname = "pastel2_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_3_r(self):
cname = "pastel2_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_4(self):
cname = "pastel2_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_4_r(self):
cname = "pastel2_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_5(self):
cname = "pastel2_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_5_r(self):
cname = "pastel2_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_6(self):
cname = "pastel2_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_6_r(self):
cname = "pastel2_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_7(self):
cname = "pastel2_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_7_r(self):
cname = "pastel2_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_8(self):
cname = "pastel2_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pastel2_8_r(self):
cname = "pastel2_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pastel2_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg(self):
cname = "piyg"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_r(self):
cname = "piyg_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_10(self):
cname = "piyg_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_10_r(self):
cname = "piyg_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_11(self):
cname = "piyg_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_11_r(self):
cname = "piyg_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_3(self):
cname = "piyg_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_3_r(self):
cname = "piyg_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_4(self):
cname = "piyg_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_4_r(self):
cname = "piyg_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_5(self):
cname = "piyg_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_5_r(self):
cname = "piyg_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_6(self):
cname = "piyg_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_6_r(self):
cname = "piyg_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_7(self):
cname = "piyg_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_7_r(self):
cname = "piyg_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_8(self):
cname = "piyg_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_8_r(self):
cname = "piyg_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_9(self):
cname = "piyg_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def piyg_9_r(self):
cname = "piyg_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "piyg_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn(self):
cname = "prgn"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_r(self):
cname = "prgn_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_10(self):
cname = "prgn_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_10_r(self):
cname = "prgn_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_11(self):
cname = "prgn_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_11_r(self):
cname = "prgn_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_3(self):
cname = "prgn_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_3_r(self):
cname = "prgn_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_4(self):
cname = "prgn_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_4_r(self):
cname = "prgn_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_5(self):
cname = "prgn_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_5_r(self):
cname = "prgn_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_6(self):
cname = "prgn_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_6_r(self):
cname = "prgn_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_7(self):
cname = "prgn_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_7_r(self):
cname = "prgn_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_8(self):
cname = "prgn_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_8_r(self):
cname = "prgn_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_9(self):
cname = "prgn_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prgn_9_r(self):
cname = "prgn_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "prgn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu(self):
cname = "pubu"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_r(self):
cname = "pubu_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_3(self):
cname = "pubu_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_3_r(self):
cname = "pubu_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_4(self):
cname = "pubu_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_4_r(self):
cname = "pubu_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_5(self):
cname = "pubu_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_5_r(self):
cname = "pubu_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_6(self):
cname = "pubu_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_6_r(self):
cname = "pubu_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_7(self):
cname = "pubu_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_7_r(self):
cname = "pubu_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_8(self):
cname = "pubu_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_8_r(self):
cname = "pubu_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_9(self):
cname = "pubu_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubu_9_r(self):
cname = "pubu_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn(self):
cname = "pubugn"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_r(self):
cname = "pubugn_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_3(self):
cname = "pubugn_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_3_r(self):
cname = "pubugn_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_4(self):
cname = "pubugn_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_4_r(self):
cname = "pubugn_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_5(self):
cname = "pubugn_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_5_r(self):
cname = "pubugn_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_6(self):
cname = "pubugn_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_6_r(self):
cname = "pubugn_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_7(self):
cname = "pubugn_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_7_r(self):
cname = "pubugn_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_8(self):
cname = "pubugn_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_8_r(self):
cname = "pubugn_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_9(self):
cname = "pubugn_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pubugn_9_r(self):
cname = "pubugn_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "pubugn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor(self):
cname = "puor"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_r(self):
cname = "puor_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_10(self):
cname = "puor_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_10_r(self):
cname = "puor_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_11(self):
cname = "puor_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_11_r(self):
cname = "puor_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_3(self):
cname = "puor_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_3_r(self):
cname = "puor_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_4(self):
cname = "puor_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_4_r(self):
cname = "puor_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_5(self):
cname = "puor_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_5_r(self):
cname = "puor_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_6(self):
cname = "puor_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_6_r(self):
cname = "puor_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_7(self):
cname = "puor_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_7_r(self):
cname = "puor_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_8(self):
cname = "puor_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_8_r(self):
cname = "puor_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_9(self):
cname = "puor_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def puor_9_r(self):
cname = "puor_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "puor_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd(self):
cname = "purd"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_r(self):
cname = "purd_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_3(self):
cname = "purd_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_3_r(self):
cname = "purd_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_4(self):
cname = "purd_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_4_r(self):
cname = "purd_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_5(self):
cname = "purd_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_5_r(self):
cname = "purd_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_6(self):
cname = "purd_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_6_r(self):
cname = "purd_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_7(self):
cname = "purd_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_7_r(self):
cname = "purd_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_8(self):
cname = "purd_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_8_r(self):
cname = "purd_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_9(self):
cname = "purd_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purd_9_r(self):
cname = "purd_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purd_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples(self):
cname = "purples"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_r(self):
cname = "purples_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_3(self):
cname = "purples_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_3_r(self):
cname = "purples_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_4(self):
cname = "purples_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_4_r(self):
cname = "purples_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_5(self):
cname = "purples_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_5_r(self):
cname = "purples_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_6(self):
cname = "purples_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_6_r(self):
cname = "purples_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_7(self):
cname = "purples_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_7_r(self):
cname = "purples_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_8(self):
cname = "purples_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_8_r(self):
cname = "purples_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_9(self):
cname = "purples_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purples_9_r(self):
cname = "purples_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "purples_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu(self):
cname = "rdbu"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_r(self):
cname = "rdbu_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_10(self):
cname = "rdbu_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_10_r(self):
cname = "rdbu_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_11(self):
cname = "rdbu_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_11_r(self):
cname = "rdbu_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_3(self):
cname = "rdbu_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_3_r(self):
cname = "rdbu_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_4(self):
cname = "rdbu_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_4_r(self):
cname = "rdbu_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_5(self):
cname = "rdbu_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_5_r(self):
cname = "rdbu_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_6(self):
cname = "rdbu_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_6_r(self):
cname = "rdbu_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_7(self):
cname = "rdbu_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_7_r(self):
cname = "rdbu_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_8(self):
cname = "rdbu_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_8_r(self):
cname = "rdbu_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_9(self):
cname = "rdbu_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdbu_9_r(self):
cname = "rdbu_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdbu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy(self):
cname = "rdgy"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_r(self):
cname = "rdgy_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_10(self):
cname = "rdgy_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_10_r(self):
cname = "rdgy_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_11(self):
cname = "rdgy_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_11_r(self):
cname = "rdgy_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_3(self):
cname = "rdgy_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_3_r(self):
cname = "rdgy_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_4(self):
cname = "rdgy_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_4_r(self):
cname = "rdgy_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_5(self):
cname = "rdgy_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_5_r(self):
cname = "rdgy_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_6(self):
cname = "rdgy_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_6_r(self):
cname = "rdgy_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_7(self):
cname = "rdgy_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_7_r(self):
cname = "rdgy_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_8(self):
cname = "rdgy_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_8_r(self):
cname = "rdgy_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_9(self):
cname = "rdgy_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdgy_9_r(self):
cname = "rdgy_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdgy_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu(self):
cname = "rdpu"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_r(self):
cname = "rdpu_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_3(self):
cname = "rdpu_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_3_r(self):
cname = "rdpu_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_4(self):
cname = "rdpu_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_4_r(self):
cname = "rdpu_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_5(self):
cname = "rdpu_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_5_r(self):
cname = "rdpu_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_6(self):
cname = "rdpu_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_6_r(self):
cname = "rdpu_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_7(self):
cname = "rdpu_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_7_r(self):
cname = "rdpu_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_8(self):
cname = "rdpu_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_8_r(self):
cname = "rdpu_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_9(self):
cname = "rdpu_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdpu_9_r(self):
cname = "rdpu_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdpu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu(self):
cname = "rdylbu"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_r(self):
cname = "rdylbu_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_10(self):
cname = "rdylbu_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_10_r(self):
cname = "rdylbu_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_11(self):
cname = "rdylbu_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_11_r(self):
cname = "rdylbu_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_3(self):
cname = "rdylbu_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_3_r(self):
cname = "rdylbu_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_4(self):
cname = "rdylbu_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_4_r(self):
cname = "rdylbu_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_5(self):
cname = "rdylbu_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_5_r(self):
cname = "rdylbu_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_6(self):
cname = "rdylbu_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_6_r(self):
cname = "rdylbu_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_7(self):
cname = "rdylbu_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_7_r(self):
cname = "rdylbu_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_8(self):
cname = "rdylbu_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_8_r(self):
cname = "rdylbu_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_9(self):
cname = "rdylbu_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylbu_9_r(self):
cname = "rdylbu_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylbu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn(self):
cname = "rdylgn"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_r(self):
cname = "rdylgn_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_10(self):
cname = "rdylgn_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_10_r(self):
cname = "rdylgn_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_11(self):
cname = "rdylgn_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_11_r(self):
cname = "rdylgn_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_3(self):
cname = "rdylgn_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_3_r(self):
cname = "rdylgn_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_4(self):
cname = "rdylgn_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_4_r(self):
cname = "rdylgn_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_5(self):
cname = "rdylgn_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_5_r(self):
cname = "rdylgn_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_6(self):
cname = "rdylgn_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_6_r(self):
cname = "rdylgn_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_7(self):
cname = "rdylgn_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_7_r(self):
cname = "rdylgn_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_8(self):
cname = "rdylgn_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_8_r(self):
cname = "rdylgn_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_9(self):
cname = "rdylgn_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rdylgn_9_r(self):
cname = "rdylgn_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "rdylgn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds(self):
cname = "reds"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_r(self):
cname = "reds_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_3(self):
cname = "reds_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_3_r(self):
cname = "reds_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_4(self):
cname = "reds_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_4_r(self):
cname = "reds_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_5(self):
cname = "reds_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_5_r(self):
cname = "reds_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_6(self):
cname = "reds_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_6_r(self):
cname = "reds_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_7(self):
cname = "reds_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_7_r(self):
cname = "reds_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_8(self):
cname = "reds_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_8_r(self):
cname = "reds_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_9(self):
cname = "reds_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def reds_9_r(self):
cname = "reds_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "reds_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1(self):
cname = "set1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_r(self):
cname = "set1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_3(self):
cname = "set1_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_3_r(self):
cname = "set1_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_4(self):
cname = "set1_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_4_r(self):
cname = "set1_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_5(self):
cname = "set1_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_5_r(self):
cname = "set1_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_6(self):
cname = "set1_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_6_r(self):
cname = "set1_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_7(self):
cname = "set1_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_7_r(self):
cname = "set1_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_8(self):
cname = "set1_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_8_r(self):
cname = "set1_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_9(self):
cname = "set1_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set1_9_r(self):
cname = "set1_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set1_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2(self):
cname = "set2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_r(self):
cname = "set2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_3(self):
cname = "set2_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_3_r(self):
cname = "set2_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_4(self):
cname = "set2_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_4_r(self):
cname = "set2_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_5(self):
cname = "set2_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_5_r(self):
cname = "set2_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_6(self):
cname = "set2_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_6_r(self):
cname = "set2_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_7(self):
cname = "set2_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_7_r(self):
cname = "set2_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_8(self):
cname = "set2_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set2_8_r(self):
cname = "set2_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set2_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3(self):
cname = "set3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_r(self):
cname = "set3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_10(self):
cname = "set3_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_10_r(self):
cname = "set3_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_11(self):
cname = "set3_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_11_r(self):
cname = "set3_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_12(self):
cname = "set3_12"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_12.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_12_r(self):
cname = "set3_12_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_12.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_3(self):
cname = "set3_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_3_r(self):
cname = "set3_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_4(self):
cname = "set3_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_4_r(self):
cname = "set3_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_5(self):
cname = "set3_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_5_r(self):
cname = "set3_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_6(self):
cname = "set3_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_6_r(self):
cname = "set3_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_7(self):
cname = "set3_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_7_r(self):
cname = "set3_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_8(self):
cname = "set3_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_8_r(self):
cname = "set3_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_9(self):
cname = "set3_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def set3_9_r(self):
cname = "set3_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "set3_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral(self):
cname = "spectral"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_r(self):
cname = "spectral_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_10(self):
cname = "spectral_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_10_r(self):
cname = "spectral_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_11(self):
cname = "spectral_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_11_r(self):
cname = "spectral_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_3(self):
cname = "spectral_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_3_r(self):
cname = "spectral_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_4(self):
cname = "spectral_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_4_r(self):
cname = "spectral_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_5(self):
cname = "spectral_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_5_r(self):
cname = "spectral_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_6(self):
cname = "spectral_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_6_r(self):
cname = "spectral_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_7(self):
cname = "spectral_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_7_r(self):
cname = "spectral_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_8(self):
cname = "spectral_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_8_r(self):
cname = "spectral_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_9(self):
cname = "spectral_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spectral_9_r(self):
cname = "spectral_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "spectral_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn(self):
cname = "ylgn"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_r(self):
cname = "ylgn_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_3(self):
cname = "ylgn_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_3_r(self):
cname = "ylgn_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_4(self):
cname = "ylgn_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_4_r(self):
cname = "ylgn_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_5(self):
cname = "ylgn_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_5_r(self):
cname = "ylgn_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_6(self):
cname = "ylgn_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_6_r(self):
cname = "ylgn_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_7(self):
cname = "ylgn_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_7_r(self):
cname = "ylgn_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_8(self):
cname = "ylgn_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_8_r(self):
cname = "ylgn_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_9(self):
cname = "ylgn_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgn_9_r(self):
cname = "ylgn_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgn_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu(self):
cname = "ylgnbu"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_r(self):
cname = "ylgnbu_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_3(self):
cname = "ylgnbu_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_3_r(self):
cname = "ylgnbu_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_4(self):
cname = "ylgnbu_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_4_r(self):
cname = "ylgnbu_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_5(self):
cname = "ylgnbu_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_5_r(self):
cname = "ylgnbu_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_6(self):
cname = "ylgnbu_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_6_r(self):
cname = "ylgnbu_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_7(self):
cname = "ylgnbu_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_7_r(self):
cname = "ylgnbu_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_8(self):
cname = "ylgnbu_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_8_r(self):
cname = "ylgnbu_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_9(self):
cname = "ylgnbu_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylgnbu_9_r(self):
cname = "ylgnbu_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylgnbu_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr(self):
cname = "ylorbr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_r(self):
cname = "ylorbr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_3(self):
cname = "ylorbr_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_3_r(self):
cname = "ylorbr_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_4(self):
cname = "ylorbr_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_4_r(self):
cname = "ylorbr_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_5(self):
cname = "ylorbr_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_5_r(self):
cname = "ylorbr_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_6(self):
cname = "ylorbr_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_6_r(self):
cname = "ylorbr_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_7(self):
cname = "ylorbr_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_7_r(self):
cname = "ylorbr_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_8(self):
cname = "ylorbr_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_8_r(self):
cname = "ylorbr_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_9(self):
cname = "ylorbr_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorbr_9_r(self):
cname = "ylorbr_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorbr_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd(self):
cname = "ylorrd"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_r(self):
cname = "ylorrd_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_3(self):
cname = "ylorrd_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_3_r(self):
cname = "ylorrd_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_4(self):
cname = "ylorrd_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_4_r(self):
cname = "ylorrd_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_5(self):
cname = "ylorrd_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_5_r(self):
cname = "ylorrd_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_6(self):
cname = "ylorrd_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_6_r(self):
cname = "ylorrd_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_7(self):
cname = "ylorrd_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_7_r(self):
cname = "ylorrd_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_8(self):
cname = "ylorrd_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_8_r(self):
cname = "ylorrd_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_9(self):
cname = "ylorrd_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ylorrd_9_r(self):
cname = "ylorrd_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorbrewer", "ylorrd_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def classic_16(self):
cname = "classic_16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "classic_16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def classic_16_r(self):
cname = "classic_16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "classic_16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cubehelix1_16(self):
cname = "cubehelix1_16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "cubehelix1_16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cubehelix1_16_r(self):
cname = "cubehelix1_16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "cubehelix1_16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cubehelix2_16(self):
cname = "cubehelix2_16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "cubehelix2_16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cubehelix2_16_r(self):
cname = "cubehelix2_16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "cubehelix2_16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cubehelix3_16(self):
cname = "cubehelix3_16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "cubehelix3_16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cubehelix3_16_r(self):
cname = "cubehelix3_16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "cubehelix3_16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def jim_special_16(self):
cname = "jim_special_16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "jim_special_16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def jim_special_16_r(self):
cname = "jim_special_16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "jim_special_16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def perceptual_rainbow_16(self):
cname = "perceptual_rainbow_16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "perceptual_rainbow_16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def perceptual_rainbow_16_r(self):
cname = "perceptual_rainbow_16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "perceptual_rainbow_16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purple_16(self):
cname = "purple_16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "purple_16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purple_16_r(self):
cname = "purple_16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "purple_16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def red_16(self):
cname = "red_16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "red_16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def red_16_r(self):
cname = "red_16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "cubehelix", "red_16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BkBlAqGrYeOrReViWh200(self):
cname = "BkBlAqGrYeOrReViWh200"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BkBlAqGrYeOrReViWh200.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BkBlAqGrYeOrReViWh200_r(self):
cname = "BkBlAqGrYeOrReViWh200_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BkBlAqGrYeOrReViWh200.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlAqGrWh2YeOrReVi22(self):
cname = "BlAqGrWh2YeOrReVi22"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlAqGrWh2YeOrReVi22.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlAqGrWh2YeOrReVi22_r(self):
cname = "BlAqGrWh2YeOrReVi22_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlAqGrWh2YeOrReVi22.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlAqGrYeOrRe(self):
cname = "BlAqGrYeOrRe"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlAqGrYeOrRe.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlAqGrYeOrRe_r(self):
cname = "BlAqGrYeOrRe_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlAqGrYeOrRe.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlAqGrYeOrReVi200(self):
cname = "BlAqGrYeOrReVi200"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlAqGrYeOrReVi200.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlAqGrYeOrReVi200_r(self):
cname = "BlAqGrYeOrReVi200_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlAqGrYeOrReVi200.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlGrYeOrReVi200(self):
cname = "BlGrYeOrReVi200"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlGrYeOrReVi200.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlGrYeOrReVi200_r(self):
cname = "BlGrYeOrReVi200_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlGrYeOrReVi200.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlRe(self):
cname = "BlRe"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlRe.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlRe_r(self):
cname = "BlRe_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlRe.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlWhRe(self):
cname = "BlWhRe"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlWhRe.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlWhRe_r(self):
cname = "BlWhRe_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlWhRe.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueDarkOrange18(self):
cname = "BlueDarkOrange18"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueDarkOrange18.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueDarkOrange18_r(self):
cname = "BlueDarkOrange18_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueDarkOrange18.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueDarkRed18(self):
cname = "BlueDarkRed18"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueDarkRed18.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueDarkRed18_r(self):
cname = "BlueDarkRed18_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueDarkRed18.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueGreen14(self):
cname = "BlueGreen14"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueGreen14.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueGreen14_r(self):
cname = "BlueGreen14_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueGreen14.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueRed(self):
cname = "BlueRed"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueRed.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueRed_r(self):
cname = "BlueRed_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueRed.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueRedGray(self):
cname = "BlueRedGray"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueRedGray.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueRedGray_r(self):
cname = "BlueRedGray_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueRedGray.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueWhiteOrangeRed(self):
cname = "BlueWhiteOrangeRed"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueWhiteOrangeRed.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueWhiteOrangeRed_r(self):
cname = "BlueWhiteOrangeRed_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueWhiteOrangeRed.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueYellowRed(self):
cname = "BlueYellowRed"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueYellowRed.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BlueYellowRed_r(self):
cname = "BlueYellowRed_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BlueYellowRed.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BrownBlue12(self):
cname = "BrownBlue12"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BrownBlue12.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def BrownBlue12_r(self):
cname = "BrownBlue12_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "BrownBlue12.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def Cat12(self):
cname = "Cat12"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "Cat12.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def Cat12_r(self):
cname = "Cat12_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "Cat12.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GHRSST_anomaly(self):
cname = "GHRSST_anomaly"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GHRSST_anomaly.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GHRSST_anomaly_r(self):
cname = "GHRSST_anomaly_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GHRSST_anomaly.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_cool(self):
cname = "GMT_cool"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_cool.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_cool_r(self):
cname = "GMT_cool_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_cool.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_copper(self):
cname = "GMT_copper"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_copper.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_copper_r(self):
cname = "GMT_copper_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_copper.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_drywet(self):
cname = "GMT_drywet"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_drywet.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_drywet_r(self):
cname = "GMT_drywet_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_drywet.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_gebco(self):
cname = "GMT_gebco"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_gebco.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_gebco_r(self):
cname = "GMT_gebco_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_gebco.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_globe(self):
cname = "GMT_globe"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_globe.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_globe_r(self):
cname = "GMT_globe_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_globe.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_gray(self):
cname = "GMT_gray"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_gray.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_gray_r(self):
cname = "GMT_gray_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_gray.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_haxby(self):
cname = "GMT_haxby"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_haxby.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_haxby_r(self):
cname = "GMT_haxby_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_haxby.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_hot(self):
cname = "GMT_hot"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_hot.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_hot_r(self):
cname = "GMT_hot_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_hot.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_jet(self):
cname = "GMT_jet"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_jet.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_jet_r(self):
cname = "GMT_jet_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_jet.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_nighttime(self):
cname = "GMT_nighttime"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_nighttime.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_nighttime_r(self):
cname = "GMT_nighttime_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_nighttime.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_no_green(self):
cname = "GMT_no_green"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_no_green.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_no_green_r(self):
cname = "GMT_no_green_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_no_green.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_ocean(self):
cname = "GMT_ocean"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_ocean.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_ocean_r(self):
cname = "GMT_ocean_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_ocean.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_paired(self):
cname = "GMT_paired"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_paired.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_paired_r(self):
cname = "GMT_paired_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_paired.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_panoply(self):
cname = "GMT_panoply"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_panoply.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_panoply_r(self):
cname = "GMT_panoply_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_panoply.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_polar(self):
cname = "GMT_polar"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_polar.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_polar_r(self):
cname = "GMT_polar_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_polar.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_red2green(self):
cname = "GMT_red2green"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_red2green.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_red2green_r(self):
cname = "GMT_red2green_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_red2green.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_relief(self):
cname = "GMT_relief"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_relief.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_relief_r(self):
cname = "GMT_relief_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_relief.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_relief_oceanonly(self):
cname = "GMT_relief_oceanonly"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_relief_oceanonly.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_relief_oceanonly_r(self):
cname = "GMT_relief_oceanonly_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_relief_oceanonly.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_seis(self):
cname = "GMT_seis"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_seis.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_seis_r(self):
cname = "GMT_seis_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_seis.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_split(self):
cname = "GMT_split"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_split.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_split_r(self):
cname = "GMT_split_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_split.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_topo(self):
cname = "GMT_topo"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_topo.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_topo_r(self):
cname = "GMT_topo_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_topo.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_wysiwyg(self):
cname = "GMT_wysiwyg"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_wysiwyg.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_wysiwyg_r(self):
cname = "GMT_wysiwyg_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_wysiwyg.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_wysiwygcont(self):
cname = "GMT_wysiwygcont"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_wysiwygcont.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GMT_wysiwygcont_r(self):
cname = "GMT_wysiwygcont_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GMT_wysiwygcont.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GSFC_landsat_udf_density(self):
cname = "GSFC_landsat_udf_density"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GSFC_landsat_udf_density.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GSFC_landsat_udf_density_r(self):
cname = "GSFC_landsat_udf_density_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GSFC_landsat_udf_density.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GrayWhiteGray(self):
cname = "GrayWhiteGray"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GrayWhiteGray.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GrayWhiteGray_r(self):
cname = "GrayWhiteGray_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GrayWhiteGray.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GreenMagenta16(self):
cname = "GreenMagenta16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GreenMagenta16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GreenMagenta16_r(self):
cname = "GreenMagenta16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GreenMagenta16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GreenYellow(self):
cname = "GreenYellow"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GreenYellow.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def GreenYellow_r(self):
cname = "GreenYellow_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "GreenYellow.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_banded(self):
cname = "NCV_banded"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_banded.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_banded_r(self):
cname = "NCV_banded_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_banded.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_blu_red(self):
cname = "NCV_blu_red"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_blu_red.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_blu_red_r(self):
cname = "NCV_blu_red_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_blu_red.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_blue_red(self):
cname = "NCV_blue_red"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_blue_red.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_blue_red_r(self):
cname = "NCV_blue_red_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_blue_red.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_bright(self):
cname = "NCV_bright"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_bright.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_bright_r(self):
cname = "NCV_bright_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_bright.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_gebco(self):
cname = "NCV_gebco"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_gebco.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_gebco_r(self):
cname = "NCV_gebco_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_gebco.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_jaisnd(self):
cname = "NCV_jaisnd"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_jaisnd.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_jaisnd_r(self):
cname = "NCV_jaisnd_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_jaisnd.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_jet(self):
cname = "NCV_jet"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_jet.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_jet_r(self):
cname = "NCV_jet_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_jet.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_manga(self):
cname = "NCV_manga"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_manga.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_manga_r(self):
cname = "NCV_manga_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_manga.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_rainbow2(self):
cname = "NCV_rainbow2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_rainbow2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_rainbow2_r(self):
cname = "NCV_rainbow2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_rainbow2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_roullet(self):
cname = "NCV_roullet"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_roullet.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NCV_roullet_r(self):
cname = "NCV_roullet_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NCV_roullet.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NEO_div_vegetation_a(self):
cname = "NEO_div_vegetation_a"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NEO_div_vegetation_a.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NEO_div_vegetation_a_r(self):
cname = "NEO_div_vegetation_a_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NEO_div_vegetation_a.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NEO_div_vegetation_b(self):
cname = "NEO_div_vegetation_b"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NEO_div_vegetation_b.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NEO_div_vegetation_b_r(self):
cname = "NEO_div_vegetation_b_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NEO_div_vegetation_b.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NEO_div_vegetation_c(self):
cname = "NEO_div_vegetation_c"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NEO_div_vegetation_c.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NEO_div_vegetation_c_r(self):
cname = "NEO_div_vegetation_c_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NEO_div_vegetation_c.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NEO_modis_ndvi(self):
cname = "NEO_modis_ndvi"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NEO_modis_ndvi.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NEO_modis_ndvi_r(self):
cname = "NEO_modis_ndvi_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NEO_modis_ndvi.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NMCRef(self):
cname = "NMCRef"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NMCRef.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NMCRef_r(self):
cname = "NMCRef_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NMCRef.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NMCVel(self):
cname = "NMCVel"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NMCVel.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NMCVel_r(self):
cname = "NMCVel_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NMCVel.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NOC_ndvi(self):
cname = "NOC_ndvi"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NOC_ndvi.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def NOC_ndvi_r(self):
cname = "NOC_ndvi_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "NOC_ndvi.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def OceanLakeLandSnow(self):
cname = "OceanLakeLandSnow"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "OceanLakeLandSnow.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def OceanLakeLandSnow_r(self):
cname = "OceanLakeLandSnow_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "OceanLakeLandSnow.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_Gallet13(self):
cname = "SVG_Gallet13"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_Gallet13.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_Gallet13_r(self):
cname = "SVG_Gallet13_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_Gallet13.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_Lindaa06(self):
cname = "SVG_Lindaa06"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_Lindaa06.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_Lindaa06_r(self):
cname = "SVG_Lindaa06_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_Lindaa06.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_Lindaa07(self):
cname = "SVG_Lindaa07"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_Lindaa07.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_Lindaa07_r(self):
cname = "SVG_Lindaa07_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_Lindaa07.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_bhw3_22(self):
cname = "SVG_bhw3_22"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_bhw3_22.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_bhw3_22_r(self):
cname = "SVG_bhw3_22_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_bhw3_22.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_es_landscape_79(self):
cname = "SVG_es_landscape_79"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_es_landscape_79.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_es_landscape_79_r(self):
cname = "SVG_es_landscape_79_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_es_landscape_79.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_feb_sunrise(self):
cname = "SVG_feb_sunrise"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_feb_sunrise.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_feb_sunrise_r(self):
cname = "SVG_feb_sunrise_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_feb_sunrise.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_foggy_sunrise(self):
cname = "SVG_foggy_sunrise"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_foggy_sunrise.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_foggy_sunrise_r(self):
cname = "SVG_foggy_sunrise_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_foggy_sunrise.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_fs2006(self):
cname = "SVG_fs2006"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_fs2006.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def SVG_fs2006_r(self):
cname = "SVG_fs2006_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "SVG_fs2006.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def StepSeq25(self):
cname = "StepSeq25"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "StepSeq25.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def StepSeq25_r(self):
cname = "StepSeq25_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "StepSeq25.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def UKM_hadcrut(self):
cname = "UKM_hadcrut"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "UKM_hadcrut.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def UKM_hadcrut_r(self):
cname = "UKM_hadcrut_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "UKM_hadcrut.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ViBlGrWhYeOrRe(self):
cname = "ViBlGrWhYeOrRe"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "ViBlGrWhYeOrRe.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ViBlGrWhYeOrRe_r(self):
cname = "ViBlGrWhYeOrRe_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "ViBlGrWhYeOrRe.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhBlGrYeRe(self):
cname = "WhBlGrYeRe"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhBlGrYeRe.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhBlGrYeRe_r(self):
cname = "WhBlGrYeRe_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhBlGrYeRe.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhBlReWh(self):
cname = "WhBlReWh"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhBlReWh.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhBlReWh_r(self):
cname = "WhBlReWh_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhBlReWh.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhViBlGrYeOrRe(self):
cname = "WhViBlGrYeOrRe"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhViBlGrYeOrRe.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhViBlGrYeOrRe_r(self):
cname = "WhViBlGrYeOrRe_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhViBlGrYeOrRe.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhViBlGrYeOrReWh(self):
cname = "WhViBlGrYeOrReWh"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhViBlGrYeOrReWh.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhViBlGrYeOrReWh_r(self):
cname = "WhViBlGrYeOrReWh_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhViBlGrYeOrReWh.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhiteBlue(self):
cname = "WhiteBlue"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhiteBlue.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhiteBlue_r(self):
cname = "WhiteBlue_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhiteBlue.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhiteBlueGreenYellowRed(self):
cname = "WhiteBlueGreenYellowRed"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhiteBlueGreenYellowRed.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhiteBlueGreenYellowRed_r(self):
cname = "WhiteBlueGreenYellowRed_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhiteBlueGreenYellowRed.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhiteGreen(self):
cname = "WhiteGreen"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhiteGreen.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhiteGreen_r(self):
cname = "WhiteGreen_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhiteGreen.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhiteYellowOrangeRed(self):
cname = "WhiteYellowOrangeRed"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhiteYellowOrangeRed.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def WhiteYellowOrangeRed_r(self):
cname = "WhiteYellowOrangeRed_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "WhiteYellowOrangeRed.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def amwg(self):
cname = "amwg"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "amwg.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def amwg_r(self):
cname = "amwg_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "amwg.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def amwg256(self):
cname = "amwg256"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "amwg256.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def amwg256_r(self):
cname = "amwg256_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "amwg256.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def amwg_blueyellowred(self):
cname = "amwg_blueyellowred"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "amwg_blueyellowred.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def amwg_blueyellowred_r(self):
cname = "amwg_blueyellowred_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "amwg_blueyellowred.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cb_9step(self):
cname = "cb_9step"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cb_9step.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cb_9step_r(self):
cname = "cb_9step_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cb_9step.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cb_rainbow(self):
cname = "cb_rainbow"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cb_rainbow.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cb_rainbow_r(self):
cname = "cb_rainbow_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cb_rainbow.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cb_rainbow_inv(self):
cname = "cb_rainbow_inv"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cb_rainbow_inv.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cb_rainbow_inv_r(self):
cname = "cb_rainbow_inv_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cb_rainbow_inv.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def circular_0(self):
cname = "circular_0"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "circular_0.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def circular_0_r(self):
cname = "circular_0_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "circular_0.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def circular_1(self):
cname = "circular_1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "circular_1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def circular_1_r(self):
cname = "circular_1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "circular_1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def circular_2(self):
cname = "circular_2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "circular_2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def circular_2_r(self):
cname = "circular_2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "circular_2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cividis(self):
cname = "cividis"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cividis.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cividis_r(self):
cname = "cividis_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cividis.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cmp_b2r(self):
cname = "cmp_b2r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cmp_b2r.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cmp_b2r_r(self):
cname = "cmp_b2r_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cmp_b2r.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cmp_flux(self):
cname = "cmp_flux"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cmp_flux.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cmp_flux_r(self):
cname = "cmp_flux_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cmp_flux.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cmp_haxby(self):
cname = "cmp_haxby"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cmp_haxby.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cmp_haxby_r(self):
cname = "cmp_haxby_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cmp_haxby.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cosam(self):
cname = "cosam"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cosam.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cosam_r(self):
cname = "cosam_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cosam.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cosam12(self):
cname = "cosam12"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cosam12.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cosam12_r(self):
cname = "cosam12_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cosam12.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cyclic(self):
cname = "cyclic"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cyclic.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cyclic_r(self):
cname = "cyclic_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "cyclic.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def default(self):
cname = "default"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "default.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def default_r(self):
cname = "default_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "default.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def detail(self):
cname = "detail"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "detail.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def detail_r(self):
cname = "detail_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "detail.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def drought_severity(self):
cname = "drought_severity"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "drought_severity.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def drought_severity_r(self):
cname = "drought_severity_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "drought_severity.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def example(self):
cname = "example"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "example.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def example_r(self):
cname = "example_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "example.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def extrema(self):
cname = "extrema"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "extrema.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def extrema_r(self):
cname = "extrema_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "extrema.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gauss3(self):
cname = "gauss3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gauss3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gauss3_r(self):
cname = "gauss3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gauss3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def grads_default(self):
cname = "grads_default"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "grads_default.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def grads_default_r(self):
cname = "grads_default_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "grads_default.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def grads_rainbow(self):
cname = "grads_rainbow"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "grads_rainbow.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def grads_rainbow_r(self):
cname = "grads_rainbow_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "grads_rainbow.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gscyclic(self):
cname = "gscyclic"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gscyclic.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gscyclic_r(self):
cname = "gscyclic_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gscyclic.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gsdtol(self):
cname = "gsdtol"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gsdtol.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gsdtol_r(self):
cname = "gsdtol_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gsdtol.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gsltod(self):
cname = "gsltod"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gsltod.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gsltod_r(self):
cname = "gsltod_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gsltod.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gui_default(self):
cname = "gui_default"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gui_default.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gui_default_r(self):
cname = "gui_default_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "gui_default.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def helix(self):
cname = "helix"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "helix.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def helix_r(self):
cname = "helix_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "helix.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def helix1(self):
cname = "helix1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "helix1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def helix1_r(self):
cname = "helix1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "helix1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hlu_default(self):
cname = "hlu_default"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "hlu_default.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hlu_default_r(self):
cname = "hlu_default_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "hlu_default.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hotcold_18lev(self):
cname = "hotcold_18lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "hotcold_18lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hotcold_18lev_r(self):
cname = "hotcold_18lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "hotcold_18lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hotcolr_19lev(self):
cname = "hotcolr_19lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "hotcolr_19lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hotcolr_19lev_r(self):
cname = "hotcolr_19lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "hotcolr_19lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hotres(self):
cname = "hotres"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "hotres.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hotres_r(self):
cname = "hotres_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "hotres.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def lithology(self):
cname = "lithology"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "lithology.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def lithology_r(self):
cname = "lithology_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "lithology.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matlab_hot(self):
cname = "matlab_hot"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "matlab_hot.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matlab_hot_r(self):
cname = "matlab_hot_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "matlab_hot.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matlab_hsv(self):
cname = "matlab_hsv"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "matlab_hsv.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matlab_hsv_r(self):
cname = "matlab_hsv_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "matlab_hsv.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matlab_jet(self):
cname = "matlab_jet"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "matlab_jet.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matlab_jet_r(self):
cname = "matlab_jet_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "matlab_jet.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matlab_lines(self):
cname = "matlab_lines"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "matlab_lines.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def matlab_lines_r(self):
cname = "matlab_lines_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "matlab_lines.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def mch_default(self):
cname = "mch_default"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "mch_default.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def mch_default_r(self):
cname = "mch_default_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "mch_default.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ncl_default(self):
cname = "ncl_default"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "ncl_default.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ncl_default_r(self):
cname = "ncl_default_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "ncl_default.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ncview_default(self):
cname = "ncview_default"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "ncview_default.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def ncview_default_r(self):
cname = "ncview_default_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "ncview_default.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def nice_gfdl(self):
cname = "nice_gfdl"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "nice_gfdl.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def nice_gfdl_r(self):
cname = "nice_gfdl_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "nice_gfdl.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def nrl_sirkes(self):
cname = "nrl_sirkes"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "nrl_sirkes.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def nrl_sirkes_r(self):
cname = "nrl_sirkes_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "nrl_sirkes.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def nrl_sirkes_nowhite(self):
cname = "nrl_sirkes_nowhite"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "nrl_sirkes_nowhite.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def nrl_sirkes_nowhite_r(self):
cname = "nrl_sirkes_nowhite_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "nrl_sirkes_nowhite.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def perc2_9lev(self):
cname = "perc2_9lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "perc2_9lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def perc2_9lev_r(self):
cname = "perc2_9lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "perc2_9lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def percent_11lev(self):
cname = "percent_11lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "percent_11lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def percent_11lev_r(self):
cname = "percent_11lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "percent_11lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def posneg_1(self):
cname = "posneg_1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "posneg_1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def posneg_1_r(self):
cname = "posneg_1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "posneg_1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def posneg_2(self):
cname = "posneg_2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "posneg_2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def posneg_2_r(self):
cname = "posneg_2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "posneg_2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prcp_1(self):
cname = "prcp_1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "prcp_1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prcp_1_r(self):
cname = "prcp_1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "prcp_1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prcp_2(self):
cname = "prcp_2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "prcp_2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prcp_2_r(self):
cname = "prcp_2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "prcp_2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prcp_3(self):
cname = "prcp_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "prcp_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def prcp_3_r(self):
cname = "prcp_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "prcp_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip2_15lev(self):
cname = "precip2_15lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip2_15lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip2_15lev_r(self):
cname = "precip2_15lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip2_15lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip2_17lev(self):
cname = "precip2_17lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip2_17lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip2_17lev_r(self):
cname = "precip2_17lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip2_17lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip3_16lev(self):
cname = "precip3_16lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip3_16lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip3_16lev_r(self):
cname = "precip3_16lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip3_16lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip4_11lev(self):
cname = "precip4_11lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip4_11lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip4_11lev_r(self):
cname = "precip4_11lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip4_11lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip4_diff_19lev(self):
cname = "precip4_diff_19lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip4_diff_19lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip4_diff_19lev_r(self):
cname = "precip4_diff_19lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip4_diff_19lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip_11lev(self):
cname = "precip_11lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip_11lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip_11lev_r(self):
cname = "precip_11lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip_11lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip_diff_12lev(self):
cname = "precip_diff_12lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip_diff_12lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip_diff_12lev_r(self):
cname = "precip_diff_12lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip_diff_12lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip_diff_1lev(self):
cname = "precip_diff_1lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip_diff_1lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def precip_diff_1lev_r(self):
cname = "precip_diff_1lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "precip_diff_1lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def psgcap(self):
cname = "psgcap"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "psgcap.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def psgcap_r(self):
cname = "psgcap_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "psgcap.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def radar(self):
cname = "radar"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "radar.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def radar_r(self):
cname = "radar_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "radar.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def radar_1(self):
cname = "radar_1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "radar_1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def radar_1_r(self):
cname = "radar_1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "radar_1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rainbow(self):
cname = "rainbow"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rainbow.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rainbow_r(self):
cname = "rainbow_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rainbow.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rainbow_gray(self):
cname = "rainbow_gray"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rainbow_gray.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rainbow_gray_r(self):
cname = "rainbow_gray_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rainbow_gray.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rainbow_white(self):
cname = "rainbow_white"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rainbow_white.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rainbow_white_r(self):
cname = "rainbow_white_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rainbow_white.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rainbow_white_gray(self):
cname = "rainbow_white_gray"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rainbow_white_gray.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rainbow_white_gray_r(self):
cname = "rainbow_white_gray_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rainbow_white_gray.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rh_19lev(self):
cname = "rh_19lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rh_19lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rh_19lev_r(self):
cname = "rh_19lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "rh_19lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def saw3(self):
cname = "saw3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "saw3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def saw3_r(self):
cname = "saw3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "saw3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def seaice_1(self):
cname = "seaice_1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "seaice_1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def seaice_1_r(self):
cname = "seaice_1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "seaice_1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def seaice_2(self):
cname = "seaice_2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "seaice_2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def seaice_2_r(self):
cname = "seaice_2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "seaice_2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def so4_21(self):
cname = "so4_21"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "so4_21.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def so4_21_r(self):
cname = "so4_21_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "so4_21.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def so4_23(self):
cname = "so4_23"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "so4_23.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def so4_23_r(self):
cname = "so4_23_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "so4_23.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spread_15lev(self):
cname = "spread_15lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "spread_15lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def spread_15lev_r(self):
cname = "spread_15lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "spread_15lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def srip_reanalysis(self):
cname = "srip_reanalysis"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "srip_reanalysis.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def srip_reanalysis_r(self):
cname = "srip_reanalysis_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "srip_reanalysis.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def sunshine_9lev(self):
cname = "sunshine_9lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "sunshine_9lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def sunshine_9lev_r(self):
cname = "sunshine_9lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "sunshine_9lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def sunshine_diff_12lev(self):
cname = "sunshine_diff_12lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "sunshine_diff_12lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def sunshine_diff_12lev_r(self):
cname = "sunshine_diff_12lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "sunshine_diff_12lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def t2m_29lev(self):
cname = "t2m_29lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "t2m_29lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def t2m_29lev_r(self):
cname = "t2m_29lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "t2m_29lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbrAvg1(self):
cname = "tbrAvg1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbrAvg1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbrAvg1_r(self):
cname = "tbrAvg1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbrAvg1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbrStd1(self):
cname = "tbrStd1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbrStd1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbrStd1_r(self):
cname = "tbrStd1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbrStd1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbrVar1(self):
cname = "tbrVar1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbrVar1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbrVar1_r(self):
cname = "tbrVar1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbrVar1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbr_240_300(self):
cname = "tbr_240_300"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbr_240_300.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbr_240_300_r(self):
cname = "tbr_240_300_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbr_240_300.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbr_stdev_0_30(self):
cname = "tbr_stdev_0_30"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbr_stdev_0_30.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbr_stdev_0_30_r(self):
cname = "tbr_stdev_0_30_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbr_stdev_0_30.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbr_var_0_500(self):
cname = "tbr_var_0_500"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbr_var_0_500.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tbr_var_0_500_r(self):
cname = "tbr_var_0_500_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "tbr_var_0_500.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temp1(self):
cname = "temp1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "temp1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temp1_r(self):
cname = "temp1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "temp1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temp_19lev(self):
cname = "temp_19lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "temp_19lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temp_19lev_r(self):
cname = "temp_19lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "temp_19lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temp_diff_18lev(self):
cname = "temp_diff_18lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "temp_diff_18lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temp_diff_18lev_r(self):
cname = "temp_diff_18lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "temp_diff_18lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temp_diff_1lev(self):
cname = "temp_diff_1lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "temp_diff_1lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def temp_diff_1lev_r(self):
cname = "temp_diff_1lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "temp_diff_1lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def testcmap(self):
cname = "testcmap"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "testcmap.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def testcmap_r(self):
cname = "testcmap_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "testcmap.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def thelix(self):
cname = "thelix"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "thelix.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def thelix_r(self):
cname = "thelix_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "thelix.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def topo_15lev(self):
cname = "topo_15lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "topo_15lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def topo_15lev_r(self):
cname = "topo_15lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "topo_15lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def uniform(self):
cname = "uniform"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "uniform.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def uniform_r(self):
cname = "uniform_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "uniform.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vegetation_ClarkU(self):
cname = "vegetation_ClarkU"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "vegetation_ClarkU.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vegetation_ClarkU_r(self):
cname = "vegetation_ClarkU_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "vegetation_ClarkU.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vegetation_modis(self):
cname = "vegetation_modis"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "vegetation_modis.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vegetation_modis_r(self):
cname = "vegetation_modis_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "vegetation_modis.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wgne15(self):
cname = "wgne15"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "wgne15.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wgne15_r(self):
cname = "wgne15_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "wgne15.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wh_bl_gr_ye_re(self):
cname = "wh_bl_gr_ye_re"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "wh_bl_gr_ye_re.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wh_bl_gr_ye_re_r(self):
cname = "wh_bl_gr_ye_re_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "wh_bl_gr_ye_re.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wind_17lev(self):
cname = "wind_17lev"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "wind_17lev.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wind_17lev_r(self):
cname = "wind_17lev_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "wind_17lev.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wxpEnIR(self):
cname = "wxpEnIR"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "wxpEnIR.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wxpEnIR_r(self):
cname = "wxpEnIR_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "ncar_ncl", "wxpEnIR.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def acton(self):
cname = "acton"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "acton.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def acton_r(self):
cname = "acton_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "acton.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bamako(self):
cname = "bamako"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "bamako.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bamako_r(self):
cname = "bamako_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "bamako.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def batlow(self):
cname = "batlow"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "batlow.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def batlow_r(self):
cname = "batlow_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "batlow.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def berlin(self):
cname = "berlin"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "berlin.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def berlin_r(self):
cname = "berlin_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "berlin.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bilbao(self):
cname = "bilbao"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "bilbao.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bilbao_r(self):
cname = "bilbao_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "bilbao.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def broc(self):
cname = "broc"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "broc.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def broc_r(self):
cname = "broc_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "broc.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def buda(self):
cname = "buda"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "buda.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def buda_r(self):
cname = "buda_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "buda.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cork(self):
cname = "cork"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "cork.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cork_r(self):
cname = "cork_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "cork.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def davos(self):
cname = "davos"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "davos.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def davos_r(self):
cname = "davos_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "davos.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def devon(self):
cname = "devon"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "devon.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def devon_r(self):
cname = "devon_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "devon.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def grayc(self):
cname = "grayc"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "grayc.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def grayc_r(self):
cname = "grayc_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "grayc.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hawaii(self):
cname = "hawaii"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "hawaii.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hawaii_r(self):
cname = "hawaii_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "hawaii.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def imola(self):
cname = "imola"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "imola.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def imola_r(self):
cname = "imola_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "imola.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def lajolla(self):
cname = "lajolla"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "lajolla.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def lajolla_r(self):
cname = "lajolla_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "lajolla.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def lapaz(self):
cname = "lapaz"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "lapaz.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def lapaz_r(self):
cname = "lapaz_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "lapaz.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def lisbon(self):
cname = "lisbon"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "lisbon.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def lisbon_r(self):
cname = "lisbon_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "lisbon.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def nuuk(self):
cname = "nuuk"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "nuuk.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def nuuk_r(self):
cname = "nuuk_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "nuuk.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oleron(self):
cname = "oleron"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "oleron.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oleron_r(self):
cname = "oleron_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "oleron.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oslo(self):
cname = "oslo"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "oslo.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oslo_r(self):
cname = "oslo_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "oslo.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def roma(self):
cname = "roma"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "roma.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def roma_r(self):
cname = "roma_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "roma.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tofino(self):
cname = "tofino"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "tofino.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tofino_r(self):
cname = "tofino_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "tofino.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tokyo(self):
cname = "tokyo"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "tokyo.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tokyo_r(self):
cname = "tokyo_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "tokyo.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def turku(self):
cname = "turku"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "turku.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def turku_r(self):
cname = "turku_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "turku.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vik(self):
cname = "vik"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "vik.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vik_r(self):
cname = "vik_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "scientific", "vik.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bluered_12(self):
cname = "bluered_12"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "bluered_12.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bluered_12_r(self):
cname = "bluered_12_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "bluered_12.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bluered_6(self):
cname = "bluered_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "bluered_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bluered_6_r(self):
cname = "bluered_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "bluered_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def colorblind_10(self):
cname = "colorblind_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "colorblind_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def colorblind_10_r(self):
cname = "colorblind_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "colorblind_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gray_5(self):
cname = "gray_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "gray_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def gray_5_r(self):
cname = "gray_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "gray_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greenorange_12(self):
cname = "greenorange_12"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "greenorange_12.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greenorange_12_r(self):
cname = "greenorange_12_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "greenorange_12.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greenorange_6(self):
cname = "greenorange_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "greenorange_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def greenorange_6_r(self):
cname = "greenorange_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "greenorange_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purplegray_12(self):
cname = "purplegray_12"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "purplegray_12.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purplegray_12_r(self):
cname = "purplegray_12_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "purplegray_12.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purplegray_6(self):
cname = "purplegray_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "purplegray_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purplegray_6_r(self):
cname = "purplegray_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "purplegray_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tableau_10(self):
cname = "tableau_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "tableau_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tableau_10_r(self):
cname = "tableau_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "tableau_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tableau_20(self):
cname = "tableau_20"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "tableau_20.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tableau_20_r(self):
cname = "tableau_20_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "tableau_20.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tableaulight_10(self):
cname = "tableaulight_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "tableaulight_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tableaulight_10_r(self):
cname = "tableaulight_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "tableaulight_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tableaumedium_10(self):
cname = "tableaumedium_10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "tableaumedium_10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tableaumedium_10_r(self):
cname = "tableaumedium_10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "tableaumedium_10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def trafficlight_9(self):
cname = "trafficlight_9"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "trafficlight_9.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def trafficlight_9_r(self):
cname = "trafficlight_9_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "tableau", "trafficlight_9.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def b2_31(self):
cname = "b2_31"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "b2_31.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def b2_31_r(self):
cname = "b2_31_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "b2_31.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bl_11(self):
cname = "bl_11"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "bl_11.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bl_11_r(self):
cname = "bl_11_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "bl_11.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_8_5g2(self):
cname = "blue_8_5g2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_8_5g2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_8_5g2_r(self):
cname = "blue_8_5g2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_8_5g2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_bl111(self):
cname = "blue_bl111"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_bl111.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_bl111_r(self):
cname = "blue_bl111_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_bl111.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_blgra2(self):
cname = "blue_blgra2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_blgra2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_blgra2_r(self):
cname = "blue_blgra2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_blgra2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_blue3(self):
cname = "blue_blue3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_blue3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_blue3_r(self):
cname = "blue_blue3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_blue3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_blueb1(self):
cname = "blue_blueb1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_blueb1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_blueb1_r(self):
cname = "blue_blueb1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_blueb1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_c16adjw(self):
cname = "blue_c16adjw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_c16adjw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_c16adjw_r(self):
cname = "blue_c16adjw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_c16adjw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_gblue(self):
cname = "blue_gblue"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_gblue.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_gblue_r(self):
cname = "blue_gblue_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_gblue.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_lbluec1(self):
cname = "blue_lbluec1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_lbluec1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_lbluec1_r(self):
cname = "blue_lbluec1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_lbluec1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_medb717b(self):
cname = "blue_medb717b"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_medb717b.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def blue_medb717b_r(self):
cname = "blue_medb717b_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "blue_medb717b.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def br4div(self):
cname = "br4div"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "br4div.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def br4div_r(self):
cname = "br4div_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "br4div.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_br119a(self):
cname = "brown_br119a"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_br119a.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_br119a_r(self):
cname = "brown_br119a_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_br119a.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_browns(self):
cname = "brown_browns"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_browns.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_browns_r(self):
cname = "brown_browns_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_browns.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_brye1b(self):
cname = "brown_brye1b"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_brye1b.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_brye1b_r(self):
cname = "brown_brye1b_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_brye1b.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_grawarm1(self):
cname = "brown_grawarm1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_grawarm1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_grawarm1_r(self):
cname = "brown_grawarm1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_grawarm1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_orange1(self):
cname = "brown_orange1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_orange1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_orange1_r(self):
cname = "brown_orange1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_orange1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_ortanish1(self):
cname = "brown_ortanish1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_ortanish1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_ortanish1_r(self):
cname = "brown_ortanish1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_ortanish1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_peachy(self):
cname = "brown_peachy"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_peachy.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_peachy_r(self):
cname = "brown_peachy_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_peachy.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_red3b(self):
cname = "brown_red3b"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_red3b.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_red3b_r(self):
cname = "brown_red3b_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_red3b.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_sable(self):
cname = "brown_sable"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_sable.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_sable_r(self):
cname = "brown_sable_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_sable.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_vbrown1(self):
cname = "brown_vbrown1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_vbrown1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def brown_vbrown1_r(self):
cname = "brown_vbrown1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "brown_vbrown1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def browngray(self):
cname = "browngray"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "browngray.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def browngray_r(self):
cname = "browngray_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "browngray.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bruce2(self):
cname = "bruce2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "bruce2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def bruce2_r(self):
cname = "bruce2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "bruce2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_7_16(self):
cname = "c_7_16"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_7_16.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_7_16_r(self):
cname = "c_7_16_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_7_16.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_blgr1(self):
cname = "c_blgr1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_blgr1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_blgr1_r(self):
cname = "c_blgr1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_blgr1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_pch1(self):
cname = "c_pch1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_pch1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_pch1_r(self):
cname = "c_pch1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_pch1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_violet1(self):
cname = "c_violet1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_violet1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_violet1_r(self):
cname = "c_violet1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_violet1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_yelpch1(self):
cname = "c_yelpch1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_yelpch1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def c_yelpch1_r(self):
cname = "c_yelpch1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "c_yelpch1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def colormap66(self):
cname = "colormap66"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "colormap66.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def colormap66_r(self):
cname = "colormap66_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "colormap66.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def d_blgr3(self):
cname = "d_blgr3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "d_blgr3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def d_blgr3_r(self):
cname = "d_blgr3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "d_blgr3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def d_seteq2(self):
cname = "d_seteq2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "d_seteq2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def d_seteq2_r(self):
cname = "d_seteq2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "d_seteq2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dasy_grbr1(self):
cname = "dasy_grbr1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "dasy_grbr1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def dasy_grbr1_r(self):
cname = "dasy_grbr1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "dasy_grbr1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_Bg(self):
cname = "discrete_Bg"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_Bg.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_Bg_r(self):
cname = "discrete_Bg_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_Bg.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_Bo(self):
cname = "discrete_Bo"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_Bo.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_Bo_r(self):
cname = "discrete_Bo_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_Bo.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_autumn(self):
cname = "discrete_autumn"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_autumn.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_autumn_r(self):
cname = "discrete_autumn_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_autumn.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_dark(self):
cname = "discrete_dark"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_dark.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_dark_r(self):
cname = "discrete_dark_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_dark.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_light_aut(self):
cname = "discrete_light_aut"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_light_aut.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_light_aut_r(self):
cname = "discrete_light_aut_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_light_aut.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_muted(self):
cname = "discrete_muted"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_muted.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_muted_r(self):
cname = "discrete_muted_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_muted.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_vaneyck(self):
cname = "discrete_vaneyck"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_vaneyck.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def discrete_vaneyck_r(self):
cname = "discrete_vaneyck_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "discrete_vaneyck.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def div1_blue_orange(self):
cname = "div1_blue_orange"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "div1_blue_orange.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def div1_blue_orange_r(self):
cname = "div1_blue_orange_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "div1_blue_orange.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def div2_gray_gold(self):
cname = "div2_gray_gold"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "div2_gray_gold.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def div2_gray_gold_r(self):
cname = "div2_gray_gold_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "div2_gray_gold.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def div3_green_brown(self):
cname = "div3_green_brown"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "div3_green_brown.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def div3_green_brown_r(self):
cname = "div3_green_brown_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "div3_green_brown.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def div5_asym_Ob(self):
cname = "div5_asym_Ob"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "div5_asym_Ob.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def div5_asym_Ob_r(self):
cname = "div5_asym_Ob_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "div5_asym_Ob.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def fushia_red_pink1(self):
cname = "fushia_red_pink1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "fushia_red_pink1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def fushia_red_pink1_r(self):
cname = "fushia_red_pink1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "fushia_red_pink1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_9_17e(self):
cname = "green_9_17e"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_9_17e.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_9_17e_r(self):
cname = "green_9_17e_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_9_17e.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_c_gry1(self):
cname = "green_c_gry1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_c_gry1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_c_gry1_r(self):
cname = "green_c_gry1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_c_gry1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_gr1214b(self):
cname = "green_gr1214b"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_gr1214b.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_gr1214b_r(self):
cname = "green_gr1214b_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_gr1214b.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_green1(self):
cname = "green_green1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_green1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_green1_r(self):
cname = "green_green1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_green1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_green6(self):
cname = "green_green6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_green6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_green6_r(self):
cname = "green_green6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_green6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_mistyteal(self):
cname = "green_mistyteal"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_mistyteal.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_mistyteal_r(self):
cname = "green_mistyteal_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_mistyteal.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_mustard(self):
cname = "green_mustard"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_mustard.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_mustard_r(self):
cname = "green_mustard_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_mustard.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_rox(self):
cname = "green_rox"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_rox.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def green_rox_r(self):
cname = "green_rox_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "green_rox.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hier1p(self):
cname = "hier1p"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hier1p.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hier1p_r(self):
cname = "hier1p_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hier1p.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hier2p(self):
cname = "hier2p"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hier2p.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hier2p_r(self):
cname = "hier2p_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hier2p.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hier4w(self):
cname = "hier4w"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hier4w.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hier4w_r(self):
cname = "hier4w_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hier4w.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hier5(self):
cname = "hier5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hier5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hier5_r(self):
cname = "hier5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hier5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def high2ml(self):
cname = "high2ml"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "high2ml.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def high2ml_r(self):
cname = "high2ml_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "high2ml.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def high3(self):
cname = "high3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "high3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def high3_r(self):
cname = "high3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "high3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def high4(self):
cname = "high4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "high4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def high4_r(self):
cname = "high4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "high4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def high5(self):
cname = "high5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "high5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def high5_r(self):
cname = "high5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "high5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hs_orange2(self):
cname = "hs_orange2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hs_orange2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def hs_orange2_r(self):
cname = "hs_orange2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "hs_orange2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_oragnemed1(self):
cname = "l_oragnemed1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_oragnemed1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_oragnemed1_r(self):
cname = "l_oragnemed1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_oragnemed1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_orangemute1(self):
cname = "l_orangemute1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_orangemute1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_orangemute1_r(self):
cname = "l_orangemute1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_orangemute1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_orangesat1(self):
cname = "l_orangesat1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_orangesat1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_orangesat1_r(self):
cname = "l_orangesat1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_orangesat1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_purplowsat3(self):
cname = "l_purplowsat3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_purplowsat3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_purplowsat3_r(self):
cname = "l_purplowsat3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_purplowsat3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_purpmedsat3(self):
cname = "l_purpmedsat3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_purpmedsat3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_purpmedsat3_r(self):
cname = "l_purpmedsat3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_purpmedsat3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_purpwarm2(self):
cname = "l_purpwarm2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_purpwarm2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_purpwarm2_r(self):
cname = "l_purpwarm2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_purpwarm2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_red_warm1(self):
cname = "l_red_warm1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_red_warm1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_red_warm1_r(self):
cname = "l_red_warm1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_red_warm1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_redmuted4(self):
cname = "l_redmuted4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_redmuted4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_redmuted4_r(self):
cname = "l_redmuted4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_redmuted4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_redsat1(self):
cname = "l_redsat1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_redsat1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_redsat1_r(self):
cname = "l_redsat1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_redsat1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_turqmed2(self):
cname = "l_turqmed2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_turqmed2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_turqmed2_r(self):
cname = "l_turqmed2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_turqmed2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_turqsat1(self):
cname = "l_turqsat1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_turqsat1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_turqsat1_r(self):
cname = "l_turqsat1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_turqsat1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_yellowmed2(self):
cname = "l_yellowmed2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_yellowmed2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def l_yellowmed2_r(self):
cname = "l_yellowmed2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "l_yellowmed2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def mauve1(self):
cname = "mauve1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "mauve1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def mauve1_r(self):
cname = "mauve1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "mauve1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oryell2(self):
cname = "oryell2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "oryell2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def oryell2_r(self):
cname = "oryell2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "oryell2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_1(self):
cname = "other_outl_1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_1_r(self):
cname = "other_outl_1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_2(self):
cname = "other_outl_2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_2_r(self):
cname = "other_outl_2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_3(self):
cname = "other_outl_3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_3_r(self):
cname = "other_outl_3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_4(self):
cname = "other_outl_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_4_r(self):
cname = "other_outl_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_5(self):
cname = "other_outl_5"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_5.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_5_r(self):
cname = "other_outl_5_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_5.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_6(self):
cname = "other_outl_6"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_6.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_6_r(self):
cname = "other_outl_6_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_6.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_7(self):
cname = "other_outl_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_7_r(self):
cname = "other_outl_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_8(self):
cname = "other_outl_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def other_outl_8_r(self):
cname = "other_outl_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "other_outl_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pr_mist(self):
cname = "pr_mist"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "pr_mist.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def pr_mist_r(self):
cname = "pr_mist_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "pr_mist.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purp2(self):
cname = "purp2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "purp2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purp2_r(self):
cname = "purp2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "purp2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purp_pink(self):
cname = "purp_pink"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "purp_pink.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def purp_pink_r(self):
cname = "purp_pink_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "purp_pink.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def red2b(self):
cname = "red2b"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "red2b.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def red2b_r(self):
cname = "red2b_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "red2b.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def red_1lt(self):
cname = "red_1lt"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "red_1lt.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def red_1lt_r(self):
cname = "red_1lt_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "red_1lt.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def redp1(self):
cname = "redp1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "redp1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def redp1_r(self):
cname = "redp1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "redp1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def redsun1(self):
cname = "redsun1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "redsun1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def redsun1_r(self):
cname = "redsun1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "redsun1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def redy3(self):
cname = "redy3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "redy3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def redy3_r(self):
cname = "redy3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "redy3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rpinky(self):
cname = "rpinky"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "rpinky.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def rpinky_r(self):
cname = "rpinky_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "rpinky.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def speed_yel(self):
cname = "speed_yel"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "speed_yel.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def speed_yel_r(self):
cname = "speed_yel_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "speed_yel.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tempm1(self):
cname = "tempm1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "tempm1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tempm1_r(self):
cname = "tempm1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "tempm1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def test(self):
cname = "test"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "test.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def test_r(self):
cname = "test_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "test.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tr4(self):
cname = "tr4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "tr4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def tr4_r(self):
cname = "tr4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "tr4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def turq1lt(self):
cname = "turq1lt"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "turq1lt.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def turq1lt_r(self):
cname = "turq1lt_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "turq1lt.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def turqw1(self):
cname = "turqw1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "turqw1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def turqw1_r(self):
cname = "turqw1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "turqw1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vvlt_turq3(self):
cname = "vvlt_turq3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "vvlt_turq3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def vvlt_turq3_r(self):
cname = "vvlt_turq3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "vvlt_turq3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def w5m4(self):
cname = "w5m4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "w5m4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def w5m4_r(self):
cname = "w5m4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "w5m4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def w_ymiddle1(self):
cname = "w_ymiddle1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "w_ymiddle1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def w_ymiddle1_r(self):
cname = "w_ymiddle1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "w_ymiddle1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wlteqcool(self):
cname = "wlteqcool"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "wlteqcool.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wlteqcool_r(self):
cname = "wlteqcool_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "wlteqcool.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wmutedset(self):
cname = "wmutedset"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "wmutedset.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def wmutedset_r(self):
cname = "wmutedset_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "wmutedset.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yel15(self):
cname = "yel15"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yel15.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yel15_r(self):
cname = "yel15_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yel15.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yel3(self):
cname = "yel3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yel3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yel3_r(self):
cname = "yel3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yel3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yel_peach_br(self):
cname = "yel_peach_br"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yel_peach_br.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yel_peach_br_r(self):
cname = "yel_peach_br_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yel_peach_br.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yellowsun(self):
cname = "yellowsun"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yellowsun.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yellowsun_r(self):
cname = "yellowsun_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yellowsun.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yelsat100(self):
cname = "yelsat100"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yelsat100.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yelsat100_r(self):
cname = "yelsat100_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yelsat100.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yg1(self):
cname = "yg1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yg1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yg1_r(self):
cname = "yg1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yg1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yg3(self):
cname = "yg3"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yg3.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def yg3_r(self):
cname = "yg3_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "sciviz", "yg3.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_grey(self):
cname = "cet_c_grey"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_grey.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_grey_r(self):
cname = "cet_c_grey_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_grey.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_grey_15(self):
cname = "cet_c_grey_15"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_grey_15.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_grey_15_r(self):
cname = "cet_c_grey_15_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_grey_15.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_mrybm(self):
cname = "cet_c_mrybm"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_mrybm.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_mrybm_r(self):
cname = "cet_c_mrybm_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_mrybm.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_mrybm_35(self):
cname = "cet_c_mrybm_35"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_mrybm_35.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_mrybm_35_r(self):
cname = "cet_c_mrybm_35_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_mrybm_35.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_mygbm(self):
cname = "cet_c_mygbm"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_mygbm.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_mygbm_r(self):
cname = "cet_c_mygbm_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_mygbm.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_mygbm_30(self):
cname = "cet_c_mygbm_30"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_mygbm_30.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_mygbm_30_r(self):
cname = "cet_c_mygbm_30_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_mygbm_30.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_protanopic_deuteranopic_bwyk(self):
cname = "cet_c_protanopic_deuteranopic_bwyk"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_protanopic_deuteranopic_bwyk.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_protanopic_deuteranopic_bwyk_r(self):
cname = "cet_c_protanopic_deuteranopic_bwyk_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_protanopic_deuteranopic_bwyk.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_protanopic_deuteranopic_wywb(self):
cname = "cet_c_protanopic_deuteranopic_wywb"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_protanopic_deuteranopic_wywb.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_protanopic_deuteranopic_wywb_r(self):
cname = "cet_c_protanopic_deuteranopic_wywb_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_protanopic_deuteranopic_wywb.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_tritanopic_cwrk_4(self):
cname = "cet_c_tritanopic_cwrk_4"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_tritanopic_cwrk_4.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_tritanopic_cwrk_4_r(self):
cname = "cet_c_tritanopic_cwrk_4_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_tritanopic_cwrk_4.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_tritanopic_wrwc_7(self):
cname = "cet_c_tritanopic_wrwc_7"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_tritanopic_wrwc_7.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_tritanopic_wrwc_7_r(self):
cname = "cet_c_tritanopic_wrwc_7_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_tritanopic_wrwc_7.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_wrwbw(self):
cname = "cet_c_wrwbw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_wrwbw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_wrwbw_r(self):
cname = "cet_c_wrwbw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_wrwbw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_wrwbw_40(self):
cname = "cet_c_wrwbw_40"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_wrwbw_40.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_c_wrwbw_40_r(self):
cname = "cet_c_wrwbw_40_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_c_wrwbw_40.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_bkr(self):
cname = "cet_d_bkr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_bkr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_bkr_r(self):
cname = "cet_d_bkr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_bkr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_bky(self):
cname = "cet_d_bky"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_bky.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_bky_r(self):
cname = "cet_d_bky_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_bky.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_bwg(self):
cname = "cet_d_bwg"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_bwg.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_bwg_r(self):
cname = "cet_d_bwg_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_bwg.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_bwr(self):
cname = "cet_d_bwr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_bwr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_bwr_r(self):
cname = "cet_d_bwr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_bwr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_cwm_8(self):
cname = "cet_d_cwm_8"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_cwm_8.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_cwm_8_r(self):
cname = "cet_d_cwm_8_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_cwm_8.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_gkr(self):
cname = "cet_d_gkr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_gkr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_gkr_r(self):
cname = "cet_d_gkr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_gkr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_gwr(self):
cname = "cet_d_gwr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_gwr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_gwr_r(self):
cname = "cet_d_gwr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_gwr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_gwv(self):
cname = "cet_d_gwv"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_gwv.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_gwv_r(self):
cname = "cet_d_gwv_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_gwv.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_isoluminant_cjm(self):
cname = "cet_d_isoluminant_cjm"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_isoluminant_cjm.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_isoluminant_cjm_r(self):
cname = "cet_d_isoluminant_cjm_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_isoluminant_cjm.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_isoluminant_cjm1(self):
cname = "cet_d_isoluminant_cjm1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_isoluminant_cjm1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_isoluminant_cjm1_r(self):
cname = "cet_d_isoluminant_cjm1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_isoluminant_cjm1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_isoluminant_cjo(self):
cname = "cet_d_isoluminant_cjo"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_isoluminant_cjo.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_isoluminant_cjo_r(self):
cname = "cet_d_isoluminant_cjo_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_isoluminant_cjo.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_linear_bjr(self):
cname = "cet_d_linear_bjr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_linear_bjr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_linear_bjr_r(self):
cname = "cet_d_linear_bjr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_linear_bjr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_linear_bjy(self):
cname = "cet_d_linear_bjy"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_linear_bjy.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_linear_bjy_r(self):
cname = "cet_d_linear_bjy_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_linear_bjy.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_protanopic_deuteranopic_bwy(self):
cname = "cet_d_protanopic_deuteranopic_bwy"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_protanopic_deuteranopic_bwy.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_protanopic_deuteranopic_bwy_r(self):
cname = "cet_d_protanopic_deuteranopic_bwy_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_protanopic_deuteranopic_bwy.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_rainbow_bgymr(self):
cname = "cet_d_rainbow_bgymr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_rainbow_bgymr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_rainbow_bgymr_r(self):
cname = "cet_d_rainbow_bgymr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_rainbow_bgymr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_tritanopic_cwr(self):
cname = "cet_d_tritanopic_cwr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_tritanopic_cwr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_d_tritanopic_cwr_r(self):
cname = "cet_d_tritanopic_cwr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_d_tritanopic_cwr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw(self):
cname = "cet_g_bw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw_r(self):
cname = "cet_g_bw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw_minc(self):
cname = "cet_g_bw_minc"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw_minc.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw_minc_r(self):
cname = "cet_g_bw_minc_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw_minc.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw_minc1(self):
cname = "cet_g_bw_minc1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw_minc1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw_minc1_r(self):
cname = "cet_g_bw_minc1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw_minc1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw_minc_maxl(self):
cname = "cet_g_bw_minc_maxl"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw_minc_maxl.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw_minc_maxl_r(self):
cname = "cet_g_bw_minc_maxl_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw_minc_maxl.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw_minc_minl(self):
cname = "cet_g_bw_minc_minl"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw_minc_minl.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_bw_minc_minl_r(self):
cname = "cet_g_bw_minc_minl_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_bw_minc_minl.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_category10(self):
cname = "cet_g_category10"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_category10.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_category10_r(self):
cname = "cet_g_category10_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_category10.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_hv(self):
cname = "cet_g_hv"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_hv.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_g_hv_r(self):
cname = "cet_g_hv_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_g_hv.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_i(self):
cname = "cet_i"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_i.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_i_r(self):
cname = "cet_i_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_i.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_i_cgo(self):
cname = "cet_i_cgo"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_i_cgo.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_i_cgo_r(self):
cname = "cet_i_cgo_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_i_cgo.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_i_cgo1(self):
cname = "cet_i_cgo1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_i_cgo1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_i_cgo1_r(self):
cname = "cet_i_cgo1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_i_cgo1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bgy(self):
cname = "cet_l_bgy"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bgy.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bgy_r(self):
cname = "cet_l_bgy_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bgy.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bgyw(self):
cname = "cet_l_bgyw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bgyw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bgyw_r(self):
cname = "cet_l_bgyw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bgyw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bgyw1(self):
cname = "cet_l_bgyw1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bgyw1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bgyw1_r(self):
cname = "cet_l_bgyw1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bgyw1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_blue(self):
cname = "cet_l_blue"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_blue.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_blue_r(self):
cname = "cet_l_blue_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_blue.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_blue1(self):
cname = "cet_l_blue1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_blue1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_blue1_r(self):
cname = "cet_l_blue1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_blue1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bmw(self):
cname = "cet_l_bmw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bmw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bmw_r(self):
cname = "cet_l_bmw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bmw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bmw1(self):
cname = "cet_l_bmw1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bmw1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bmw1_r(self):
cname = "cet_l_bmw1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bmw1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bmy(self):
cname = "cet_l_bmy"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bmy.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bmy_r(self):
cname = "cet_l_bmy_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bmy.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bmy1(self):
cname = "cet_l_bmy1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bmy1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_bmy1_r(self):
cname = "cet_l_bmy1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_bmy1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_gow(self):
cname = "cet_l_gow"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_gow.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_gow_r(self):
cname = "cet_l_gow_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_gow.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_gow1(self):
cname = "cet_l_gow1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_gow1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_gow1_r(self):
cname = "cet_l_gow1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_gow1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_green(self):
cname = "cet_l_green"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_green.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_green_r(self):
cname = "cet_l_green_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_green.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_grey(self):
cname = "cet_l_grey"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_grey.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_grey_r(self):
cname = "cet_l_grey_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_grey.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_grey1(self):
cname = "cet_l_grey1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_grey1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_grey1_r(self):
cname = "cet_l_grey1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_grey1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kbc(self):
cname = "cet_l_kbc"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kbc.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kbc_r(self):
cname = "cet_l_kbc_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kbc.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kbgyw(self):
cname = "cet_l_kbgyw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kbgyw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kbgyw_r(self):
cname = "cet_l_kbgyw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kbgyw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kgy(self):
cname = "cet_l_kgy"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kgy.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kgy_r(self):
cname = "cet_l_kgy_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kgy.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kry(self):
cname = "cet_l_kry"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kry.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kry_r(self):
cname = "cet_l_kry_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kry.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kry1(self):
cname = "cet_l_kry1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kry1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kry1_r(self):
cname = "cet_l_kry1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kry1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kry2(self):
cname = "cet_l_kry2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kry2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kry2_r(self):
cname = "cet_l_kry2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kry2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kryw(self):
cname = "cet_l_kryw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kryw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kryw_r(self):
cname = "cet_l_kryw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kryw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kryw1(self):
cname = "cet_l_kryw1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kryw1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kryw1_r(self):
cname = "cet_l_kryw1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kryw1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kryw2(self):
cname = "cet_l_kryw2"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kryw2.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_kryw2_r(self):
cname = "cet_l_kryw2_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_kryw2.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_protanopic_deuteranopic_kbjyw(self):
cname = "cet_l_protanopic_deuteranopic_kbjyw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_protanopic_deuteranopic_kbjyw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_protanopic_deuteranopic_kbjyw_r(self):
cname = "cet_l_protanopic_deuteranopic_kbjyw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_protanopic_deuteranopic_kbjyw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_protanopic_deuteranopic_kbw(self):
cname = "cet_l_protanopic_deuteranopic_kbw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_protanopic_deuteranopic_kbw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_protanopic_deuteranopic_kbw_r(self):
cname = "cet_l_protanopic_deuteranopic_kbw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_protanopic_deuteranopic_kbw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_ternary_blue(self):
cname = "cet_l_ternary_blue"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_ternary_blue.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_ternary_blue_r(self):
cname = "cet_l_ternary_blue_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_ternary_blue.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_ternary_green(self):
cname = "cet_l_ternary_green"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_ternary_green.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_ternary_green_r(self):
cname = "cet_l_ternary_green_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_ternary_green.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_ternary_red(self):
cname = "cet_l_ternary_red"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_ternary_red.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_ternary_red_r(self):
cname = "cet_l_ternary_red_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_ternary_red.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_tritanopic_krjcw(self):
cname = "cet_l_tritanopic_krjcw"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_tritanopic_krjcw.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_tritanopic_krjcw_r(self):
cname = "cet_l_tritanopic_krjcw_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_tritanopic_krjcw.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_tritanopic_krjcw1(self):
cname = "cet_l_tritanopic_krjcw1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_tritanopic_krjcw1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_tritanopic_krjcw1_r(self):
cname = "cet_l_tritanopic_krjcw1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_tritanopic_krjcw1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_wcmr(self):
cname = "cet_l_wcmr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_wcmr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_wcmr_r(self):
cname = "cet_l_wcmr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_wcmr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_worb(self):
cname = "cet_l_worb"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_worb.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_worb_r(self):
cname = "cet_l_worb_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_worb.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_wyor(self):
cname = "cet_l_wyor"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_wyor.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_l_wyor_r(self):
cname = "cet_l_wyor_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_l_wyor.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_r_bgyr(self):
cname = "cet_r_bgyr"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_r_bgyr.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_r_bgyr_r(self):
cname = "cet_r_bgyr_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_r_bgyr.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_r_bgyr1(self):
cname = "cet_r_bgyr1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_r_bgyr1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_r_bgyr1_r(self):
cname = "cet_r_bgyr1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_r_bgyr1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_r_bgyrm(self):
cname = "cet_r_bgyrm"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_r_bgyrm.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_r_bgyrm_r(self):
cname = "cet_r_bgyrm_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_r_bgyrm.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_r_bgyrm1(self):
cname = "cet_r_bgyrm1"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_r_bgyrm1.rgb")
cmap = Colormap(self._coltbl(cmap_file), name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
@property
def cet_r_bgyrm1_r(self):
cname = "cet_r_bgyrm1_r"
if cname in matplotlib.cm._cmap_registry:
return matplotlib.cm.get_cmap(cname)
cmap_file = os.path.join(CMAPSFILE_DIR, "colorcet", "cet_r_bgyrm1.rgb")
cmap = Colormap(self._coltbl(cmap_file)[::-1], name=cname)
matplotlib.cm.register_cmap(name=cname, cmap=cmap)
return cmap
| 39.049909 | 103 | 0.645284 | 78,872 | 597,776 | 4.656596 | 0.008051 | 0.149479 | 0.041549 | 0.091408 | 0.942678 | 0.934254 | 0.930545 | 0.928569 | 0.928569 | 0.928569 | 0 | 0.010377 | 0.243304 | 597,776 | 15,307 | 104 | 39.05246 | 0.801576 | 0.000378 | 0 | 0.775743 | 0 | 0 | 0.081512 | 0.005444 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111038 | false | 0 | 0.000436 | 0 | 0.333261 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e80c9f7d7d8137d0f7d459e9ca9a16f397ad1799 | 18,662 | py | Python | chess_engine/legal_moves.py | Lewis-Cole/Chess-Engine | d1f4f8cffc8d3b2208b5250ba0ecef6ccb450149 | [
"MIT"
] | null | null | null | chess_engine/legal_moves.py | Lewis-Cole/Chess-Engine | d1f4f8cffc8d3b2208b5250ba0ecef6ccb450149 | [
"MIT"
] | null | null | null | chess_engine/legal_moves.py | Lewis-Cole/Chess-Engine | d1f4f8cffc8d3b2208b5250ba0ecef6ccb450149 | [
"MIT"
] | null | null | null | """Find the legal moves for all pieces."""
from rules import white_pieces, black_pieces
# Functions for finding the legal move of each piece
def pawn_moves(board, colour_string, square_list):
"""Find all legal moves for a pawn."""
# create empty string to append legal moves
legal_moves = []
# consider moves if pawn is white
if colour_string == "w":
# one square forward
move_1 = [square_list[0] + 1, square_list[1]]
if 0 <= move_1[0] <= 7:
if board[move_1[0]][move_1[1]] == "":
legal_moves.append(move_1)
if square_list[0] == 1:
# two squares forward
move_2 = [square_list[0] + 2, square_list[1]]
if 0 <= move_2[0] <= 7:
if board[move_2[0]][move_2[1]] == "":
legal_moves.append(move_2)
# diagonal captures
move_3 = [square_list[0] + 1, square_list[1] + 1]
if 0 <= move_3[1] <= 7:
if board[move_3[0]][move_3[1]] in black_pieces:
legal_moves.append(move_3)
move_4 = [square_list[0] + 1, square_list[1] - 1]
if 0 <= move_4[1] <= 7:
if board[move_4[0]][move_4[1]] in black_pieces:
legal_moves.append(move_4)
# consider moves if pawn is black
if colour_string == "b":
# one square forward
move_1 = [square_list[0] - 1, square_list[1]]
if 0 <= move_1[0] <= 7:
if board[move_1[0]][move_1[1]] == "":
legal_moves.append(move_1)
if square_list[0] == 6:
# two squares forward
move_2 = [square_list[0] - 2, square_list[1]]
if 0 <= move_2[0] <= 7:
if board[move_2[0]][move_2[1]] == "":
legal_moves.append(move_2)
# diagonal captures
move_3 = [square_list[0] - 1, square_list[1] + 1]
if 0 <= move_3[1] <= 7:
if board[move_3[0]][move_3[1]] in white_pieces:
legal_moves.append(move_3)
move_4 = [square_list[0] - 1, square_list[1] - 1]
if 0 <= move_4[1] <= 7:
if board[move_4[0]][move_4[1]] in white_pieces:
legal_moves.append(move_4)
return legal_moves
def rook_moves(board, colour_string, square_list):
"""Find all legal moves for a rook."""
legal_moves = []
# set the colour of opponents pieces
if colour_string == "w":
opponent_pieces = black_pieces
if colour_string == "b":
opponent_pieces = white_pieces
# going up
go_up = True
up_index = 1
while go_up:
move = [square_list[0] + up_index, square_list[1]]
up_index += 1
if 0 <= move[0] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_up = False
else:
go_up = False
else:
go_up = False
# going down
go_down = True
down_index = 1
while go_down:
move = [square_list[0] - down_index, square_list[1]]
down_index += 1
if 0 <= move[0] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_down = False
else:
go_down = False
else:
go_down = False
# going right
go_right = True
right_index = 1
while go_right:
move = [square_list[0], square_list[1] + right_index]
right_index += 1
if 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_right = False
else:
go_right = False
else:
go_right = False
# going left
go_left = True
left_index = 1
while go_left:
move = [square_list[0], square_list[1] - left_index]
left_index += 1
if 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_left = False
else:
go_left = False
else:
go_left = False
return legal_moves
def knight_moves(board, colour_string, square_list):
"""Find all legal moves for a knight."""
legal_moves = []
# set the colour of opponents pieces
if colour_string == "w":
opponent_pieces = black_pieces
if colour_string == "b":
opponent_pieces = white_pieces
move_1 = [square_list[0] - 2, square_list[1] + 1]
if 0 <= move_1[0] <= 7 and 0 <= move_1[1] <= 7:
if (
board[move_1[0]][move_1[1]] == ""
or board[move_1[0]][move_1[1]] in opponent_pieces
):
legal_moves.append(move_1)
move_2 = [square_list[0] - 1, square_list[1] + 2]
if 0 <= move_2[0] <= 7 and 0 <= move_2[1] <= 7:
if (
board[move_2[0]][move_2[1]] == ""
or board[move_2[0]][move_2[1]] in opponent_pieces
):
legal_moves.append(move_2)
move_3 = [square_list[0] + 1, square_list[1] + 2]
if 0 <= move_3[0] <= 7 and 0 <= move_3[1] <= 7:
if (
board[move_3[0]][move_3[1]] == ""
or board[move_3[0]][move_3[1]] in opponent_pieces
):
legal_moves.append(move_3)
move_4 = [square_list[0] + 2, square_list[1] + 1]
if 0 <= move_4[0] <= 7 and 0 <= move_4[1] <= 7:
if (
board[move_4[0]][move_4[1]] == ""
or board[move_4[0]][move_4[1]] in opponent_pieces
):
legal_moves.append(move_4)
move_5 = [square_list[0] + 2, square_list[1] - 1]
if 0 <= move_5[0] <= 7 and 0 <= move_5[1] <= 7:
if (
board[move_5[0]][move_5[1]] == ""
or board[move_5[0]][move_5[1]] in opponent_pieces
):
legal_moves.append(move_5)
move_6 = [square_list[0] + 1, square_list[1] - 2]
if 0 <= move_6[0] <= 7 and 0 <= move_6[1] <= 7:
if (
board[move_6[0]][move_6[1]] == ""
or board[move_6[0]][move_6[1]] in opponent_pieces
):
legal_moves.append(move_6)
move_7 = [square_list[0] - 1, square_list[1] - 2]
if 0 <= move_7[0] <= 7 and 0 <= move_7[1] <= 7:
if (
board[move_7[0]][move_7[1]] == ""
or board[move_7[0]][move_7[1]] in opponent_pieces
):
legal_moves.append(move_7)
move_8 = [square_list[0] - 2, square_list[1] - 1]
if 0 <= move_8[0] <= 7 and 0 <= move_8[1] <= 7:
if (
board[move_8[0]][move_8[1]] == ""
or board[move_8[0]][move_8[1]] in opponent_pieces
):
legal_moves.append(move_8)
return legal_moves
def bishop_moves(board, colour_string, square_list):
"""Find all legal moves for a bishop."""
legal_moves = []
if colour_string == "w":
opponent_pieces = black_pieces
if colour_string == "b":
opponent_pieces = white_pieces
# going up-right
go_upright = True
upright_index = 1
while go_upright:
move = [square_list[0] + upright_index, square_list[1] + upright_index]
upright_index += 1
if 0 <= move[0] <= 7 and 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_upright = False
else:
go_upright = False
else:
go_upright = False
# going down-right
go_downright = True
downright_index = 1
while go_downright:
move = [square_list[0] - downright_index, square_list[1] + downright_index]
downright_index += 1
if 0 <= move[0] <= 7 and 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_downright = False
else:
go_downright = False
else:
go_downright = False
# going down-left
go_downleft = True
downleft_index = 1
while go_downleft:
move = [square_list[0] - downleft_index, square_list[1] - downleft_index]
downleft_index += 1
if 0 <= move[0] <= 7 and 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_downleft = False
else:
go_downleft = False
else:
go_downleft = False
# going up-left
go_upleft = True
upleft_index = 1
while go_upleft:
move = [square_list[0] + upleft_index, square_list[1] - upleft_index]
upleft_index += 1
if 0 <= move[0] <= 7 and 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_upleft = False
else:
go_upleft = False
else:
go_upleft = False
return legal_moves
def queen_moves(board, colour_string, square_list):
"""Find all legal moves for a queen."""
legal_moves = []
if colour_string == "w":
opponent_pieces = black_pieces
if colour_string == "b":
opponent_pieces = white_pieces
# going up
go_up = True
up_index = 1
while go_up:
move = [square_list[0] + up_index, square_list[1]]
up_index += 1
if 0 <= move[0] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_up = False
else:
go_up = False
else:
go_up = False
# going down
go_down = True
down_index = 1
while go_down:
move = [square_list[0] - down_index, square_list[1]]
down_index += 1
if 0 <= move[0] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_down = False
else:
go_down = False
else:
go_down = False
# going right
go_right = True
right_index = 1
while go_right:
move = [square_list[0], square_list[1] + right_index]
right_index += 1
if 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_right = False
else:
go_right = False
else:
go_right = False
# going left
go_left = True
left_index = 1
while go_left:
move = [square_list[0], square_list[1] - left_index]
left_index += 1
if 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_left = False
else:
go_left = False
else:
go_left = False
# going up-right
go_upright = True
upright_index = 1
while go_upright:
move = [square_list[0] + upright_index, square_list[1] + upright_index]
upright_index += 1
if 0 <= move[0] <= 7 and 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_upright = False
else:
go_upright = False
else:
go_upright = False
# going down-right
go_downright = True
downright_index = 1
while go_downright:
move = [square_list[0] - downright_index, square_list[1] + downright_index]
downright_index += 1
if 0 <= move[0] <= 7 and 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_downright = False
else:
go_downright = False
else:
go_downright = False
# going down-left
go_downleft = True
downleft_index = 1
while go_downleft:
move = [square_list[0] - downleft_index, square_list[1] - downleft_index]
downleft_index += 1
if 0 <= move[0] <= 7 and 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_downleft = False
else:
go_downleft = False
else:
go_downleft = False
# going up-left
go_upleft = True
upleft_index = 1
while go_upleft:
move = [square_list[0] + upleft_index, square_list[1] - upleft_index]
upleft_index += 1
if 0 <= move[0] <= 7 and 0 <= move[1] <= 7:
if board[move[0]][move[1]] == "":
legal_moves.append(move)
elif board[move[0]][move[1]] in opponent_pieces:
legal_moves.append(move)
go_upleft = False
else:
go_upleft = False
else:
go_upleft = False
return legal_moves
def king_moves(board, colour_string, square_list):
"""Find all legal moves for a king."""
legal_moves = []
if colour_string == "w":
opponent_pieces = black_pieces
if colour_string == "b":
opponent_pieces = white_pieces
move_1 = [square_list[0] + 1, square_list[1]]
if 0 <= move_1[0] <= 7 and 0 <= move_1[1] <= 7:
if (
board[move_1[0]][move_1[1]] == ""
or board[move_1[0]][move_1[1]] in opponent_pieces
):
legal_moves.append(move_1)
move_2 = [square_list[0] + 1, square_list[1] + 1]
if 0 <= move_2[0] <= 7 and 0 <= move_2[1] <= 7:
if (
board[move_2[0]][move_2[1]] == ""
or board[move_2[0]][move_2[1]] in opponent_pieces
):
legal_moves.append(move_2)
move_3 = [square_list[0], square_list[1] + 1]
if 0 <= move_3[0] <= 7 and 0 <= move_3[1] <= 7:
if (
board[move_3[0]][move_3[1]] == ""
or board[move_3[0]][move_3[1]] in opponent_pieces
):
legal_moves.append(move_3)
move_4 = [square_list[0] - 1, square_list[1] + 1]
if 0 <= move_4[0] <= 7 and 0 <= move_4[1] <= 7:
if (
board[move_4[0]][move_4[1]] == ""
or board[move_4[0]][move_4[1]] in opponent_pieces
):
legal_moves.append(move_4)
move_5 = [square_list[0] - 1, square_list[1]]
if 0 <= move_5[0] <= 7 and 0 <= move_5[1] <= 7:
if (
board[move_5[0]][move_5[1]] == ""
or board[move_5[0]][move_5[1]] in opponent_pieces
):
legal_moves.append(move_5)
move_6 = [square_list[0] - 1, square_list[1] - 1]
if 0 <= move_6[0] <= 7 and 0 <= move_6[1] <= 7:
if (
board[move_6[0]][move_6[1]] == ""
or board[move_6[0]][move_6[1]] in opponent_pieces
):
legal_moves.append(move_6)
move_7 = [square_list[0], square_list[1] - 1]
if 0 <= move_7[0] <= 7 and 0 <= move_7[1] <= 7:
if (
board[move_7[0]][move_7[1]] == ""
or board[move_7[0]][move_7[1]] in opponent_pieces
):
legal_moves.append(move_7)
move_8 = [square_list[0] + 1, square_list[1] - 1]
if 0 <= move_8[0] <= 7 and 0 <= move_8[1] <= 7:
if (
board[move_8[0]][move_8[1]] == ""
or board[move_8[0]][move_8[1]] in opponent_pieces
):
legal_moves.append(move_8)
return legal_moves
# Function for outputting all legal moves
def find_moves(board, colour_string):
"""Finds all the legal moves for a player"""
all_moves = []
player_pieces = white_pieces if colour_string == "w" else black_pieces
for row_index in range(8):
for column_index in range(8):
piece = board[row_index][column_index]
if piece in player_pieces:
square_list = [row_index, column_index]
if piece == "P" or piece == "p":
for move in pawn_moves(board, colour_string, square_list):
all_moves.append((square_list, move))
if piece == "R" or piece == "r":
for move in rook_moves(board, colour_string, square_list):
all_moves.append((square_list, move))
if piece == "N" or piece == "n":
for move in knight_moves(board, colour_string, square_list):
all_moves.append((square_list, move))
if piece == "B" or piece == "b":
for move in bishop_moves(board, colour_string, square_list):
all_moves.append((square_list, move))
if piece == "Q" or piece == "q":
for move in queen_moves(board, colour_string, square_list):
all_moves.append((square_list, move))
if piece == "K" or piece == "k":
for move in king_moves(board, colour_string, square_list):
all_moves.append((square_list, move))
legal_moves = []
for move in all_moves:
if move[1] != []:
legal_moves.append(move)
return legal_moves
| 32.065292 | 83 | 0.512057 | 2,527 | 18,662 | 3.547685 | 0.035615 | 0.075851 | 0.101729 | 0.127161 | 0.931846 | 0.922699 | 0.913887 | 0.907418 | 0.899387 | 0.899387 | 0 | 0.055794 | 0.363252 | 18,662 | 581 | 84 | 32.120482 | 0.698645 | 0.046619 | 0 | 0.875546 | 0 | 0 | 0.001411 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015284 | false | 0 | 0.002183 | 0 | 0.032751 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e820996a5f1fde785f883366be233f3edd9f6929 | 107 | py | Python | __init__.py | Aayog/processing_py | d0b879d536fa707aaed8d0921dee04c5deedc09a | [
"MIT"
] | 30 | 2020-07-02T01:36:01.000Z | 2022-03-26T02:29:10.000Z | __init__.py | Aayog/processing_py | d0b879d536fa707aaed8d0921dee04c5deedc09a | [
"MIT"
] | 10 | 2020-09-10T15:49:10.000Z | 2021-12-14T09:50:50.000Z | __init__.py | Aayog/processing_py | d0b879d536fa707aaed8d0921dee04c5deedc09a | [
"MIT"
] | 6 | 2021-02-25T03:55:32.000Z | 2022-01-16T10:03:05.000Z | from processing_py.app import App
from processing_py.color import *
from processing_py.constants import *
| 26.75 | 37 | 0.831776 | 16 | 107 | 5.375 | 0.4375 | 0.488372 | 0.55814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121495 | 107 | 3 | 38 | 35.666667 | 0.914894 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1c15c1f6dd7ef550d938f7ea2bbdf703dbf621b4 | 30,565 | py | Python | custom/icds_reports/migrations/0016_aggawcmonthly_aggccsrecordmonthly_aggchildhealthmonthly_aggdailyusageview_aggthrmonthly_awclocation_.py | dannyroberts/commcare-hq | 4b0b8ecbe851e46307d3a0e635d6d5d6e31c3598 | [
"BSD-3-Clause"
] | null | null | null | custom/icds_reports/migrations/0016_aggawcmonthly_aggccsrecordmonthly_aggchildhealthmonthly_aggdailyusageview_aggthrmonthly_awclocation_.py | dannyroberts/commcare-hq | 4b0b8ecbe851e46307d3a0e635d6d5d6e31c3598 | [
"BSD-3-Clause"
] | null | null | null | custom/icds_reports/migrations/0016_aggawcmonthly_aggccsrecordmonthly_aggchildhealthmonthly_aggdailyusageview_aggthrmonthly_awclocation_.py | dannyroberts/commcare-hq | 4b0b8ecbe851e46307d3a0e635d6d5d6e31c3598 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.10.7 on 2017-05-11 09:50
from __future__ import unicode_literals
from __future__ import absolute_import
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('icds_reports', '0015_add_aggregation_level'),
]
operations = [
migrations.CreateModel(
name='AggAwcMonthly',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('awc_id', models.TextField(blank=True, null=True)),
('awc_name', models.TextField(blank=True, null=True)),
('awc_site_code', models.TextField(blank=True, null=True)),
('supervisor_id', models.TextField(blank=True, null=True)),
('supervisor_name', models.TextField(blank=True, null=True)),
('supervisor_site_code', models.TextField(blank=True, null=True)),
('block_id', models.TextField(blank=True, null=True)),
('block_name', models.TextField(blank=True, null=True)),
('block_site_code', models.TextField(blank=True, null=True)),
('district_id', models.TextField(blank=True, null=True)),
('district_name', models.TextField(blank=True, null=True)),
('district_site_code', models.TextField(blank=True, null=True)),
('state_id', models.TextField(blank=True, null=True)),
('state_name', models.TextField(blank=True, null=True)),
('state_site_code', models.TextField(blank=True, null=True)),
('aggregation_level', models.IntegerField(blank=True, null=True)),
('month', models.DateField(blank=True, null=True)),
('is_launched', models.TextField(blank=True, null=True)),
('num_awcs', models.IntegerField(blank=True, null=True)),
('awc_days_open', models.IntegerField(blank=True, null=True)),
('total_eligible_children', models.IntegerField(blank=True, null=True)),
('total_attended_children', models.IntegerField(blank=True, null=True)),
('pse_avg_attendance_percent', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('pse_full', models.IntegerField(blank=True, null=True)),
('pse_partial', models.IntegerField(blank=True, null=True)),
('pse_non', models.IntegerField(blank=True, null=True)),
('pse_score', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('awc_days_provided_breakfast', models.IntegerField(blank=True, null=True)),
('awc_days_provided_hotmeal', models.IntegerField(blank=True, null=True)),
('awc_days_provided_thr', models.IntegerField(blank=True, null=True)),
('awc_days_provided_pse', models.IntegerField(blank=True, null=True)),
('awc_not_open_holiday', models.IntegerField(blank=True, null=True)),
('awc_not_open_festival', models.IntegerField(blank=True, null=True)),
('awc_not_open_no_help', models.IntegerField(blank=True, null=True)),
('awc_not_open_department_work', models.IntegerField(blank=True, null=True)),
('awc_not_open_other', models.IntegerField(blank=True, null=True)),
('awc_num_open', models.IntegerField(blank=True, null=True)),
('awc_not_open_no_data', models.IntegerField(blank=True, null=True)),
('wer_weighed', models.IntegerField(blank=True, null=True)),
('wer_eligible', models.IntegerField(blank=True, null=True)),
('wer_score', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('thr_eligible_child', models.IntegerField(blank=True, null=True)),
('thr_rations_21_plus_distributed_child', models.IntegerField(blank=True, null=True)),
('thr_eligible_ccs', models.IntegerField(blank=True, null=True)),
('thr_rations_21_plus_distributed_ccs', models.IntegerField(blank=True, null=True)),
('thr_score', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('awc_score', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('num_awc_rank_functional', models.IntegerField(blank=True, null=True)),
('num_awc_rank_semi', models.IntegerField(blank=True, null=True)),
('num_awc_rank_non', models.IntegerField(blank=True, null=True)),
('cases_ccs_pregnant', models.IntegerField(blank=True, null=True)),
('cases_ccs_lactating', models.IntegerField(blank=True, null=True)),
('cases_child_health', models.IntegerField(blank=True, null=True)),
('usage_num_pse', models.IntegerField(blank=True, null=True)),
('usage_num_gmp', models.IntegerField(blank=True, null=True)),
('usage_num_thr', models.IntegerField(blank=True, null=True)),
('usage_num_home_visit', models.IntegerField(blank=True, null=True)),
('usage_num_bp_tri1', models.IntegerField(blank=True, null=True)),
('usage_num_bp_tri2', models.IntegerField(blank=True, null=True)),
('usage_num_bp_tri3', models.IntegerField(blank=True, null=True)),
('usage_num_pnc', models.IntegerField(blank=True, null=True)),
('usage_num_ebf', models.IntegerField(blank=True, null=True)),
('usage_num_cf', models.IntegerField(blank=True, null=True)),
('usage_num_delivery', models.IntegerField(blank=True, null=True)),
('usage_num_due_list_ccs', models.IntegerField(blank=True, null=True)),
('usage_num_due_list_child_health', models.IntegerField(blank=True, null=True)),
('usage_awc_num_active', models.IntegerField(blank=True, null=True)),
('usage_time_pse', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('usage_time_gmp', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('usage_time_bp', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('usage_time_pnc', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('usage_time_ebf', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('usage_time_cf', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('usage_time_of_day_pse', models.TimeField(blank=True, null=True)),
('usage_time_of_day_home_visit', models.TimeField(blank=True, null=True)),
('vhnd_immunization', models.IntegerField(blank=True, null=True)),
('vhnd_anc', models.IntegerField(blank=True, null=True)),
('vhnd_gmp', models.IntegerField(blank=True, null=True)),
('vhnd_num_pregnancy', models.IntegerField(blank=True, null=True)),
('vhnd_num_lactating', models.IntegerField(blank=True, null=True)),
('vhnd_num_mothers_6_12', models.IntegerField(blank=True, null=True)),
('vhnd_num_mothers_12', models.IntegerField(blank=True, null=True)),
('vhnd_num_fathers', models.IntegerField(blank=True, null=True)),
('ls_supervision_visit', models.IntegerField(blank=True, null=True)),
('ls_num_supervised', models.IntegerField(blank=True, null=True)),
('ls_awc_location_long', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('ls_awc_location_lat', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('ls_awc_present', models.IntegerField(blank=True, null=True)),
('ls_awc_open', models.IntegerField(blank=True, null=True)),
('ls_awc_not_open_aww_not_available', models.IntegerField(blank=True, null=True)),
('ls_awc_not_open_closed_early', models.IntegerField(blank=True, null=True)),
('ls_awc_not_open_holiday', models.IntegerField(blank=True, null=True)),
('ls_awc_not_open_unknown', models.IntegerField(blank=True, null=True)),
('ls_awc_not_open_other', models.IntegerField(blank=True, null=True)),
('infra_last_update_date', models.DateField(blank=True, null=True)),
('infra_type_of_building', models.TextField(blank=True, null=True)),
('infra_type_of_building_pucca', models.IntegerField(blank=True, null=True)),
('infra_type_of_building_semi_pucca', models.IntegerField(blank=True, null=True)),
('infra_type_of_building_kuccha', models.IntegerField(blank=True, null=True)),
('infra_type_of_building_partial_covered_space', models.IntegerField(blank=True, null=True)),
('infra_clean_water', models.IntegerField(blank=True, null=True)),
('infra_functional_toilet', models.IntegerField(blank=True, null=True)),
('infra_baby_weighing_scale', models.IntegerField(blank=True, null=True)),
('infra_flat_weighing_scale', models.IntegerField(blank=True, null=True)),
('infra_adult_weighing_scale', models.IntegerField(blank=True, null=True)),
('infra_cooking_utensils', models.IntegerField(blank=True, null=True)),
('infra_medicine_kits', models.IntegerField(blank=True, null=True)),
('infra_adequate_space_pse', models.IntegerField(blank=True, null=True)),
('usage_num_hh_reg', models.IntegerField(blank=True, null=True)),
('usage_num_add_person', models.IntegerField(blank=True, null=True)),
('usage_num_add_pregnancy', models.IntegerField(blank=True, null=True)),
('training_phase', models.IntegerField(blank=True, null=True)),
('trained_phase_1', models.IntegerField(blank=True, null=True)),
('trained_phase_2', models.IntegerField(blank=True, null=True)),
('trained_phase_3', models.IntegerField(blank=True, null=True)),
('trained_phase_4', models.IntegerField(blank=True, null=True)),
],
options={
'db_table': 'agg_awc_monthly',
'managed': False,
},
),
migrations.CreateModel(
name='AggCcsRecordMonthly',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('awc_id', models.TextField(blank=True, null=True)),
('awc_name', models.TextField(blank=True, null=True)),
('awc_site_code', models.TextField(blank=True, null=True)),
('supervisor_id', models.TextField(blank=True, null=True)),
('supervisor_name', models.TextField(blank=True, null=True)),
('supervisor_site_code', models.TextField(blank=True, null=True)),
('block_id', models.TextField(blank=True, null=True)),
('block_name', models.TextField(blank=True, null=True)),
('block_site_code', models.TextField(blank=True, null=True)),
('district_id', models.TextField(blank=True, null=True)),
('district_name', models.TextField(blank=True, null=True)),
('district_site_code', models.TextField(blank=True, null=True)),
('state_id', models.TextField(blank=True, null=True)),
('state_name', models.TextField(blank=True, null=True)),
('state_site_code', models.TextField(blank=True, null=True)),
('aggregation_level', models.IntegerField(blank=True, null=True)),
('month', models.DateField(blank=True, null=True)),
('ccs_status', models.TextField(blank=True, null=True)),
('trimester', models.TextField(blank=True, null=True)),
('caste', models.TextField(blank=True, null=True)),
('disabled', models.TextField(blank=True, null=True)),
('minority', models.TextField(blank=True, null=True)),
('resident', models.TextField(blank=True, null=True)),
('valid_in_month', models.IntegerField(blank=True, null=True)),
('lactating', models.IntegerField(blank=True, null=True)),
('pregnant', models.IntegerField(blank=True, null=True)),
('thr_eligible', models.IntegerField(blank=True, null=True)),
('rations_21_plus_distributed', models.IntegerField(blank=True, null=True)),
('tetanus_complete', models.IntegerField(blank=True, null=True)),
('delivered_in_month', models.IntegerField(blank=True, null=True)),
('anc1_received_at_delivery', models.IntegerField(blank=True, null=True)),
('anc2_received_at_delivery', models.IntegerField(blank=True, null=True)),
('anc3_received_at_delivery', models.IntegerField(blank=True, null=True)),
('anc4_received_at_delivery', models.IntegerField(blank=True, null=True)),
('registration_trimester_at_delivery', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('using_ifa', models.IntegerField(blank=True, null=True)),
('ifa_consumed_last_seven_days', models.IntegerField(blank=True, null=True)),
('anemic_normal', models.IntegerField(blank=True, null=True)),
('anemic_moderate', models.IntegerField(blank=True, null=True)),
('anemic_severe', models.IntegerField(blank=True, null=True)),
('anemic_unknown', models.IntegerField(blank=True, null=True)),
('extra_meal', models.IntegerField(blank=True, null=True)),
('resting_during_pregnancy', models.IntegerField(blank=True, null=True)),
('bp1_complete', models.IntegerField(blank=True, null=True)),
('bp2_complete', models.IntegerField(blank=True, null=True)),
('bp3_complete', models.IntegerField(blank=True, null=True)),
('pnc_complete', models.IntegerField(blank=True, null=True)),
('trimester_2', models.IntegerField(blank=True, null=True)),
('trimester_3', models.IntegerField(blank=True, null=True)),
('postnatal', models.IntegerField(blank=True, null=True)),
('counsel_bp_vid', models.IntegerField(blank=True, null=True)),
('counsel_preparation', models.IntegerField(blank=True, null=True)),
('counsel_immediate_bf', models.IntegerField(blank=True, null=True)),
('counsel_fp_vid', models.IntegerField(blank=True, null=True)),
('counsel_immediate_conception', models.IntegerField(blank=True, null=True)),
('counsel_accessible_postpartum_fp', models.IntegerField(blank=True, null=True)),
],
options={
'db_table': 'agg_ccs_record_monthly',
'managed': False,
},
),
migrations.CreateModel(
name='AggChildHealthMonthly',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('awc_id', models.TextField(blank=True, null=True)),
('awc_name', models.TextField(blank=True, null=True)),
('awc_site_code', models.TextField(blank=True, null=True)),
('supervisor_id', models.TextField(blank=True, null=True)),
('supervisor_name', models.TextField(blank=True, null=True)),
('supervisor_site_code', models.TextField(blank=True, null=True)),
('block_id', models.TextField(blank=True, null=True)),
('block_name', models.TextField(blank=True, null=True)),
('block_site_code', models.TextField(blank=True, null=True)),
('district_id', models.TextField(blank=True, null=True)),
('district_name', models.TextField(blank=True, null=True)),
('district_site_code', models.TextField(blank=True, null=True)),
('state_id', models.TextField(blank=True, null=True)),
('state_name', models.TextField(blank=True, null=True)),
('state_site_code', models.TextField(blank=True, null=True)),
('aggregation_level', models.IntegerField(blank=True, null=True)),
('month', models.DateField(blank=True, null=True)),
('month_display', models.TextField(blank=True, null=True)),
('gender', models.TextField(blank=True, null=True)),
('age_tranche', models.TextField(blank=True, null=True)),
('caste', models.TextField(blank=True, null=True)),
('disabled', models.TextField(blank=True, null=True)),
('minority', models.TextField(blank=True, null=True)),
('resident', models.TextField(blank=True, null=True)),
('valid_in_month', models.IntegerField(blank=True, null=True)),
('nutrition_status_weighed', models.IntegerField(blank=True, null=True)),
('nutrition_status_unweighed', models.IntegerField(blank=True, null=True)),
('nutrition_status_normal', models.IntegerField(blank=True, null=True)),
('nutrition_status_moderately_underweight', models.IntegerField(blank=True, null=True)),
('nutrition_status_severely_underweight', models.IntegerField(blank=True, null=True)),
('wer_eligible', models.IntegerField(blank=True, null=True)),
('thr_eligible', models.IntegerField(blank=True, null=True)),
('rations_21_plus_distributed', models.IntegerField(blank=True, null=True)),
('pse_eligible', models.IntegerField(blank=True, null=True)),
('pse_attended_16_days', models.IntegerField(blank=True, null=True)),
('born_in_month', models.IntegerField(blank=True, null=True)),
('low_birth_weight_in_month', models.IntegerField(blank=True, null=True)),
('bf_at_birth', models.IntegerField(blank=True, null=True)),
('ebf_eligible', models.IntegerField(blank=True, null=True)),
('ebf_in_month', models.IntegerField(blank=True, null=True)),
('cf_eligible', models.IntegerField(blank=True, null=True)),
('cf_in_month', models.IntegerField(blank=True, null=True)),
('cf_diet_diversity', models.IntegerField(blank=True, null=True)),
('cf_diet_quantity', models.IntegerField(blank=True, null=True)),
('cf_demo', models.IntegerField(blank=True, null=True)),
('cf_handwashing', models.IntegerField(blank=True, null=True)),
('counsel_increase_food_bf', models.IntegerField(blank=True, null=True)),
('counsel_manage_breast_problems', models.IntegerField(blank=True, null=True)),
('counsel_ebf', models.IntegerField(blank=True, null=True)),
('counsel_adequate_bf', models.IntegerField(blank=True, null=True)),
('counsel_pediatric_ifa', models.IntegerField(blank=True, null=True)),
('counsel_play_cf_video', models.IntegerField(blank=True, null=True)),
('fully_immunized_eligible', models.IntegerField(blank=True, null=True)),
('fully_immunized_on_time', models.IntegerField(blank=True, null=True)),
('fully_immunized_late', models.IntegerField(blank=True, null=True)),
],
options={
'db_table': 'agg_child_health_monthly',
'managed': False,
},
),
migrations.CreateModel(
name='AggDailyUsageView',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('awc_id', models.TextField(blank=True, null=True)),
('awc_name', models.TextField(blank=True, null=True)),
('awc_site_code', models.TextField(blank=True, null=True)),
('supervisor_id', models.TextField(blank=True, null=True)),
('supervisor_name', models.TextField(blank=True, null=True)),
('supervisor_site_code', models.TextField(blank=True, null=True)),
('block_id', models.TextField(blank=True, null=True)),
('block_name', models.TextField(blank=True, null=True)),
('block_site_code', models.TextField(blank=True, null=True)),
('district_id', models.TextField(blank=True, null=True)),
('district_name', models.TextField(blank=True, null=True)),
('district_site_code', models.TextField(blank=True, null=True)),
('state_id', models.TextField(blank=True, null=True)),
('state_name', models.TextField(blank=True, null=True)),
('state_site_code', models.TextField(blank=True, null=True)),
('aggregation_level', models.IntegerField(blank=True, null=True)),
('date', models.DateField(blank=True, null=True)),
('daily_attendance_open', models.IntegerField(blank=True, null=True)),
('usage_num_forms', models.IntegerField(blank=True, null=True)),
('usage_num_home_visit', models.IntegerField(blank=True, null=True)),
('usage_num_gmp', models.IntegerField(blank=True, null=True)),
('usage_num_thr', models.IntegerField(blank=True, null=True)),
('awc_count', models.IntegerField(blank=True, null=True)),
],
options={
'db_table': 'agg_daily_usage_view',
'managed': False,
},
),
migrations.CreateModel(
name='AggThrMonthly',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('awc_id', models.TextField(blank=True, null=True)),
('awc_name', models.TextField(blank=True, null=True)),
('awc_site_code', models.TextField(blank=True, null=True)),
('supervisor_id', models.TextField(blank=True, null=True)),
('supervisor_name', models.TextField(blank=True, null=True)),
('supervisor_site_code', models.TextField(blank=True, null=True)),
('block_id', models.TextField(blank=True, null=True)),
('block_name', models.TextField(blank=True, null=True)),
('block_site_code', models.TextField(blank=True, null=True)),
('district_id', models.TextField(blank=True, null=True)),
('district_name', models.TextField(blank=True, null=True)),
('district_site_code', models.TextField(blank=True, null=True)),
('state_id', models.TextField(blank=True, null=True)),
('state_name', models.TextField(blank=True, null=True)),
('state_site_code', models.TextField(blank=True, null=True)),
('aggregation_level', models.IntegerField(blank=True, null=True)),
('month', models.DateField(blank=True, null=True)),
('beneficiary_type', models.TextField(blank=True, null=True)),
('caste', models.TextField(blank=True, null=True)),
('disabled', models.TextField(blank=True, null=True)),
('minority', models.TextField(blank=True, null=True)),
('resident', models.TextField(blank=True, null=True)),
('thr_eligible', models.IntegerField(blank=True, null=True)),
('rations_21_plus_distributed', models.IntegerField(blank=True, null=True)),
],
options={
'db_table': 'agg_thr_monthly',
'managed': False,
},
),
migrations.CreateModel(
name='AwcLocation',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('doc_id', models.TextField()),
('awc_name', models.TextField(blank=True, null=True)),
('awc_site_code', models.TextField(blank=True, null=True)),
('supervisor_id', models.TextField()),
('supervisor_name', models.TextField(blank=True, null=True)),
('supervisor_site_code', models.TextField(blank=True, null=True)),
('block_id', models.TextField()),
('block_name', models.TextField(blank=True, null=True)),
('block_site_code', models.TextField(blank=True, null=True)),
('district_id', models.TextField()),
('district_name', models.TextField(blank=True, null=True)),
('district_site_code', models.TextField(blank=True, null=True)),
('state_id', models.TextField()),
('state_name', models.TextField(blank=True, null=True)),
('state_site_code', models.TextField(blank=True, null=True)),
('aggregation_level', models.IntegerField(blank=True, null=True)),
],
options={
'db_table': 'awc_location',
'managed': False,
},
),
migrations.CreateModel(
name='AwcLocationMonths',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('awc_id', models.TextField(blank=True, null=True)),
('awc_name', models.TextField(blank=True, null=True)),
('awc_site_code', models.TextField(blank=True, null=True)),
('supervisor_id', models.TextField(blank=True, null=True)),
('supervisor_name', models.TextField(blank=True, null=True)),
('supervisor_site_code', models.TextField(blank=True, null=True)),
('block_id', models.TextField(blank=True, null=True)),
('block_name', models.TextField(blank=True, null=True)),
('block_site_code', models.TextField(blank=True, null=True)),
('district_id', models.TextField(blank=True, null=True)),
('district_name', models.TextField(blank=True, null=True)),
('district_site_code', models.TextField(blank=True, null=True)),
('state_id', models.TextField(blank=True, null=True)),
('state_name', models.TextField(blank=True, null=True)),
('state_site_code', models.TextField(blank=True, null=True)),
('aggregation_level', models.IntegerField(blank=True, null=True)),
('month', models.DateField(blank=True, null=True)),
('month_display', models.TextField(blank=True, null=True)),
],
options={
'db_table': 'awc_location_months',
'managed': False,
},
),
migrations.CreateModel(
name='DailyAttendanceView',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('awc_id', models.TextField(blank=True, null=True)),
('awc_name', models.TextField(blank=True, null=True)),
('awc_site_code', models.TextField(blank=True, null=True)),
('supervisor_id', models.TextField(blank=True, null=True)),
('supervisor_name', models.TextField(blank=True, null=True)),
('supervisor_site_code', models.TextField(blank=True, null=True)),
('block_id', models.TextField(blank=True, null=True)),
('block_name', models.TextField(blank=True, null=True)),
('block_site_code', models.TextField(blank=True, null=True)),
('district_id', models.TextField(blank=True, null=True)),
('district_name', models.TextField(blank=True, null=True)),
('district_site_code', models.TextField(blank=True, null=True)),
('state_id', models.TextField(blank=True, null=True)),
('state_name', models.TextField(blank=True, null=True)),
('state_site_code', models.TextField(blank=True, null=True)),
('aggregation_level', models.IntegerField(blank=True, null=True)),
('month', models.DateField(blank=True, null=True)),
('doc_id', models.TextField(blank=True, null=True)),
('pse_date', models.DateField(blank=True, null=True)),
('awc_open_count', models.IntegerField(blank=True, null=True)),
('count', models.IntegerField(blank=True, null=True)),
('eligible_children', models.IntegerField(blank=True, null=True)),
('attended_children', models.IntegerField(blank=True, null=True)),
('attended_children_percent', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('form_location', models.TextField(blank=True, null=True)),
('form_location_lat', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('form_location_long', models.DecimalField(blank=True, decimal_places=65535, max_digits=65535, null=True)),
('image_name', models.TextField(blank=True, null=True)),
],
options={
'db_table': 'daily_attendance_view',
'managed': False,
},
),
]
| 70.264368 | 139 | 0.604744 | 3,277 | 30,565 | 5.433628 | 0.085444 | 0.167303 | 0.229249 | 0.299787 | 0.925755 | 0.91705 | 0.892284 | 0.766146 | 0.665001 | 0.580254 | 0 | 0.009757 | 0.248912 | 30,565 | 434 | 140 | 70.426267 | 0.765867 | 0.002225 | 0 | 0.504695 | 1 | 0 | 0.182331 | 0.056109 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.007042 | 0 | 0.016432 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1c460e435bc0e519d5da56e295c2516fae50f58a | 2,381 | py | Python | pgmpy/exceptions/Exceptions.py | NunoEdgarGFlowHub/pgmpy | ac0ecc8f5bdd14999c386c6b00a3ce77407b83ce | [
"MIT"
] | 1 | 2016-08-27T18:30:57.000Z | 2016-08-27T18:30:57.000Z | pgmpy/exceptions/Exceptions.py | NunoEdgarGFlowHub/pgmpy | ac0ecc8f5bdd14999c386c6b00a3ce77407b83ce | [
"MIT"
] | null | null | null | pgmpy/exceptions/Exceptions.py | NunoEdgarGFlowHub/pgmpy | ac0ecc8f5bdd14999c386c6b00a3ce77407b83ce | [
"MIT"
] | 1 | 2016-08-27T18:31:00.000Z | 2016-08-27T18:31:00.000Z | #!/usr/bin/env python3
"""Contains all the user-defined exceptions created for PgmPy"""
class MissingParentsError(Exception):
def __init__(self, *missing):
self.missing = missing
def __str__(self):
return repr("Parents are missing: " + str(self.missing))
class ExtraParentsError(Exception):
def __init__(self, *extra):
self.extra = extra
def __str__(self):
return repr("Following are not parents: " + str(self.extra))
class MissingStatesError(Exception):
def __init__(self, *missing):
self.missing = missing
def __str__(self):
return repr("States are missing: " + str(self.missing))
class ExtraStatesError(Exception):
def __init__(self, *extra):
self.extra = extra
def __str__(self):
return repr("Following are not states: " + str(self.extra))
class SelfLoopError(Exception):
def __init__(self, *extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
class CycleError(Exception):
def __init__(self, *extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
class StateError(Exception):
def __init__(self, *extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
class NodeNotFoundError(Exception):
def __init__(self, *extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
class ScopeError(Exception):
def __init__(self, extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
class SizeError(Exception):
def __init__(self, extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
class CardinalityError(Exception):
def __init__(self, extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
class RequiredError(Exception):
def __init__(self, extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
class ModelError(Exception):
def __init__(self, extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
class InvalidValueError(Exception):
def __init__(self, extra):
self.extra = extra
def __str__(self):
return repr(str(self.extra))
| 20.704348 | 68 | 0.642167 | 279 | 2,381 | 5.078853 | 0.150538 | 0.228652 | 0.15808 | 0.197601 | 0.769936 | 0.769936 | 0.729005 | 0.729005 | 0.729005 | 0.729005 | 0 | 0.000555 | 0.242755 | 2,381 | 114 | 69 | 20.885965 | 0.785358 | 0.033599 | 0 | 0.742857 | 0 | 0 | 0.040959 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.2 | 0.8 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 11 |
1c8353e57f6cbc0d69f2c2272776460a7e6e9aac | 11,251 | py | Python | oops_fhir/r4/code_system/example_service_place_codes.py | Mikuana/oops_fhir | 77963315d123756b7d21ae881f433778096a1d25 | [
"MIT"
] | null | null | null | oops_fhir/r4/code_system/example_service_place_codes.py | Mikuana/oops_fhir | 77963315d123756b7d21ae881f433778096a1d25 | [
"MIT"
] | null | null | null | oops_fhir/r4/code_system/example_service_place_codes.py | Mikuana/oops_fhir | 77963315d123756b7d21ae881f433778096a1d25 | [
"MIT"
] | null | null | null | from pathlib import Path
from fhir.resources.codesystem import CodeSystem
from oops_fhir.utils import CodeSystemConcept
__all__ = ["ExampleServicePlaceCodes"]
_resource = CodeSystem.parse_file(Path(__file__).with_suffix(".json"))
class ExampleServicePlaceCodes:
"""
Example Service Place Codes
This value set includes a smattering of Service Place codes.
Status: draft - Version: 4.0.1
Copyright This is an example set.
http://terminology.hl7.org/CodeSystem/ex-serviceplace
"""
zero1 = CodeSystemConcept(
{
"code": "01",
"definition": "A facility or location where drugs and other medically related items and services are sold, dispensed, or otherwise provided directly to patients.",
"display": "Pharmacy",
}
)
"""
Pharmacy
A facility or location where drugs and other medically related items and services are sold, dispensed, or otherwise provided directly to patients.
"""
zero3 = CodeSystemConcept(
{
"code": "03",
"definition": "A facility whose primary purpose is education.",
"display": "School",
}
)
"""
School
A facility whose primary purpose is education.
"""
zero4 = CodeSystemConcept(
{
"code": "04",
"definition": "A facility or location whose primary purpose is to provide temporary housing to homeless individuals (e.g., emergency shelters, individual or family shelters).",
"display": "Homeless Shelter",
}
)
"""
Homeless Shelter
A facility or location whose primary purpose is to provide temporary housing to homeless individuals (e.g., emergency shelters, individual or family shelters).
"""
zero5 = CodeSystemConcept(
{
"code": "05",
"definition": "A facility or location, owned and operated by the Indian Health Service, which provides diagnostic, therapeutic (surgical and nonsurgical), and rehabilitation services to American Indians and Alaska Natives who do not require hospitalization.",
"display": "Indian Health Service Free-standing Facility",
}
)
"""
Indian Health Service Free-standing Facility
A facility or location, owned and operated by the Indian Health Service, which provides diagnostic, therapeutic (surgical and nonsurgical), and rehabilitation services to American Indians and Alaska Natives who do not require hospitalization.
"""
zero6 = CodeSystemConcept(
{
"code": "06",
"definition": "A facility or location, owned and operated by the Indian Health Service, which provides diagnostic, therapeutic (surgical and nonsurgical), and rehabilitation services rendered by, or under the supervision of, physicians to American Indians and Alaska Natives admitted as inpatients or outpatients.",
"display": "Indian Health Service Provider-based Facility",
}
)
"""
Indian Health Service Provider-based Facility
A facility or location, owned and operated by the Indian Health Service, which provides diagnostic, therapeutic (surgical and nonsurgical), and rehabilitation services rendered by, or under the supervision of, physicians to American Indians and Alaska Natives admitted as inpatients or outpatients.
"""
zero7 = CodeSystemConcept(
{
"code": "07",
"definition": "A facility or location owned and operated by a federally recognized American Indian or Alaska Native tribe or tribal organization under a 638 agreement, which provides diagnostic, therapeutic (surgical and nonsurgical), and rehabilitation services to tribal members who do not require hospitalization.",
"display": "Tribal 638 Free-Standing Facility",
}
)
"""
Tribal 638 Free-Standing Facility
A facility or location owned and operated by a federally recognized American Indian or Alaska Native tribe or tribal organization under a 638 agreement, which provides diagnostic, therapeutic (surgical and nonsurgical), and rehabilitation services to tribal members who do not require hospitalization.
"""
zero8 = CodeSystemConcept(
{
"code": "08",
"definition": "A facility or location owned and operated by a federally recognized American Indian or Alaska Native tribe or tribal organization under a 638 agreement, which provides diagnostic, therapeutic (surgical and nonsurgical), and rehabilitation services to tribal members admitted as inpatients or outpatients.",
"display": "Tribal 638 Provider-Based Facility",
}
)
"""
Tribal 638 Provider-Based Facility
A facility or location owned and operated by a federally recognized American Indian or Alaska Native tribe or tribal organization under a 638 agreement, which provides diagnostic, therapeutic (surgical and nonsurgical), and rehabilitation services to tribal members admitted as inpatients or outpatients.
"""
zero9 = CodeSystemConcept(
{
"code": "09",
"definition": "A prison, jail, reformatory, work farm, detention center, or any other similar facility maintained by either Federal, State or local authorities for the purpose of confinement or rehabilitation of adult or juvenile criminal offenders.",
"display": "Prison/Correctional Facility",
}
)
"""
Prison/Correctional Facility
A prison, jail, reformatory, work farm, detention center, or any other similar facility maintained by either Federal, State or local authorities for the purpose of confinement or rehabilitation of adult or juvenile criminal offenders.
"""
one1 = CodeSystemConcept(
{
"code": "11",
"definition": "Location, other than a hospital, skilled nursing facility (SNF), military treatment facility, community health center, State or local public health clinic, or intermediate care facility (ICF), where the health professional routinely provides health examinations, diagnosis, and treatment of illness or injury on an ambulatory basis.",
"display": "Office",
}
)
"""
Office
Location, other than a hospital, skilled nursing facility (SNF), military treatment facility, community health center, State or local public health clinic, or intermediate care facility (ICF), where the health professional routinely provides health examinations, diagnosis, and treatment of illness or injury on an ambulatory basis.
"""
one2 = CodeSystemConcept(
{
"code": "12",
"definition": "Location, other than a hospital or other facility, where the patient receives care in a private residence.",
"display": "Home",
}
)
"""
Home
Location, other than a hospital or other facility, where the patient receives care in a private residence.
"""
one3 = CodeSystemConcept(
{
"code": "13",
"definition": "Congregate residential facility with self-contained living units providing assessment of each resident's needs and on-site support 24 hours a day, 7 days a week, with the capacity to deliver or arrange for services including some health care and other services.",
"display": "Assisted Living Fa",
}
)
"""
Assisted Living Fa
Congregate residential facility with self-contained living units providing assessment of each resident's needs and on-site support 24 hours a day, 7 days a week, with the capacity to deliver or arrange for services including some health care and other services.
"""
one4 = CodeSystemConcept(
{
"code": "14",
"definition": "A residence, with shared living areas, where clients receive supervision and other services such as social and/or behavioral services, custodial service, and minimal services (e.g., medication administration).",
"display": "Group Home",
}
)
"""
Group Home
A residence, with shared living areas, where clients receive supervision and other services such as social and/or behavioral services, custodial service, and minimal services (e.g., medication administration).
"""
one5 = CodeSystemConcept(
{
"code": "15",
"definition": "A facility/unit that moves from place-to-place equipped to provide preventive, screening, diagnostic, and/or treatment services.",
"display": "Mobile Unit",
}
)
"""
Mobile Unit
A facility/unit that moves from place-to-place equipped to provide preventive, screening, diagnostic, and/or treatment services.
"""
one9 = CodeSystemConcept(
{
"code": "19",
"definition": "portion of an off-campus hospital provider-based department which provides diagnostic, therapeutic (both surgical and nonsurgical), and rehabilitation services to sick or injured persons who do not require hospitalization or institutionalization.",
"display": "Off Campus-Outpatient Hospital",
}
)
"""
Off Campus-Outpatient Hospital
portion of an off-campus hospital provider-based department which provides diagnostic, therapeutic (both surgical and nonsurgical), and rehabilitation services to sick or injured persons who do not require hospitalization or institutionalization.
"""
two0 = CodeSystemConcept(
{
"code": "20",
"definition": "Location, distinct from a hospital emergency room, an office, or a clinic, whose purpose is to diagnose and treat illness or injury for unscheduled, ambulatory patients seeking immediate medical attention.",
"display": "Urgent Care Facility",
}
)
"""
Urgent Care Facility
Location, distinct from a hospital emergency room, an office, or a clinic, whose purpose is to diagnose and treat illness or injury for unscheduled, ambulatory patients seeking immediate medical attention.
"""
two1 = CodeSystemConcept(
{
"code": "21",
"definition": "A facility, other than psychiatric, which primarily provides diagnostic, therapeutic (both surgical and nonsurgical), and rehabilitation services by, or under, the supervision of physicians to patients admitted for a variety of medical conditions.",
"display": "Inpatient Hospital",
}
)
"""
Inpatient Hospital
A facility, other than psychiatric, which primarily provides diagnostic, therapeutic (both surgical and nonsurgical), and rehabilitation services by, or under, the supervision of physicians to patients admitted for a variety of medical conditions.
"""
four1 = CodeSystemConcept(
{
"code": "41",
"definition": "A land vehicle specifically designed, equipped and staffed for lifesaving and transporting the sick or injured.",
"display": "Ambulance—Land",
}
)
"""
Ambulance—Land
A land vehicle specifically designed, equipped and staffed for lifesaving and transporting the sick or injured.
"""
class Meta:
resource = _resource
| 45.184739 | 361 | 0.690161 | 1,284 | 11,251 | 6.038941 | 0.228972 | 0.020892 | 0.017023 | 0.029404 | 0.81919 | 0.80668 | 0.785401 | 0.775342 | 0.775342 | 0.775342 | 0 | 0.009953 | 0.240956 | 11,251 | 248 | 362 | 45.366935 | 0.897775 | 0.018754 | 0 | 0 | 0 | 0.110236 | 0.64641 | 0.006702 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.023622 | 0 | 0.173228 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1c84a77d3c6643ddfd4e91ce7298fdbc0fe4c956 | 17,206 | py | Python | test/test_cases.py | IntelEuclid/euclid_configuration_node | e46af8d31512805bd22136ab12460334cc3189bd | [
"BSD-3-Clause"
] | 1 | 2019-04-18T06:03:19.000Z | 2019-04-18T06:03:19.000Z | test/test_cases.py | IntelEuclid/euclid_configuration_node | e46af8d31512805bd22136ab12460334cc3189bd | [
"BSD-3-Clause"
] | null | null | null | test/test_cases.py | IntelEuclid/euclid_configuration_node | e46af8d31512805bd22136ab12460334cc3189bd | [
"BSD-3-Clause"
] | 2 | 2018-01-31T10:03:08.000Z | 2020-04-22T05:12:30.000Z |
#!/usr/bin/env python
import unittest
import rospy
import subprocess
import time
import os.path
import os
from configuration_node.srv import *
class Utils():
@staticmethod
def killSystem():
print "killing system!"
subprocess.call(['/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/killSystem.bash'])
class OOBENodesNotEmpty(unittest.TestCase):
def runTest(self):
Utils.killSystem()
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_1.xml'])
oobe_service = rospy.ServiceProxy('/cs/get_oobe_nodes',GetOOBENodes)
rospy.wait_for_service('/cs/get_oobe_nodes')
res = oobe_service()
self.assertTrue(len(res.nodes) == 1)
self.assertEqual( res.nodes[0], "Test 1 node")
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(2)
class OOBENodesEmpty(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_2.xml'])
oobe_service = rospy.ServiceProxy('/cs/get_oobe_nodes',GetOOBENodes)
rospy.wait_for_service('/cs/get_oobe_nodes')
res = oobe_service()
self.assertTrue(len(res.nodes) == 0)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(2)
class SystemNodesAll(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_1.xml'])
oobe_service = rospy.ServiceProxy('/cs/get_system_nodes',GetSystemNodes)
rospy.wait_for_service('/cs/get_system_nodes')
res = oobe_service('all')
self.assertTrue(len(res.nodes) == 1)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(2)
class SystemNodesActive(unittest.TestCase):
def runTest(self):
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
Utils.killSystem()
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_1.xml'])
oobe_service = rospy.ServiceProxy('/cs/get_system_nodes',GetSystemNodes)
rospy.wait_for_service('/cs/get_system_nodes')
res = oobe_service('active')
self.assertTrue(len(res.nodes) == 0)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(2)
class SystemNodesWrong(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_1.xml'])
oobe_service = rospy.ServiceProxy('/cs/get_system_nodes',GetSystemNodes)
rospy.wait_for_service('/cs/get_system_nodes')
res = oobe_service('koko')
self.assertTrue(len(res.nodes) == 0)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(2)
class RunScenarioValid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_1.xml'])
oobe_service = rospy.ServiceProxy('/cs/run_scenario',RunScenario)
rospy.wait_for_service('/cs/run_scenario')
res = oobe_service('Camera')
self.assertTrue(res.res)
subprocess.call(['rosnode','kill','/CsScenarioManager'])
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(8)
Utils.killSystem()
class RunScenarioNotValid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_1.xml'])
time.sleep(10)
oobe_service = rospy.ServiceProxy('/cs/run_scenario',RunScenario)
rospy.wait_for_service('/cs/run_scenario')
res = oobe_service('koko')
self.assertFalse(res.res)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(2)
class GetActiveScenarioValid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_3.xml'])
time.sleep(3)
oobe_service = rospy.ServiceProxy('/cs/get_active_scenario',GetActiveScenario)
rospy.wait_for_service('/cs/get_active_scenario')
res = oobe_service()
self.assertEqual(res.name,'Camera')
time.sleep(2)
class GetActiveScenarioUnvalid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_1.xml'])
time.sleep(5)
oobe_service = rospy.ServiceProxy('/cs/get_active_scenario',GetActiveScenario)
rospy.wait_for_service('/cs/get_active_scenario')
res = oobe_service()
self.assertEqual(res.name,'')
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(2)
class SetActiveScenarioValid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
subprocess.call(['cp','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_4.xml','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
time.sleep(2)
oobe_service = rospy.ServiceProxy('/cs/set_active_scenario',SetActiveScenario)
rospy.wait_for_service('/cs/set_active_scenario')
res = oobe_service('Camera')
self.assertTrue(res.res)
print "Doing the grep.."
p = subprocess.Popen(['grep', '-A 3','<active_scenario>','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'],stdout=subprocess.PIPE,stderr=subprocess.PIPE)
output, err = p.communicate()
print "Output: " + output
expStr=" <active_scenario>\n <name>Camera</name>\n </active_scenario>\n <system_nodes>\n"
self.assertEqual(output,expStr)
class SetActiveScenarioUnvalid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
subprocess.call(['cp','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_4.xml','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
time.sleep(2)
oobe_service = rospy.ServiceProxy('/cs/set_active_scenario',SetActiveScenario)
rospy.wait_for_service('/cs/set_active_scenario')
res = oobe_service('koko')
self.assertFalse(res.res)
p = subprocess.Popen(['grep', '-A 3','<active_scenario>','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'],stdout=subprocess.PIPE,stderr=subprocess.PIPE)
output, err = p.communicate()
expStr=" <active_scenario>\n <name></name>\n </active_scenario>\n <system_nodes>\n"
self.assertEqual(output,expStr)
class GetScenariosTest(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_4.xml'])
time.sleep(2)
oobe_service = rospy.ServiceProxy('/cs/get_scenarios',GetScenarios)
rospy.wait_for_service('/cs/get_scenarios')
res = oobe_service()
self.assertEqual(len(res.scenarios),1)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(1)
class GetRobotsTest(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_4.xml'])
time.sleep(5)
oobe_service = rospy.ServiceProxy('/cs/get_robots',GetRobots)
rospy.wait_for_service('/cs/get_robots')
res = oobe_service()
self.assertEqual(len(res.robots),1)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(1)
class RemoveScenarioValid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
subprocess.call(['cp','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_4.xml','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
time.sleep(5)
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
time.sleep(2)
oobe_service = rospy.ServiceProxy('/cs/remove_scenarios',RemoveScenarios)
rospy.wait_for_service('/cs/remove_scenarios')
res = oobe_service(['Camera'])
self.assertTrue(res.res)
p = subprocess.Popen(['grep','-A 1','<scenarios>','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'],stdout=subprocess.PIPE,stderr=subprocess.PIPE)
output, err = p.communicate()
print "output: " + output
expStr=" <scenarios>\n </scenarios>\n"
self.assertEqual(output,expStr)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(1)
class RemoveScenarioUnvalid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#subprocess.call(['rosnode','kill','/CsConfigurationNode'])
subprocess.call(['cp','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_4.xml','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
time.sleep(5)
oobe_service = rospy.ServiceProxy('/cs/remove_scenarios',RemoveScenarios)
rospy.wait_for_service('/cs/remove_scenarios')
res = oobe_service(['koko'])
self.assertFalse(res.res)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(1)
class CreateScenarioIllegal(unittest.TestCase):
def runTest(self):
Utils.killSystem()
# subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_4.xml'])
time.sleep(2)
oobe_service = rospy.ServiceProxy('/cs/create_scenario',CreateScenario)
rospy.wait_for_service('/cs/create_scenario')
res = oobe_service('koko space',['Test 1 node'])
self.assertFalse(res.res)
res = oobe_service('koko$dollar',['Test 1 node'])
self.assertFalse(res.res)
res = oobe_service('Camera',['Test 1 node'])
self.assertFalse(res.res)
res = oobe_service('Valid',[])
self.assertFalse(res.res)
res = oobe_service('Valid',['WhoAmI?'])
self.assertFalse(res.res)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(1)
class CreateScenarioValid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
# subprocess.call(['rosnode','kill','/CsConfigurationNode'])
subprocess.call(['cp','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_4.xml','/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_t.xml'])
time.sleep(2)
oobe_service = rospy.ServiceProxy('/cs/create_scenario',CreateScenario)
rospy.wait_for_service('/cs/create_scenario')
res = oobe_service('TestScenario',['Test 1 node'])
self.assertTrue(res.res)
#rospy.wait_for_service('/cs/save_configuration')
time.sleep(3)
subprocess.call(['rosnode','kill','/CsScenarioManager'])
time.sleep(1)
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(1)
class GenerateArduinoTest(unittest.TestCase):
def runTest(self):
Utils.killSystem()
# subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_4.xml'])
time.sleep(2)
oobe_service = rospy.ServiceProxy('/cs/generate_arduino_library',GenerateArduinoLibrary)
rospy.wait_for_service('/cs/generate_arduino_library')
subprocess.call(['rm','-rf','/intel/euclid/public_html/files/euclid_lib.zip'])
res = oobe_service()
self.assertTrue(res.res)
self.assertTrue(os.path.isfile('/intel/euclid/public_html/files/euclid_lib.zip'))
subprocess.call(['kill','-9',str(proc.pid)])
time.sleep(1)
class BadConfigFile(unittest.TestCase):
def runTest(self):
Utils.killSystem()
# subprocess.call(['rosnode','kill','/CsConfigurationNode'])
proc = subprocess.Popen(['rosrun','configuration_node','CsConfigModule.py' ,'/intel/euclid/euclid_ws/src/system_nodes/configuration_node/test/test_config_bad.xml'])
time.sleep(2)
rospy.wait_for_service('/cs/generate_arduino_library')
time.sleep(2)
subprocess.call(['rosnode','kill','/CsDeviceMonitor'])
subprocess.call(['rosnode','kill','/CsNetworkManager'])
self.assertTrue(True)
class ScenarioManagerError(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#ret = subprocess.call(['ps auwwxf | grep CsScenarioManager | grep -v grep'],shell=True)
proc = subprocess.Popen(['rosrun','configuration_node','CsScenarioManager.py' ,'scenario:=koko'])
time.sleep(5)
ret = subprocess.call(['ps auwwxf | grep CsScenarioManager | grep -v grep'],shell=True)
self.assertEqual(ret,1)
class ScenarioManagerValid(unittest.TestCase):
def runTest(self):
Utils.killSystem()
#ret = subprocess.call(['ps auwwxf | grep CsScenarioManager | grep -v grep'],shell=True)
proc = subprocess.Popen(['rosrun','configuration_node','CsScenarioManager.py' ,'Camera'])
time.sleep(5)
ret = subprocess.call(['ps auwwxf | grep CsScenarioManager | grep -v grep'],shell=True)
self.assertEqual(ret,0)
Utils.killSystem()
class RunTests(unittest.TestSuite):
def __init__(self):
super(RunTests,self).__init__()
self.addTest(OOBENodesNotEmpty())
self.addTest(OOBENodesEmpty())
self.addTest(SystemNodesAll())
self.addTest(SystemNodesActive())
self.addTest(SystemNodesWrong())
self.addTest(RunScenarioValid())
self.addTest(RunScenarioNotValid())
self.addTest(GetActiveScenarioValid())
self.addTest(GetActiveScenarioUnvalid())
self.addTest(SetActiveScenarioValid())
self.addTest(SetActiveScenarioUnvalid())
self.addTest(GetScenariosTest())
self.addTest(GetRobotsTest())
self.addTest(RemoveScenarioValid())
self.addTest(RemoveScenarioUnvalid())
self.addTest(CreateScenarioIllegal())
self.addTest(CreateScenarioValid())
self.addTest(GenerateArduinoTest())
self.addTest(BadConfigFile())
self.addTest(ScenarioManagerError())
self.addTest(ScenarioManagerValid())
| 46.882834 | 201 | 0.681681 | 1,968 | 17,206 | 5.785061 | 0.080285 | 0.082126 | 0.049275 | 0.055072 | 0.837154 | 0.811946 | 0.805182 | 0.794554 | 0.774089 | 0.763988 | 0 | 0.006078 | 0.168139 | 17,206 | 366 | 202 | 47.010929 | 0.789352 | 0.074974 | 0 | 0.65614 | 0 | 0 | 0.328403 | 0.194285 | 0 | 0 | 0 | 0 | 0.105263 | 0 | null | null | 0 | 0.024561 | null | null | 0.014035 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c72299ef7d1ebe073b9ddbf5d3f438c4f4816859 | 144 | py | Python | {{cookiecutter.project_slug}}/apps/cms/tests/conftest.py | ukwahlula/django-server-boilerplate | 6bd4b83511ea7e3370349957cf0b6dbff4003ab1 | [
"BSD-3-Clause"
] | 2 | 2020-10-30T09:47:07.000Z | 2020-10-30T09:48:11.000Z | {{cookiecutter.project_slug}}/apps/cms/tests/conftest.py | ukwahlula/django-server-boilerplate | 6bd4b83511ea7e3370349957cf0b6dbff4003ab1 | [
"BSD-3-Clause"
] | null | null | null | {{cookiecutter.project_slug}}/apps/cms/tests/conftest.py | ukwahlula/django-server-boilerplate | 6bd4b83511ea7e3370349957cf0b6dbff4003ab1 | [
"BSD-3-Clause"
] | null | null | null | from apps.cms.tests.fixtures import * # noqa
from apps.generic.tests.fixtures import * # noqa
from apps.users.tests.fixtures import * # noqa
| 36 | 49 | 0.75 | 21 | 144 | 5.142857 | 0.428571 | 0.222222 | 0.527778 | 0.638889 | 0.574074 | 0.574074 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 144 | 3 | 50 | 48 | 0.878049 | 0.097222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c736acd262715edc5e98f8abd4039ae158e62079 | 17,009 | py | Python | third_party/infra_libs/ts_mon/test/config_test.py | hustwei/chromite | 10eb79abeb64e859362546214b7e039096ac9830 | [
"BSD-3-Clause"
] | null | null | null | third_party/infra_libs/ts_mon/test/config_test.py | hustwei/chromite | 10eb79abeb64e859362546214b7e039096ac9830 | [
"BSD-3-Clause"
] | null | null | null | third_party/infra_libs/ts_mon/test/config_test.py | hustwei/chromite | 10eb79abeb64e859362546214b7e039096ac9830 | [
"BSD-3-Clause"
] | null | null | null | # Copyright 2015 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import argparse
import json
import os
import requests
import tempfile
import unittest
import mock
from testing_support import auto_stub
from infra_libs.ts_mon import config
from infra_libs.ts_mon.common import interface
from infra_libs.ts_mon.common import standard_metrics
from infra_libs.ts_mon.common import monitors
from infra_libs.ts_mon.common import targets
from infra_libs.ts_mon.common.test import stubs
import infra_libs
DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'data')
class GlobalsTest(auto_stub.TestCase):
def setUp(self):
super(GlobalsTest, self).setUp()
self.mock(config, 'load_machine_config', lambda x: {})
def tearDown(self):
# It's important to call close() before un-setting the mock state object,
# because any FlushThread started by the test is stored in that mock state
# and needs to be stopped before running any other tests.
interface.close()
# This should probably live in interface.close()
interface.state = interface.State()
super(GlobalsTest, self).tearDown()
@mock.patch('requests.get', autospec=True)
@mock.patch('socket.getfqdn', autospec=True)
def test_pubsub_monitor_args(self, fake_fqdn, fake_get):
fake_fqdn.return_value = 'slave1-a1.reg.tld'
fake_get.return_value.side_effect = requests.exceptions.ConnectionError
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args([
'--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-endpoint', 'pubsub://invalid-project/invalid-topic'])
config.process_argparse_options(args)
self.assertIsInstance(interface.state.global_monitor,
monitors.PubSubMonitor)
self.assertIsInstance(interface.state.target, targets.DeviceTarget)
self.assertEquals(interface.state.target.hostname, 'slave1-a1')
self.assertEquals(interface.state.target.region, 'reg')
self.assertEquals(args.ts_mon_flush, 'auto')
self.assertIsNotNone(interface.state.flush_thread)
self.assertTrue(standard_metrics.up.get())
@mock.patch('requests.get', autospec=True)
@mock.patch('socket.getfqdn', autospec=True)
@mock.patch('infra_libs.ts_mon.common.monitors.HttpsMonitor.'
'_load_credentials', autospec=True)
def test_https_monitor_args(self, _load_creds, fake_fqdn, fake_get):
print [_load_creds, fake_fqdn, fake_get]
fake_fqdn.return_value = 'slave1-a1.reg.tld'
fake_get.return_value.side_effect = requests.exceptions.ConnectionError
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args([
'--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-endpoint', 'https://test/random:insert'])
config.process_argparse_options(args)
self.assertIsInstance(interface.state.global_monitor,
monitors.HttpsMonitor)
self.assertIsInstance(interface.state.target, targets.DeviceTarget)
self.assertEquals(interface.state.target.hostname, 'slave1-a1')
self.assertEquals(interface.state.target.region, 'reg')
self.assertEquals(args.ts_mon_flush, 'auto')
self.assertIsNotNone(interface.state.flush_thread)
self.assertTrue(standard_metrics.up.get())
@mock.patch('requests.get', autospec=True)
@mock.patch('socket.getfqdn', autospec=True)
def test_default_target_uppercase_fqdn(self, fake_fqdn, fake_get):
fake_fqdn.return_value = 'SLAVE1-A1.REG.TLD'
fake_get.return_value.side_effect = requests.exceptions.ConnectionError
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args([
'--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-endpoint', 'unsupported://www.googleapis.com/some/api'])
config.process_argparse_options(args)
self.assertIsInstance(interface.state.target, targets.DeviceTarget)
self.assertEquals(interface.state.target.hostname, 'slave1-a1')
self.assertEquals(interface.state.target.region, 'reg')
@mock.patch('requests.get', autospec=True)
@mock.patch('socket.getfqdn', autospec=True)
def test_default_target_fqdn_without_domain(self, fake_fqdn, fake_get):
fake_fqdn.return_value = 'SLAVE1-A1'
fake_get.return_value.side_effect = requests.exceptions.ConnectionError
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args([
'--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-endpoint', 'unsupported://www.googleapis.com/some/api'])
config.process_argparse_options(args)
self.assertIsInstance(interface.state.target, targets.DeviceTarget)
self.assertEquals(interface.state.target.hostname, 'slave1-a1')
self.assertEquals(interface.state.target.region, '')
@mock.patch('requests.get', autospec=True)
@mock.patch('socket.getfqdn', autospec=True)
def test_fallback_monitor_args(self, fake_fqdn, fake_get):
fake_fqdn.return_value = 'foo'
fake_get.return_value.side_effect = requests.exceptions.ConnectionError
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args([
'--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-endpoint', 'unsupported://www.googleapis.com/some/api'])
config.process_argparse_options(args)
self.assertIsInstance(interface.state.global_monitor,
monitors.NullMonitor)
@mock.patch('requests.get', autospec=True)
@mock.patch('socket.getfqdn', autospec=True)
def test_explicit_disable_args(self, fake_fqdn, fake_get):
fake_fqdn.return_value = 'foo'
fake_get.return_value.side_effect = requests.exceptions.ConnectionError
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args([
'--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-endpoint', 'none'])
config.process_argparse_options(args)
self.assertIsInstance(interface.state.global_monitor,
monitors.NullMonitor)
@mock.patch('requests.get', autospec=True)
@mock.patch('socket.getfqdn', autospec=True)
def test_manual_flush(self, fake_fqdn, fake_get):
fake_fqdn.return_value = 'foo'
fake_get.return_value.side_effect = requests.exceptions.ConnectionError
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-flush', 'manual'])
config.process_argparse_options(args)
self.assertIsNone(interface.state.flush_thread)
@mock.patch('infra_libs.ts_mon.common.monitors.PubSubMonitor', autospec=True)
def test_pubsub_args(self, fake_monitor):
singleton = mock.Mock()
fake_monitor.return_value = singleton
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-endpoint', 'pubsub://mytopic/myproject'])
config.process_argparse_options(args)
fake_monitor.assert_called_once_with(
'/path/to/creds.p8.json', 'mytopic', 'myproject',
use_instrumented_http=True)
self.assertIs(interface.state.global_monitor, singleton)
@mock.patch('infra_libs.ts_mon.common.monitors.PubSubMonitor', autospec=True)
def test_pubsub_without_credentials(self, fake_monitor):
# safety net, not supposed to be called.
singleton = mock.Mock()
fake_monitor.return_value = singleton
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-config-file',
os.path.join(DATA_DIR, 'empty-config-file.json'),
'--ts-mon-endpoint', 'pubsub://mytopic/myproject'])
config.process_argparse_options(args)
self.assertIsInstance(interface.state.global_monitor, monitors.NullMonitor)
@mock.patch('infra_libs.ts_mon.common.monitors.DebugMonitor', auto_spec=True)
def test_dryrun_args(self, fake_monitor):
singleton = mock.Mock()
fake_monitor.return_value = singleton
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-endpoint', 'file://foo.txt'])
config.process_argparse_options(args)
fake_monitor.assert_called_once_with('foo.txt')
self.assertIs(interface.state.global_monitor, singleton)
def test_device_args(self):
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-target-type', 'device',
'--ts-mon-device-region', 'reg',
'--ts-mon-device-role', 'role',
'--ts-mon-device-network', 'net',
'--ts-mon-device-hostname', 'host'])
config.process_argparse_options(args)
self.assertEqual(interface.state.target.region, 'reg')
self.assertEqual(interface.state.target.role, 'role')
self.assertEqual(interface.state.target.network, 'net')
self.assertEqual(interface.state.target.hostname, 'host')
def test_autogen_device_args(self):
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-target-type', 'device',
'--ts-mon-device-region', 'reg',
'--ts-mon-device-role', 'role',
'--ts-mon-device-network', 'net',
'--ts-mon-device-hostname', 'host',
'--ts-mon-autogen-hostname'])
config.process_argparse_options(args)
self.assertEqual(interface.state.target.region, 'reg')
self.assertEqual(interface.state.target.role, 'role')
self.assertEqual(interface.state.target.network, 'net')
self.assertEqual(interface.state.target.hostname, 'autogen:host')
def test_autogen_device_config(self):
self.mock(config, 'load_machine_config', lambda x: {
'autogen_hostname': True,
'credentials': '/path/to/creds.p8.json',
'endpoint': 'test://endpoint'})
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args([
'--ts-mon-target-type', 'device',
'--ts-mon-device-region', 'reg',
'--ts-mon-device-role', 'role',
'--ts-mon-device-network', 'net',
'--ts-mon-device-hostname', 'host'])
config.process_argparse_options(args)
self.assertEqual(interface.state.target.region, 'reg')
self.assertEqual(interface.state.target.role, 'role')
self.assertEqual(interface.state.target.network, 'net')
self.assertEqual(interface.state.target.hostname, 'autogen:host')
def test_task_args(self):
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-target-type', 'task',
'--ts-mon-task-service-name', 'serv',
'--ts-mon-task-job-name', 'job',
'--ts-mon-task-region', 'reg',
'--ts-mon-task-hostname', 'host',
'--ts-mon-task-number', '1'])
config.process_argparse_options(args)
self.assertEqual(interface.state.target.service_name, 'serv')
self.assertEqual(interface.state.target.job_name, 'job')
self.assertEqual(interface.state.target.region, 'reg')
self.assertEqual(interface.state.target.hostname, 'host')
self.assertEqual(interface.state.target.task_num, 1)
def test_autogen_task_args(self):
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-target-type', 'task',
'--ts-mon-task-service-name', 'serv',
'--ts-mon-task-job-name', 'job',
'--ts-mon-task-region', 'reg',
'--ts-mon-task-hostname', 'host',
'--ts-mon-task-number', '1',
'--ts-mon-autogen-hostname'])
config.process_argparse_options(args)
self.assertEqual(interface.state.target.service_name, 'serv')
self.assertEqual(interface.state.target.job_name, 'job')
self.assertEqual(interface.state.target.region, 'reg')
self.assertEqual(interface.state.target.hostname, 'autogen:host')
self.assertEqual(interface.state.target.task_num, 1)
def test_autogen_task_config(self):
self.mock(config, 'load_machine_config', lambda x: {
'autogen_hostname': True,
'credentials': '/path/to/creds.p8.json',
'endpoint': 'test://endpoint'})
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-target-type', 'task',
'--ts-mon-task-service-name', 'serv',
'--ts-mon-task-job-name', 'job',
'--ts-mon-task-region', 'reg',
'--ts-mon-task-hostname', 'host',
'--ts-mon-task-number', '1'])
config.process_argparse_options(args)
self.assertEqual(interface.state.target.service_name, 'serv')
self.assertEqual(interface.state.target.job_name, 'job')
self.assertEqual(interface.state.target.region, 'reg')
self.assertEqual(interface.state.target.hostname, 'autogen:host')
self.assertEqual(interface.state.target.task_num, 1)
def test_task_args_missing_service_name(self):
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-target-type', 'task',
'--ts-mon-task-job-name', 'job',
'--ts-mon-task-region', 'reg',
'--ts-mon-task-hostname', 'host',
'--ts-mon-task-number', '1'])
with self.assertRaises(SystemExit):
config.process_argparse_options(args)
def test_task_args_missing_job_name(self):
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-credentials', '/path/to/creds.p8.json',
'--ts-mon-target-type', 'task',
'--ts-mon-task-service-name', 'serv',
'--ts-mon-task-region', 'reg',
'--ts-mon-task-hostname', 'host',
'--ts-mon-task-number', '1'])
with self.assertRaises(SystemExit):
config.process_argparse_options(args)
def test_metric_name_prefix(self):
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args(['--ts-mon-metric-name-prefix', '/test/random/'])
config.process_argparse_options(args)
self.assertEqual('/test/random/', interface.state.metric_name_prefix)
@mock.patch('infra_libs.ts_mon.common.monitors.NullMonitor', autospec=True)
def test_no_args(self, fake_monitor):
singleton = mock.Mock()
fake_monitor.return_value = singleton
p = argparse.ArgumentParser()
config.add_argparse_options(p)
args = p.parse_args([])
config.process_argparse_options(args)
self.assertEqual(1, len(fake_monitor.mock_calls))
self.assertEqual('/chrome/infra/', interface.state.metric_name_prefix)
self.assertIs(interface.state.global_monitor, singleton)
@mock.patch('requests.get', autospec=True)
def test_gce_region(self, mock_get):
r = mock_get.return_value
r.status_code = 200
r.text = 'projects/182615506979/zones/us-central1-f'
self.assertEquals('us-central1-f', config._default_region('foo.golo'))
@mock.patch('requests.get', autospec=True)
def test_gce_region_timeout(self, mock_get):
mock_get.side_effect = requests.exceptions.Timeout
self.assertEquals('golo', config._default_region('foo.golo'))
@mock.patch('requests.get', autospec=True)
def test_gce_region_404(self, mock_get):
r = mock_get.return_value
r.status_code = 404
self.assertEquals('golo', config._default_region('foo.golo'))
class ConfigTest(unittest.TestCase):
def test_load_machine_config(self):
with infra_libs.temporary_directory() as temp_dir:
filename = os.path.join(temp_dir, 'config')
with open(filename, 'w') as fh:
json.dump({'foo': 'bar'}, fh)
self.assertEquals({'foo': 'bar'}, config.load_machine_config(filename))
def test_load_machine_config_bad(self):
with infra_libs.temporary_directory() as temp_dir:
filename = os.path.join(temp_dir, 'config')
with open(filename, 'w') as fh:
fh.write('not a json file')
with self.assertRaises(ValueError):
config.load_machine_config(filename)
def test_load_machine_config_not_exists(self):
self.assertEquals({}, config.load_machine_config('does not exist'))
| 43.170051 | 79 | 0.672761 | 2,105 | 17,009 | 5.262708 | 0.113539 | 0.037462 | 0.07041 | 0.070681 | 0.849792 | 0.833363 | 0.824968 | 0.800957 | 0.782361 | 0.773696 | 0 | 0.004914 | 0.18649 | 17,009 | 393 | 80 | 43.279898 | 0.795693 | 0.025986 | 0 | 0.71988 | 0 | 0 | 0.199783 | 0.087329 | 0 | 0 | 0 | 0 | 0.201807 | 1 | 0.084337 | false | 0 | 0.045181 | 0 | 0.135542 | 0.003012 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1be5cc6ba1cde99570f25357f011bc1dc887d721 | 338 | py | Python | main.py | 4nth0nySLT/logica-python | 4d29e6f5f6b949ed9d817948cb8f420c330f2598 | [
"MIT"
] | null | null | null | main.py | 4nth0nySLT/logica-python | 4d29e6f5f6b949ed9d817948cb8f420c330f2598 | [
"MIT"
] | null | null | null | main.py | 4nth0nySLT/logica-python | 4d29e6f5f6b949ed9d817948cb8f420c330f2598 | [
"MIT"
] | null | null | null | import tabla_de_verdad
print("x=tabla_de_verdad.tabla(20)")
x=tabla_de_verdad.tabla(20)
print("x=tabla_de_verdad.tabla_version2(20)")
x=tabla_de_verdad.tabla_version2(20)
print("x=tabla_de_verdad.tabla_version3(20)")
x=tabla_de_verdad.tabla_version3(20)
print("x=tabla_de_verdad.tabla_version4(20)")
x=tabla_de_verdad.tabla_version4(20)
| 30.727273 | 45 | 0.825444 | 62 | 338 | 4.112903 | 0.16129 | 0.247059 | 0.458824 | 0.439216 | 0.92549 | 0.92549 | 0.74902 | 0 | 0 | 0 | 0 | 0.067278 | 0.032544 | 338 | 10 | 46 | 33.8 | 0.712538 | 0 | 0 | 0 | 0 | 0 | 0.400593 | 0.400593 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0.444444 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
40595aeb38553b6424df2cf5e8a422fba7103289 | 1,526 | py | Python | tests/test_1957.py | sungho-joo/leetcode2github | ce7730ef40f6051df23681dd3c0e1e657abba620 | [
"MIT"
] | null | null | null | tests/test_1957.py | sungho-joo/leetcode2github | ce7730ef40f6051df23681dd3c0e1e657abba620 | [
"MIT"
] | null | null | null | tests/test_1957.py | sungho-joo/leetcode2github | ce7730ef40f6051df23681dd3c0e1e657abba620 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import pytest
"""
Test 1957. Delete Characters to Make Fancy String
"""
@pytest.fixture(scope="session")
def init_variables_1957():
from src.leetcode_1957_delete_characters_to_make_fancy_string import Solution
solution = Solution()
def _init_variables_1957():
return solution
yield _init_variables_1957
class TestClass1957:
def test_solution_0(self, init_variables_1957):
assert init_variables_1957().makeFancyString("leeetcode") == "leetcode"
def test_solution_1(self, init_variables_1957):
assert init_variables_1957().makeFancyString("aaabaaaa") == "aabaa"
def test_solution_2(self, init_variables_1957):
assert init_variables_1957().makeFancyString("aab") == "aab"
#!/usr/bin/env python
import pytest
"""
Test 1957. Delete Characters to Make Fancy String
"""
@pytest.fixture(scope="session")
def init_variables_1957():
from src.leetcode_1957_delete_characters_to_make_fancy_string import Solution
solution = Solution()
def _init_variables_1957():
return solution
yield _init_variables_1957
class TestClass1957:
def test_solution_0(self, init_variables_1957):
assert init_variables_1957().makeFancyString("leeetcode") == "leetcode"
def test_solution_1(self, init_variables_1957):
assert init_variables_1957().makeFancyString("aaabaaaa") == "aabaa"
def test_solution_2(self, init_variables_1957):
assert init_variables_1957().makeFancyString("aab") == "aab"
| 24.222222 | 81 | 0.737877 | 186 | 1,526 | 5.698925 | 0.204301 | 0.220755 | 0.288679 | 0.118868 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0.079937 | 0.163827 | 1,526 | 62 | 82 | 24.612903 | 0.750784 | 0.026212 | 0 | 1 | 0 | 0 | 0.062774 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.333333 | false | 0 | 0.133333 | 0.066667 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 11 |
40859ddc4b05c893c81195fc0a637ebee95bedb4 | 18,093 | py | Python | config.1.py | jeffreyyang3/Lines_Queueing | 17e0be2e01d0649ec0a953210d3e3647562bf7dc | [
"MIT"
] | null | null | null | config.1.py | jeffreyyang3/Lines_Queueing | 17e0be2e01d0649ec0a953210d3e3647562bf7dc | [
"MIT"
] | 7 | 2020-05-15T23:58:47.000Z | 2020-08-09T16:51:04.000Z | config.1.py | jeffreyyang3/Lines_Queueing | 17e0be2e01d0649ec0a953210d3e3647562bf7dc | [
"MIT"
] | 4 | 2018-10-02T01:03:01.000Z | 2020-09-21T03:37:23.000Z | import random
import math
data = [[
{ # Type 1: double, no communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "double",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": False,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 2: swap, no communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "swap",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": False,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 3: bid, no communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "bid",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": False,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 4: swap, communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "swap",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": True,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 5: bid, communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "bid",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": True,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},{ # Type 1: double, no communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "double",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": False,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 2: swap, no communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "swap",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": False,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 3: bid, no communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "bid",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": False,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 4: swap, communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "swap",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": True,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 5: bid, communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "bid",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": True,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},{ # Type 1: double, no communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "double",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": False,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 2: swap, no communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "swap",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": False,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 3: bid, no communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "bid",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": False,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 4: swap, communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "swap",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": True,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
{ # Type 5: bid, communication, 8 players
#
"settings": {
"duration": 100,
"swap_method": "bid",
"pay_method": "gain",
"k": 0.8,
"service_distribution": 1,
"discrete": True,
"messaging": True,
},
"players": [
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
{"pay_rate": 4, "endowment": 4, "c": random.random()},
],
},
]]
def shuffle(data):
for i, group in enumerate(data):
for j, period in enumerate(group):
if "start_pos" not in data[i][j]["players"][0]:
positions = [n for n in range(1, len(period["players"]) + 1)]
random.shuffle(positions)
for k, player in enumerate(period["players"]):
data[i][j]["players"][k]["start_pos"] = positions[k]
random.shuffle(
data[i][j]["players"]
) # shuffle order of players within periods
random.shuffle(data[i]) # shuffle order of periods withing groups
random.shuffle(data) # shuffle order of groups
return data
# exports data to a csv format
def export_csv(fname, data):
pass
# exports data to models.py
# formats data to make it easier for models.py to parse it
def export_data():
# error handling & filling defaults
for i, group in enumerate(data):
for j, period in enumerate(group):
if "settings" not in period:
raise ValueError("Each period must contain settings dict")
if "players" not in period:
raise ValueError("Each period must contain players dict")
settings = period["settings"]
players = period["players"]
if "duration" not in settings:
raise ValueError("Each period settings must have a duration")
if "swap_method" not in settings:
raise ValueError(
"Each period settings must have a swap_method variable"
)
# For now, will comment out this swap_method check to allow for testing
# of the double auction
"""
if settings['swap_method'] not in ['cut', 'swap', 'bid']:
raise ValueError('Each period settings swap_method variable \
must be either \'bid\', \'swap\' or \'cut\'')
"""
if "pay_method" not in settings:
raise ValueError(
"Each period settings must have a pay_method variable")
if settings["pay_method"] not in ["gain", "lose"]:
raise ValueError(
"Each period settings pay_method variable \
must be either 'gain' or 'lose'"
)
if "pay_rate" not in players[0]:
raise ValueError("Players must have pay_rates")
if "service_time" not in players[0]:
if "k" not in settings:
raise ValueError(
"Period settings must have a k variable if players \
do not define service ti"
)
if "service_distribution" not in settings:
data[i][j]["settings"]["service_distribution"] = 1
sd = settings["service_distribution"]
t = settings["duration"]
k = settings["k"]
vals = [random.randrange(sd) + 1 for p in players]
vals = [v / sum(vals) for v in vals]
vals = [round(v * k * t) for v in vals]
positions = [n for n in range(1, len(period["players"]) + 1)]
for k, _ in enumerate(players):
data[i][j]["players"][k]["service_time"] = vals[k]
data[i][j]["players"][k]["start_pos"] = positions[k]
print("exported data is")
print(data[0][0])
return data
| 41.308219 | 83 | 0.428895 | 1,776 | 18,093 | 4.264077 | 0.065878 | 0.111845 | 0.126766 | 0.269378 | 0.848541 | 0.826225 | 0.822527 | 0.822527 | 0.822527 | 0.801664 | 0 | 0.033878 | 0.394738 | 18,093 | 437 | 84 | 41.402746 | 0.657657 | 0.052672 | 0 | 0.754617 | 0 | 0 | 0.240232 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007916 | false | 0.002639 | 0.005277 | 0 | 0.01847 | 0.005277 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
409ef99986866fb6a2712184c796bcdc0ee57c9b | 21,847 | py | Python | emsapi/operations/weather_operations.py | ge-flight-analytics/emsapi-python | 2e3a53529758f1bd7a2a850119b1cc1b5ac552e3 | [
"MIT"
] | null | null | null | emsapi/operations/weather_operations.py | ge-flight-analytics/emsapi-python | 2e3a53529758f1bd7a2a850119b1cc1b5ac552e3 | [
"MIT"
] | 2 | 2020-01-16T00:04:35.000Z | 2021-05-26T21:04:06.000Z | emsapi/operations/weather_operations.py | ge-flight-analytics/emsapi-python | 2e3a53529758f1bd7a2a850119b1cc1b5ac552e3 | [
"MIT"
] | 1 | 2021-02-23T08:25:12.000Z | 2021-02-23T08:25:12.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
from msrest.pipeline import ClientRawResponse
from msrest.exceptions import HttpOperationError
from .. import models
class WeatherOperations(object):
"""WeatherOperations operations.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.config = config
def parse_taf(
self, options, custom_headers=None, raw=False, **operation_config):
"""Parses and validates a raw TAF report and returns parsed information in
a format that can be more easily
consumed.
The TAF format doesn't contain contextual information about the date or
month the TAF was issued. This
information can be specified in the request as a ISO 8601 date format,
typically as "YYYY-MM-DD" or
"YYYY-MM".
:param options: An object with the TAF string to parse along with
extra information to help with
parsing.
:type options:
~emsapi.models.AdiEmsWebApiV2DtoWeatherTafTafParseOptions
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: object or ClientRawResponse if raw=true
:rtype: object or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`HttpOperationError<msrest.exceptions.HttpOperationError>`
"""
# Construct URL
url = self.parse_taf.metadata['url']
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(options, 'AdiEmsWebApiV2DtoWeatherTafTafParseOptions')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 400, 401, 503]:
raise HttpOperationError(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('AdiEmsWebApiV2DtoWeatherTafTafReport', response)
if response.status_code == 400:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 401:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 503:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
parse_taf.metadata = {'url': '/v2/weather/taf/parse'}
def parse_metar(
self, options, custom_headers=None, raw=False, **operation_config):
"""Parses and validates a raw METAR report and returns parsed information
in a format that can be more easily
consumed.
:param options: An object with the METAR string to parse.
:type options:
~emsapi.models.AdiEmsWebApiV2DtoWeatherMetarMetarParseOptions
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: object or ClientRawResponse if raw=true
:rtype: object or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`HttpOperationError<msrest.exceptions.HttpOperationError>`
"""
# Construct URL
url = self.parse_metar.metadata['url']
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(options, 'AdiEmsWebApiV2DtoWeatherMetarMetarParseOptions')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 400, 401, 503]:
raise HttpOperationError(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('AdiEmsWebApiV2DtoWeatherMetarMetarReport', response)
if response.status_code == 400:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 401:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 503:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
parse_metar.metadata = {'url': '/v2/weather/metar/parse'}
def query_tafs(
self, ems_system_id, options, custom_headers=None, raw=False, **operation_config):
"""Returns a list of collected TAF reports matching the specified search
criteria for the provided EMS system.
This API will return TAF reports matching your search options. If none
are found, an empty list is returned.
:param ems_system_id: The unique identifier of the system containing
the EMS data.
:type ems_system_id: int
:param options: An object defining the search criteria to use for
querying TAF reports.
:type options: ~emsapi.models.AdiEmsWebApiV2DtoWeatherTafTafQuery
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: object or ClientRawResponse if raw=true
:rtype: object or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`HttpOperationError<msrest.exceptions.HttpOperationError>`
"""
# Construct URL
url = self.query_tafs.metadata['url']
path_format_arguments = {
'emsSystemId': self._serialize.url("ems_system_id", ems_system_id, 'int')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(options, 'AdiEmsWebApiV2DtoWeatherTafTafQuery')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 400, 401, 503]:
raise HttpOperationError(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('[AdiEmsWebApiV2DtoWeatherTafTafReport]', response)
if response.status_code == 400:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 401:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 503:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
query_tafs.metadata = {'url': '/v2/ems-systems/{emsSystemId}/weather/tafs'}
def query_metars(
self, ems_system_id, options, custom_headers=None, raw=False, **operation_config):
"""Returns a list of collected METAR reports matching the specified search
criteria for the provided EMS
system.
This API will return METAR reports matching your search options. If
none are found, an empty list is
returned.
:param ems_system_id: The unique identifier of the system containing
the EMS data.
:type ems_system_id: int
:param options: An object defining the search criteria to use for
querying METAR reports.
:type options: ~emsapi.models.AdiEmsWebApiV2DtoWeatherMetarMetarQuery
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: object or ClientRawResponse if raw=true
:rtype: object or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`HttpOperationError<msrest.exceptions.HttpOperationError>`
"""
# Construct URL
url = self.query_metars.metadata['url']
path_format_arguments = {
'emsSystemId': self._serialize.url("ems_system_id", ems_system_id, 'int')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(options, 'AdiEmsWebApiV2DtoWeatherMetarMetarQuery')
# Construct and send request
request = self._client.post(url, query_parameters, header_parameters, body_content)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 400, 401, 503]:
raise HttpOperationError(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('[AdiEmsWebApiV2DtoWeatherMetarMetarReport]', response)
if response.status_code == 400:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 401:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 503:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
query_metars.metadata = {'url': '/v2/ems-systems/{emsSystemId}/weather/metars'}
def get_weather_for_flight(
self, ems_system_id, flight_id, custom_headers=None, raw=False, **operation_config):
"""Returns a list of all collected TAF and METAR reports for a specified
flight.
<p>
If the specified flight doesn't have any weather data, empty lists will
be returned in the result.
</p>
<p>
METAR and TAF reports that are related to flights in EMS can also be
queried by building queries using the
Database APIs. This API provides a simple wrapper around those APIs to
make it easy to query for weather
data
</p>.
:param ems_system_id: The unique identifier of the system containing
the EMS data.
:type ems_system_id: int
:param flight_id: The integer ID of the flight record for which to
return TAF reports.
:type flight_id: int
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: object or ClientRawResponse if raw=true
:rtype: object or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`HttpOperationError<msrest.exceptions.HttpOperationError>`
"""
# Construct URL
url = self.get_weather_for_flight.metadata['url']
path_format_arguments = {
'emsSystemId': self._serialize.url("ems_system_id", ems_system_id, 'int'),
'flightId': self._serialize.url("flight_id", flight_id, 'int')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 401, 404, 503]:
raise HttpOperationError(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('AdiEmsWebApiV2DtoWeatherWeatherReport', response)
if response.status_code == 401:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 404:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 503:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_weather_for_flight.metadata = {'url': '/v2/ems-systems/{emsSystemId}/flights/{flightId}/weather'}
def get_tafs_for_flight(
self, ems_system_id, flight_id, custom_headers=None, raw=False, **operation_config):
"""Returns a list of collected TAF reports for a specified flight.
<p>
If the specified flight doesn't have any TAF reports, an empty list
will be returned in the result.
</p>
<p>
TAF reports that are related to flights in EMS can also be queried by
building queries using the Database
APIs. This API provides a simple wrapper around those APIs to make it
easy to query for TAFs.
</p>.
:param ems_system_id: The unique identifier of the system containing
the EMS data.
:type ems_system_id: int
:param flight_id: The integer ID of the flight record for which to
return TAF reports.
:type flight_id: int
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: object or ClientRawResponse if raw=true
:rtype: object or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`HttpOperationError<msrest.exceptions.HttpOperationError>`
"""
# Construct URL
url = self.get_tafs_for_flight.metadata['url']
path_format_arguments = {
'emsSystemId': self._serialize.url("ems_system_id", ems_system_id, 'int'),
'flightId': self._serialize.url("flight_id", flight_id, 'int')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 401, 404, 503]:
raise HttpOperationError(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('[AdiEmsWebApiV2DtoWeatherTafTafReport]', response)
if response.status_code == 401:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 404:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 503:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_tafs_for_flight.metadata = {'url': '/v2/ems-systems/{emsSystemId}/flights/{flightId}/weather/tafs'}
def get_metars_for_flight(
self, ems_system_id, flight_id, custom_headers=None, raw=False, **operation_config):
"""Returns a list of collected METAR reports for a specified flight.
<p>
If the specified flight doesn't have any METAR reports, an empty list
will be returned in the result.
</p>
<p>
METAR reports that are related to flights in EMS can also be queried by
building queries using the Database
APIs. This API provides a simple wrapper around those APIs to make it
easy to query for METARs.
</p>.
:param ems_system_id: The unique identifier of the system containing
the EMS data.
:type ems_system_id: int
:param flight_id: The integer ID of the flight record for which to
return METAR reports.
:type flight_id: int
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: object or ClientRawResponse if raw=true
:rtype: object or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`HttpOperationError<msrest.exceptions.HttpOperationError>`
"""
# Construct URL
url = self.get_metars_for_flight.metadata['url']
path_format_arguments = {
'emsSystemId': self._serialize.url("ems_system_id", ems_system_id, 'int'),
'flightId': self._serialize.url("flight_id", flight_id, 'int')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Accept'] = 'application/json'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters, header_parameters)
response = self._client.send(request, stream=False, **operation_config)
if response.status_code not in [200, 401, 404, 503]:
raise HttpOperationError(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('[AdiEmsWebApiV2DtoWeatherMetarMetarReport]', response)
if response.status_code == 401:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 404:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if response.status_code == 503:
deserialized = self._deserialize('AdiEmsWebApiModelError', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
get_metars_for_flight.metadata = {'url': '/v2/ems-systems/{emsSystemId}/flights/{flightId}/weather/metars'}
| 41.932821 | 111 | 0.665446 | 2,312 | 21,847 | 6.128893 | 0.091696 | 0.038109 | 0.03952 | 0.0494 | 0.89259 | 0.888356 | 0.882145 | 0.876359 | 0.876359 | 0.873747 | 0 | 0.012164 | 0.251156 | 21,847 | 520 | 112 | 42.013462 | 0.853973 | 0.35355 | 0 | 0.807175 | 1 | 0 | 0.138704 | 0.094585 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035874 | false | 0 | 0.013453 | 0 | 0.121076 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
40ba4a60b3ed93e09b4fbc182c94a41d709e80ca | 14,978 | py | Python | tester/src/control.py | giraffe50/RISCV-M4F | 1b1ed756a8ea02c2d2a11d8472f8603847170ad8 | [
"Apache-2.0"
] | 3 | 2021-01-13T03:41:14.000Z | 2021-03-23T11:31:48.000Z | tester/src/control.py | scutdig/LG-32HP | 1b1ed756a8ea02c2d2a11d8472f8603847170ad8 | [
"Apache-2.0"
] | 1 | 2021-03-01T09:32:59.000Z | 2021-03-01T09:32:59.000Z | tester/src/control.py | scutdig/LG-32HP | 1b1ed756a8ea02c2d2a11d8472f8603847170ad8 | [
"Apache-2.0"
] | 4 | 2021-01-07T03:01:26.000Z | 2021-02-28T02:20:10.000Z | from .config import *
from .instructions import Inst
# csr change start
# Control signal decode map and default case
# Reg_Write Imm_sel ALU_Src ALUOp Branch Branch_Src Mem_Read Mem_Write Data_Size Load_Type Mem_to_Reg Jump_Type CSR_src Write_CSR is_Illegal
# | | | | | | | | | | | | | | |
default = [Reg_Write_False, IMM_X , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_True]
decode_map = {
Inst.ADD : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SUB : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_SUB , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.AND : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_AND , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.OR : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_OR , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.XOR : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_XOR , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.ADDI : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.ANDI : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_AND , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.ORI : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_OR , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.XORI : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_XOR , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SLL : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_SLL , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SRL : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_SRL , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SRA : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_SRA , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SLLI : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_SLL , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SRLI : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_SRL , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SRAI : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_SRA , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SLT : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_SLT , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SLTU : [Reg_Write_True , IMM_R , ALU_B_rs2, ALU_SLTU , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SLTI : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_SLT , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SLTIU : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_SLTU , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ALU, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.LW : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_True , Mem_Write_False, Data_Size_W, Load_Signed , RegWrite_Mem, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.LH : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_True , Mem_Write_False, Data_Size_H, Load_Signed , RegWrite_Mem, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.LB : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_True , Mem_Write_False, Data_Size_B, Load_Signed , RegWrite_Mem, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.LHU : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_True , Mem_Write_False, Data_Size_H, Load_Unsigned, RegWrite_Mem, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.LBU : [Reg_Write_True , IMM_I , ALU_B_imm, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_True , Mem_Write_False, Data_Size_B, Load_Unsigned, RegWrite_Mem, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SW : [Reg_Write_False, IMM_S , ALU_B_imm, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_True , Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SH : [Reg_Write_False, IMM_S , ALU_B_imm, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_True , Data_Size_H, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.SB : [Reg_Write_False, IMM_S , ALU_B_imm, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_True , Data_Size_B, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.BEQ : [Reg_Write_False, IMM_SB, ALU_B_rs2, ALU_BEQ , Branch_True , Branch_PC , Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.BNE : [Reg_Write_False, IMM_SB, ALU_B_rs2, ALU_BNE , Branch_True , Branch_PC , Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.BLT : [Reg_Write_False, IMM_SB, ALU_B_rs2, ALU_BLT , Branch_True , Branch_PC , Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.BGE : [Reg_Write_False, IMM_SB, ALU_B_rs2, ALU_BGE , Branch_True , Branch_PC , Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.BLTU : [Reg_Write_False, IMM_SB, ALU_B_rs2, ALU_BLTU , Branch_True , Branch_PC , Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.BGEU : [Reg_Write_False, IMM_SB, ALU_B_rs2, ALU_BGEU , Branch_True , Branch_PC , Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.JAL : [Reg_Write_True , IMM_UJ, ALU_B_rs2, ALU_ADD , Branch_True , Branch_PC , Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_PC_4, NonConditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.JALR : [Reg_Write_True , IMM_I , ALU_B_rs2, ALU_ADD , Branch_True , Branch_Rs1, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_PC_4, NonConditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.LUI : [Reg_Write_True , IMM_U , ALU_B_rs2, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_imm , Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.AUIPC : [Reg_Write_True , IMM_U , ALU_B_rs2, ALU_ADD , Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_ipc , Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.NOP : [Reg_Write_False, IMM_X , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
Inst.CSRRW : [Reg_Write_True , IMM_I , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_CSR, Conditional, CSR_src_rs1, Write_CSR_True_W, is_Illegal_False],
Inst.CSRRS : [Reg_Write_True , IMM_I , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_CSR, Conditional, CSR_src_rs1, Write_CSR_True_S, is_Illegal_False],
Inst.CSRRC : [Reg_Write_True , IMM_I , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_CSR, Conditional, CSR_src_rs1, Write_CSR_True_C, is_Illegal_False],
Inst.CSRRWI : [Reg_Write_True , IMM_I , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_CSR, Conditional, CSR_src_imm, Write_CSR_True_WI, is_Illegal_False],
Inst.CSRRSI : [Reg_Write_True , IMM_I , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_CSR, Conditional, CSR_src_imm, Write_CSR_True_SI, is_Illegal_False],
Inst.CSRRCI : [Reg_Write_True , IMM_I , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_CSR, Conditional, CSR_src_imm, Write_CSR_True_CI, is_Illegal_False],
Inst.MRET : [Reg_Write_False, IMM_X , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_Return, is_Illegal_False],
Inst.VOID : [Reg_Write_False, IMM_X , ALU_B_XXX, ALU_OP_XXX, Branch_False, Branch_XXX, Mem_Read_False, Mem_Write_False, Data_Size_W, Load_XXX , RegWrite_XXX, Conditional, CSR_src_XXX, Write_CSR_False, is_Illegal_False],
...: default
}
# csr change end
class Control(Module):
io = IO(
inst=Input(U.w(WLEN)),
Reg_Write=Output(U.w(REG_WRITE_SIG_WIDTH)),
Imm_Sel=Output(U.w(IMM_SEL_SIG_WIDTH)),
ALU_Src=Output(U.w(ALU_SRC_SIG_LEN)),
ALUOp=Output(U.w(ALUOP_SIG_LEN)),
Branch=Output(U.w(BRANCH_SIG_LEN)),
Branch_Src=Output(U.w(BRANCH_SRC_SIG_LEN)),
Mem_Read=Output(U.w(MEM_READ_SIG_LEN)),
Mem_Write=Output(U.w(MEM_WRITE_SIG_LEN)),
Data_Size=Output(U.w(DATA_SIZE_SIG_LEN)),
Load_Type=Output(U.w(LOAD_TYPE_SIG_LEN)),
Mem_to_Reg=Output(U.w(REG_SRC_SIG_LEN)),
Jump_Type=Output(U.w(JUMP_TYPE_SIG_LEN)),
# csr add start
CSR_src=Output(U.w(CSR_SRC_SIG_LEN)),
Write_CSR=Output(U.w(WRITE_CSR_SIG_LEN)),
is_Illegal=Output(U.w(IS_ILLEGAL_SIG_LEN))
# csr add end
)
ctrlsignals = LookUpTable(io.inst, decode_map)
# Control signals for ID stage
io.Imm_Sel <<= ctrlsignals[1]
# Control signals for EX stage
io.ALU_Src <<= ctrlsignals[2]
io.ALUOp <<= ctrlsignals[3]
io.Branch <<= ctrlsignals[4]
io.Branch_Src <<= ctrlsignals[5]
io.Jump_Type <<= ctrlsignals[11]
# Control signals for MEM stage
io.Mem_Read <<= ctrlsignals[6]
io.Mem_Write <<= ctrlsignals[7]
io.Data_Size <<= ctrlsignals[8]
io.Load_Type <<= ctrlsignals[9]
# csr add start
io.CSR_src <<= ctrlsignals[12]
io.Write_CSR <<= ctrlsignals[13]
io.is_Illegal <<= ctrlsignals[14]
# csr add end
# Control signals for WB stage
io.Reg_Write <<= ctrlsignals[0]
io.Mem_to_Reg <<= ctrlsignals[10]
if __name__ == '__main__':
f = Emitter.dump(Emitter.emit(Control()), "Control.fir")
Emitter.dumpVerilog(f)
| 117.937008 | 264 | 0.652357 | 2,170 | 14,978 | 3.974194 | 0.067742 | 0.066095 | 0.074675 | 0.093924 | 0.810181 | 0.810181 | 0.810181 | 0.80821 | 0.80334 | 0.736317 | 0 | 0.004207 | 0.270063 | 14,978 | 126 | 265 | 118.873016 | 0.784597 | 0.04974 | 0 | 0 | 0 | 0 | 0.001348 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.022222 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
904aeda7dad7383475ed944abb1ec2085983eb78 | 6,740 | py | Python | mtist/graphing_utils.py | granthussey/mtist_platform | eb7474b99b745f71b3b34c3a1257bf0d57d29470 | [
"MIT"
] | null | null | null | mtist/graphing_utils.py | granthussey/mtist_platform | eb7474b99b745f71b3b34c3a1257bf0d57d29470 | [
"MIT"
] | null | null | null | mtist/graphing_utils.py | granthussey/mtist_platform | eb7474b99b745f71b3b34c3a1257bf0d57d29470 | [
"MIT"
] | null | null | null | from collections.abc import Iterable
import matplotlib.pyplot as plt
import seaborn as sns
def easy_subplots(ncols=1, nrows=1, base_figsize=None, **kwargs):
if base_figsize is None:
base_figsize = (8, 5)
fig, axes = plt.subplots(
ncols=ncols,
nrows=nrows,
figsize=(base_figsize[0] * ncols, base_figsize[1] * nrows),
**kwargs
)
# Lazy way of doing this
try:
axes = axes.reshape(-1)
except:
pass
return fig, axes
def despine(fig=None, axes=None):
if fig is not None:
sns.despine(trim=True, offset=0.5, fig=fig)
elif axes is not None:
if not isinstance(axes, Iterable): # to generalize to a single ax
axes = [axes]
for ax in axes:
sns.despine(trim=True, offset=0.5, ax=ax)
else:
fig = plt.gcf()
sns.despine(trim=True, offset=0.5, fig=fig)
def savefig(fig, filename, ft=None):
if ft is None:
ft = "jpg"
fig.savefig("{}.{}".format(filename, ft), dpi=300, bbox_inches="tight")
def score_heatmap(meta, df_es_scores, plot_floored=True, plot_low_seq_depth=True, **kwargs):
"""meta should have index of did"""
# Get heatmaps across seq_depth low, high and raw, floored scores
hm_high_raw = (
df_es_scores.join(meta)
.query('seq_depth == "high"')
.groupby(["n_species", "noise", "n_timeseries", "sampling_scheme", "n_timepoints"])
.median()
.pivot_table(
index=["noise", "sampling_scheme", "n_timepoints"],
columns=["n_species", "n_timeseries"],
values="raw",
)
)
hm_low_raw = (
df_es_scores.join(meta)
.query('seq_depth == "low"')
.groupby(["n_species", "noise", "n_timeseries", "sampling_scheme", "n_timepoints"])
.median()
.pivot_table(
index=["noise", "sampling_scheme", "n_timepoints"],
columns=["n_species", "n_timeseries"],
values="raw",
)
)
hm_high_floored = (
df_es_scores.join(meta)
.query('seq_depth == "high"')
.groupby(["n_species", "noise", "n_timeseries", "sampling_scheme", "n_timepoints"])
.median()
.pivot_table(
index=["noise", "sampling_scheme", "n_timepoints"],
columns=["n_species", "n_timeseries"],
values="floored",
)
)
hm_low_floored = (
df_es_scores.join(meta)
.query('seq_depth == "low"')
.groupby(["n_species", "noise", "n_timeseries", "sampling_scheme", "n_timepoints"])
.median()
.pivot_table(
index=["noise", "sampling_scheme", "n_timepoints"],
columns=["n_species", "n_timeseries"],
values="floored",
)
)
if plot_floored:
to_plot = [hm_high_raw, hm_low_raw, hm_high_floored, hm_low_floored]
ax_titles = [
"SeqDepth==High, Non-floored ES Score",
"SeqDepth==Low, Non-floored ES Score",
"SeqDepth==High, Floored ES Score",
"SeqDepth==Low, Floored ES Score",
]
nrows = 2
ncols = 2
else:
to_plot = [hm_high_raw, hm_low_raw]
ax_titles = [
"SeqDepth==High, Non-floored ES Score",
"SeqDepth==Low, Non-floored ES Score",
]
nrows = 1
ncols = 2
fig, axes = easy_subplots(
base_figsize=(10, 10), nrows=nrows, ncols=ncols, sharex=True, sharey=True
)
plotting_kwargs = {"center": 0.5, "cmap": "coolwarm", "annot": True}
# Will only evaluate if kwargs is NOT empty
if kwargs:
plotting_kwargs.update(kwargs)
for i, ax in enumerate(axes):
sns.heatmap(to_plot[i], ax=ax, **plotting_kwargs)
ax.set_title(ax_titles[i])
plt.tight_layout()
return fig
def score_heatmap_expanded(meta, df_es_scores, plot_floored=True, return_ax=False, **kwargs):
"""meta should have index of did"""
# Get heatmaps across seq_depth low, high and raw, floored scores
hm_high_raw = (
df_es_scores.join(meta)
.query('seq_depth == "high"')
.groupby(["ground_truth", "noise", "n_timeseries", "sampling_scheme", "n_timepoints"])
.median()
.pivot_table(
index=["noise", "sampling_scheme", "n_timepoints"],
columns=["ground_truth", "n_timeseries"],
values="raw",
)
)
hm_low_raw = (
df_es_scores.join(meta)
.query('seq_depth == "low"')
.groupby(["ground_truth", "noise", "n_timeseries", "sampling_scheme", "n_timepoints"])
.median()
.pivot_table(
index=["noise", "sampling_scheme", "n_timepoints"],
columns=["ground_truth", "n_timeseries"],
values="raw",
)
)
hm_high_floored = (
df_es_scores.join(meta)
.query('seq_depth == "high"')
.groupby(["ground_truth", "noise", "n_timeseries", "sampling_scheme", "n_timepoints"])
.median()
.pivot_table(
index=["noise", "sampling_scheme", "n_timepoints"],
columns=["ground_truth", "n_timeseries"],
values="floored",
)
)
hm_low_floored = (
df_es_scores.join(meta)
.query('seq_depth == "low"')
.groupby(["ground_truth", "noise", "n_timeseries", "sampling_scheme", "n_timepoints"])
.median()
.pivot_table(
index=["noise", "sampling_scheme", "n_timepoints"],
columns=["ground_truth", "n_timeseries"],
values="floored",
)
)
if plot_floored:
to_plot = [hm_high_raw, hm_low_raw, hm_high_floored, hm_low_floored]
ax_titles = [
"SeqDepth==High, Non-floored ES Score",
"SeqDepth==Low, Non-floored ES Score",
"SeqDepth==High, Floored ES Score",
"SeqDepth==Low, Floored ES Score",
]
nrows = 2
ncols = 2
else:
to_plot = [hm_high_raw, hm_low_raw]
ax_titles = [
"SeqDepth==High, Non-floored ES Score",
"SeqDepth==Low, Non-floored ES Score",
]
nrows = 1
ncols = 2
fig, axes = easy_subplots(
base_figsize=(10, 10), nrows=nrows, ncols=ncols, sharex=True, sharey=True
)
plotting_kwargs = {"center": 0.5, "cmap": "coolwarm", "annot": True}
# Will only evaluate if kwargs is NOT empty
if kwargs:
plotting_kwargs.update(kwargs)
for i, ax in enumerate(axes):
sns.heatmap(to_plot[i], ax=ax, **plotting_kwargs)
ax.set_title(ax_titles[i])
plt.tight_layout()
if return_ax:
return fig, axes
else:
return fig | 27.736626 | 94 | 0.566024 | 803 | 6,740 | 4.518057 | 0.158157 | 0.048512 | 0.066152 | 0.110254 | 0.821389 | 0.821389 | 0.821389 | 0.798236 | 0.798236 | 0.780595 | 0 | 0.007579 | 0.295252 | 6,740 | 243 | 95 | 27.736626 | 0.756211 | 0.048071 | 0 | 0.710383 | 0 | 0 | 0.238869 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027322 | false | 0.005464 | 0.016393 | 0 | 0.065574 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
907fdd66eb0441a2c2ab27306cde81be62552a89 | 36 | py | Python | 100/79.py | ElyKar/Euler | 38744b553b22565ac30ece06e2e3fbf3408068e2 | [
"MIT"
] | null | null | null | 100/79.py | ElyKar/Euler | 38744b553b22565ac30ece06e2e3fbf3408068e2 | [
"MIT"
] | null | null | null | 100/79.py | ElyKar/Euler | 38744b553b22565ac30ece06e2e3fbf3408068e2 | [
"MIT"
] | null | null | null | #!/bin/python3.4
print('73162890')
| 9 | 17 | 0.666667 | 5 | 36 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.30303 | 0.083333 | 36 | 3 | 18 | 12 | 0.424242 | 0.416667 | 0 | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
90a9cb020e586ff0d3a375cd7a41240ec9f1d510 | 7,548 | py | Python | benchmarks/test_core.py | Quansight-Labs/python-moa | 1f02519425ab0215896a5fb9be00631c20d34895 | [
"BSD-3-Clause"
] | 22 | 2019-04-19T20:42:53.000Z | 2022-02-03T20:42:48.000Z | benchmarks/test_core.py | costrouc/python-moa | 1f02519425ab0215896a5fb9be00631c20d34895 | [
"BSD-3-Clause"
] | 15 | 2019-03-12T17:35:42.000Z | 2019-05-20T20:15:14.000Z | benchmarks/test_core.py | costrouc/python-moa | 1f02519425ab0215896a5fb9be00631c20d34895 | [
"BSD-3-Clause"
] | 2 | 2019-11-12T22:50:04.000Z | 2020-04-24T15:56:43.000Z | import pytest
import numpy
import numba
import torch
import tensorflow
from moa.frontend import LazyArray
@pytest.mark.benchmark(group="addition", warmup=True)
def test_moa_numba_addition(benchmark):
n = 1000
m = 1000
expression = LazyArray(name='A', shape=('n', 'm')) + LazyArray(name='B', shape=('n', 'm'))
local_dict = {}
exec(expression.compile(backend='python', use_numba=True), globals(), local_dict)
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
benchmark(local_dict['f'], A, B)
@pytest.mark.benchmark(group="addition")
def test_numpy_addition(benchmark):
n = 1000
m = 1000
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
def _test():
A + B
benchmark(_test)
@pytest.mark.benchmark(group="addition")
def test_pytorch_addition(benchmark):
n = 1000
m = 1000
A = torch.rand(n, m)
B = torch.rand(n, m)
def _test():
torch.add(A, B)
benchmark(_test)
@pytest.mark.benchmark(group="addition")
def test_tensorflow_addition(benchmark):
n = 1000
m = 1000
A = tensorflow.random.uniform((n, m))
B = tensorflow.random.uniform((n, m))
session = tensorflow.Session()
session.run(tensorflow.initialize_all_variables())
result = tensorflow.math.add(A, B)
def _test():
session.run(result)
benchmark(_test)
@pytest.mark.benchmark(group="addition_index", warmup=True)
def test_moa_numba_addition_index(benchmark):
n = 1000
m = 1000
expression = (LazyArray(name='A', shape=('n', 'm')) + LazyArray(name='B', shape=('n', 'm')))[0]
local_dict = {}
exec(expression.compile(backend='python', use_numba=True), globals(), local_dict)
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
benchmark(local_dict['f'], A, B)
@pytest.mark.benchmark(group="addition_index")
def test_numpy_addition_index(benchmark):
n = 1000
m = 1000
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
def _test():
A[0] + B[0]
benchmark(_test)
@pytest.mark.benchmark(group="addition_index")
def test_pytorch_addition_index(benchmark):
n = 1000
m = 1000
A = torch.rand(n, m)
B = torch.rand(n, m)
def _test():
torch.add(A[0], B[0])
benchmark(_test)
@pytest.mark.benchmark(group="addition_index")
def test_tensorflow_addition_index(benchmark):
n = 1000
m = 1000
A = tensorflow.random.uniform((n, m))
B = tensorflow.random.uniform((n, m))
index = tensorflow.constant(0)
session = tensorflow.Session()
session.run(tensorflow.initialize_all_variables())
result = tensorflow.gather(tensorflow.math.add(A, B), index)
def _test():
session.run(result)
benchmark(_test)
@pytest.mark.benchmark(group="double_addition", warmup=True)
def test_moa_numba_double_addition(benchmark):
n = 1000
m = 1000
expression = LazyArray(name='A', shape=('n', 'm')) + LazyArray(name='B', shape=('n', 'm')) + LazyArray(name='C', shape=('n', 'm'))
local_dict = {}
exec(expression.compile(backend='python', use_numba=True), globals(), local_dict)
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
C = numpy.random.random((n, m))
benchmark(local_dict['f'], A=A, B=B, C=C)
@pytest.mark.benchmark(group="double_addition")
def test_numpy_double_addition(benchmark):
n = 1000
m = 1000
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
C = numpy.random.random((n, m))
def _test():
A + B + C
benchmark(_test)
@pytest.mark.benchmark(group="double_addition")
def test_pytorch_double_addition(benchmark):
n = 1000
m = 1000
A = torch.rand(n, m)
B = torch.rand(n, m)
C = torch.rand(n, m)
def _test():
torch.add(torch.add(A, B), C)
benchmark(_test)
@pytest.mark.benchmark(group="double_addition")
def test_tensorflow_double_addition(benchmark):
n = 1000
m = 1000
A = tensorflow.random.uniform((n, m))
B = tensorflow.random.uniform((n, m))
C = tensorflow.random.uniform((n, m))
session = tensorflow.Session()
session.run(tensorflow.initialize_all_variables())
result = tensorflow.math.add(tensorflow.math.add(A, B), C)
def _test():
session.run(result)
benchmark(_test)
@pytest.mark.benchmark(group="outer_product", warmup=True)
def test_moa_numba_outer_product(benchmark):
n = 100
m = 100
expression = LazyArray(name='A', shape=('n', 'm')).outer('*', LazyArray(name='B', shape=('n', 'm')))
local_dict = {}
exec(expression.compile(backend='python', use_numba=True), globals(), local_dict)
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
benchmark(local_dict['f'], A, B)
@pytest.mark.benchmark(group="outer_product")
def test_numpy_outer_product(benchmark):
n = 100
m = 100
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
def _test():
numpy.outer(A, B)
benchmark(_test)
@pytest.mark.benchmark(group="reduce", warmup=True)
def test_moa_numba_reduce(benchmark):
n = 1000
m = 1000
expression = LazyArray(name='A', shape=('n', 'm')).reduce('+')
local_dict = {}
exec(expression.compile(backend='python', use_numba=True), globals(), local_dict)
A = numpy.random.random((n, m))
benchmark(local_dict['f'], A)
@pytest.mark.benchmark(group="reduce")
def test_numpy_reduce(benchmark):
n = 1000
m = 1000
A = numpy.random.random((n, m))
def _test():
A.sum()
benchmark(_test)
@pytest.mark.benchmark(group="reduce")
def test_pytorch_reduce(benchmark):
n = 1000
m = 1000
A = torch.rand(n, m)
def _test():
torch.sum(A)
benchmark(_test)
@pytest.mark.benchmark(group="reduce")
def test_tensorflow_reduce(benchmark):
n = 1000
m = 1000
A = tensorflow.random.uniform((n, m))
session = tensorflow.Session()
session.run(tensorflow.initialize_all_variables())
result = tensorflow.math.reduce_sum(A)
def _test():
session.run(result)
benchmark(_test)
@pytest.mark.benchmark(group="inner_product", warmup=True)
def test_moa_numba_inner_product(benchmark):
n = 1000
m = 1000
_A = LazyArray(name='A', shape=('n', 'm'))
_B = LazyArray(name='B', shape=('m', 'k'))
expression = _A.inner('+', '*', _B)
local_dict = {}
exec(expression.compile(backend='python', use_numba=True), globals(), local_dict)
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
benchmark(local_dict['f'], A, B)
@pytest.mark.benchmark(group="inner_product")
def test_numpy_inner_product(benchmark):
n = 1000
m = 1000
A = numpy.random.random((n, m))
B = numpy.random.random((n, m))
def _test():
A.dot(B)
benchmark(_test)
@pytest.mark.benchmark(group="inner_product")
def test_pytorch_inner_product(benchmark):
n = 1000
m = 1000
A = torch.rand(n, m)
B = torch.rand(n, m)
def _test():
torch.mm(A, B)
benchmark(_test)
@pytest.mark.benchmark(group="inner_product")
def test_tensorflow_inner_product(benchmark):
n = 1000
m = 1000
A = tensorflow.random.uniform((n, m))
B = tensorflow.random.uniform((n, m))
session = tensorflow.Session()
session.run(tensorflow.initialize_all_variables())
result = tensorflow.linalg.matmul(A, B)
def _test():
session.run(result)
benchmark(_test)
| 21.202247 | 134 | 0.6354 | 1,033 | 7,548 | 4.498548 | 0.061955 | 0.023671 | 0.087799 | 0.092963 | 0.922531 | 0.906391 | 0.891543 | 0.82096 | 0.753389 | 0.721756 | 0 | 0.029617 | 0.203763 | 7,548 | 355 | 135 | 21.261972 | 0.743594 | 0 | 0 | 0.719298 | 0 | 0 | 0.043985 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.026316 | 0 | 0.192982 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
90efadc0aa69103a414acb6c0dfbe0985570b898 | 86 | py | Python | application/games/scrambledwords/util/time_util.py | Tyler-Yates/game-box | dc838270c3777372c3eeaf1e09fb1962c36fc2a8 | [
"MIT"
] | 1 | 2020-12-13T02:41:19.000Z | 2020-12-13T02:41:19.000Z | application/games/scrambledwords/util/time_util.py | Tyler-Yates/game-box | dc838270c3777372c3eeaf1e09fb1962c36fc2a8 | [
"MIT"
] | null | null | null | application/games/scrambledwords/util/time_util.py | Tyler-Yates/game-box | dc838270c3777372c3eeaf1e09fb1962c36fc2a8 | [
"MIT"
] | null | null | null | import time
def get_time_millis() -> int:
return int(round(time.time() * 1000))
| 14.333333 | 41 | 0.662791 | 13 | 86 | 4.230769 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057143 | 0.186047 | 86 | 5 | 42 | 17.2 | 0.728571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
2906815f9ab5aa9537e164dfc64b273352f2922b | 13,536 | py | Python | tt/tests/unit/expressions/test_bexpr_init_from_tree.py | fkromer/tt | b4dfc90f7d0f9b5794e1f5054b640e22f6f75bf7 | [
"MIT"
] | 233 | 2016-02-05T20:13:06.000Z | 2022-03-26T13:01:10.000Z | tt/tests/unit/expressions/test_bexpr_init_from_tree.py | fkromer/tt | b4dfc90f7d0f9b5794e1f5054b640e22f6f75bf7 | [
"MIT"
] | 8 | 2017-12-20T17:07:58.000Z | 2020-08-06T15:44:55.000Z | tt/tests/unit/expressions/test_bexpr_init_from_tree.py | fkromer/tt | b4dfc90f7d0f9b5794e1f5054b640e22f6f75bf7 | [
"MIT"
] | 15 | 2016-03-22T23:37:56.000Z | 2022-02-27T17:51:08.000Z | """Test expression initialization from expression tree nodes."""
import unittest
from tt.expressions import BooleanExpression
from tt.trees import ExpressionTreeNode
class TestBooleanExpressionInitFromTrees(unittest.TestCase):
def _bexpr_from_postfix_tokens(self, postfix_tokens):
return BooleanExpression(ExpressionTreeNode.build_tree(postfix_tokens))
def test_single_operand(self):
"""Test from a single operand tree."""
for token in ('0', '1', 'operand'):
b = self._bexpr_from_postfix_tokens([token])
self.assertEqual(b.symbols, [] if token in {'0', '1'} else [token])
self.assertEqual(b.tokens, [token])
self.assertEqual(b.postfix_tokens, [token])
self.assertEqual(b.raw_expr, token)
self.assertTrue(b.tree is not None)
def test_only_symbolic_unary_operators(self):
"""Test from trees containing only symbol unary operators."""
b = self._bexpr_from_postfix_tokens(['A', '~'])
self.assertEqual(b.symbols, ['A'])
self.assertEqual(b.tokens, ['~', 'A'])
self.assertEqual(b.postfix_tokens, ['A', '~'])
self.assertEqual(b.raw_expr, '~A')
self.assertTrue(b.tree is not None)
b = self._bexpr_from_postfix_tokens(['A', '~', '~'])
self.assertEqual(b.symbols, ['A'])
self.assertEqual(b.tokens, ['~', '~', 'A'])
self.assertEqual(b.postfix_tokens, ['A', '~', '~'])
self.assertEqual(b.raw_expr, '~~A')
self.assertTrue(b.tree is not None)
b = self._bexpr_from_postfix_tokens(['A', '~', '~', '~'])
self.assertEqual(b.symbols, ['A'])
self.assertEqual(b.tokens, ['~', '~', '~', 'A'])
self.assertEqual(b.postfix_tokens, ['A', '~', '~', '~'])
self.assertEqual(b.raw_expr, '~~~A')
self.assertTrue(b.tree is not None)
def test_only_plain_english_symbolic_unary_operators(self):
"""Test from trees containing only plain-English unary operators."""
b = self._bexpr_from_postfix_tokens(['A', 'not'])
self.assertEqual(b.symbols, ['A'])
self.assertEqual(b.tokens, ['not', 'A'])
self.assertEqual(b.postfix_tokens, ['A', 'not'])
self.assertEqual(b.raw_expr, 'not A')
self.assertTrue(b.tree is not None)
b = self._bexpr_from_postfix_tokens(['A', 'not', 'not'])
self.assertEqual(b.symbols, ['A'])
self.assertEqual(b.tokens, ['not', 'not', 'A'])
self.assertEqual(b.postfix_tokens, ['A', 'not', 'not'])
self.assertEqual(b.raw_expr, 'not not A')
self.assertTrue(b.tree is not None)
b = self._bexpr_from_postfix_tokens(['A', 'not', 'not', 'not'])
self.assertEqual(b.symbols, ['A'])
self.assertEqual(b.tokens, ['not', 'not', 'not', 'A'])
self.assertEqual(b.postfix_tokens, ['A', 'not', 'not', 'not'])
self.assertEqual(b.raw_expr, 'not not not A')
self.assertTrue(b.tree is not None)
def test_unary_operator_applied_to_binary_operator(self):
"""Test applying a unary operator to exprs of binary operators."""
b = self._bexpr_from_postfix_tokens(['A', 'B', '&', 'not'])
self.assertEqual(b.symbols, ['A', 'B'])
self.assertEqual(b.tokens, ['not', '(', 'A', '&', 'B', ')'])
self.assertEqual(b.postfix_tokens, ['A', 'B', '&', 'not'])
self.assertEqual(b.raw_expr, 'not (A & B)')
self.assertTrue(b.tree is not None)
b = self._bexpr_from_postfix_tokens(['A', 'B', '&', '~'])
self.assertEqual(b.symbols, ['A', 'B'])
self.assertEqual(b.tokens, ['~', '(', 'A', '&', 'B', ')'])
self.assertEqual(b.postfix_tokens, ['A', 'B', '&', '~'])
self.assertEqual(b.raw_expr, '~(A & B)')
self.assertTrue(b.tree is not None)
def test_negated_single_operand_clauses(self):
"""Test single-operand clauses that are negated several times."""
b = self._bexpr_from_postfix_tokens(
['A', 'not', 'not', 'not', 'not',
'B', '!',
'C',
'D', '~', '~',
'or', 'or', 'or'])
self.assertEqual(
b.symbols,
['A', 'B', 'C', 'D'])
self.assertEqual(
b.tokens,
['not', 'not', 'not', 'not', 'A', 'or', '!', 'B', 'or', 'C', 'or',
'~', '~', 'D'])
self.assertEqual(
b.postfix_tokens,
['A', 'not', 'not', 'not', 'not',
'B', '!',
'C',
'D', '~', '~',
'or', 'or', 'or'])
self.assertEqual(b.raw_expr, 'not not not not A or !B or C or ~~D')
self.assertTrue(b.tree is not None)
def test_parens_for_leading_chained_operator_clause(self):
"""Test paren insertion for a leading clause of chained operators."""
b = self._bexpr_from_postfix_tokens(
['A', 'B', 'C', 'D', 'and', 'and', 'and',
'E',
'or'])
self.assertEqual(
b.symbols,
['A', 'B', 'C', 'D', 'E'])
self.assertEqual(
b.tokens,
['(', 'A', 'and', 'B', 'and', 'C', 'and', 'D', ')', 'or', 'E'])
self.assertEqual(
b.postfix_tokens,
['A', 'B', 'C', 'D', 'and', 'and', 'and',
'E',
'or'])
self.assertEqual(
b.raw_expr,
'(A and B and C and D) or E')
self.assertTrue(b.tree is not None)
def test_parens_for_trailing_chained_operator_clause(self):
"""Test paren insertion for a trailing clause of chained operators."""
b = self._bexpr_from_postfix_tokens(
['A',
'B', 'C', 'or',
'D', 'E', 'F', 'or', 'or',
'and', 'and'])
self.assertEqual(
b.symbols,
['A', 'B', 'C', 'D', 'E', 'F'])
self.assertEqual(
b.tokens,
['A', 'and', '(', 'B', 'or', 'C', ')', 'and',
'(', 'D', 'or', 'E', 'or', 'F', ')'])
self.assertEqual(
b.postfix_tokens,
['A',
'B', 'C', 'or',
'D', 'E', 'F', 'or', 'or',
'and', 'and'])
self.assertEqual(b.raw_expr, 'A and (B or C) and (D or E or F)')
self.assertTrue(b.tree is not None)
def test_parens_for_sandwiched_chained_operator_clause(self):
"""Test paren insertion for a clause in the middle of an expression."""
b = self._bexpr_from_postfix_tokens(
['A',
'B', 'C', 'D', '&', '&',
'E',
'->', '->'])
self.assertEqual(
b.symbols,
['A', 'B', 'C', 'D', 'E'])
self.assertEqual(
b.tokens,
['A', '->', '(', 'B', '&', 'C', '&', 'D', ')', '->', 'E'])
self.assertEqual(
b.postfix_tokens,
['A',
'B', 'C', 'D', '&', '&',
'E',
'->', '->'])
self.assertEqual(b.raw_expr, 'A -> (B & C & D) -> E')
self.assertTrue(b.tree is not None)
def test_parens_for_negated_leading_chained_operator_clause(self):
"""Test paren insertion for negated leading clause."""
b = self._bexpr_from_postfix_tokens(
['A', 'B', 'iff', '!',
'0', '1', 'and',
'and'])
self.assertEqual(
b.symbols,
['A', 'B'])
self.assertEqual(
b.tokens,
['!', '(', 'A', 'iff', 'B', ')', 'and', '0', 'and', '1'])
self.assertEqual(
b.postfix_tokens,
['A', 'B', 'iff', '!',
'0', '1', 'and',
'and'])
self.assertEqual(b.raw_expr, '!(A iff B) and 0 and 1')
self.assertTrue(b.tree is not None)
def test_parens_for_sandwiched_negated_chained_operator_clause(self):
"""Test paren insertion for negated sandwiched clause."""
b = self._bexpr_from_postfix_tokens(
['A',
'B', 'C', 'D', 'xor', 'xor', '~',
'E',
'->', '->'])
self.assertEqual(
b.symbols,
['A', 'B', 'C', 'D', 'E'])
self.assertEqual(
b.tokens,
['A', '->', '~', '(', 'B', 'xor', 'C', 'xor', 'D', ')', '->', 'E'])
self.assertEqual(
b.postfix_tokens,
['A',
'B', 'C', 'D', 'xor', 'xor', '~',
'E',
'->', '->'])
self.assertEqual(b.raw_expr, 'A -> ~(B xor C xor D) -> E')
self.assertTrue(b.tree is not None)
def test_parens_for_trailing_negated_chained_operator_clause(self):
"""Test paren insertion for negated trailing clause."""
b = self._bexpr_from_postfix_tokens(
['A',
'B',
'C', 'D', 'E', 'and', 'and', 'not',
'xnor', 'xnor'])
self.assertEqual(
b.symbols,
['A', 'B', 'C', 'D', 'E'])
self.assertEqual(
b.tokens,
['A', 'xnor', 'B', 'xnor',
'not', '(', 'C', 'and', 'D', 'and', 'E', ')'])
self.assertEqual(
b.postfix_tokens,
['A',
'B',
'C', 'D', 'E', 'and', 'and', 'not',
'xnor', 'xnor'])
self.assertEqual(b.raw_expr, 'A xnor B xnor not (C and D and E)')
self.assertTrue(b.tree is not None)
def test_multiple_negated_chained_clauses(self):
"""Test clauses that are negated multiple times."""
b = self._bexpr_from_postfix_tokens(
['A', '~', '~', '~',
'B', 'C', 'D', '&', '&', '~', '~',
'E', '~',
'->', '->'])
self.assertEqual(
b.symbols,
['A', 'B', 'C', 'D', 'E'])
self.assertEqual(
b.tokens,
['~', '~', '~', 'A', '->',
'~', '~', '(', 'B', '&', 'C', '&', 'D', ')', '->',
'~', 'E'])
self.assertEqual(
b.postfix_tokens,
['A', '~', '~', '~',
'B', 'C', 'D', '&', '&', '~', '~',
'E', '~',
'->', '->'])
self.assertEqual(b.raw_expr, '~~~A -> ~~(B & C & D) -> ~E')
self.assertTrue(b.tree is not None)
def test_parens_for_nested_chained_clause(self):
"""Test paren insertion for nested chained clauses."""
b = self._bexpr_from_postfix_tokens(
['A', 'B', 'C', 'D', '->', '->', 'E', 'nand', 'xor', '~',
'A', 'B', 'xor', '~', 'A', 'B', 'xnor', '!', 'C', 'or', 'nand',
'iff'])
self.assertEqual(
b.symbols,
['A', 'B', 'C', 'D', 'E'])
self.assertEqual(
b.tokens,
['~', '(', 'A', 'xor', '(', '(', 'B', '->', 'C', '->', 'D', ')',
'nand', 'E', ')', ')', 'iff', '(', '~', '(', 'A', 'xor', 'B', ')',
'nand', '(', '!', '(', 'A', 'xnor', 'B', ')', 'or', 'C', ')',
')'])
self.assertEqual(
b.postfix_tokens,
['A', 'B', 'C', 'D', '->', '->', 'E', 'nand', 'xor', '~',
'A', 'B', 'xor', '~', 'A', 'B', 'xnor', '!', 'C', 'or', 'nand',
'iff'])
self.assertEqual(
b.raw_expr,
'~(A xor ((B -> C -> D) nand E)) iff (~(A xor B) nand '
'(!(A xnor B) or C))')
self.assertTrue(b.tree is not None)
def test_parens_for_leading_single_operand_same_op_as_next(self):
"""Parens are omitted for a leading op the same as the next clause."""
b = self._bexpr_from_postfix_tokens(
['A',
'B', 'C', 'and',
'and'])
self.assertEqual(
b.symbols,
['A', 'B', 'C'])
self.assertEqual(
b.tokens,
['A', 'and', 'B', 'and', 'C'])
self.assertEqual(
b.postfix_tokens,
['A',
'B', 'C', 'and',
'and'])
self.assertEqual(
b.raw_expr,
'A and B and C')
self.assertTrue(b.tree is not None)
def test_parens_for_trailing_single_operand_same_op_as_prev(self):
"""Parens are omitted for a trailing op the same as the prev clause."""
b = self._bexpr_from_postfix_tokens(
['A', 'B', 'and',
'C',
'and'])
self.assertEqual(
b.symbols,
['A', 'B', 'C'])
self.assertEqual(
b.tokens,
['A', 'and', 'B', 'and', 'C'])
self.assertEqual(
b.postfix_tokens,
['A', 'B', 'and',
'C',
'and'])
self.assertEqual(
b.raw_expr,
'A and B and C')
self.assertTrue(b.tree is not None)
def test_parens_for_sandwiched_single_operand_same_op_as_neighbors(self):
"""Parens are omitted for an op the same as the adjacent clauses."""
b = self._bexpr_from_postfix_tokens(
['A', 'B', 'and',
'C',
'D', 'E', 'and',
'and', 'and'])
self.assertEqual(
b.symbols,
['A', 'B', 'C', 'D', 'E'])
self.assertEqual(
b.tokens,
['A', 'and', 'B', 'and', 'C', 'and', 'D', 'and', 'E'])
self.assertEqual(
b.postfix_tokens,
['A', 'B', 'and',
'C',
'D', 'E', 'and',
'and', 'and'])
self.assertEqual(
b.raw_expr,
'A and B and C and D and E')
self.assertTrue(b.tree is not None)
| 37.6 | 79 | 0.457225 | 1,533 | 13,536 | 3.887149 | 0.061318 | 0.211445 | 0.225541 | 0.065447 | 0.861554 | 0.834368 | 0.804497 | 0.79124 | 0.767075 | 0.657325 | 0 | 0.001331 | 0.333703 | 13,536 | 359 | 80 | 37.704735 | 0.659386 | 0.071365 | 0 | 0.708861 | 0 | 0 | 0.10173 | 0 | 0 | 0 | 0 | 0 | 0.332278 | 1 | 0.053797 | false | 0 | 0.009494 | 0.003165 | 0.06962 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
292a83a842d7d1d8d8ecf1d02855035ec5f29698 | 82 | py | Python | PredictiveOutlierExplanationBenchmark/src/models/detectors/__init__.py | myrtakis/MxM | 552fa5ad27e542f41ade04ad1af565fdc6213be7 | [
"Apache-2.0"
] | 5 | 2021-07-13T07:48:32.000Z | 2022-02-28T10:59:15.000Z | PredictiveOutlierExplanationBenchmark/src/models/detectors/__init__.py | myrtakis/MxM | 552fa5ad27e542f41ade04ad1af565fdc6213be7 | [
"Apache-2.0"
] | 5 | 2021-03-19T15:30:01.000Z | 2022-03-12T00:51:20.000Z | PredictiveOutlierExplanationBenchmark/src/models/detectors/__init__.py | myrtakis/MxM | 552fa5ad27e542f41ade04ad1af565fdc6213be7 | [
"Apache-2.0"
] | null | null | null | from models.detectors.Lof import Lof
from models.detectors.iForest import iForest
| 27.333333 | 44 | 0.853659 | 12 | 82 | 5.833333 | 0.5 | 0.285714 | 0.542857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 82 | 2 | 45 | 41 | 0.945946 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
2930f63d2201dc2ee49d44bbae1fe72a46133339 | 190 | py | Python | lhvqt/__init__.py | cwitkowitz/LHVQT | 19e803f9c9d75620ecace8ea760c7e2e097a49bb | [
"MIT"
] | 11 | 2019-11-11T08:07:41.000Z | 2021-03-21T19:45:45.000Z | lhvqt/__init__.py | mjhydri/lhvqt | 19e803f9c9d75620ecace8ea760c7e2e097a49bb | [
"MIT"
] | null | null | null | lhvqt/__init__.py | mjhydri/lhvqt | 19e803f9c9d75620ecace8ea760c7e2e097a49bb | [
"MIT"
] | 1 | 2021-10-30T17:51:28.000Z | 2021-10-30T17:51:28.000Z | from .lhvqt import *
from .lhvqt_comb import *
from .lvqt import *
from .lvqt_hilb import *
from .lvqt_orig import *
from .lvqt_real import *
from .utils import *
from .variational import *
| 21.111111 | 26 | 0.747368 | 28 | 190 | 4.928571 | 0.357143 | 0.507246 | 0.405797 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168421 | 190 | 8 | 27 | 23.75 | 0.873418 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
29817eb4a6aed42df261446c7750a0ece8cc3cc9 | 240 | py | Python | cli/scaleout/repository/helpers.py | aitmlouk/fedn | d6b663554492464e4eaed391a7048ec09cb11f25 | [
"Apache-2.0"
] | null | null | null | cli/scaleout/repository/helpers.py | aitmlouk/fedn | d6b663554492464e4eaed391a7048ec09cb11f25 | [
"Apache-2.0"
] | null | null | null | cli/scaleout/repository/helpers.py | aitmlouk/fedn | d6b663554492464e4eaed391a7048ec09cb11f25 | [
"Apache-2.0"
] | null | null | null | from scaleout.repository.miniorepository import MINIORepository
from scaleout.repository.s3modelrepository import S3ModelRepository
def get_repository(config=None):
return S3ModelRepository(config)
#return MINIORepository(config)
| 30 | 67 | 0.845833 | 23 | 240 | 8.782609 | 0.478261 | 0.118812 | 0.217822 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013889 | 0.1 | 240 | 7 | 68 | 34.285714 | 0.921296 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
31064c7f33f26a3093bc5fe4574ed9f7cff1982e | 9,327 | py | Python | tests/test_models/test_recognizers/test_recognizer3d.py | kiyoon/Video-Swin-Transformer | 7a0d40ced8fb52c064d1cd11ffa8b0c3bbb77607 | [
"Apache-2.0"
] | 648 | 2021-06-24T19:33:09.000Z | 2022-03-31T06:27:24.000Z | tests/test_models/test_recognizers/test_recognizer3d.py | jayleicn/mmaction2-1 | 0a6fde1abb8403f1f68b568f5b4694c6f828e27e | [
"Apache-2.0"
] | 53 | 2021-07-01T03:07:52.000Z | 2022-03-27T16:15:29.000Z | tests/test_models/test_recognizers/test_recognizer3d.py | jayleicn/mmaction2-1 | 0a6fde1abb8403f1f68b568f5b4694c6f828e27e | [
"Apache-2.0"
] | 117 | 2021-06-25T01:22:32.000Z | 2022-03-31T08:33:55.000Z | import torch
from mmaction.models import build_recognizer
from ..base import generate_recognizer_demo_inputs, get_recognizer_cfg
def test_i3d():
config = get_recognizer_cfg('i3d/i3d_r50_32x2x1_100e_kinetics400_rgb.py')
config.model['backbone']['pretrained2d'] = False
config.model['backbone']['pretrained'] = None
recognizer = build_recognizer(config.model)
input_shape = (1, 3, 3, 8, 32, 32)
demo_inputs = generate_recognizer_demo_inputs(input_shape, '3D')
imgs = demo_inputs['imgs']
gt_labels = demo_inputs['gt_labels']
# parrots 3dconv is only implemented on gpu
if torch.__version__ == 'parrots':
if torch.cuda.is_available():
recognizer = recognizer.cuda()
imgs = imgs.cuda()
gt_labels = gt_labels.cuda()
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
# Test forward dummy
recognizer.forward_dummy(imgs, softmax=False)
res = recognizer.forward_dummy(imgs, softmax=True)[0]
assert torch.min(res) >= 0
assert torch.max(res) <= 1
else:
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
# Test forward dummy
recognizer.forward_dummy(imgs, softmax=False)
res = recognizer.forward_dummy(imgs, softmax=True)[0]
assert torch.min(res) >= 0
assert torch.max(res) <= 1
def test_r2plus1d():
config = get_recognizer_cfg(
'r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb.py')
config.model['backbone']['pretrained2d'] = False
config.model['backbone']['pretrained'] = None
config.model['backbone']['norm_cfg'] = dict(type='BN3d')
recognizer = build_recognizer(config.model)
input_shape = (1, 3, 3, 8, 32, 32)
demo_inputs = generate_recognizer_demo_inputs(input_shape, '3D')
imgs = demo_inputs['imgs']
gt_labels = demo_inputs['gt_labels']
# parrots 3dconv is only implemented on gpu
if torch.__version__ == 'parrots':
if torch.cuda.is_available():
recognizer = recognizer.cuda()
imgs = imgs.cuda()
gt_labels = gt_labels.cuda()
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
else:
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
def test_slowfast():
config = get_recognizer_cfg(
'slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb.py')
recognizer = build_recognizer(config.model)
input_shape = (1, 3, 3, 16, 32, 32)
demo_inputs = generate_recognizer_demo_inputs(input_shape, '3D')
imgs = demo_inputs['imgs']
gt_labels = demo_inputs['gt_labels']
# parrots 3dconv is only implemented on gpu
if torch.__version__ == 'parrots':
if torch.cuda.is_available():
recognizer = recognizer.cuda()
imgs = imgs.cuda()
gt_labels = gt_labels.cuda()
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
else:
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
# Test the feature max_testing_views
config.model.test_cfg['max_testing_views'] = 1
recognizer = build_recognizer(config.model)
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
def test_csn():
config = get_recognizer_cfg(
'csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb.py')
config.model['backbone']['pretrained2d'] = False
config.model['backbone']['pretrained'] = None
recognizer = build_recognizer(config.model)
input_shape = (1, 3, 3, 8, 32, 32)
demo_inputs = generate_recognizer_demo_inputs(input_shape, '3D')
imgs = demo_inputs['imgs']
gt_labels = demo_inputs['gt_labels']
# parrots 3dconv is only implemented on gpu
if torch.__version__ == 'parrots':
if torch.cuda.is_available():
recognizer = recognizer.cuda()
imgs = imgs.cuda()
gt_labels = gt_labels.cuda()
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
else:
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
def test_tpn():
config = get_recognizer_cfg(
'tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb.py')
config.model['backbone']['pretrained'] = None
recognizer = build_recognizer(config.model)
input_shape = (1, 8, 3, 1, 32, 32)
demo_inputs = generate_recognizer_demo_inputs(input_shape, '3D')
imgs = demo_inputs['imgs']
gt_labels = demo_inputs['gt_labels']
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
# Test dummy forward
with torch.no_grad():
_recognizer = build_recognizer(config.model)
img_list = [img[None, :] for img in imgs]
if hasattr(_recognizer, 'forward_dummy'):
_recognizer.forward = _recognizer.forward_dummy
for one_img in img_list:
_recognizer(one_img)
def test_c3d():
config = get_recognizer_cfg('c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb.py')
config.model['backbone']['pretrained'] = None
recognizer = build_recognizer(config.model)
input_shape = (1, 3, 3, 16, 112, 112)
demo_inputs = generate_recognizer_demo_inputs(input_shape, '3D')
imgs = demo_inputs['imgs']
gt_labels = demo_inputs['gt_labels']
losses = recognizer(imgs, gt_labels)
assert isinstance(losses, dict)
# Test forward test
with torch.no_grad():
img_list = [img[None, :] for img in imgs]
for one_img in img_list:
recognizer(one_img, None, return_loss=False)
# Test forward gradcam
recognizer(imgs, gradcam=True)
for one_img in img_list:
recognizer(one_img, gradcam=True)
| 32.841549 | 77 | 0.61724 | 1,150 | 9,327 | 4.761739 | 0.091304 | 0.04821 | 0.036158 | 0.044193 | 0.873996 | 0.860847 | 0.860847 | 0.860847 | 0.856099 | 0.84989 | 0 | 0.022895 | 0.288196 | 9,327 | 283 | 78 | 32.957597 | 0.801928 | 0.069583 | 0 | 0.875648 | 1 | 0 | 0.07076 | 0.033992 | 0 | 0 | 0 | 0 | 0.072539 | 1 | 0.031088 | false | 0 | 0.015544 | 0 | 0.046632 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3111b9a5eace1c63c09b9f4335fc8ec7828a8b81 | 6,821 | py | Python | mosaic/test/test_space.py | JohnKurian/mosaic | bc985640844e10b9aded478d6ef7a5f286ab2d82 | [
"BSD-3-Clause"
] | 6 | 2018-09-17T13:27:46.000Z | 2021-09-03T15:46:15.000Z | mosaic/test/test_space.py | JohnKurian/mosaic | bc985640844e10b9aded478d6ef7a5f286ab2d82 | [
"BSD-3-Clause"
] | null | null | null | mosaic/test/test_space.py | JohnKurian/mosaic | bc985640844e10b9aded478d6ef7a5f286ab2d82 | [
"BSD-3-Clause"
] | 2 | 2019-11-21T13:17:11.000Z | 2020-11-30T03:32:41.000Z | import unittest
from mosaic.space import Space
from mosaic.simulation.parameter import Parameter
from mosaic.simulation.rules import ChildRule
from mosaic.simulation.scenario import WorkflowListTask, WorkflowChoiceScenario
class TestSpace(unittest.TestCase):
def test_init(self):
pass
def test_next_params(self):
def a_func(): return 0
def b_func(): return 0
def c_func(): return 0
x1 = WorkflowListTask(is_ordered=False, name ="x1", tasks = ["x1__p1", "x1__p2"])
x2 = WorkflowListTask(is_ordered=True, name ="x2", tasks = ["x2__p1", "x2__p2", "x2__p3"])
start = WorkflowChoiceScenario(name ="Model", scenarios=[x1, x2])
sampler = { "x1__p1": Parameter("x1__p1", [0, 1], "uniform", "float"),
"x1__p2": Parameter("x1__p2", [1, 2, 3, 4, 5, 6, 7], "choice", "int"),
"x2__p1": Parameter("x2__p1", ["a", "b", "c", "d"], "choice", "string"),
"x2__p2": Parameter("x2__p2", [a_func, b_func, c_func], "choice", "func"),
"x2__p3": Parameter("x2__p3", "lol", "constant", "string"),
}
space = Space(scenario = start, sampler = sampler)
for i in range(50):
assert(space.next_params(history=[("Model", None), ("x2", None), ("x2__p1", "c"), ("x2__p2", a_func)]) == ("x2__p3", "lol", True))
assert(space.next_params(history=[("Model", None), ("x2", None), ("x2__p1", "c")])[0] == "x2__p2")
assert(space.next_params(history=[("Model", None), ("x2", None), ("x2__p1", "a")])[0] == "x2__p2")
assert(space.next_params(history=[("Model", None), ("x2", None)])[0] == "x2__p1")
assert(space.next_params(history=[("Model", None), ("x1", None)])[0] in ["x1__p1", "x1__p2"])
assert(space.next_params(history=[("Model", None), ("x1", None), ("x1__p2", 5)])[0] in ["x1__p1", "x1__p2"])
assert(space.next_params(history=[("Model", None)])[0] in ["x1", "x2"])
def test_is_valid(self):
x1 = WorkflowListTask(is_ordered=False, name ="x1", tasks = ["x1__p1", "x1__p2"])
x2 = WorkflowListTask(is_ordered=True, name ="x2", tasks = ["x2__p1", "x2__p2", "x2__p3"])
start = WorkflowChoiceScenario(name ="Model", scenarios=[x1, x2])
sampler = { "x1__p1": Parameter("x1__p1", [0, 1], "uniform", "float"),
"x1__p2": Parameter("x1__p2", [1, 2, 3, 4, 5, 6, 7], "choice", "int"),
"x2__p1": Parameter("x2__p1", ["a", "b", "c", "d"], "choice", "string"),
"x2__p2": Parameter("x2__p2", [10, 11, 12], "choice", "int"),
"x2__p3": Parameter("x2__p3", "lol", "constant", "string"),
}
rules = [ChildRule(applied_to = ["x2__p2"], parent = "x2__p1", value = ["a"])]
space = Space(scenario = start, sampler = sampler, rules = rules)
assert(space.next_params(history=[("Model", None), ("x2", None), ("x2__p1", "a")])[0] in ["x2__p2", "x2__p3"])
assert(space.has_finite_child(history=[("Model", None), ("x2", None), ("x2__p1", "b")])[1] == 0)
assert(space.has_finite_child(history=[("Model", None), ("x2", None), ("x2__p1", "a")])[1] > 0)
def test_sample(self):
def a_func(): return 0
def b_func(): return 0
def c_func(): return 0
x1 = WorkflowListTask(is_ordered=False, name ="x1", tasks = ["x1__p1", "x1__p2"])
x2 = WorkflowListTask(is_ordered=True, name ="x2", tasks = ["x2__p1", "x2__p2"])
start = WorkflowChoiceScenario(name ="Model", scenarios=[x1, x2])
sampler = { "x1__p1": Parameter("x1__p1", [0, 1], "uniform", "float"),
"x1__p2": Parameter("x1__p2", [1, 2, 3, 4, 5, 6, 7], "choice", "int"),
"x2__p1": Parameter("x2__p1", ["a", "b", "c", "d"], "choice", "string"),
"x2__p2": Parameter("x2__p2", [a_func, b_func, c_func], "choice", "func"),
}
space = Space(scenario = start, sampler = sampler)
for i in range(10):
v = space.sample("x1__p1")
assert(v >= 0)
assert(v <= 1)
assert(space.sample("x1__p2") in [1, 2, 3, 4, 5, 6, 7])
assert(space.sample("x2__p1") in ["a", "b", "c", "d"])
assert(space.sample("x2__p2") in [a_func, b_func, c_func])
def test_playout(self):
def a_func(): return 0
def b_func(): return 0
def c_func(): return 0
x1 = WorkflowListTask(is_ordered=False, name ="x1", tasks = ["x1__p1", "x1__p2"])
x2 = WorkflowListTask(is_ordered=True, name ="x2", tasks = ["x2__p1", "x2__p2"])
start = WorkflowChoiceScenario(name ="Model", scenarios=[x1, x2])
sampler = { "x1__p1": Parameter("x1__p1", [0, 1], "uniform", "float"),
"x1__p2": Parameter("x1__p2", [1, 2, 3, 4, 5, 6, 7], "choice", "int"),
"x2__p1": Parameter("x2__p1", ["a", "b", "c", "d"], "choice", "string"),
"x2__p2": Parameter("x2__p2", [a_func, b_func, c_func], "choice", "func"),
}
space = Space(scenario = start, sampler = sampler)
for i in range(10):
space.playout(history = [("Model", None)])
def test_has_finite_child(self):
def a_func(): return 0
def b_func(): return 0
def c_func(): return 0
x1 = WorkflowListTask(is_ordered=False, name ="x1", tasks = ["x1__p1", "x1__p2"])
x2 = WorkflowListTask(is_ordered=True, name ="x2", tasks = ["x2__p1", "x2__p2"])
start = WorkflowChoiceScenario(name ="Model", scenarios=[x1, x2])
sampler = { "x1__p1": Parameter("x1__p1", [0, 1], "uniform", "float"),
"x1__p2": Parameter("x1__p2", [1, 2, 3, 4, 5, 6, 7], "choice", "int"),
"x2__p1": Parameter("x2__p1", ["a", "b", "c", "d"], "choice", "string"),
"x2__p2": Parameter("x2__p2", [a_func, b_func, c_func], "choice", "func"),
}
space = Space(scenario = start, sampler = sampler)
assert(space.has_finite_child(history = [("Model", None)]) == (False, 2))
assert(space.has_finite_child(history = [("Model", None), ("x1", None)]) == (False, 17))
assert(space.has_finite_child(history = [("Model", None), ("x1", None), ("x1__p1", 0.5)]) == (False, 7))
assert(space.has_finite_child(history = [("Model", None), ("x1", None), ("x1__p1", 0.5), ("x1__p2", 1)]) == (False, 0))
assert(space.has_finite_child(history = [("Model", None), ("x2", None)]) == (False, 4))
assert(space.has_finite_child(history = [("Model", None), ("x2", None), ("x2__p1", "a")]) == (False, 3))
assert(space.has_finite_child(history = [("Model", None), ("x2", None), ("x2__p1", "c"), ("x2__p2", c_func)]) == (False, 0))
| 51.285714 | 142 | 0.545081 | 897 | 6,821 | 3.846154 | 0.090301 | 0.030145 | 0.083478 | 0.052174 | 0.817971 | 0.817971 | 0.802319 | 0.80029 | 0.768696 | 0.754493 | 0 | 0.066343 | 0.244246 | 6,821 | 132 | 143 | 51.674242 | 0.60291 | 0 | 0 | 0.54 | 0 | 0 | 0.150858 | 0 | 0 | 0 | 0 | 0 | 0.22 | 1 | 0.18 | false | 0.01 | 0.05 | 0.12 | 0.24 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 7 |
3126ed0ba314d3fe98cee4e981455c8dadfb215b | 7,115 | py | Python | tests/examples/app/test_app.py | rgreinho/OpenAlchemy | 23202bdecb94763d09b6d9e84eb9b29506c811ae | [
"Apache-2.0"
] | null | null | null | tests/examples/app/test_app.py | rgreinho/OpenAlchemy | 23202bdecb94763d09b6d9e84eb9b29506c811ae | [
"Apache-2.0"
] | 53 | 2020-12-30T15:32:55.000Z | 2022-03-31T10:07:00.000Z | tests/examples/app/test_app.py | rgreinho/OpenAlchemy | 23202bdecb94763d09b6d9e84eb9b29506c811ae | [
"Apache-2.0"
] | null | null | null | """Tests for example app."""
# pylint: disable=no-member
import json
import pytest
from examples.app import models_autogenerated
from open_alchemy import models
@pytest.mark.app
def test_post(client, db_session):
"""
GIVEN employee
WHEN /employee POST is called with the employee
THEN the employee is in the database.
"""
employee = {"id": 1, "name": "name 1", "division": "division 1", "salary": 1.0}
response = client.post(
"/employee",
data=json.dumps(employee),
headers={"Content-Type": "application/json"},
)
assert response.status_code == 204
db_employees = db_session.query(models.Employee).all()
assert len(db_employees) == 1
db_employee = db_employees[0]
assert db_employee.id == employee["id"]
assert db_employee.name == employee["name"]
assert db_employee.division == employee["division"]
assert db_employee.salary == employee["salary"]
@pytest.mark.app
def test_post_duplicate(client):
"""
GIVEN employee
WHEN /employee POST is called twice with the employee
THEN 400 is returned.
"""
employee = {"id": 1, "name": "name 1", "division": "division 1", "salary": 1.0}
client.post(
"/employee",
data=json.dumps(employee),
headers={"Content-Type": "application/json"},
)
response = client.post(
"/employee",
data=json.dumps(employee),
headers={"Content-Type": "application/json"},
)
assert response.status_code == 400
@pytest.mark.app
def test_get(client, db_session):
"""
GIVEN database with employee
WHEN /employee GET is called
THEN the employee is returned.
"""
db_employee = models.Employee(
id=1, name="name 1", division="division 1", salary=1.0
)
db_session.add(db_employee)
db_session.flush()
response = client.get("/employee")
assert response.status_code == 200
employees = response.json
assert len(employees) == 1
employee = employees[0]
assert employee["id"] == db_employee.id
assert employee["name"] == db_employee.name
assert employee["division"] == db_employee.division
assert employee["salary"] == db_employee.salary
@pytest.mark.app
def test_get_id_miss(client, db_session):
"""
GIVEN database with employee
WHEN /employee/{id} GET is called with a different id
THEN 404 is returned.
"""
db_employee = models.Employee(
id=1, name="name 1", division="division 1", salary=1.0
)
db_session.add(db_employee)
db_session.flush()
response = client.get("/employee/2")
assert response.status_code == 404
@pytest.mark.app
def test_get_id_hit(client, db_session):
"""
GIVEN database with employee
WHEN /employee/{id} GET is called with the id of the employee
THEN the employee is returned.
"""
db_employee = models.Employee(
id=1, name="name 1", division="division 1", salary=1.0
)
db_session.add(db_employee)
db_session.flush()
response = client.get(f"/employee/{db_employee.id}")
assert response.status_code == 200
employee = response.json
assert employee["id"] == db_employee.id
assert employee["name"] == db_employee.name
assert employee["division"] == db_employee.division
assert employee["salary"] == db_employee.salary
@pytest.mark.app
def test_patch_id_miss(client, db_session):
"""
GIVEN database with employee
WHEN /employee/{id} PATCH is called with a different id
THEN 404 is returned.
"""
db_employee = models.Employee(
id=1, name="name 1", division="division 1", salary=1.0
)
db_session.add(db_employee)
db_session.flush()
employee = {"id": 2, "name": "name 2", "division": "division 2", "salary": 2.0}
response = client.patch(
"/employee/2",
data=json.dumps(employee),
headers={"Content-Type": "application/json"},
)
assert response.status_code == 404
@pytest.mark.app
def test_patch_id_hit(client, db_session):
"""
GIVEN database with employee
WHEN /employee/{id} PATCH is called with the id of the employee
THEN the employee is updated.
"""
db_employee = models.Employee(
id=1, name="name 1", division="division 1", salary=1.0
)
db_session.add(db_employee)
db_session.flush()
employee = {
"id": db_employee.id,
"name": "name 2",
"division": "division 2",
"salary": 2.0,
}
response = client.patch(
f"/employee/{db_employee.id}",
data=json.dumps(employee),
headers={"Content-Type": "application/json"},
)
assert response.status_code == 200
db_session.refresh(db_employee)
assert db_employee.id == employee["id"]
assert db_employee.name == employee["name"]
assert db_employee.division == employee["division"]
assert db_employee.salary == employee["salary"]
@pytest.mark.app
def test_delete_id_miss(client, db_session):
"""
GIVEN database with employee
WHEN /employee/{id} DELETE is called with a different id
THEN 404 is returned.
"""
db_employee = models.Employee(
id=1, name="name 1", division="division 1", salary=1.0
)
db_session.add(db_employee)
db_session.flush()
response = client.delete("/employee/2")
assert response.status_code == 404
@pytest.mark.app
def test_delete_id_hit(client, db_session):
"""
GIVEN database with employee
WHEN /employee/{id} DELETE is called with the id of the employee
THEN the employee is updated.
"""
db_employee = models.Employee(
id=1, name="name 1", division="division 1", salary=1.0
)
db_session.add(db_employee)
db_session.flush()
response = client.delete(f"/employee/{db_employee.id}")
assert response.status_code == 200
db_employees = db_session.query(models.Employee).all()
assert len(db_employees) == 0
@pytest.mark.app
def test_models_autogen_init(db_session, employee_kwargs):
"""
GIVEN autogenerated models
WHEN a model is constructed using __init__ and added to the session
THEN the employee is in the database.
"""
employee = models_autogenerated.Employee(**employee_kwargs)
db_session.add(employee)
queried_employees = db_session.query(models.Employee).all()
assert len(queried_employees) == 1
queried_employee = queried_employees[0]
for key, value in employee_kwargs.items():
assert getattr(queried_employee, key) == value
@pytest.mark.app
def test_models_autogen_from_dict(db_session, employee_kwargs):
"""
GIVEN autogenerated models
WHEN a model is constructed using __init__ and added to the session
THEN the employee is in the database.
"""
employee = models_autogenerated.Employee.from_dict(**employee_kwargs)
db_session.add(employee)
queried_employees = db_session.query(models.Employee).all()
assert len(queried_employees) == 1
queried_employee = queried_employees[0]
for key, value in employee_kwargs.items():
assert getattr(queried_employee, key) == value
| 28.011811 | 83 | 0.665495 | 931 | 7,115 | 4.936627 | 0.09667 | 0.078329 | 0.031114 | 0.038294 | 0.91906 | 0.906005 | 0.891427 | 0.854874 | 0.848346 | 0.837032 | 0 | 0.018773 | 0.213914 | 7,115 | 253 | 84 | 28.12253 | 0.802968 | 0.180604 | 0 | 0.637584 | 1 | 0 | 0.111829 | 0.014001 | 0 | 0 | 0 | 0 | 0.214765 | 1 | 0.073826 | false | 0 | 0.026846 | 0 | 0.100671 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3133cad967e604fa090b0ff28cb17ab87dadbbb0 | 575 | py | Python | deeptensor/vis/__init__.py | cicicici/deeptensor | efcd7b9ca2d758cb2461b64fa5ba1268685e4dab | [
"MIT"
] | 1 | 2018-04-04T09:00:45.000Z | 2018-04-04T09:00:45.000Z | deeptensor/vis/__init__.py | cicicici/deeptensor | efcd7b9ca2d758cb2461b64fa5ba1268685e4dab | [
"MIT"
] | null | null | null | deeptensor/vis/__init__.py | cicicici/deeptensor | efcd7b9ca2d758cb2461b64fa5ba1268685e4dab | [
"MIT"
] | null | null | null | from .summary import set_default_writer, get_default_writer, add_graph, add_image, add_images_grid, \
add_images, add_scalar, add_histogram, add_figure, add_video, add_audio, add_text, \
summary_tensor, summary_tensor_abs, summary_tensor_clamp, set_default_writer, \
get_default_writer, add_graph, add_image, add_images_grid, add_images, add_scalar, \
add_histogram, add_figure, add_video, add_audio, add_text, summary_tensor, \
summary_tensor_abs, summary_tensor_clamp
| 82.142857 | 105 | 0.690435 | 73 | 575 | 4.890411 | 0.287671 | 0.218487 | 0.089636 | 0.106443 | 0.952381 | 0.952381 | 0.952381 | 0.952381 | 0.952381 | 0.952381 | 0 | 0 | 0.248696 | 575 | 6 | 106 | 95.833333 | 0.826389 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
313b7605c7cdd093c73a7fab2c006a1cdb50229f | 9,930 | py | Python | tests/functional/testplan/runnable/interactive/reports/basic_run_suite_test2.py | raoyitao/testplan | aae3e9cee597ca3d01b6d64eed2642c421c56cbb | [
"Apache-2.0"
] | 96 | 2018-03-14T13:14:50.000Z | 2021-01-14T08:26:08.000Z | tests/functional/testplan/runnable/interactive/reports/basic_run_suite_test2.py | raoyitao/testplan | aae3e9cee597ca3d01b6d64eed2642c421c56cbb | [
"Apache-2.0"
] | 135 | 2018-06-28T02:41:05.000Z | 2021-01-19T02:16:58.000Z | tests/functional/testplan/runnable/interactive/reports/basic_run_suite_test2.py | raoyitao/testplan | aae3e9cee597ca3d01b6d64eed2642c421c56cbb | [
"Apache-2.0"
] | 53 | 2018-03-17T14:39:15.000Z | 2021-01-21T10:54:13.000Z | from collections import Counter
REPORT = {
"strict_order": False,
"status_override": None,
"env_status": "STARTED",
"status_reason": None,
"fix_spec_path": None,
"uid": "Test2",
"entries": [
{
"strict_order": False,
"status_override": None,
"env_status": None,
"status_reason": None,
"fix_spec_path": None,
"uid": "BasicSuite",
"entries": [
{
"strict_order": False,
"status_override": None,
"env_status": None,
"status_reason": None,
"fix_spec_path": None,
"uid": "basic_case",
"entries": [
{
"logs": [],
"status_override": None,
"suite_related": False,
"runtime_status": "ready",
"timer": {},
"tags": {},
"description": None,
"parent_uids": [
"InteractivePlan",
"Test2",
"BasicSuite",
"basic_case",
],
"counter": Counter(
{
"total": 1,
"unknown": 1,
"passed": 0,
"failed": 0,
}
),
"status": "unknown",
"category": "testcase",
"status_reason": None,
"name": "basic_case <arg=0>",
"uid": "basic_case__arg_0",
"type": "TestCaseReport",
"entries": [],
"hash": 0,
},
{
"logs": [],
"status_override": None,
"suite_related": False,
"runtime_status": "ready",
"timer": {},
"tags": {},
"description": None,
"parent_uids": [
"InteractivePlan",
"Test2",
"BasicSuite",
"basic_case",
],
"counter": Counter(
{
"total": 1,
"unknown": 1,
"passed": 0,
"failed": 0,
}
),
"status": "unknown",
"category": "testcase",
"status_reason": None,
"name": "basic_case <arg=1>",
"uid": "basic_case__arg_1",
"type": "TestCaseReport",
"entries": [],
"hash": 0,
},
{
"logs": [],
"status_override": None,
"suite_related": False,
"runtime_status": "ready",
"timer": {},
"tags": {},
"description": None,
"parent_uids": [
"InteractivePlan",
"Test2",
"BasicSuite",
"basic_case",
],
"counter": Counter(
{
"total": 1,
"unknown": 1,
"passed": 0,
"failed": 0,
}
),
"status": "unknown",
"category": "testcase",
"status_reason": None,
"name": "basic_case <arg=2>",
"uid": "basic_case__arg_2",
"type": "TestCaseReport",
"entries": [],
"hash": 0,
},
],
"counter": Counter(
{"total": 3, "unknown": 3, "passed": 0, "failed": 0}
),
"part": None,
"status": "unknown",
"type": "TestGroupReport",
"timer": {},
"runtime_status": "ready",
"description": None,
"category": "parametrization",
"name": "basic_case",
"logs": [],
"tags": {},
"parent_uids": ["InteractivePlan", "Test2", "BasicSuite"],
"hash": 0,
}
],
"counter": Counter(
{"total": 3, "unknown": 3, "passed": 0, "failed": 0}
),
"part": None,
"status": "unknown",
"type": "TestGroupReport",
"timer": {},
"runtime_status": "ready",
"description": None,
"category": "testsuite",
"name": "BasicSuite",
"logs": [],
"tags": {},
"parent_uids": ["InteractivePlan", "Test2"],
"hash": 0,
},
{
"strict_order": False,
"status_override": None,
"env_status": None,
"status_reason": None,
"fix_spec_path": None,
"uid": "Custom_0",
"entries": [
{
"logs": [],
"status_override": None,
"suite_related": False,
"runtime_status": "ready",
"timer": {},
"tags": {},
"description": "Client sends a message, server received and responds back.",
"parent_uids": ["InteractivePlan", "Test2", "Custom_0"],
"counter": Counter(
{"total": 1, "unknown": 1, "passed": 0, "failed": 0}
),
"status": "unknown",
"category": "testcase",
"status_reason": None,
"name": "send_and_receive_msg",
"uid": "send_and_receive_msg",
"type": "TestCaseReport",
"entries": [],
"hash": 0,
}
],
"counter": Counter(
{"total": 1, "unknown": 1, "passed": 0, "failed": 0}
),
"part": None,
"status": "unknown",
"type": "TestGroupReport",
"timer": {},
"runtime_status": "ready",
"description": None,
"category": "testsuite",
"name": "Custom_0",
"logs": [],
"tags": {},
"parent_uids": ["InteractivePlan", "Test2"],
"hash": 0,
},
{
"strict_order": False,
"status_override": None,
"env_status": None,
"status_reason": None,
"fix_spec_path": None,
"uid": "Custom_1",
"entries": [
{
"logs": [],
"status_override": None,
"suite_related": False,
"runtime_status": "ready",
"timer": {},
"tags": {},
"description": "Client sends a message, server received and responds back.",
"parent_uids": ["InteractivePlan", "Test2", "Custom_1"],
"counter": Counter(
{"total": 1, "unknown": 1, "passed": 0, "failed": 0}
),
"status": "unknown",
"category": "testcase",
"status_reason": None,
"name": "send_and_receive_msg",
"uid": "send_and_receive_msg",
"type": "TestCaseReport",
"entries": [],
"hash": 0,
}
],
"counter": Counter(
{"total": 1, "unknown": 1, "passed": 0, "failed": 0}
),
"part": None,
"status": "unknown",
"type": "TestGroupReport",
"timer": {},
"runtime_status": "ready",
"description": None,
"category": "testsuite",
"name": "Custom_1",
"logs": [],
"tags": {},
"parent_uids": ["InteractivePlan", "Test2"],
"hash": 0,
},
],
"counter": Counter({"total": 5, "unknown": 5, "passed": 0, "failed": 0}),
"part": None,
"status": "unknown",
"type": "TestGroupReport",
"timer": {},
"runtime_status": "ready",
"description": None,
"category": "multitest",
"name": "Test2",
"logs": [],
"tags": {},
"parent_uids": ["InteractivePlan"],
"hash": 0,
}
| 38.045977 | 96 | 0.308862 | 555 | 9,930 | 5.340541 | 0.122523 | 0.033401 | 0.060729 | 0.047233 | 0.93556 | 0.914642 | 0.901822 | 0.901822 | 0.852901 | 0.852901 | 0 | 0.016809 | 0.562638 | 9,930 | 260 | 97 | 38.192308 | 0.665669 | 0 | 0 | 0.803089 | 0 | 0 | 0.274622 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.03861 | 0.003861 | 0 | 0.003861 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
31d6105ea8cf858fb59d779ec5ce8fc3269aeb38 | 136 | py | Python | help/access_db.py | SriGuru05/my | b720b96dae5030683ceea0a1ddd9e9e03e4c1c18 | [
"Apache-2.0"
] | null | null | null | help/access_db.py | SriGuru05/my | b720b96dae5030683ceea0a1ddd9e9e03e4c1c18 | [
"Apache-2.0"
] | null | null | null | help/access_db.py | SriGuru05/my | b720b96dae5030683ceea0a1ddd9e9e03e4c1c18 | [
"Apache-2.0"
] | null | null | null | # (c) @DKBOTZ
from bot import DATABASE_URI, BOT_USERNAME
from help.database import Database
db = Database(DATABASE_URI, BOT_USERNAME)
| 19.428571 | 42 | 0.794118 | 20 | 136 | 5.2 | 0.5 | 0.269231 | 0.269231 | 0.423077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132353 | 136 | 6 | 43 | 22.666667 | 0.881356 | 0.080882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
31d6ab4b8d176e2b7384c34efc580847602b387c | 995,847 | py | Python | logicmonitor_sdk/api/lm_api.py | JeremyTangCD/lm-sdk-python | 2a15e055e5a3f72d2f2e4fb43bdbed203c5a9983 | [
"Apache-2.0"
] | null | null | null | logicmonitor_sdk/api/lm_api.py | JeremyTangCD/lm-sdk-python | 2a15e055e5a3f72d2f2e4fb43bdbed203c5a9983 | [
"Apache-2.0"
] | null | null | null | logicmonitor_sdk/api/lm_api.py | JeremyTangCD/lm-sdk-python | 2a15e055e5a3f72d2f2e4fb43bdbed203c5a9983 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
LogicMonitor REST API
LogicMonitor is a SaaS-based performance monitoring platform that provides full visibility into complex, hybrid infrastructures, offering granular performance monitoring and actionable data and insights. logicmonitor_sdk enables you to manage your LogicMonitor account programmatically. # noqa: E501
OpenAPI spec version: 1.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from logicmonitor_sdk.api_client import ApiClient
class LMApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def ack_alert_by_id(self, body, id, **kwargs): # noqa: E501
"""ack alert by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.ack_alert_by_id(body, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertAck body: (required)
:param str id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.ack_alert_by_id_with_http_info(body, id, **kwargs) # noqa: E501
else:
(data) = self.ack_alert_by_id_with_http_info(body, id, **kwargs) # noqa: E501
return data
def ack_alert_by_id_with_http_info(self, body, id, **kwargs): # noqa: E501
"""ack alert by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.ack_alert_by_id_with_http_info(body, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertAck body: (required)
:param str id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method ack_alert_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `ack_alert_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `ack_alert_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/alert/alerts/{id}/ack', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def ack_collector_down_alert_by_id(self, id, body, **kwargs): # noqa: E501
"""ack collector down alert # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.ack_collector_down_alert_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param AckCollectorDown body: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.ack_collector_down_alert_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.ack_collector_down_alert_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def ack_collector_down_alert_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""ack collector down alert # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.ack_collector_down_alert_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param AckCollectorDown body: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method ack_collector_down_alert_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `ack_collector_down_alert_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `ack_collector_down_alert_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `ack_collector_down_alert_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/collectors/{id}/ackdown', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_admin(self, body, **kwargs): # noqa: E501
"""add user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_admin(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Admin body: (required)
:return: Admin
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_admin_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_admin_with_http_info(body, **kwargs) # noqa: E501
return data
def add_admin_with_http_info(self, body, **kwargs): # noqa: E501
"""add user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_admin_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Admin body: (required)
:return: Admin
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_admin" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_admin`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Admin', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_alert_note_by_id(self, body, id, **kwargs): # noqa: E501
"""add alert note # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_alert_note_by_id(body, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertAck body: (required)
:param str id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_alert_note_by_id_with_http_info(body, id, **kwargs) # noqa: E501
else:
(data) = self.add_alert_note_by_id_with_http_info(body, id, **kwargs) # noqa: E501
return data
def add_alert_note_by_id_with_http_info(self, body, id, **kwargs): # noqa: E501
"""add alert note # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_alert_note_by_id_with_http_info(body, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertAck body: (required)
:param str id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_alert_note_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_alert_note_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `add_alert_note_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/alert/alerts/{id}/note', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_alert_rule(self, body, **kwargs): # noqa: E501
"""add alert rule # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_alert_rule(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertRule body: (required)
:return: AlertRule
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_alert_rule_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_alert_rule_with_http_info(body, **kwargs) # noqa: E501
return data
def add_alert_rule_with_http_info(self, body, **kwargs): # noqa: E501
"""add alert rule # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_alert_rule_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertRule body: (required)
:return: AlertRule
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_alert_rule" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_alert_rule`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/rules', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertRule', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_api_token_by_admin_id(self, admin_id, body, **kwargs): # noqa: E501
"""add api tokens for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_api_token_by_admin_id(admin_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param APIToken body: (required)
:return: APIToken
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_api_token_by_admin_id_with_http_info(admin_id, body, **kwargs) # noqa: E501
else:
(data) = self.add_api_token_by_admin_id_with_http_info(admin_id, body, **kwargs) # noqa: E501
return data
def add_api_token_by_admin_id_with_http_info(self, admin_id, body, **kwargs): # noqa: E501
"""add api tokens for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_api_token_by_admin_id_with_http_info(admin_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param APIToken body: (required)
:return: APIToken
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['admin_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_api_token_by_admin_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'admin_id' is set
if ('admin_id' not in params or
params['admin_id'] is None):
raise ValueError("Missing the required parameter `admin_id` when calling `add_api_token_by_admin_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_api_token_by_admin_id`") # noqa: E501
if 'admin_id' in params and not re.search('\d+', params['admin_id'] if type(params['admin_id']) is str else str(params['admin_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `admin_id` when calling `add_api_token_by_admin_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'admin_id' in params:
path_params['adminId'] = params['admin_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/{adminId}/apitokens', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='APIToken', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_collector(self, body, **kwargs): # noqa: E501
"""add collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_collector(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Collector body: (required)
:return: Collector
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_collector_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_collector_with_http_info(body, **kwargs) # noqa: E501
return data
def add_collector_with_http_info(self, body, **kwargs): # noqa: E501
"""add collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_collector_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Collector body: (required)
:return: Collector
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_collector" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_collector`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/collectors', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Collector', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_collector_group(self, body, **kwargs): # noqa: E501
"""add collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_collector_group(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CollectorGroup body: (required)
:return: CollectorGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_collector_group_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_collector_group_with_http_info(body, **kwargs) # noqa: E501
return data
def add_collector_group_with_http_info(self, body, **kwargs): # noqa: E501
"""add collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_collector_group_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param CollectorGroup body: (required)
:return: CollectorGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_collector_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_collector_group`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/groups', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CollectorGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_dashboard(self, body, **kwargs): # noqa: E501
"""add dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_dashboard(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Dashboard body: (required)
:return: Dashboard
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_dashboard_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_dashboard_with_http_info(body, **kwargs) # noqa: E501
return data
def add_dashboard_with_http_info(self, body, **kwargs): # noqa: E501
"""add dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_dashboard_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Dashboard body: (required)
:return: Dashboard
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_dashboard" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_dashboard`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/dashboards', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Dashboard', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_dashboard_group(self, body, **kwargs): # noqa: E501
"""add dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_dashboard_group(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DashboardGroup body: (required)
:return: DashboardGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_dashboard_group_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_dashboard_group_with_http_info(body, **kwargs) # noqa: E501
return data
def add_dashboard_group_with_http_info(self, body, **kwargs): # noqa: E501
"""add dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_dashboard_group_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DashboardGroup body: (required)
:return: DashboardGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_dashboard_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_dashboard_group`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/groups', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DashboardGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_device(self, body, **kwargs): # noqa: E501
"""add a new device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Device body: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param bool add_from_wizard:
:return: Device
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_device_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_device_with_http_info(body, **kwargs) # noqa: E501
return data
def add_device_with_http_info(self, body, **kwargs): # noqa: E501
"""add a new device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Device body: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param bool add_from_wizard:
:return: Device
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'start', 'end', 'netflow_filter', 'add_from_wizard'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_device" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_device`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'add_from_wizard' in params:
query_params.append(('addFromWizard', params['add_from_wizard'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Device', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_device_datasource_instance(self, device_id, hds_id, body, **kwargs): # noqa: E501
"""add device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_datasource_instance(device_id, hds_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param DeviceDataSourceInstance body: (required)
:return: DeviceDataSourceInstance
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_device_datasource_instance_with_http_info(device_id, hds_id, body, **kwargs) # noqa: E501
else:
(data) = self.add_device_datasource_instance_with_http_info(device_id, hds_id, body, **kwargs) # noqa: E501
return data
def add_device_datasource_instance_with_http_info(self, device_id, hds_id, body, **kwargs): # noqa: E501
"""add device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_datasource_instance_with_http_info(device_id, hds_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param DeviceDataSourceInstance body: (required)
:return: DeviceDataSourceInstance
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_device_datasource_instance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `add_device_datasource_instance`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `add_device_datasource_instance`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_device_datasource_instance`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `add_device_datasource_instance`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `add_device_datasource_instance`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstance', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_device_datasource_instance_group(self, device_id, device_ds_id, body, **kwargs): # noqa: E501
"""add device datasource instance group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_datasource_instance_group(device_id, device_ds_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param DeviceDataSourceInstanceGroup body: (required)
:return: DeviceDataSourceInstanceGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_device_datasource_instance_group_with_http_info(device_id, device_ds_id, body, **kwargs) # noqa: E501
else:
(data) = self.add_device_datasource_instance_group_with_http_info(device_id, device_ds_id, body, **kwargs) # noqa: E501
return data
def add_device_datasource_instance_group_with_http_info(self, device_id, device_ds_id, body, **kwargs): # noqa: E501
"""add device datasource instance group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_datasource_instance_group_with_http_info(device_id, device_ds_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param DeviceDataSourceInstanceGroup body: (required)
:return: DeviceDataSourceInstanceGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'device_ds_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_device_datasource_instance_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `add_device_datasource_instance_group`") # noqa: E501
# verify the required parameter 'device_ds_id' is set
if ('device_ds_id' not in params or
params['device_ds_id'] is None):
raise ValueError("Missing the required parameter `device_ds_id` when calling `add_device_datasource_instance_group`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_device_datasource_instance_group`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `add_device_datasource_instance_group`, must conform to the pattern `/\d+/`") # noqa: E501
if 'device_ds_id' in params and not re.search('\d+', params['device_ds_id'] if type(params['device_ds_id']) is str else str(params['device_ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_ds_id` when calling `add_device_datasource_instance_group`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'device_ds_id' in params:
path_params['deviceDsId'] = params['device_ds_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{deviceDsId}/groups', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_device_group(self, body, **kwargs): # noqa: E501
"""add device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_group(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DeviceGroup body: (required)
:return: DeviceGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_device_group_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_device_group_with_http_info(body, **kwargs) # noqa: E501
return data
def add_device_group_with_http_info(self, body, **kwargs): # noqa: E501
"""add device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_group_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param DeviceGroup body: (required)
:return: DeviceGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_device_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_device_group`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_device_group_cluster_alert_conf(self, device_group_id, body, **kwargs): # noqa: E501
"""Add cluster alert configuration # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_group_cluster_alert_conf(device_group_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param DeviceClusterAlertConfig body: (required)
:return: DeviceClusterAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_device_group_cluster_alert_conf_with_http_info(device_group_id, body, **kwargs) # noqa: E501
else:
(data) = self.add_device_group_cluster_alert_conf_with_http_info(device_group_id, body, **kwargs) # noqa: E501
return data
def add_device_group_cluster_alert_conf_with_http_info(self, device_group_id, body, **kwargs): # noqa: E501
"""Add cluster alert configuration # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_group_cluster_alert_conf_with_http_info(device_group_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param DeviceClusterAlertConfig body: (required)
:return: DeviceClusterAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_device_group_cluster_alert_conf" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `add_device_group_cluster_alert_conf`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_device_group_cluster_alert_conf`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `add_device_group_cluster_alert_conf`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/clusterAlertConf', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceClusterAlertConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_device_group_property(self, gid, body, **kwargs): # noqa: E501
"""add device group property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_group_property(gid, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_device_group_property_with_http_info(gid, body, **kwargs) # noqa: E501
else:
(data) = self.add_device_group_property_with_http_info(gid, body, **kwargs) # noqa: E501
return data
def add_device_group_property_with_http_info(self, gid, body, **kwargs): # noqa: E501
"""add device group property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_group_property_with_http_info(gid, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['gid', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_device_group_property" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'gid' is set
if ('gid' not in params or
params['gid'] is None):
raise ValueError("Missing the required parameter `gid` when calling `add_device_group_property`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_device_group_property`") # noqa: E501
if 'gid' in params and not re.search('\d+', params['gid'] if type(params['gid']) is str else str(params['gid'])): # noqa: E501
raise ValueError("Invalid value for parameter `gid` when calling `add_device_group_property`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'gid' in params:
path_params['gid'] = params['gid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{gid}/properties', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EntityProperty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_device_property(self, device_id, body, **kwargs): # noqa: E501
"""add device property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_property(device_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_device_property_with_http_info(device_id, body, **kwargs) # noqa: E501
else:
(data) = self.add_device_property_with_http_info(device_id, body, **kwargs) # noqa: E501
return data
def add_device_property_with_http_info(self, device_id, body, **kwargs): # noqa: E501
"""add device property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_device_property_with_http_info(device_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_device_property" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `add_device_property`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_device_property`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `add_device_property`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/properties', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EntityProperty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_escalation_chain(self, body, **kwargs): # noqa: E501
"""add escalation chain # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_escalation_chain(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EscalatingChain body: (required)
:return: EscalatingChain
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_escalation_chain_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_escalation_chain_with_http_info(body, **kwargs) # noqa: E501
return data
def add_escalation_chain_with_http_info(self, body, **kwargs): # noqa: E501
"""add escalation chain # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_escalation_chain_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EscalatingChain body: (required)
:return: EscalatingChain
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_escalation_chain" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_escalation_chain`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/chains', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EscalatingChain', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_netscan(self, **kwargs): # noqa: E501
"""add a new netscan # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_netscan(async_req=True)
>>> result = thread.get()
:param async_req bool
:param Netscan body:
:return: Netscan
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_netscan_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.add_netscan_with_http_info(**kwargs) # noqa: E501
return data
def add_netscan_with_http_info(self, **kwargs): # noqa: E501
"""add a new netscan # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_netscan_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param Netscan body:
:return: Netscan
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_netscan" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/netscans', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Netscan', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_ops_note(self, body, **kwargs): # noqa: E501
"""add opsnote # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_ops_note(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param OpsNote body: (required)
:return: OpsNote
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_ops_note_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_ops_note_with_http_info(body, **kwargs) # noqa: E501
return data
def add_ops_note_with_http_info(self, body, **kwargs): # noqa: E501
"""add opsnote # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_ops_note_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param OpsNote body: (required)
:return: OpsNote
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_ops_note" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_ops_note`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/opsnotes', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OpsNote', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_recipient_group(self, body, **kwargs): # noqa: E501
"""add recipient group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_recipient_group(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param RecipientGroup body: (required)
:return: RecipientGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_recipient_group_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_recipient_group_with_http_info(body, **kwargs) # noqa: E501
return data
def add_recipient_group_with_http_info(self, body, **kwargs): # noqa: E501
"""add recipient group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_recipient_group_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param RecipientGroup body: (required)
:return: RecipientGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_recipient_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_recipient_group`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/recipientgroups', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RecipientGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_report(self, body, **kwargs): # noqa: E501
"""add report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_report(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param ReportBase body: (required)
:return: ReportBase
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_report_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_report_with_http_info(body, **kwargs) # noqa: E501
return data
def add_report_with_http_info(self, body, **kwargs): # noqa: E501
"""add report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_report_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param ReportBase body: (required)
:return: ReportBase
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_report" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_report`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/reports', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportBase', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_report_group(self, body, **kwargs): # noqa: E501
"""add report group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_report_group(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param ReportGroup body: (required)
:return: ReportGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_report_group_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_report_group_with_http_info(body, **kwargs) # noqa: E501
return data
def add_report_group_with_http_info(self, body, **kwargs): # noqa: E501
"""add report group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_report_group_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param ReportGroup body: (required)
:return: ReportGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_report_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_report_group`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/groups', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_role(self, body, **kwargs): # noqa: E501
"""add role # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_role(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Role body: (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_role_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_role_with_http_info(body, **kwargs) # noqa: E501
return data
def add_role_with_http_info(self, body, **kwargs): # noqa: E501
"""add role # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_role_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Role body: (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_role" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_role`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/roles', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Role', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_sdt(self, body, **kwargs): # noqa: E501
"""add SDT # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_sdt(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param SDT body: (required)
:return: SDT
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_sdt_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_sdt_with_http_info(body, **kwargs) # noqa: E501
return data
def add_sdt_with_http_info(self, body, **kwargs): # noqa: E501
"""add SDT # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_sdt_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param SDT body: (required)
:return: SDT
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_sdt" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_sdt`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/sdt/sdts', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SDT', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_website(self, body, **kwargs): # noqa: E501
"""add website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_website(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Website body: (required)
:return: Website
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_website_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_website_with_http_info(body, **kwargs) # noqa: E501
return data
def add_website_with_http_info(self, body, **kwargs): # noqa: E501
"""add website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_website_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Website body: (required)
:return: Website
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_website" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_website`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Website', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_website_group(self, body, **kwargs): # noqa: E501
"""add website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_website_group(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param WebsiteGroup body: (required)
:return: WebsiteGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_website_group_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_website_group_with_http_info(body, **kwargs) # noqa: E501
return data
def add_website_group_with_http_info(self, body, **kwargs): # noqa: E501
"""add website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_website_group_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param WebsiteGroup body: (required)
:return: WebsiteGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_website_group" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_website_group`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/groups', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WebsiteGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def add_widget(self, body, **kwargs): # noqa: E501
"""add widget # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_widget(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Widget body: (required)
:return: Widget
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_widget_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.add_widget_with_http_info(body, **kwargs) # noqa: E501
return data
def add_widget_with_http_info(self, body, **kwargs): # noqa: E501
"""add widget # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_widget_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param Widget body: (required)
:return: Widget
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_widget" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `add_widget`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/widgets', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Widget', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def collect_device_config_source_config(self, device_id, hds_id, instance_id, **kwargs): # noqa: E501
"""collect a config for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.collect_device_config_source_config(device_id, hds_id, instance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: (required)
:param int instance_id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.collect_device_config_source_config_with_http_info(device_id, hds_id, instance_id, **kwargs) # noqa: E501
else:
(data) = self.collect_device_config_source_config_with_http_info(device_id, hds_id, instance_id, **kwargs) # noqa: E501
return data
def collect_device_config_source_config_with_http_info(self, device_id, hds_id, instance_id, **kwargs): # noqa: E501
"""collect a config for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.collect_device_config_source_config_with_http_info(device_id, hds_id, instance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: (required)
:param int instance_id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'instance_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method collect_device_config_source_config" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `collect_device_config_source_config`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `collect_device_config_source_config`") # noqa: E501
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params or
params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `collect_device_config_source_config`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `collect_device_config_source_config`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `collect_device_config_source_config`, must conform to the pattern `/\d+/`") # noqa: E501
if 'instance_id' in params and not re.search('\d+', params['instance_id'] if type(params['instance_id']) is str else str(params['instance_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `instance_id` when calling `collect_device_config_source_config`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{instanceId}/config/collectNow', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_admin_by_id(self, id, **kwargs): # noqa: E501
"""delete user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_admin_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_admin_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_admin_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_admin_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_admin_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_admin_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_admin_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_admin_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_alert_rule_by_id(self, id, **kwargs): # noqa: E501
"""delete alert rule # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_alert_rule_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_alert_rule_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_alert_rule_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_alert_rule_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete alert rule # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_alert_rule_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_alert_rule_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_alert_rule_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_alert_rule_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/rules/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_api_token_by_id(self, admin_id, apitoken_id, **kwargs): # noqa: E501
"""delete apiToken # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_api_token_by_id(admin_id, apitoken_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param int apitoken_id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_api_token_by_id_with_http_info(admin_id, apitoken_id, **kwargs) # noqa: E501
else:
(data) = self.delete_api_token_by_id_with_http_info(admin_id, apitoken_id, **kwargs) # noqa: E501
return data
def delete_api_token_by_id_with_http_info(self, admin_id, apitoken_id, **kwargs): # noqa: E501
"""delete apiToken # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_api_token_by_id_with_http_info(admin_id, apitoken_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param int apitoken_id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['admin_id', 'apitoken_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_api_token_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'admin_id' is set
if ('admin_id' not in params or
params['admin_id'] is None):
raise ValueError("Missing the required parameter `admin_id` when calling `delete_api_token_by_id`") # noqa: E501
# verify the required parameter 'apitoken_id' is set
if ('apitoken_id' not in params or
params['apitoken_id'] is None):
raise ValueError("Missing the required parameter `apitoken_id` when calling `delete_api_token_by_id`") # noqa: E501
if 'admin_id' in params and not re.search('\d+', params['admin_id'] if type(params['admin_id']) is str else str(params['admin_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `admin_id` when calling `delete_api_token_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'apitoken_id' in params and not re.search('\d+', params['apitoken_id'] if type(params['apitoken_id']) is str else str(params['apitoken_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `apitoken_id` when calling `delete_api_token_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'admin_id' in params:
path_params['adminId'] = params['admin_id'] # noqa: E501
if 'apitoken_id' in params:
path_params['apitokenId'] = params['apitoken_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/{adminId}/apitokens/{apitokenId}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_collector_by_id(self, id, **kwargs): # noqa: E501
"""delete collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_collector_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_collector_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_collector_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_collector_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_collector_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_collector_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_collector_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_collector_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/collectors/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_collector_group_by_id(self, id, **kwargs): # noqa: E501
"""delete collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_collector_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_collector_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_collector_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_collector_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_collector_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_collector_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_collector_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_collector_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/groups/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_dashboard_by_id(self, id, **kwargs): # noqa: E501
"""delete dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_dashboard_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_dashboard_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_dashboard_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_dashboard_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_dashboard_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_dashboard_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_dashboard_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_dashboard_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/dashboards/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_dashboard_group_by_id(self, id, **kwargs): # noqa: E501
"""delete dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_dashboard_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool allow_non_empty_group:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_dashboard_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_dashboard_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_dashboard_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_dashboard_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool allow_non_empty_group:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'allow_non_empty_group'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_dashboard_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_dashboard_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_dashboard_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'allow_non_empty_group' in params:
query_params.append(('allowNonEmptyGroup', params['allow_non_empty_group'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/groups/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_datasource_by_id(self, id, **kwargs): # noqa: E501
"""delete datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_datasource_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_datasource_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_datasource_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_datasource_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_datasource_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_datasource_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_datasource_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_datasource_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/datasources/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_device_by_id(self, id, **kwargs): # noqa: E501
"""delete a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param bool delete_hard:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_device_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_device_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_device_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param bool delete_hard:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter', 'delete_hard'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_device_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_device_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_device_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'delete_hard' in params:
query_params.append(('deleteHard', params['delete_hard'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_device_datasource_instance_by_id(self, device_id, hds_id, id, **kwargs): # noqa: E501
"""delete a device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_datasource_instance_by_id(device_id, hds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, **kwargs) # noqa: E501
else:
(data) = self.delete_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, **kwargs) # noqa: E501
return data
def delete_device_datasource_instance_by_id_with_http_info(self, device_id, hds_id, id, **kwargs): # noqa: E501
"""delete a device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_device_datasource_instance_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `delete_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `delete_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_device_datasource_instance_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `delete_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `delete_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_device_group_by_id(self, id, **kwargs): # noqa: E501
"""delete device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool delete_children:
:param bool delete_hard:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_device_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_device_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_device_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool delete_children:
:param bool delete_hard:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'delete_children', 'delete_hard'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_device_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_device_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_device_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'delete_children' in params:
query_params.append(('deleteChildren', params['delete_children'])) # noqa: E501
if 'delete_hard' in params:
query_params.append(('deleteHard', params['delete_hard'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_device_group_cluster_alert_conf_by_id(self, device_group_id, id, **kwargs): # noqa: E501
"""Delete cluster alert configuration # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_group_cluster_alert_conf_by_id(device_group_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, **kwargs) # noqa: E501
else:
(data) = self.delete_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, **kwargs) # noqa: E501
return data
def delete_device_group_cluster_alert_conf_by_id_with_http_info(self, device_group_id, id, **kwargs): # noqa: E501
"""Delete cluster alert configuration # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_device_group_cluster_alert_conf_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `delete_device_group_cluster_alert_conf_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_device_group_cluster_alert_conf_by_id`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `delete_device_group_cluster_alert_conf_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_device_group_cluster_alert_conf_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/clusterAlertConf/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_device_group_property_by_name(self, gid, name, **kwargs): # noqa: E501
"""delete device group property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_group_property_by_name(gid, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str name: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_device_group_property_by_name_with_http_info(gid, name, **kwargs) # noqa: E501
else:
(data) = self.delete_device_group_property_by_name_with_http_info(gid, name, **kwargs) # noqa: E501
return data
def delete_device_group_property_by_name_with_http_info(self, gid, name, **kwargs): # noqa: E501
"""delete device group property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_group_property_by_name_with_http_info(gid, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str name: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['gid', 'name'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_device_group_property_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'gid' is set
if ('gid' not in params or
params['gid'] is None):
raise ValueError("Missing the required parameter `gid` when calling `delete_device_group_property_by_name`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `delete_device_group_property_by_name`") # noqa: E501
if 'gid' in params and not re.search('\d+', params['gid'] if type(params['gid']) is str else str(params['gid'])): # noqa: E501
raise ValueError("Invalid value for parameter `gid` when calling `delete_device_group_property_by_name`, must conform to the pattern `/\d+/`") # noqa: E501
if 'name' in params and not re.search('[^\/]+', params['name'] if type(params['name']) is str else str(params['name'])): # noqa: E501
raise ValueError("Invalid value for parameter `name` when calling `delete_device_group_property_by_name`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'gid' in params:
path_params['gid'] = params['gid'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{gid}/properties/{name}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_device_property_by_name(self, device_id, name, **kwargs): # noqa: E501
"""delete device property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_property_by_name(device_id, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str name: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_device_property_by_name_with_http_info(device_id, name, **kwargs) # noqa: E501
else:
(data) = self.delete_device_property_by_name_with_http_info(device_id, name, **kwargs) # noqa: E501
return data
def delete_device_property_by_name_with_http_info(self, device_id, name, **kwargs): # noqa: E501
"""delete device property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_device_property_by_name_with_http_info(device_id, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str name: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'name'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_device_property_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `delete_device_property_by_name`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `delete_device_property_by_name`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `delete_device_property_by_name`, must conform to the pattern `/\d+/`") # noqa: E501
if 'name' in params and not re.search('[^\/]+', params['name'] if type(params['name']) is str else str(params['name'])): # noqa: E501
raise ValueError("Invalid value for parameter `name` when calling `delete_device_property_by_name`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/properties/{name}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_escalation_chain_by_id(self, id, **kwargs): # noqa: E501
"""delete escalation chain # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_escalation_chain_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_escalation_chain_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_escalation_chain_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_escalation_chain_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete escalation chain # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_escalation_chain_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_escalation_chain_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_escalation_chain_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_escalation_chain_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/chains/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_netscan_by_id(self, id, **kwargs): # noqa: E501
"""delete a netscan # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_netscan_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_netscan_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_netscan_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_netscan_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete a netscan # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_netscan_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_netscan_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_netscan_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_netscan_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/netscans/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_ops_note_by_id(self, id, **kwargs): # noqa: E501
"""delete opsnote # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_ops_note_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_ops_note_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_ops_note_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_ops_note_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete opsnote # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_ops_note_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_ops_note_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_ops_note_by_id`") # noqa: E501
if 'id' in params and not re.search('[^\/]+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_ops_note_by_id`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/opsnotes/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_recipient_group_by_id(self, id, **kwargs): # noqa: E501
"""delete recipient group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_recipient_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_recipient_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_recipient_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_recipient_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete recipient group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_recipient_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_recipient_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_recipient_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_recipient_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/recipientgroups/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_report_by_id(self, id, **kwargs): # noqa: E501
"""delete report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_report_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_report_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_report_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_report_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_report_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_report_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_report_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_report_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/reports/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_report_group_by_id(self, id, **kwargs): # noqa: E501
"""delete report group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_report_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_report_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_report_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_report_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete report group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_report_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_report_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_report_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_report_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/groups/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_role_by_id(self, id, **kwargs): # noqa: E501
"""delete role # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_role_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_role_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_role_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_role_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete role # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_role_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_role_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_role_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_role_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/roles/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_sdt_by_id(self, id, **kwargs): # noqa: E501
"""delete SDT # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_sdt_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_sdt_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_sdt_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_sdt_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete SDT # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_sdt_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_sdt_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_sdt_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/sdt/sdts/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_website_by_id(self, id, **kwargs): # noqa: E501
"""delete website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_website_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_website_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_website_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_website_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_website_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_website_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_website_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_website_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_website_group_by_id(self, id, **kwargs): # noqa: E501
"""delete website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_website_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int delete_children:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_website_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_website_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_website_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_website_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int delete_children:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'delete_children'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_website_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_website_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_website_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'delete_children' in params:
query_params.append(('deleteChildren', params['delete_children'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/groups/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_widget_by_id(self, id, **kwargs): # noqa: E501
"""delete widget # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_widget_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_widget_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_widget_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_widget_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""delete widget # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_widget_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_widget_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_widget_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `delete_widget_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/widgets/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def execute_debug_command(self, **kwargs): # noqa: E501
"""Execute a Collector debug command # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.execute_debug_command(async_req=True)
>>> result = thread.get()
:param async_req bool
:param Debug body:
:param int collector_id:
:return: Debug
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.execute_debug_command_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.execute_debug_command_with_http_info(**kwargs) # noqa: E501
return data
def execute_debug_command_with_http_info(self, **kwargs): # noqa: E501
"""Execute a Collector debug command # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.execute_debug_command_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param Debug body:
:param int collector_id:
:return: Debug
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'collector_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method execute_debug_command" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'collector_id' in params:
query_params.append(('collectorId', params['collector_id'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/debug', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Debug', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def generate_report_by_id(self, id, **kwargs): # noqa: E501
"""run a report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.generate_report_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param GenerateReportRequest body:
:return: GenerateReportResult
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.generate_report_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.generate_report_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def generate_report_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""run a report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.generate_report_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param GenerateReportRequest body:
:return: GenerateReportResult
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method generate_report_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `generate_report_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `generate_report_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/reports/{id}/executions', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GenerateReportResult', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_admin_by_id(self, id, **kwargs): # noqa: E501
"""get user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_admin_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: Admin
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_admin_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_admin_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_admin_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_admin_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: Admin
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_admin_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_admin_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_admin_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Admin', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_admin_list(self, **kwargs): # noqa: E501
"""get user list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_admin_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AdminPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_admin_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_admin_list_with_http_info(**kwargs) # noqa: E501
return data
def get_admin_list_with_http_info(self, **kwargs): # noqa: E501
"""get user list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_admin_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AdminPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_admin_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AdminPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_alert_by_id(self, id, **kwargs): # noqa: E501
"""get alert # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param bool need_message:
:param str custom_columns:
:param str fields:
:return: Alert
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_alert_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_alert_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_alert_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get alert # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param bool need_message:
:param str custom_columns:
:param str fields:
:return: Alert
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'need_message', 'custom_columns', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_alert_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_alert_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'need_message' in params:
query_params.append(('needMessage', params['need_message'])) # noqa: E501
if 'custom_columns' in params:
query_params.append(('customColumns', params['custom_columns'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/alert/alerts/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Alert', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_alert_list(self, **kwargs): # noqa: E501
"""get alert list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str custom_columns:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_alert_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_alert_list_with_http_info(**kwargs) # noqa: E501
return data
def get_alert_list_with_http_info(self, **kwargs): # noqa: E501
"""get alert list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str custom_columns:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['custom_columns', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_alert_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'custom_columns' in params:
query_params.append(('customColumns', params['custom_columns'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/alert/alerts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_alert_list_by_device_group_id(self, id, **kwargs): # noqa: E501
"""get device group alerts # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_list_by_device_group_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool need_message:
:param str custom_columns:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_alert_list_by_device_group_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_alert_list_by_device_group_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_alert_list_by_device_group_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get device group alerts # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_list_by_device_group_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool need_message:
:param str custom_columns:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'need_message', 'custom_columns', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_alert_list_by_device_group_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_alert_list_by_device_group_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_alert_list_by_device_group_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'need_message' in params:
query_params.append(('needMessage', params['need_message'])) # noqa: E501
if 'custom_columns' in params:
query_params.append(('customColumns', params['custom_columns'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{id}/alerts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_alert_list_by_device_id(self, id, **kwargs): # noqa: E501
"""get alerts # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_list_by_device_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param bool need_message:
:param str custom_columns:
:param str bound:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_alert_list_by_device_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_alert_list_by_device_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_alert_list_by_device_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get alerts # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_list_by_device_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param bool need_message:
:param str custom_columns:
:param str bound:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter', 'need_message', 'custom_columns', 'bound', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_alert_list_by_device_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_alert_list_by_device_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_alert_list_by_device_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'need_message' in params:
query_params.append(('needMessage', params['need_message'])) # noqa: E501
if 'custom_columns' in params:
query_params.append(('customColumns', params['custom_columns'])) # noqa: E501
if 'bound' in params:
query_params.append(('bound', params['bound'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}/alerts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_alert_rule_by_id(self, id, **kwargs): # noqa: E501
"""get alert rule by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_rule_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: AlertRule
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_alert_rule_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_alert_rule_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_alert_rule_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get alert rule by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_rule_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: AlertRule
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_alert_rule_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_alert_rule_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_alert_rule_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/rules/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertRule', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_alert_rule_list(self, **kwargs): # noqa: E501
"""get alert rule list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_rule_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertRulePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_alert_rule_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_alert_rule_list_with_http_info(**kwargs) # noqa: E501
return data
def get_alert_rule_list_with_http_info(self, **kwargs): # noqa: E501
"""get alert rule list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_alert_rule_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertRulePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_alert_rule_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/rules', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertRulePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_all_sdt_list_by_device_id(self, id, **kwargs): # noqa: E501
"""get SDTs for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_sdt_list_by_device_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_all_sdt_list_by_device_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_all_sdt_list_by_device_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_all_sdt_list_by_device_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get SDTs for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_sdt_list_by_device_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_all_sdt_list_by_device_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_all_sdt_list_by_device_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_all_sdt_list_by_device_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}/sdts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SDTPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_all_sdt_list_by_website_group_id(self, id, **kwargs): # noqa: E501
"""get a list of SDTs for a website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_sdt_list_by_website_group_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_all_sdt_list_by_website_group_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_all_sdt_list_by_website_group_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_all_sdt_list_by_website_group_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get a list of SDTs for a website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_sdt_list_by_website_group_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_all_sdt_list_by_website_group_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_all_sdt_list_by_website_group_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_all_sdt_list_by_website_group_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/groups/{id}/sdts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SDTPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_api_token_list(self, **kwargs): # noqa: E501
"""get a list of api tokens across users # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_api_token_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: ApiTokenPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_api_token_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_api_token_list_with_http_info(**kwargs) # noqa: E501
return data
def get_api_token_list_with_http_info(self, **kwargs): # noqa: E501
"""get a list of api tokens across users # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_api_token_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: ApiTokenPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_api_token_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/apitokens', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ApiTokenPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_api_token_list_by_admin_id(self, admin_id, **kwargs): # noqa: E501
"""get api tokens for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_api_token_list_by_admin_id(admin_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: ApiTokenPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_api_token_list_by_admin_id_with_http_info(admin_id, **kwargs) # noqa: E501
else:
(data) = self.get_api_token_list_by_admin_id_with_http_info(admin_id, **kwargs) # noqa: E501
return data
def get_api_token_list_by_admin_id_with_http_info(self, admin_id, **kwargs): # noqa: E501
"""get api tokens for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_api_token_list_by_admin_id_with_http_info(admin_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: ApiTokenPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['admin_id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_api_token_list_by_admin_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'admin_id' is set
if ('admin_id' not in params or
params['admin_id'] is None):
raise ValueError("Missing the required parameter `admin_id` when calling `get_api_token_list_by_admin_id`") # noqa: E501
if 'admin_id' in params and not re.search('\d+', params['admin_id'] if type(params['admin_id']) is str else str(params['admin_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `admin_id` when calling `get_api_token_list_by_admin_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'admin_id' in params:
path_params['adminId'] = params['admin_id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/{adminId}/apitokens', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ApiTokenPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_associated_device_list_by_data_source_id(self, id, **kwargs): # noqa: E501
"""get devices associated with a datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_associated_device_list_by_data_source_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDataSourceAssociatedPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_associated_device_list_by_data_source_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_associated_device_list_by_data_source_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_associated_device_list_by_data_source_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get devices associated with a datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_associated_device_list_by_data_source_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDataSourceAssociatedPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_associated_device_list_by_data_source_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_associated_device_list_by_data_source_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_associated_device_list_by_data_source_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/datasources/{id}/devices', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceAssociatedPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_audit_log_by_id(self, id, **kwargs): # noqa: E501
"""Get audit log by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_log_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: AuditLog
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_audit_log_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_audit_log_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_audit_log_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""Get audit log by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_log_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: AuditLog
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_log_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_audit_log_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/accesslogs/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AuditLog', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_audit_log_list(self, **kwargs): # noqa: E501
"""Get audit logs # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_log_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str format:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AccessLogPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_audit_log_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_audit_log_list_with_http_info(**kwargs) # noqa: E501
return data
def get_audit_log_list_with_http_info(self, **kwargs): # noqa: E501
"""Get audit logs # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_audit_log_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str format:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AccessLogPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['format', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_audit_log_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/accesslogs', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AccessLogPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_aws_external_id(self, **kwargs): # noqa: E501
"""Get AWS external id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aws_external_id(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: AwsExternalId
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_aws_external_id_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_aws_external_id_with_http_info(**kwargs) # noqa: E501
return data
def get_aws_external_id_with_http_info(self, **kwargs): # noqa: E501
"""Get AWS external id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_aws_external_id_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: AwsExternalId
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_aws_external_id" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/aws/externalId', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AwsExternalId', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_collector_by_id(self, id, **kwargs): # noqa: E501
"""get collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: Collector
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_collector_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_collector_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_collector_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: Collector
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_collector_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_collector_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_collector_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/collectors/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Collector', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_collector_group_by_id(self, id, **kwargs): # noqa: E501
"""get collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: CollectorGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_collector_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_collector_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_collector_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: CollectorGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_collector_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_collector_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_collector_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/groups/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CollectorGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_collector_group_list(self, **kwargs): # noqa: E501
"""get collector group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_group_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: CollectorGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_collector_group_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_collector_group_list_with_http_info(**kwargs) # noqa: E501
return data
def get_collector_group_list_with_http_info(self, **kwargs): # noqa: E501
"""get collector group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_group_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: CollectorGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_collector_group_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CollectorGroupPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_collector_installer(self, collector_id, os_and_arch, **kwargs): # noqa: E501
"""get collector installer # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_installer(collector_id, os_and_arch, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int collector_id: (required)
:param str os_and_arch: (required)
:param int collector_version: The version of the installer you'd like to download. This defaults to the latest GD Collector, unless useEA is true
:param str token:
:param bool monitor_others:
:param str collector_size: The size of the Collector you'd like to install. Options are nano, small (requires 2GB memory), medium (requires 4GB memory), large (requires 8GB memory). Requires collector version 22.180 or higher. Defaults to small
:param bool use_ea: If true, the latest EA Collector version will be used. Defaults to false
:return: file
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_collector_installer_with_http_info(collector_id, os_and_arch, **kwargs) # noqa: E501
else:
(data) = self.get_collector_installer_with_http_info(collector_id, os_and_arch, **kwargs) # noqa: E501
return data
def get_collector_installer_with_http_info(self, collector_id, os_and_arch, **kwargs): # noqa: E501
"""get collector installer # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_installer_with_http_info(collector_id, os_and_arch, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int collector_id: (required)
:param str os_and_arch: (required)
:param int collector_version: The version of the installer you'd like to download. This defaults to the latest GD Collector, unless useEA is true
:param str token:
:param bool monitor_others:
:param str collector_size: The size of the Collector you'd like to install. Options are nano, small (requires 2GB memory), medium (requires 4GB memory), large (requires 8GB memory). Requires collector version 22.180 or higher. Defaults to small
:param bool use_ea: If true, the latest EA Collector version will be used. Defaults to false
:return: file
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['collector_id', 'os_and_arch', 'collector_version', 'token', 'monitor_others', 'collector_size', 'use_ea'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_collector_installer" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'collector_id' is set
if ('collector_id' not in params or
params['collector_id'] is None):
raise ValueError("Missing the required parameter `collector_id` when calling `get_collector_installer`") # noqa: E501
# verify the required parameter 'os_and_arch' is set
if ('os_and_arch' not in params or
params['os_and_arch'] is None):
raise ValueError("Missing the required parameter `os_and_arch` when calling `get_collector_installer`") # noqa: E501
if 'collector_id' in params and not re.search('\d+', params['collector_id'] if type(params['collector_id']) is str else str(params['collector_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `collector_id` when calling `get_collector_installer`, must conform to the pattern `/\d+/`") # noqa: E501
if 'os_and_arch' in params and not re.search('.+', params['os_and_arch'] if type(params['os_and_arch']) is str else str(params['os_and_arch'])): # noqa: E501
raise ValueError("Invalid value for parameter `os_and_arch` when calling `get_collector_installer`, must conform to the pattern `/.+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'collector_id' in params:
path_params['collectorId'] = params['collector_id'] # noqa: E501
if 'os_and_arch' in params:
path_params['osAndArch'] = params['os_and_arch'] # noqa: E501
query_params = []
if 'collector_version' in params:
query_params.append(('collectorVersion', params['collector_version'])) # noqa: E501
if 'token' in params:
query_params.append(('token', params['token'])) # noqa: E501
if 'monitor_others' in params:
query_params.append(('monitorOthers', params['monitor_others'])) # noqa: E501
if 'collector_size' in params:
query_params.append(('collectorSize', params['collector_size'])) # noqa: E501
if 'use_ea' in params:
query_params.append(('useEA', params['use_ea'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/collectors/{collectorId}/installers/{osAndArch}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='file', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_collector_list(self, **kwargs): # noqa: E501
"""get collector list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: CollectorPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_collector_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_collector_list_with_http_info(**kwargs) # noqa: E501
return data
def get_collector_list_with_http_info(self, **kwargs): # noqa: E501
"""get collector list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_collector_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: CollectorPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_collector_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/collectors', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CollectorPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_dashboard_by_id(self, id, **kwargs): # noqa: E501
"""get dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_dashboard_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool template:
:param str format:
:param str fields:
:return: Dashboard
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_dashboard_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_dashboard_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_dashboard_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_dashboard_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool template:
:param str format:
:param str fields:
:return: Dashboard
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'template', 'format', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_dashboard_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_dashboard_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_dashboard_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'template' in params:
query_params.append(('template', params['template'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/dashboards/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Dashboard', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_dashboard_group_by_id(self, id, **kwargs): # noqa: E501
"""get dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_dashboard_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool template:
:param str fields:
:return: DashboardGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_dashboard_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_dashboard_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_dashboard_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_dashboard_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool template:
:param str fields:
:return: DashboardGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'template', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_dashboard_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_dashboard_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_dashboard_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'template' in params:
query_params.append(('template', params['template'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/groups/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DashboardGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_dashboard_group_list(self, **kwargs): # noqa: E501
"""get dashboard group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_dashboard_group_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DashboardGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_dashboard_group_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_dashboard_group_list_with_http_info(**kwargs) # noqa: E501
return data
def get_dashboard_group_list_with_http_info(self, **kwargs): # noqa: E501
"""get dashboard group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_dashboard_group_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DashboardGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_dashboard_group_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DashboardGroupPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_dashboard_list(self, **kwargs): # noqa: E501
"""get dashboard list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_dashboard_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DashboardPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_dashboard_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_dashboard_list_with_http_info(**kwargs) # noqa: E501
return data
def get_dashboard_list_with_http_info(self, **kwargs): # noqa: E501
"""get dashboard list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_dashboard_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DashboardPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_dashboard_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/dashboards', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DashboardPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_data_source_overview_graph_by_id(self, ds_id, id, **kwargs): # noqa: E501
"""get datasource overview graph by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_source_overview_graph_by_id(ds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int ds_id: (required)
:param int id: (required)
:return: DataSourceOverviewGraph
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_data_source_overview_graph_by_id_with_http_info(ds_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_data_source_overview_graph_by_id_with_http_info(ds_id, id, **kwargs) # noqa: E501
return data
def get_data_source_overview_graph_by_id_with_http_info(self, ds_id, id, **kwargs): # noqa: E501
"""get datasource overview graph by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_source_overview_graph_by_id_with_http_info(ds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int ds_id: (required)
:param int id: (required)
:return: DataSourceOverviewGraph
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ds_id', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_data_source_overview_graph_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'ds_id' is set
if ('ds_id' not in params or
params['ds_id'] is None):
raise ValueError("Missing the required parameter `ds_id` when calling `get_data_source_overview_graph_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_data_source_overview_graph_by_id`") # noqa: E501
if 'ds_id' in params and not re.search('\d+', params['ds_id'] if type(params['ds_id']) is str else str(params['ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `ds_id` when calling `get_data_source_overview_graph_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_data_source_overview_graph_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'ds_id' in params:
path_params['dsId'] = params['ds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/datasources/{dsId}/ographs/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DataSourceOverviewGraph', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_data_source_overview_graph_list(self, ds_id, **kwargs): # noqa: E501
"""get datasource overview graph list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_source_overview_graph_list(ds_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int ds_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DatasourceOverviewGraphPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_data_source_overview_graph_list_with_http_info(ds_id, **kwargs) # noqa: E501
else:
(data) = self.get_data_source_overview_graph_list_with_http_info(ds_id, **kwargs) # noqa: E501
return data
def get_data_source_overview_graph_list_with_http_info(self, ds_id, **kwargs): # noqa: E501
"""get datasource overview graph list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_source_overview_graph_list_with_http_info(ds_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int ds_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DatasourceOverviewGraphPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ds_id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_data_source_overview_graph_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'ds_id' is set
if ('ds_id' not in params or
params['ds_id'] is None):
raise ValueError("Missing the required parameter `ds_id` when calling `get_data_source_overview_graph_list`") # noqa: E501
if 'ds_id' in params and not re.search('\d+', params['ds_id'] if type(params['ds_id']) is str else str(params['ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `ds_id` when calling `get_data_source_overview_graph_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'ds_id' in params:
path_params['dsId'] = params['ds_id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/datasources/{dsId}/ographs', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DatasourceOverviewGraphPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_datasource_by_id(self, id, **kwargs): # noqa: E501
"""get datasource by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_datasource_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str format:
:param str fields:
:return: DataSource
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_datasource_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_datasource_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_datasource_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get datasource by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_datasource_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str format:
:param str fields:
:return: DataSource
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'format', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_datasource_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_datasource_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_datasource_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/datasources/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DataSource', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_datasource_list(self, **kwargs): # noqa: E501
"""get datasource list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_datasource_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str format:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DatasourcePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_datasource_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_datasource_list_with_http_info(**kwargs) # noqa: E501
return data
def get_datasource_list_with_http_info(self, **kwargs): # noqa: E501
"""get datasource list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_datasource_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str format:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DatasourcePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['format', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_datasource_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/datasources', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DatasourcePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_debug_command_result(self, id, **kwargs): # noqa: E501
"""Get the result of a Collector debug command # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_debug_command_result(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param int collector_id:
:return: Debug
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_debug_command_result_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_debug_command_result_with_http_info(id, **kwargs) # noqa: E501
return data
def get_debug_command_result_with_http_info(self, id, **kwargs): # noqa: E501
"""Get the result of a Collector debug command # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_debug_command_result_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param int collector_id:
:return: Debug
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'collector_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_debug_command_result" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_debug_command_result`") # noqa: E501
if 'id' in params and not re.search('.*', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_debug_command_result`, must conform to the pattern `/.*/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'collector_id' in params:
query_params.append(('collectorId', params['collector_id'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/debug/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Debug', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_by_id(self, id, **kwargs): # noqa: E501
"""get device by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:return: Device
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_device_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_device_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get device by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:return: Device
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Device', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_config_source_config_by_id(self, device_id, hds_id, instance_id, id, **kwargs): # noqa: E501
"""get a config for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_config_source_config_by_id(device_id, hds_id, instance_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: (required)
:param int instance_id: (required)
:param str id: (required)
:param str format:
:param int start_epoch:
:param str fields:
:return: DeviceDataSourceInstanceConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_config_source_config_by_id_with_http_info(device_id, hds_id, instance_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_config_source_config_by_id_with_http_info(device_id, hds_id, instance_id, id, **kwargs) # noqa: E501
return data
def get_device_config_source_config_by_id_with_http_info(self, device_id, hds_id, instance_id, id, **kwargs): # noqa: E501
"""get a config for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_config_source_config_by_id_with_http_info(device_id, hds_id, instance_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: (required)
:param int instance_id: (required)
:param str id: (required)
:param str format:
:param int start_epoch:
:param str fields:
:return: DeviceDataSourceInstanceConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'instance_id', 'id', 'format', 'start_epoch', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_config_source_config_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_config_source_config_by_id`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `get_device_config_source_config_by_id`") # noqa: E501
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params or
params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `get_device_config_source_config_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_config_source_config_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_config_source_config_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `get_device_config_source_config_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'instance_id' in params and not re.search('\d+', params['instance_id'] if type(params['instance_id']) is str else str(params['instance_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `instance_id` when calling `get_device_config_source_config_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('[-_a-zA-Z0-9]+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_config_source_config_by_id`, must conform to the pattern `/[-_a-zA-Z0-9]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
if 'start_epoch' in params:
query_params.append(('startEpoch', params['start_epoch'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{instanceId}/config/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_config_source_config_list(self, device_id, hds_id, instance_id, **kwargs): # noqa: E501
"""get config instances for a configsource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_config_source_config_list(device_id, hds_id, instance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: (required)
:param int instance_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDatasourceInstanceConfigPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_config_source_config_list_with_http_info(device_id, hds_id, instance_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_config_source_config_list_with_http_info(device_id, hds_id, instance_id, **kwargs) # noqa: E501
return data
def get_device_config_source_config_list_with_http_info(self, device_id, hds_id, instance_id, **kwargs): # noqa: E501
"""get config instances for a configsource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_config_source_config_list_with_http_info(device_id, hds_id, instance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: (required)
:param int instance_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDatasourceInstanceConfigPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'instance_id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_config_source_config_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_config_source_config_list`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `get_device_config_source_config_list`") # noqa: E501
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params or
params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `get_device_config_source_config_list`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_config_source_config_list`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `get_device_config_source_config_list`, must conform to the pattern `/\d+/`") # noqa: E501
if 'instance_id' in params and not re.search('\d+', params['instance_id'] if type(params['instance_id']) is str else str(params['instance_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `instance_id` when calling `get_device_config_source_config_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{instanceId}/config', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDatasourceInstanceConfigPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_by_id(self, device_id, id, **kwargs): # noqa: E501
"""get device datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_by_id(device_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int id: (required)
:param str fields:
:return: DeviceDataSource
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_by_id_with_http_info(device_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_by_id_with_http_info(device_id, id, **kwargs) # noqa: E501
return data
def get_device_datasource_by_id_with_http_info(self, device_id, id, **kwargs): # noqa: E501
"""get device datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_by_id_with_http_info(device_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int id: (required)
:param str fields:
:return: DeviceDataSource
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_datasource_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_datasource_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSource', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_data_by_id(self, device_id, id, **kwargs): # noqa: E501
"""get device datasource data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_data_by_id(device_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int id: (required)
:param float period:
:param int start:
:param int end:
:param str datapoints:
:param str format:
:return: DeviceDataSourceData
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_data_by_id_with_http_info(device_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_data_by_id_with_http_info(device_id, id, **kwargs) # noqa: E501
return data
def get_device_datasource_data_by_id_with_http_info(self, device_id, id, **kwargs): # noqa: E501
"""get device datasource data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_data_by_id_with_http_info(device_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int id: (required)
:param float period:
:param int start:
:param int end:
:param str datapoints:
:param str format:
:return: DeviceDataSourceData
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'id', 'period', 'start', 'end', 'datapoints', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_data_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_data_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_datasource_data_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_data_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_datasource_data_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'period' in params:
query_params.append(('period', params['period'])) # noqa: E501
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'datapoints' in params:
query_params.append(('datapoints', params['datapoints'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{id}/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceData', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_alert_setting_by_id(self, device_id, hds_id, instance_id, id, **kwargs): # noqa: E501
"""get device instance alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_alert_setting_by_id(device_id, hds_id, instance_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: Device-DataSource ID (required)
:param int instance_id: (required)
:param int id: (required)
:param str fields:
:return: DeviceDataSourceInstanceAlertSetting
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_alert_setting_by_id_with_http_info(device_id, hds_id, instance_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_alert_setting_by_id_with_http_info(device_id, hds_id, instance_id, id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_alert_setting_by_id_with_http_info(self, device_id, hds_id, instance_id, id, **kwargs): # noqa: E501
"""get device instance alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_alert_setting_by_id_with_http_info(device_id, hds_id, instance_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: Device-DataSource ID (required)
:param int instance_id: (required)
:param int id: (required)
:param str fields:
:return: DeviceDataSourceInstanceAlertSetting
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'instance_id', 'id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_alert_setting_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `get_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params or
params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `get_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_datasource_instance_alert_setting_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `get_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'instance_id' in params and not re.search('\d+', params['instance_id'] if type(params['instance_id']) is str else str(params['instance_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `instance_id` when calling `get_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{instanceId}/alertsettings/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceAlertSetting', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_alert_setting_list(self, device_id, hds_id, instance_id, **kwargs): # noqa: E501
"""get a list of alert settings for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_alert_setting_list(device_id, hds_id, instance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: Device-DataSource ID (required)
:param int instance_id: (required)
:return: DeviceDataSourceInstanceAlertSettingPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_alert_setting_list_with_http_info(device_id, hds_id, instance_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_alert_setting_list_with_http_info(device_id, hds_id, instance_id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_alert_setting_list_with_http_info(self, device_id, hds_id, instance_id, **kwargs): # noqa: E501
"""get a list of alert settings for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_alert_setting_list_with_http_info(device_id, hds_id, instance_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: Device-DataSource ID (required)
:param int instance_id: (required)
:return: DeviceDataSourceInstanceAlertSettingPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'instance_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_alert_setting_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_alert_setting_list`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `get_device_datasource_instance_alert_setting_list`") # noqa: E501
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params or
params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `get_device_datasource_instance_alert_setting_list`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_alert_setting_list`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `get_device_datasource_instance_alert_setting_list`, must conform to the pattern `/\d+/`") # noqa: E501
if 'instance_id' in params and not re.search('\d+', params['instance_id'] if type(params['instance_id']) is str else str(params['instance_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `instance_id` when calling `get_device_datasource_instance_alert_setting_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{instanceId}/alertsettings', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceAlertSettingPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_by_id(self, device_id, hds_id, id, **kwargs): # noqa: E501
"""get device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_by_id(device_id, hds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param str fields:
:return: DeviceDataSourceInstance
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_by_id_with_http_info(self, device_id, hds_id, id, **kwargs): # noqa: E501
"""get device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param str fields:
:return: DeviceDataSourceInstance
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `get_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_datasource_instance_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `get_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstance', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_data(self, device_id, hds_id, id, **kwargs): # noqa: E501
"""get device instance data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_data(device_id, hds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param float period:
:param int start:
:param int end:
:param str datapoints:
:param str format:
:return: DeviceDataSourceInstanceData
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_data_with_http_info(device_id, hds_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_data_with_http_info(device_id, hds_id, id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_data_with_http_info(self, device_id, hds_id, id, **kwargs): # noqa: E501
"""get device instance data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_data_with_http_info(device_id, hds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param float period:
:param int start:
:param int end:
:param str datapoints:
:param str format:
:return: DeviceDataSourceInstanceData
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'id', 'period', 'start', 'end', 'datapoints', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_data" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_data`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `get_device_datasource_instance_data`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_datasource_instance_data`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `get_device_datasource_instance_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_datasource_instance_data`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'period' in params:
query_params.append(('period', params['period'])) # noqa: E501
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'datapoints' in params:
query_params.append(('datapoints', params['datapoints'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{id}/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceData', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_graph_data(self, device_id, hds_id, id, graph_id, **kwargs): # noqa: E501
"""get device instance graph data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_graph_data(device_id, hds_id, id, graph_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param int graph_id: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_graph_data_with_http_info(device_id, hds_id, id, graph_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_graph_data_with_http_info(device_id, hds_id, id, graph_id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_graph_data_with_http_info(self, device_id, hds_id, id, graph_id, **kwargs): # noqa: E501
"""get device instance graph data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_graph_data_with_http_info(device_id, hds_id, id, graph_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param int graph_id: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'id', 'graph_id', 'start', 'end', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_graph_data" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_graph_data`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `get_device_datasource_instance_graph_data`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_datasource_instance_graph_data`") # noqa: E501
# verify the required parameter 'graph_id' is set
if ('graph_id' not in params or
params['graph_id'] is None):
raise ValueError("Missing the required parameter `graph_id` when calling `get_device_datasource_instance_graph_data`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_graph_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `get_device_datasource_instance_graph_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_datasource_instance_graph_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'graph_id' in params and not re.search('-?\d+', params['graph_id'] if type(params['graph_id']) is str else str(params['graph_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `graph_id` when calling `get_device_datasource_instance_graph_data`, must conform to the pattern `/-?\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
if 'graph_id' in params:
path_params['graphId'] = params['graph_id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{id}/graphs/{graphId}/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GraphPlot', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_group_by_id(self, device_id, device_ds_id, id, **kwargs): # noqa: E501
"""get device datasource instance group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_group_by_id(device_id, device_ds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param int id: (required)
:param str fields:
:return: DeviceDataSourceInstanceGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_group_by_id_with_http_info(device_id, device_ds_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_group_by_id_with_http_info(device_id, device_ds_id, id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_group_by_id_with_http_info(self, device_id, device_ds_id, id, **kwargs): # noqa: E501
"""get device datasource instance group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_group_by_id_with_http_info(device_id, device_ds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param int id: (required)
:param str fields:
:return: DeviceDataSourceInstanceGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'device_ds_id', 'id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_group_by_id`") # noqa: E501
# verify the required parameter 'device_ds_id' is set
if ('device_ds_id' not in params or
params['device_ds_id'] is None):
raise ValueError("Missing the required parameter `device_ds_id` when calling `get_device_datasource_instance_group_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_datasource_instance_group_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'device_ds_id' in params and not re.search('\d+', params['device_ds_id'] if type(params['device_ds_id']) is str else str(params['device_ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_ds_id` when calling `get_device_datasource_instance_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_datasource_instance_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'device_ds_id' in params:
path_params['deviceDsId'] = params['device_ds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{deviceDsId}/groups/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_group_list(self, device_id, device_ds_id, **kwargs): # noqa: E501
"""get device datasource instance group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_group_list(device_id, device_ds_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDatasourceInstanceGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_group_list_with_http_info(device_id, device_ds_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_group_list_with_http_info(device_id, device_ds_id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_group_list_with_http_info(self, device_id, device_ds_id, **kwargs): # noqa: E501
"""get device datasource instance group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_group_list_with_http_info(device_id, device_ds_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDatasourceInstanceGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'device_ds_id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_group_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_group_list`") # noqa: E501
# verify the required parameter 'device_ds_id' is set
if ('device_ds_id' not in params or
params['device_ds_id'] is None):
raise ValueError("Missing the required parameter `device_ds_id` when calling `get_device_datasource_instance_group_list`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_group_list`, must conform to the pattern `/\d+/`") # noqa: E501
if 'device_ds_id' in params and not re.search('\d+', params['device_ds_id'] if type(params['device_ds_id']) is str else str(params['device_ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_ds_id` when calling `get_device_datasource_instance_group_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'device_ds_id' in params:
path_params['deviceDsId'] = params['device_ds_id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{deviceDsId}/groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDatasourceInstanceGroupPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_group_overview_graph_data(self, device_id, device_ds_id, dsig_id, ograph_id, **kwargs): # noqa: E501
"""get device instance group overview graph data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_group_overview_graph_data(device_id, device_ds_id, dsig_id, ograph_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param int dsig_id: (required)
:param int ograph_id: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_group_overview_graph_data_with_http_info(device_id, device_ds_id, dsig_id, ograph_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_group_overview_graph_data_with_http_info(device_id, device_ds_id, dsig_id, ograph_id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_group_overview_graph_data_with_http_info(self, device_id, device_ds_id, dsig_id, ograph_id, **kwargs): # noqa: E501
"""get device instance group overview graph data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_group_overview_graph_data_with_http_info(device_id, device_ds_id, dsig_id, ograph_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param int dsig_id: (required)
:param int ograph_id: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'device_ds_id', 'dsig_id', 'ograph_id', 'start', 'end', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_group_overview_graph_data" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_group_overview_graph_data`") # noqa: E501
# verify the required parameter 'device_ds_id' is set
if ('device_ds_id' not in params or
params['device_ds_id'] is None):
raise ValueError("Missing the required parameter `device_ds_id` when calling `get_device_datasource_instance_group_overview_graph_data`") # noqa: E501
# verify the required parameter 'dsig_id' is set
if ('dsig_id' not in params or
params['dsig_id'] is None):
raise ValueError("Missing the required parameter `dsig_id` when calling `get_device_datasource_instance_group_overview_graph_data`") # noqa: E501
# verify the required parameter 'ograph_id' is set
if ('ograph_id' not in params or
params['ograph_id'] is None):
raise ValueError("Missing the required parameter `ograph_id` when calling `get_device_datasource_instance_group_overview_graph_data`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_group_overview_graph_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'device_ds_id' in params and not re.search('\d+', params['device_ds_id'] if type(params['device_ds_id']) is str else str(params['device_ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_ds_id` when calling `get_device_datasource_instance_group_overview_graph_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'dsig_id' in params and not re.search('\d+', params['dsig_id'] if type(params['dsig_id']) is str else str(params['dsig_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `dsig_id` when calling `get_device_datasource_instance_group_overview_graph_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'ograph_id' in params and not re.search('\d+', params['ograph_id'] if type(params['ograph_id']) is str else str(params['ograph_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `ograph_id` when calling `get_device_datasource_instance_group_overview_graph_data`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'device_ds_id' in params:
path_params['deviceDsId'] = params['device_ds_id'] # noqa: E501
if 'dsig_id' in params:
path_params['dsigId'] = params['dsig_id'] # noqa: E501
if 'ograph_id' in params:
path_params['ographId'] = params['ograph_id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{deviceDsId}/groups/{dsigId}/graphs/{ographId}/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GraphPlot', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_list(self, device_id, hds_id, **kwargs): # noqa: E501
"""get device instance list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_list(device_id, hds_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:return: DeviceDatasourceInstancePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_list_with_http_info(device_id, hds_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_list_with_http_info(device_id, hds_id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_list_with_http_info(self, device_id, hds_id, **kwargs): # noqa: E501
"""get device instance list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_list_with_http_info(device_id, hds_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:return: DeviceDatasourceInstancePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_list`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `get_device_datasource_instance_list`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_list`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `get_device_datasource_instance_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDatasourceInstancePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_instance_sdt_history(self, device_id, hds_id, id, **kwargs): # noqa: E501
"""get device instance SDT history # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_sdt_history(device_id, hds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceGroupSDTHistoryPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_instance_sdt_history_with_http_info(device_id, hds_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_instance_sdt_history_with_http_info(device_id, hds_id, id, **kwargs) # noqa: E501
return data
def get_device_datasource_instance_sdt_history_with_http_info(self, device_id, hds_id, id, **kwargs): # noqa: E501
"""get device instance SDT history # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_instance_sdt_history_with_http_info(device_id, hds_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceGroupSDTHistoryPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_instance_sdt_history" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_instance_sdt_history`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `get_device_datasource_instance_sdt_history`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_datasource_instance_sdt_history`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_instance_sdt_history`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `get_device_datasource_instance_sdt_history`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_datasource_instance_sdt_history`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{id}/historysdts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroupSDTHistoryPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_datasource_list(self, device_id, **kwargs): # noqa: E501
"""get device datasource list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_list(device_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDatasourcePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_datasource_list_with_http_info(device_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_datasource_list_with_http_info(device_id, **kwargs) # noqa: E501
return data
def get_device_datasource_list_with_http_info(self, device_id, **kwargs): # noqa: E501
"""get device datasource list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_datasource_list_with_http_info(device_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDatasourcePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_datasource_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_datasource_list`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_datasource_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDatasourcePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_by_id(self, id, **kwargs): # noqa: E501
"""get device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: DeviceGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_device_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_device_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: DeviceGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_cluster_alert_conf_by_id(self, device_group_id, id, **kwargs): # noqa: E501
"""Get cluster alert configuration by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_cluster_alert_conf_by_id(device_group_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:return: DeviceClusterAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, **kwargs) # noqa: E501
return data
def get_device_group_cluster_alert_conf_by_id_with_http_info(self, device_group_id, id, **kwargs): # noqa: E501
"""Get cluster alert configuration by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:return: DeviceClusterAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_cluster_alert_conf_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `get_device_group_cluster_alert_conf_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_group_cluster_alert_conf_by_id`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `get_device_group_cluster_alert_conf_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_group_cluster_alert_conf_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/clusterAlertConf/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceClusterAlertConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_cluster_alert_conf_list(self, device_group_id, **kwargs): # noqa: E501
"""get a list of cluster alert configurations for a device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_cluster_alert_conf_list(device_group_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceClusterAlertConfigPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_cluster_alert_conf_list_with_http_info(device_group_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_group_cluster_alert_conf_list_with_http_info(device_group_id, **kwargs) # noqa: E501
return data
def get_device_group_cluster_alert_conf_list_with_http_info(self, device_group_id, **kwargs): # noqa: E501
"""get a list of cluster alert configurations for a device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_cluster_alert_conf_list_with_http_info(device_group_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceClusterAlertConfigPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_cluster_alert_conf_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `get_device_group_cluster_alert_conf_list`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `get_device_group_cluster_alert_conf_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/clusterAlertConf', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceClusterAlertConfigPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_datasource_alert_setting(self, device_group_id, ds_id, **kwargs): # noqa: E501
"""get device group datasource alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_datasource_alert_setting(device_group_id, ds_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int ds_id: (required)
:param str fields:
:return: DeviceGroupDataSourceAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_datasource_alert_setting_with_http_info(device_group_id, ds_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_group_datasource_alert_setting_with_http_info(device_group_id, ds_id, **kwargs) # noqa: E501
return data
def get_device_group_datasource_alert_setting_with_http_info(self, device_group_id, ds_id, **kwargs): # noqa: E501
"""get device group datasource alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_datasource_alert_setting_with_http_info(device_group_id, ds_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int ds_id: (required)
:param str fields:
:return: DeviceGroupDataSourceAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'ds_id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_datasource_alert_setting" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `get_device_group_datasource_alert_setting`") # noqa: E501
# verify the required parameter 'ds_id' is set
if ('ds_id' not in params or
params['ds_id'] is None):
raise ValueError("Missing the required parameter `ds_id` when calling `get_device_group_datasource_alert_setting`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `get_device_group_datasource_alert_setting`, must conform to the pattern `/\d+/`") # noqa: E501
if 'ds_id' in params and not re.search('\d+', params['ds_id'] if type(params['ds_id']) is str else str(params['ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `ds_id` when calling `get_device_group_datasource_alert_setting`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
if 'ds_id' in params:
path_params['dsId'] = params['ds_id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/datasources/{dsId}/alertsettings', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroupDataSourceAlertConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_datasource_by_id(self, device_group_id, id, **kwargs): # noqa: E501
"""get device group datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_datasource_by_id(device_group_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:param str fields:
:return: DeviceGroupDataSource
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_datasource_by_id_with_http_info(device_group_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_device_group_datasource_by_id_with_http_info(device_group_id, id, **kwargs) # noqa: E501
return data
def get_device_group_datasource_by_id_with_http_info(self, device_group_id, id, **kwargs): # noqa: E501
"""get device group datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_datasource_by_id_with_http_info(device_group_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:param str fields:
:return: DeviceGroupDataSource
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_datasource_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `get_device_group_datasource_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_group_datasource_by_id`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `get_device_group_datasource_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_group_datasource_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/datasources/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroupDataSource', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_datasource_list(self, device_group_id, **kwargs): # noqa: E501
"""get device group datasource list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_datasource_list(device_group_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param bool include_disabled_data_source_without_instance:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceGroupDatasourcePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_datasource_list_with_http_info(device_group_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_group_datasource_list_with_http_info(device_group_id, **kwargs) # noqa: E501
return data
def get_device_group_datasource_list_with_http_info(self, device_group_id, **kwargs): # noqa: E501
"""get device group datasource list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_datasource_list_with_http_info(device_group_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param bool include_disabled_data_source_without_instance:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceGroupDatasourcePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'include_disabled_data_source_without_instance', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_datasource_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `get_device_group_datasource_list`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `get_device_group_datasource_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
query_params = []
if 'include_disabled_data_source_without_instance' in params:
query_params.append(('includeDisabledDataSourceWithoutInstance', params['include_disabled_data_source_without_instance'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/datasources', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroupDatasourcePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_list(self, **kwargs): # noqa: E501
"""get device group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_device_group_list_with_http_info(**kwargs) # noqa: E501
return data
def get_device_group_list_with_http_info(self, **kwargs): # noqa: E501
"""get device group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroupPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_property_by_name(self, gid, name, **kwargs): # noqa: E501
"""get device group property by name # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_property_by_name(gid, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str name: (required)
:param str fields:
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_property_by_name_with_http_info(gid, name, **kwargs) # noqa: E501
else:
(data) = self.get_device_group_property_by_name_with_http_info(gid, name, **kwargs) # noqa: E501
return data
def get_device_group_property_by_name_with_http_info(self, gid, name, **kwargs): # noqa: E501
"""get device group property by name # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_property_by_name_with_http_info(gid, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str name: (required)
:param str fields:
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['gid', 'name', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_property_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'gid' is set
if ('gid' not in params or
params['gid'] is None):
raise ValueError("Missing the required parameter `gid` when calling `get_device_group_property_by_name`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_device_group_property_by_name`") # noqa: E501
if 'gid' in params and not re.search('\d+', params['gid'] if type(params['gid']) is str else str(params['gid'])): # noqa: E501
raise ValueError("Invalid value for parameter `gid` when calling `get_device_group_property_by_name`, must conform to the pattern `/\d+/`") # noqa: E501
if 'name' in params and not re.search('[^\/]+', params['name'] if type(params['name']) is str else str(params['name'])): # noqa: E501
raise ValueError("Invalid value for parameter `name` when calling `get_device_group_property_by_name`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'gid' in params:
path_params['gid'] = params['gid'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{gid}/properties/{name}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EntityProperty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_property_list(self, gid, **kwargs): # noqa: E501
"""get device group properties # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_property_list(gid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: PropertyPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_property_list_with_http_info(gid, **kwargs) # noqa: E501
else:
(data) = self.get_device_group_property_list_with_http_info(gid, **kwargs) # noqa: E501
return data
def get_device_group_property_list_with_http_info(self, gid, **kwargs): # noqa: E501
"""get device group properties # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_property_list_with_http_info(gid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: PropertyPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['gid', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_property_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'gid' is set
if ('gid' not in params or
params['gid'] is None):
raise ValueError("Missing the required parameter `gid` when calling `get_device_group_property_list`") # noqa: E501
if 'gid' in params and not re.search('\d+', params['gid'] if type(params['gid']) is str else str(params['gid'])): # noqa: E501
raise ValueError("Invalid value for parameter `gid` when calling `get_device_group_property_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'gid' in params:
path_params['gid'] = params['gid'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{gid}/properties', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PropertyPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_group_sdt_list(self, id, **kwargs): # noqa: E501
"""get device group SDTs # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_sdt_list(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_group_sdt_list_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_device_group_sdt_list_with_http_info(id, **kwargs) # noqa: E501
return data
def get_device_group_sdt_list_with_http_info(self, id, **kwargs): # noqa: E501
"""get device group SDTs # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_group_sdt_list_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_group_sdt_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_device_group_sdt_list`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_device_group_sdt_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{id}/sdts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SDTPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_instance_graph_data_only_by_instance_id(self, instance_id, graph_id, **kwargs): # noqa: E501
"""get device instance data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_instance_graph_data_only_by_instance_id(instance_id, graph_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int instance_id: (required)
:param int graph_id: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_instance_graph_data_only_by_instance_id_with_http_info(instance_id, graph_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_instance_graph_data_only_by_instance_id_with_http_info(instance_id, graph_id, **kwargs) # noqa: E501
return data
def get_device_instance_graph_data_only_by_instance_id_with_http_info(self, instance_id, graph_id, **kwargs): # noqa: E501
"""get device instance data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_instance_graph_data_only_by_instance_id_with_http_info(instance_id, graph_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int instance_id: (required)
:param int graph_id: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['instance_id', 'graph_id', 'start', 'end', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_instance_graph_data_only_by_instance_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params or
params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `get_device_instance_graph_data_only_by_instance_id`") # noqa: E501
# verify the required parameter 'graph_id' is set
if ('graph_id' not in params or
params['graph_id'] is None):
raise ValueError("Missing the required parameter `graph_id` when calling `get_device_instance_graph_data_only_by_instance_id`") # noqa: E501
if 'instance_id' in params and not re.search('\d+', params['instance_id'] if type(params['instance_id']) is str else str(params['instance_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `instance_id` when calling `get_device_instance_graph_data_only_by_instance_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'graph_id' in params and not re.search('-?\d+', params['graph_id'] if type(params['graph_id']) is str else str(params['graph_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `graph_id` when calling `get_device_instance_graph_data_only_by_instance_id`, must conform to the pattern `/-?\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id'] # noqa: E501
if 'graph_id' in params:
path_params['graphId'] = params['graph_id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devicedatasourceinstances/{instanceId}/graphs/{graphId}/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GraphPlot', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_list(self, **kwargs): # noqa: E501
"""get device list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DevicePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_device_list_with_http_info(**kwargs) # noqa: E501
return data
def get_device_list_with_http_info(self, **kwargs): # noqa: E501
"""get device list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DevicePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['start', 'end', 'netflow_filter', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DevicePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_property_by_name(self, device_id, name, **kwargs): # noqa: E501
"""get device property by name # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_property_by_name(device_id, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str name: (required)
:param str fields:
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_property_by_name_with_http_info(device_id, name, **kwargs) # noqa: E501
else:
(data) = self.get_device_property_by_name_with_http_info(device_id, name, **kwargs) # noqa: E501
return data
def get_device_property_by_name_with_http_info(self, device_id, name, **kwargs): # noqa: E501
"""get device property by name # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_property_by_name_with_http_info(device_id, name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str name: (required)
:param str fields:
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'name', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_property_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_property_by_name`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `get_device_property_by_name`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_property_by_name`, must conform to the pattern `/\d+/`") # noqa: E501
if 'name' in params and not re.search('[^\/]+', params['name'] if type(params['name']) is str else str(params['name'])): # noqa: E501
raise ValueError("Invalid value for parameter `name` when calling `get_device_property_by_name`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/properties/{name}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EntityProperty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_device_property_list(self, device_id, **kwargs): # noqa: E501
"""get device properties # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_property_list(device_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: PropertyPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_device_property_list_with_http_info(device_id, **kwargs) # noqa: E501
else:
(data) = self.get_device_property_list_with_http_info(device_id, **kwargs) # noqa: E501
return data
def get_device_property_list_with_http_info(self, device_id, **kwargs): # noqa: E501
"""get device properties # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_device_property_list_with_http_info(device_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: PropertyPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_device_property_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_device_property_list`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_device_property_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/properties', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PropertyPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_escalation_chain_by_id(self, id, **kwargs): # noqa: E501
"""get escalation chain by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_escalation_chain_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: EscalatingChain
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_escalation_chain_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_escalation_chain_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_escalation_chain_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get escalation chain by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_escalation_chain_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: EscalatingChain
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_escalation_chain_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_escalation_chain_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_escalation_chain_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/chains/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EscalatingChain', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_escalation_chain_list(self, **kwargs): # noqa: E501
"""get escalation chain list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_escalation_chain_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: EscalationChainPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_escalation_chain_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_escalation_chain_list_with_http_info(**kwargs) # noqa: E501
return data
def get_escalation_chain_list_with_http_info(self, **kwargs): # noqa: E501
"""get escalation chain list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_escalation_chain_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: EscalationChainPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_escalation_chain_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/chains', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EscalationChainPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_immediate_device_list_by_device_group_id(self, id, **kwargs): # noqa: E501
"""get immediate devices under group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_immediate_device_list_by_device_group_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DevicePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_immediate_device_list_by_device_group_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_immediate_device_list_by_device_group_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_immediate_device_list_by_device_group_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get immediate devices under group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_immediate_device_list_by_device_group_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DevicePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_immediate_device_list_by_device_group_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_immediate_device_list_by_device_group_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_immediate_device_list_by_device_group_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{id}/devices', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DevicePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_immediate_website_list_by_website_group_id(self, id, **kwargs): # noqa: E501
"""get a list of websites for a group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_immediate_website_list_by_website_group_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WebsitePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_immediate_website_list_by_website_group_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_immediate_website_list_by_website_group_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_immediate_website_list_by_website_group_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get a list of websites for a group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_immediate_website_list_by_website_group_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WebsitePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_immediate_website_list_by_website_group_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_immediate_website_list_by_website_group_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_immediate_website_list_by_website_group_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/groups/{id}/websites', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WebsitePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_netflow_endpoint_list(self, id, **kwargs): # noqa: E501
"""get netflow endpoint list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netflow_endpoint_list(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str port:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: EndpointPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_netflow_endpoint_list_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_netflow_endpoint_list_with_http_info(id, **kwargs) # noqa: E501
return data
def get_netflow_endpoint_list_with_http_info(self, id, **kwargs): # noqa: E501
"""get netflow endpoint list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netflow_endpoint_list_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str port:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: EndpointPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter', 'port', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_netflow_endpoint_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_netflow_endpoint_list`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_netflow_endpoint_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'port' in params:
query_params.append(('port', params['port'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}/endpoints', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EndpointPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_netflow_flow_list(self, id, **kwargs): # noqa: E501
"""get netflow flow list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netflow_flow_list(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: FlowRecordPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_netflow_flow_list_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_netflow_flow_list_with_http_info(id, **kwargs) # noqa: E501
return data
def get_netflow_flow_list_with_http_info(self, id, **kwargs): # noqa: E501
"""get netflow flow list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netflow_flow_list_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: FlowRecordPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_netflow_flow_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_netflow_flow_list`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_netflow_flow_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}/flows', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='FlowRecordPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_netflow_port_list(self, id, **kwargs): # noqa: E501
"""get netflow port list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netflow_port_list(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str ip:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: PortPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_netflow_port_list_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_netflow_port_list_with_http_info(id, **kwargs) # noqa: E501
return data
def get_netflow_port_list_with_http_info(self, id, **kwargs): # noqa: E501
"""get netflow port list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netflow_port_list_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str ip:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: PortPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter', 'ip', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_netflow_port_list" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_netflow_port_list`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_netflow_port_list`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'ip' in params:
query_params.append(('ip', params['ip'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}/ports', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PortPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_netscan_by_id(self, id, **kwargs): # noqa: E501
"""get netscan by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netscan_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: Netscan
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_netscan_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_netscan_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_netscan_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get netscan by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netscan_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: Netscan
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_netscan_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_netscan_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_netscan_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/netscans/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Netscan', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_netscan_list(self, **kwargs): # noqa: E501
"""get netscan list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netscan_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: NetscanPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_netscan_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_netscan_list_with_http_info(**kwargs) # noqa: E501
return data
def get_netscan_list_with_http_info(self, **kwargs): # noqa: E501
"""get netscan list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_netscan_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: NetscanPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_netscan_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/netscans', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='NetscanPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_ops_note_by_id(self, id, **kwargs): # noqa: E501
"""get opsnote by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_ops_note_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str fields:
:return: OpsNote
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_ops_note_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_ops_note_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_ops_note_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get opsnote by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_ops_note_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str fields:
:return: OpsNote
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_ops_note_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_ops_note_by_id`") # noqa: E501
if 'id' in params and not re.search('[^\/]+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_ops_note_by_id`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/opsnotes/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OpsNote', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_ops_note_list(self, **kwargs): # noqa: E501
"""get opsnote list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_ops_note_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: OpsNotePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_ops_note_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_ops_note_list_with_http_info(**kwargs) # noqa: E501
return data
def get_ops_note_list_with_http_info(self, **kwargs): # noqa: E501
"""get opsnote list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_ops_note_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: OpsNotePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_ops_note_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/opsnotes', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OpsNotePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_recipient_group_by_id(self, id, **kwargs): # noqa: E501
"""get recipient group by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_recipient_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: RecipientGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_recipient_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_recipient_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_recipient_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get recipient group by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_recipient_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: RecipientGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_recipient_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_recipient_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_recipient_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/recipientgroups/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RecipientGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_recipient_group_list(self, **kwargs): # noqa: E501
"""get recipient group List # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_recipient_group_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: RecipientGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_recipient_group_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_recipient_group_list_with_http_info(**kwargs) # noqa: E501
return data
def get_recipient_group_list_with_http_info(self, **kwargs): # noqa: E501
"""get recipient group List # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_recipient_group_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: RecipientGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_recipient_group_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/recipientgroups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RecipientGroupPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_report_by_id(self, id, **kwargs): # noqa: E501
"""get report by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_report_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: ReportBase
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_report_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_report_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_report_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get report by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_report_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: ReportBase
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_report_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_report_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_report_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/reports/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportBase', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_report_group_by_id(self, id, **kwargs): # noqa: E501
"""get report group by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_report_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: ReportGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_report_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_report_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_report_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get report group by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_report_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: ReportGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_report_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_report_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_report_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/groups/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_report_group_list(self, **kwargs): # noqa: E501
"""get report group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_report_group_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: ReportGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_report_group_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_report_group_list_with_http_info(**kwargs) # noqa: E501
return data
def get_report_group_list_with_http_info(self, **kwargs): # noqa: E501
"""get report group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_report_group_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: ReportGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_report_group_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportGroupPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_report_list(self, **kwargs): # noqa: E501
"""get report list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_report_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: ReportPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_report_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_report_list_with_http_info(**kwargs) # noqa: E501
return data
def get_report_list_with_http_info(self, **kwargs): # noqa: E501
"""get report list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_report_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: ReportPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_report_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/reports', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_role_by_id(self, id, **kwargs): # noqa: E501
"""get role by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_role_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_role_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_role_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_role_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get role by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_role_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_role_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_role_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_role_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/roles/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Role', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_role_list(self, **kwargs): # noqa: E501
"""get role list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_role_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: RolePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_role_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_role_list_with_http_info(**kwargs) # noqa: E501
return data
def get_role_list_with_http_info(self, **kwargs): # noqa: E501
"""get role list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_role_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: RolePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_role_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/roles', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RolePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_sdt_by_id(self, id, **kwargs): # noqa: E501
"""get SDT by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str fields:
:return: SDT
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_sdt_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_sdt_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_sdt_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get SDT by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param str fields:
:return: SDT
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_sdt_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_sdt_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/sdt/sdts/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SDT', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_sdt_history_by_device_data_source_id(self, device_id, id, **kwargs): # noqa: E501
"""get SDT history for the device dataSource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_history_by_device_data_source_id(device_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDataSourceSDTHistoryPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_sdt_history_by_device_data_source_id_with_http_info(device_id, id, **kwargs) # noqa: E501
else:
(data) = self.get_sdt_history_by_device_data_source_id_with_http_info(device_id, id, **kwargs) # noqa: E501
return data
def get_sdt_history_by_device_data_source_id_with_http_info(self, device_id, id, **kwargs): # noqa: E501
"""get SDT history for the device dataSource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_history_by_device_data_source_id_with_http_info(device_id, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceDataSourceSDTHistoryPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_sdt_history_by_device_data_source_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `get_sdt_history_by_device_data_source_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_sdt_history_by_device_data_source_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `get_sdt_history_by_device_data_source_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_sdt_history_by_device_data_source_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{id}/historysdts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceSDTHistoryPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_sdt_history_by_device_group_id(self, id, **kwargs): # noqa: E501
"""get SDT history for the group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_history_by_device_group_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceGroupSDTHistoryPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_sdt_history_by_device_group_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_sdt_history_by_device_group_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_sdt_history_by_device_group_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get SDT history for the group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_history_by_device_group_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceGroupSDTHistoryPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_sdt_history_by_device_group_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_sdt_history_by_device_group_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_sdt_history_by_device_group_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{id}/historysdts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroupSDTHistoryPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_sdt_history_by_device_id(self, id, **kwargs): # noqa: E501
"""get SDT history for the device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_history_by_device_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceSDTHistoryPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_sdt_history_by_device_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_sdt_history_by_device_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_sdt_history_by_device_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get SDT history for the device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_history_by_device_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DeviceSDTHistoryPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_sdt_history_by_device_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_sdt_history_by_device_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_sdt_history_by_device_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}/historysdts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceSDTHistoryPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_sdt_list(self, **kwargs): # noqa: E501
"""get SDT list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_sdt_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_sdt_list_with_http_info(**kwargs) # noqa: E501
return data
def get_sdt_list_with_http_info(self, **kwargs): # noqa: E501
"""get SDT list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_sdt_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_sdt_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/sdt/sdts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SDTPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_site_monitor_check_point_list(self, **kwargs): # noqa: E501
"""get website checkpoint list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_site_monitor_check_point_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SiteMonitorCheckPointPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_site_monitor_check_point_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_site_monitor_check_point_list_with_http_info(**kwargs) # noqa: E501
return data
def get_site_monitor_check_point_list_with_http_info(self, **kwargs): # noqa: E501
"""get website checkpoint list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_site_monitor_check_point_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SiteMonitorCheckPointPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_site_monitor_check_point_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/smcheckpoints', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SiteMonitorCheckPointPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_top_talkers_graph(self, id, **kwargs): # noqa: E501
"""get top talkers graph # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_top_talkers_graph(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str format:
:param str keyword:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_top_talkers_graph_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_top_talkers_graph_with_http_info(id, **kwargs) # noqa: E501
return data
def get_top_talkers_graph_with_http_info(self, id, **kwargs): # noqa: E501
"""get top talkers graph # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_top_talkers_graph_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str format:
:param str keyword:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter', 'format', 'keyword'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_top_talkers_graph" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_top_talkers_graph`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_top_talkers_graph`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
if 'keyword' in params:
query_params.append(('keyword', params['keyword'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}/topTalkersGraph', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GraphPlot', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_unmonitored_device_list(self, **kwargs): # noqa: E501
"""get unmonitored device list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_unmonitored_device_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: UnmonitoredDevicePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_unmonitored_device_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_unmonitored_device_list_with_http_info(**kwargs) # noqa: E501
return data
def get_unmonitored_device_list_with_http_info(self, **kwargs): # noqa: E501
"""get unmonitored device list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_unmonitored_device_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: UnmonitoredDevicePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_unmonitored_device_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/unmonitoreddevices', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UnmonitoredDevicePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_update_reason_list_by_data_source_id(self, id, **kwargs): # noqa: E501
"""get update history for a datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_update_reason_list_by_data_source_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DataSourceUpdateReasonsPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_update_reason_list_by_data_source_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_update_reason_list_by_data_source_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_update_reason_list_by_data_source_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get update history for a datasource # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_update_reason_list_by_data_source_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: DataSourceUpdateReasonsPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_update_reason_list_by_data_source_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_update_reason_list_by_data_source_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_update_reason_list_by_data_source_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/datasources/{id}/updatereasons', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DataSourceUpdateReasonsPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_alert_list_by_website_id(self, id, **kwargs): # noqa: E501
"""get alerts for a website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_alert_list_by_website_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool need_message:
:param str custom_columns:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_alert_list_by_website_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_website_alert_list_by_website_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_website_alert_list_by_website_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get alerts for a website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_alert_list_by_website_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param bool need_message:
:param str custom_columns:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: AlertPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'need_message', 'custom_columns', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_alert_list_by_website_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_website_alert_list_by_website_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_website_alert_list_by_website_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'need_message' in params:
query_params.append(('needMessage', params['need_message'])) # noqa: E501
if 'custom_columns' in params:
query_params.append(('customColumns', params['custom_columns'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{id}/alerts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_by_id(self, id, **kwargs): # noqa: E501
"""get website by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str format:
:return: Website
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_website_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_website_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get website by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str format:
:return: Website
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_website_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_website_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Website', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_checkpoint_data_by_id(self, srv_id, check_id, **kwargs): # noqa: E501
"""get data for a website checkpoint # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_checkpoint_data_by_id(srv_id, check_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int srv_id: (required)
:param int check_id: (required)
:param float period:
:param int start:
:param int end:
:param str datapoints:
:param str format:
:return: WebsiteCheckpointRawData
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_checkpoint_data_by_id_with_http_info(srv_id, check_id, **kwargs) # noqa: E501
else:
(data) = self.get_website_checkpoint_data_by_id_with_http_info(srv_id, check_id, **kwargs) # noqa: E501
return data
def get_website_checkpoint_data_by_id_with_http_info(self, srv_id, check_id, **kwargs): # noqa: E501
"""get data for a website checkpoint # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_checkpoint_data_by_id_with_http_info(srv_id, check_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int srv_id: (required)
:param int check_id: (required)
:param float period:
:param int start:
:param int end:
:param str datapoints:
:param str format:
:return: WebsiteCheckpointRawData
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['srv_id', 'check_id', 'period', 'start', 'end', 'datapoints', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_checkpoint_data_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'srv_id' is set
if ('srv_id' not in params or
params['srv_id'] is None):
raise ValueError("Missing the required parameter `srv_id` when calling `get_website_checkpoint_data_by_id`") # noqa: E501
# verify the required parameter 'check_id' is set
if ('check_id' not in params or
params['check_id'] is None):
raise ValueError("Missing the required parameter `check_id` when calling `get_website_checkpoint_data_by_id`") # noqa: E501
if 'srv_id' in params and not re.search('\d+', params['srv_id'] if type(params['srv_id']) is str else str(params['srv_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `srv_id` when calling `get_website_checkpoint_data_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'check_id' in params and not re.search('\d+', params['check_id'] if type(params['check_id']) is str else str(params['check_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `check_id` when calling `get_website_checkpoint_data_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'srv_id' in params:
path_params['srvId'] = params['srv_id'] # noqa: E501
if 'check_id' in params:
path_params['checkId'] = params['check_id'] # noqa: E501
query_params = []
if 'period' in params:
query_params.append(('period', params['period'])) # noqa: E501
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'datapoints' in params:
query_params.append(('datapoints', params['datapoints'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{srvId}/checkpoints/{checkId}/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WebsiteCheckpointRawData', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_data_by_graph_name(self, id, graph_name, **kwargs): # noqa: E501
"""get website data by graph name # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_data_by_graph_name(id, graph_name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str graph_name: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_data_by_graph_name_with_http_info(id, graph_name, **kwargs) # noqa: E501
else:
(data) = self.get_website_data_by_graph_name_with_http_info(id, graph_name, **kwargs) # noqa: E501
return data
def get_website_data_by_graph_name_with_http_info(self, id, graph_name, **kwargs): # noqa: E501
"""get website data by graph name # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_data_by_graph_name_with_http_info(id, graph_name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str graph_name: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'graph_name', 'start', 'end', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_data_by_graph_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_website_data_by_graph_name`") # noqa: E501
# verify the required parameter 'graph_name' is set
if ('graph_name' not in params or
params['graph_name'] is None):
raise ValueError("Missing the required parameter `graph_name` when calling `get_website_data_by_graph_name`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_website_data_by_graph_name`, must conform to the pattern `/\d+/`") # noqa: E501
if 'graph_name' in params and not re.search('.+', params['graph_name'] if type(params['graph_name']) is str else str(params['graph_name'])): # noqa: E501
raise ValueError("Invalid value for parameter `graph_name` when calling `get_website_data_by_graph_name`, must conform to the pattern `/.+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
if 'graph_name' in params:
path_params['graphName'] = params['graph_name'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{id}/graphs/{graphName}/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GraphPlot', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_graph_data(self, website_id, checkpoint_id, graph_name, **kwargs): # noqa: E501
"""get website graph data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_graph_data(website_id, checkpoint_id, graph_name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int website_id: (required)
:param int checkpoint_id: (required)
:param str graph_name: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_graph_data_with_http_info(website_id, checkpoint_id, graph_name, **kwargs) # noqa: E501
else:
(data) = self.get_website_graph_data_with_http_info(website_id, checkpoint_id, graph_name, **kwargs) # noqa: E501
return data
def get_website_graph_data_with_http_info(self, website_id, checkpoint_id, graph_name, **kwargs): # noqa: E501
"""get website graph data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_graph_data_with_http_info(website_id, checkpoint_id, graph_name, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int website_id: (required)
:param int checkpoint_id: (required)
:param str graph_name: (required)
:param int start:
:param int end:
:param str format:
:return: GraphPlot
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['website_id', 'checkpoint_id', 'graph_name', 'start', 'end', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_graph_data" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'website_id' is set
if ('website_id' not in params or
params['website_id'] is None):
raise ValueError("Missing the required parameter `website_id` when calling `get_website_graph_data`") # noqa: E501
# verify the required parameter 'checkpoint_id' is set
if ('checkpoint_id' not in params or
params['checkpoint_id'] is None):
raise ValueError("Missing the required parameter `checkpoint_id` when calling `get_website_graph_data`") # noqa: E501
# verify the required parameter 'graph_name' is set
if ('graph_name' not in params or
params['graph_name'] is None):
raise ValueError("Missing the required parameter `graph_name` when calling `get_website_graph_data`") # noqa: E501
if 'website_id' in params and not re.search('\d+', params['website_id'] if type(params['website_id']) is str else str(params['website_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `website_id` when calling `get_website_graph_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'checkpoint_id' in params and not re.search('\d+', params['checkpoint_id'] if type(params['checkpoint_id']) is str else str(params['checkpoint_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `checkpoint_id` when calling `get_website_graph_data`, must conform to the pattern `/\d+/`") # noqa: E501
if 'graph_name' in params and not re.search('.+', params['graph_name'] if type(params['graph_name']) is str else str(params['graph_name'])): # noqa: E501
raise ValueError("Invalid value for parameter `graph_name` when calling `get_website_graph_data`, must conform to the pattern `/.+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'website_id' in params:
path_params['websiteId'] = params['website_id'] # noqa: E501
if 'checkpoint_id' in params:
path_params['checkpointId'] = params['checkpoint_id'] # noqa: E501
if 'graph_name' in params:
path_params['graphName'] = params['graph_name'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{websiteId}/checkpoints/{checkpointId}/graphs/{graphName}/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='GraphPlot', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_group_by_id(self, id, **kwargs): # noqa: E501
"""get website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_group_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: WebsiteGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_group_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_website_group_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_website_group_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_group_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: WebsiteGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_website_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_website_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/groups/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WebsiteGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_group_list(self, **kwargs): # noqa: E501
"""get website group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_group_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WebsiteGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_group_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_website_group_list_with_http_info(**kwargs) # noqa: E501
return data
def get_website_group_list_with_http_info(self, **kwargs): # noqa: E501
"""get website group list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_group_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WebsiteGroupPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_group_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WebsiteGroupPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_list(self, **kwargs): # noqa: E501
"""get website list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str collector_ids:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WebsitePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_website_list_with_http_info(**kwargs) # noqa: E501
return data
def get_website_list_with_http_info(self, **kwargs): # noqa: E501
"""get website list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str collector_ids:
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WebsitePaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['collector_ids', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'collector_ids' in params:
query_params.append(('collectorIds', params['collector_ids'])) # noqa: E501
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WebsitePaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_property_list_by_website_id(self, id, **kwargs): # noqa: E501
"""get a list of properties for a website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_property_list_by_website_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: PropertyPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_property_list_by_website_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_website_property_list_by_website_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_website_property_list_by_website_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get a list of properties for a website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_property_list_by_website_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: PropertyPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_property_list_by_website_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_website_property_list_by_website_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_website_property_list_by_website_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{id}/properties', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PropertyPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_website_sdt_list_by_website_id(self, id, **kwargs): # noqa: E501
"""get a list of SDTs for a website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_sdt_list_by_website_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_website_sdt_list_by_website_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_website_sdt_list_by_website_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_website_sdt_list_by_website_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get a list of SDTs for a website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_website_sdt_list_by_website_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: SDTPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_website_sdt_list_by_website_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_website_sdt_list_by_website_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_website_sdt_list_by_website_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{id}/sdts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SDTPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_widget_by_id(self, id, **kwargs): # noqa: E501
"""get widget by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: Widget
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_widget_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_widget_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_widget_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get widget by id # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:return: Widget
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_widget_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_widget_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_widget_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/widgets/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Widget', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_widget_data_by_id(self, id, **kwargs): # noqa: E501
"""get widget data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_data_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str format:
:return: WidgetData
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_widget_data_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_widget_data_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_widget_data_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get widget data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_data_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str format:
:return: WidgetData
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'format'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_widget_data_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_widget_data_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_widget_data_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'format' in params:
query_params.append(('format', params['format'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/widgets/{id}/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WidgetData', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_widget_list(self, **kwargs): # noqa: E501
"""get widget list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WidgetPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_widget_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_widget_list_with_http_info(**kwargs) # noqa: E501
return data
def get_widget_list_with_http_info(self, **kwargs): # noqa: E501
"""get widget list # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WidgetPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_widget_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/widgets', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WidgetPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_widget_list_by_dashboard_id(self, id, **kwargs): # noqa: E501
"""get widget list by DashboardId # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_list_by_dashboard_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WidgetPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_widget_list_by_dashboard_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_widget_list_by_dashboard_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_widget_list_by_dashboard_id_with_http_info(self, id, **kwargs): # noqa: E501
"""get widget list by DashboardId # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_widget_list_by_dashboard_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param str fields:
:param int size:
:param int offset:
:param str filter:
:return: WidgetPaginationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'fields', 'size', 'offset', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_widget_list_by_dashboard_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `get_widget_list_by_dashboard_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `get_widget_list_by_dashboard_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'fields' in params:
query_params.append(('fields', params['fields'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'filter' in params:
query_params.append(('filter', params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/dashboards/{id}/widgets', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WidgetPaginationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def import_batch_job(self, file, **kwargs): # noqa: E501
"""import batch job via xml # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_batch_job(file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file file: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.import_batch_job_with_http_info(file, **kwargs) # noqa: E501
else:
(data) = self.import_batch_job_with_http_info(file, **kwargs) # noqa: E501
return data
def import_batch_job_with_http_info(self, file, **kwargs): # noqa: E501
"""import batch job via xml # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_batch_job_with_http_info(file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file file: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method import_batch_job" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'file' is set
if ('file' not in params or
params['file'] is None):
raise ValueError("Missing the required parameter `file` when calling `import_batch_job`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'file' in params:
local_var_files['file'] = params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/batchjobs/importxml', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def import_config_source(self, file, **kwargs): # noqa: E501
"""import config source via xml # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_config_source(file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file file: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.import_config_source_with_http_info(file, **kwargs) # noqa: E501
else:
(data) = self.import_config_source_with_http_info(file, **kwargs) # noqa: E501
return data
def import_config_source_with_http_info(self, file, **kwargs): # noqa: E501
"""import config source via xml # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_config_source_with_http_info(file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file file: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method import_config_source" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'file' is set
if ('file' not in params or
params['file'] is None):
raise ValueError("Missing the required parameter `file` when calling `import_config_source`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'file' in params:
local_var_files['file'] = params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/configsources/importxml', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def import_data_source(self, file, **kwargs): # noqa: E501
"""import datasource via xml # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_data_source(file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file file: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.import_data_source_with_http_info(file, **kwargs) # noqa: E501
else:
(data) = self.import_data_source_with_http_info(file, **kwargs) # noqa: E501
return data
def import_data_source_with_http_info(self, file, **kwargs): # noqa: E501
"""import datasource via xml # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_data_source_with_http_info(file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file file: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method import_data_source" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'file' is set
if ('file' not in params or
params['file'] is None):
raise ValueError("Missing the required parameter `file` when calling `import_data_source`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'file' in params:
local_var_files['file'] = params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/datasources/importxml', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def import_event_source(self, file, **kwargs): # noqa: E501
"""import eventsource via xml # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_event_source(file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file file: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.import_event_source_with_http_info(file, **kwargs) # noqa: E501
else:
(data) = self.import_event_source_with_http_info(file, **kwargs) # noqa: E501
return data
def import_event_source_with_http_info(self, file, **kwargs): # noqa: E501
"""import eventsource via xml # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_event_source_with_http_info(file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param file file: (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['file'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method import_event_source" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'file' is set
if ('file' not in params or
params['file'] is None):
raise ValueError("Missing the required parameter `file` when calling `import_event_source`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'file' in params:
local_var_files['file'] = params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/eventsources/importxml', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_admin_by_id(self, id, body, **kwargs): # noqa: E501
"""update user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_admin_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Admin body: (required)
:param bool change_password:
:return: Admin
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_admin_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_admin_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_admin_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_admin_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Admin body: (required)
:param bool change_password:
:return: Admin
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'change_password'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_admin_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_admin_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_admin_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_admin_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'change_password' in params:
query_params.append(('changePassword', params['change_password'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Admin', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_alert_rule_by_id(self, id, body, **kwargs): # noqa: E501
"""update alert rule # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_alert_rule_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param AlertRule body: (required)
:return: AlertRule
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_alert_rule_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_alert_rule_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_alert_rule_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update alert rule # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_alert_rule_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param AlertRule body: (required)
:return: AlertRule
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_alert_rule_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_alert_rule_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_alert_rule_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_alert_rule_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/rules/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertRule', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_api_token_by_admin_id(self, admin_id, apitoken_id, body, **kwargs): # noqa: E501
"""update api tokens for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_api_token_by_admin_id(admin_id, apitoken_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param int apitoken_id: (required)
:param APIToken body: (required)
:return: APIToken
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_api_token_by_admin_id_with_http_info(admin_id, apitoken_id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_api_token_by_admin_id_with_http_info(admin_id, apitoken_id, body, **kwargs) # noqa: E501
return data
def patch_api_token_by_admin_id_with_http_info(self, admin_id, apitoken_id, body, **kwargs): # noqa: E501
"""update api tokens for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_api_token_by_admin_id_with_http_info(admin_id, apitoken_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param int apitoken_id: (required)
:param APIToken body: (required)
:return: APIToken
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['admin_id', 'apitoken_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_api_token_by_admin_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'admin_id' is set
if ('admin_id' not in params or
params['admin_id'] is None):
raise ValueError("Missing the required parameter `admin_id` when calling `patch_api_token_by_admin_id`") # noqa: E501
# verify the required parameter 'apitoken_id' is set
if ('apitoken_id' not in params or
params['apitoken_id'] is None):
raise ValueError("Missing the required parameter `apitoken_id` when calling `patch_api_token_by_admin_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_api_token_by_admin_id`") # noqa: E501
if 'admin_id' in params and not re.search('\d+', params['admin_id'] if type(params['admin_id']) is str else str(params['admin_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `admin_id` when calling `patch_api_token_by_admin_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'apitoken_id' in params and not re.search('\d+', params['apitoken_id'] if type(params['apitoken_id']) is str else str(params['apitoken_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `apitoken_id` when calling `patch_api_token_by_admin_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'admin_id' in params:
path_params['adminId'] = params['admin_id'] # noqa: E501
if 'apitoken_id' in params:
path_params['apitokenId'] = params['apitoken_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/{adminId}/apitokens/{apitokenId}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='APIToken', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_collector_by_id(self, id, body, **kwargs): # noqa: E501
"""update collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_collector_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Collector body: (required)
:param bool collector_load_balanced:
:param bool force_update_failed_over_devices:
:return: Collector
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_collector_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_collector_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_collector_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_collector_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Collector body: (required)
:param bool collector_load_balanced:
:param bool force_update_failed_over_devices:
:return: Collector
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'collector_load_balanced', 'force_update_failed_over_devices'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_collector_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_collector_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_collector_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_collector_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'collector_load_balanced' in params:
query_params.append(('collectorLoadBalanced', params['collector_load_balanced'])) # noqa: E501
if 'force_update_failed_over_devices' in params:
query_params.append(('forceUpdateFailedOverDevices', params['force_update_failed_over_devices'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/collectors/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Collector', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_collector_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_collector_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param CollectorGroup body: (required)
:param bool collector_load_balanced:
:param bool force_update_failed_over_devices:
:return: CollectorGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_collector_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_collector_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_collector_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_collector_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param CollectorGroup body: (required)
:param bool collector_load_balanced:
:param bool force_update_failed_over_devices:
:return: CollectorGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'collector_load_balanced', 'force_update_failed_over_devices'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_collector_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_collector_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_collector_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_collector_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'collector_load_balanced' in params:
query_params.append(('collectorLoadBalanced', params['collector_load_balanced'])) # noqa: E501
if 'force_update_failed_over_devices' in params:
query_params.append(('forceUpdateFailedOverDevices', params['force_update_failed_over_devices'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/groups/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CollectorGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_dashboard_by_id(self, id, body, **kwargs): # noqa: E501
"""update dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_dashboard_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Dashboard body: (required)
:param bool overwrite_group_fields:
:return: Dashboard
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_dashboard_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_dashboard_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_dashboard_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_dashboard_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Dashboard body: (required)
:param bool overwrite_group_fields:
:return: Dashboard
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'overwrite_group_fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_dashboard_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_dashboard_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_dashboard_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_dashboard_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'overwrite_group_fields' in params:
query_params.append(('overwriteGroupFields', params['overwrite_group_fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/dashboards/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Dashboard', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_dashboard_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_dashboard_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param DashboardGroup body: (required)
:return: DashboardGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_dashboard_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_dashboard_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_dashboard_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_dashboard_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param DashboardGroup body: (required)
:return: DashboardGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_dashboard_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_dashboard_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_dashboard_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_dashboard_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/groups/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DashboardGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_device(self, id, body, **kwargs): # noqa: E501
"""update a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Device body: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str op_type:
:return: Device
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_device_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_device_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_device_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Device body: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str op_type:
:return: Device
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'start', 'end', 'netflow_filter', 'op_type'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_device" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_device`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_device`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_device`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'op_type' in params:
query_params.append(('opType', params['op_type'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Device', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_device_datasource_instance_alert_setting_by_id(self, device_id, hds_id, instance_id, id, body, **kwargs): # noqa: E501
"""update device instance alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_datasource_instance_alert_setting_by_id(device_id, hds_id, instance_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: Device-DataSource ID (required)
:param int instance_id: (required)
:param int id: (required)
:param DeviceDataSourceInstanceAlertSetting body: (required)
:return: DeviceDataSourceInstanceAlertSetting
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_device_datasource_instance_alert_setting_by_id_with_http_info(device_id, hds_id, instance_id, id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_device_datasource_instance_alert_setting_by_id_with_http_info(device_id, hds_id, instance_id, id, body, **kwargs) # noqa: E501
return data
def patch_device_datasource_instance_alert_setting_by_id_with_http_info(self, device_id, hds_id, instance_id, id, body, **kwargs): # noqa: E501
"""update device instance alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_datasource_instance_alert_setting_by_id_with_http_info(device_id, hds_id, instance_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: Device-DataSource ID (required)
:param int instance_id: (required)
:param int id: (required)
:param DeviceDataSourceInstanceAlertSetting body: (required)
:return: DeviceDataSourceInstanceAlertSetting
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'instance_id', 'id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_device_datasource_instance_alert_setting_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `patch_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `patch_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params or
params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `patch_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_device_datasource_instance_alert_setting_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `patch_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `patch_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'instance_id' in params and not re.search('\d+', params['instance_id'] if type(params['instance_id']) is str else str(params['instance_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `instance_id` when calling `patch_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{instanceId}/alertsettings/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceAlertSetting', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_device_datasource_instance_by_id(self, device_id, hds_id, id, body, **kwargs): # noqa: E501
"""update device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_datasource_instance_by_id(device_id, hds_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param DeviceDataSourceInstance body: (required)
:param str op_type:
:return: DeviceDataSourceInstance
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, body, **kwargs) # noqa: E501
return data
def patch_device_datasource_instance_by_id_with_http_info(self, device_id, hds_id, id, body, **kwargs): # noqa: E501
"""update device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param DeviceDataSourceInstance body: (required)
:param str op_type:
:return: DeviceDataSourceInstance
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'id', 'body', 'op_type'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_device_datasource_instance_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `patch_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `patch_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_device_datasource_instance_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `patch_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `patch_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'op_type' in params:
query_params.append(('opType', params['op_type'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstance', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_device_datasource_instance_group_by_id(self, device_id, device_ds_id, id, body, **kwargs): # noqa: E501
"""update device datasource instance group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_datasource_instance_group_by_id(device_id, device_ds_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param int id: (required)
:param DeviceDataSourceInstanceGroup body: (required)
:return: DeviceDataSourceInstanceGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_device_datasource_instance_group_by_id_with_http_info(device_id, device_ds_id, id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_device_datasource_instance_group_by_id_with_http_info(device_id, device_ds_id, id, body, **kwargs) # noqa: E501
return data
def patch_device_datasource_instance_group_by_id_with_http_info(self, device_id, device_ds_id, id, body, **kwargs): # noqa: E501
"""update device datasource instance group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_datasource_instance_group_by_id_with_http_info(device_id, device_ds_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param int id: (required)
:param DeviceDataSourceInstanceGroup body: (required)
:return: DeviceDataSourceInstanceGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'device_ds_id', 'id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_device_datasource_instance_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `patch_device_datasource_instance_group_by_id`") # noqa: E501
# verify the required parameter 'device_ds_id' is set
if ('device_ds_id' not in params or
params['device_ds_id'] is None):
raise ValueError("Missing the required parameter `device_ds_id` when calling `patch_device_datasource_instance_group_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_device_datasource_instance_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_device_datasource_instance_group_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `patch_device_datasource_instance_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'device_ds_id' in params and not re.search('\d+', params['device_ds_id'] if type(params['device_ds_id']) is str else str(params['device_ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_ds_id` when calling `patch_device_datasource_instance_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_device_datasource_instance_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'device_ds_id' in params:
path_params['deviceDsId'] = params['device_ds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{deviceDsId}/groups/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_device_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param DeviceGroup body: (required)
:return: DeviceGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_device_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_device_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_device_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param DeviceGroup body: (required)
:return: DeviceGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_device_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_device_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_device_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_device_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_device_group_cluster_alert_conf_by_id(self, device_group_id, id, body, **kwargs): # noqa: E501
"""Update cluster alert configuration # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_group_cluster_alert_conf_by_id(device_group_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:param DeviceClusterAlertConfig body: (required)
:return: DeviceClusterAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, body, **kwargs) # noqa: E501
return data
def patch_device_group_cluster_alert_conf_by_id_with_http_info(self, device_group_id, id, body, **kwargs): # noqa: E501
"""Update cluster alert configuration # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:param DeviceClusterAlertConfig body: (required)
:return: DeviceClusterAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_device_group_cluster_alert_conf_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `patch_device_group_cluster_alert_conf_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_device_group_cluster_alert_conf_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_device_group_cluster_alert_conf_by_id`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `patch_device_group_cluster_alert_conf_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_device_group_cluster_alert_conf_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/clusterAlertConf/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceClusterAlertConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_device_group_datasource_alert_setting(self, device_group_id, ds_id, body, **kwargs): # noqa: E501
"""update device group datasource alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_group_datasource_alert_setting(device_group_id, ds_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int ds_id: (required)
:param DeviceGroupDataSourceAlertConfig body: (required)
:return: DeviceGroupDataSourceAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_device_group_datasource_alert_setting_with_http_info(device_group_id, ds_id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_device_group_datasource_alert_setting_with_http_info(device_group_id, ds_id, body, **kwargs) # noqa: E501
return data
def patch_device_group_datasource_alert_setting_with_http_info(self, device_group_id, ds_id, body, **kwargs): # noqa: E501
"""update device group datasource alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_group_datasource_alert_setting_with_http_info(device_group_id, ds_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int ds_id: (required)
:param DeviceGroupDataSourceAlertConfig body: (required)
:return: DeviceGroupDataSourceAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'ds_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_device_group_datasource_alert_setting" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `patch_device_group_datasource_alert_setting`") # noqa: E501
# verify the required parameter 'ds_id' is set
if ('ds_id' not in params or
params['ds_id'] is None):
raise ValueError("Missing the required parameter `ds_id` when calling `patch_device_group_datasource_alert_setting`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_device_group_datasource_alert_setting`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `patch_device_group_datasource_alert_setting`, must conform to the pattern `/\d+/`") # noqa: E501
if 'ds_id' in params and not re.search('\d+', params['ds_id'] if type(params['ds_id']) is str else str(params['ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `ds_id` when calling `patch_device_group_datasource_alert_setting`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
if 'ds_id' in params:
path_params['dsId'] = params['ds_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/datasources/{dsId}/alertsettings', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroupDataSourceAlertConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_device_group_property_by_name(self, gid, name, body, **kwargs): # noqa: E501
"""update device group property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_group_property_by_name(gid, name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str name: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_device_group_property_by_name_with_http_info(gid, name, body, **kwargs) # noqa: E501
else:
(data) = self.patch_device_group_property_by_name_with_http_info(gid, name, body, **kwargs) # noqa: E501
return data
def patch_device_group_property_by_name_with_http_info(self, gid, name, body, **kwargs): # noqa: E501
"""update device group property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_group_property_by_name_with_http_info(gid, name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str name: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['gid', 'name', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_device_group_property_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'gid' is set
if ('gid' not in params or
params['gid'] is None):
raise ValueError("Missing the required parameter `gid` when calling `patch_device_group_property_by_name`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `patch_device_group_property_by_name`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_device_group_property_by_name`") # noqa: E501
if 'gid' in params and not re.search('\d+', params['gid'] if type(params['gid']) is str else str(params['gid'])): # noqa: E501
raise ValueError("Invalid value for parameter `gid` when calling `patch_device_group_property_by_name`, must conform to the pattern `/\d+/`") # noqa: E501
if 'name' in params and not re.search('[^\/]+', params['name'] if type(params['name']) is str else str(params['name'])): # noqa: E501
raise ValueError("Invalid value for parameter `name` when calling `patch_device_group_property_by_name`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'gid' in params:
path_params['gid'] = params['gid'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{gid}/properties/{name}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EntityProperty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_device_property_by_name(self, device_id, name, body, **kwargs): # noqa: E501
"""update device property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_property_by_name(device_id, name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str name: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_device_property_by_name_with_http_info(device_id, name, body, **kwargs) # noqa: E501
else:
(data) = self.patch_device_property_by_name_with_http_info(device_id, name, body, **kwargs) # noqa: E501
return data
def patch_device_property_by_name_with_http_info(self, device_id, name, body, **kwargs): # noqa: E501
"""update device property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_device_property_by_name_with_http_info(device_id, name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str name: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'name', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_device_property_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `patch_device_property_by_name`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `patch_device_property_by_name`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_device_property_by_name`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `patch_device_property_by_name`, must conform to the pattern `/\d+/`") # noqa: E501
if 'name' in params and not re.search('[^\/]+', params['name'] if type(params['name']) is str else str(params['name'])): # noqa: E501
raise ValueError("Invalid value for parameter `name` when calling `patch_device_property_by_name`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/properties/{name}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EntityProperty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_escalation_chain_by_id(self, id, body, **kwargs): # noqa: E501
"""update escalation chain # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_escalation_chain_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param EscalatingChain body: (required)
:return: EscalatingChain
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_escalation_chain_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_escalation_chain_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_escalation_chain_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update escalation chain # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_escalation_chain_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param EscalatingChain body: (required)
:return: EscalatingChain
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_escalation_chain_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_escalation_chain_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_escalation_chain_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_escalation_chain_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/chains/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EscalatingChain', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_netscan(self, id, **kwargs): # noqa: E501
"""update a netscan # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_netscan(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Netscan body:
:param str reason:
:return: Netscan
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_netscan_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.patch_netscan_with_http_info(id, **kwargs) # noqa: E501
return data
def patch_netscan_with_http_info(self, id, **kwargs): # noqa: E501
"""update a netscan # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_netscan_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Netscan body:
:param str reason:
:return: Netscan
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'reason'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_netscan" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_netscan`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_netscan`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'reason' in params:
query_params.append(('reason', params['reason'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/netscans/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Netscan', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_ops_note_by_id(self, id, body, **kwargs): # noqa: E501
"""update opsnote # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_ops_note_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param OpsNote body: (required)
:return: OpsNote
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_ops_note_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_ops_note_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_ops_note_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update opsnote # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_ops_note_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param OpsNote body: (required)
:return: OpsNote
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_ops_note_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_ops_note_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_ops_note_by_id`") # noqa: E501
if 'id' in params and not re.search('[^\/]+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_ops_note_by_id`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/opsnotes/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OpsNote', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_recipient_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update recipient group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_recipient_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param RecipientGroup body: (required)
:return: RecipientGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_recipient_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_recipient_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_recipient_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update recipient group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_recipient_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param RecipientGroup body: (required)
:return: RecipientGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_recipient_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_recipient_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_recipient_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_recipient_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/recipientgroups/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RecipientGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_report_by_id(self, id, body, **kwargs): # noqa: E501
"""update report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_report_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param ReportBase body: (required)
:return: ReportBase
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_report_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_report_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_report_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_report_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param ReportBase body: (required)
:return: ReportBase
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_report_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_report_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_report_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_report_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/reports/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportBase', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_report_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update report group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_report_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param ReportGroup body: (required)
:return: ReportGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_report_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_report_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_report_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update report group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_report_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param ReportGroup body: (required)
:return: ReportGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_report_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_report_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_report_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_report_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/groups/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_role_by_id(self, id, body, **kwargs): # noqa: E501
"""update role # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_role_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Role body: (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_role_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_role_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_role_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update role # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_role_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Role body: (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_role_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_role_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_role_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_role_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/roles/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Role', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_sdt_by_id(self, id, body, **kwargs): # noqa: E501
"""update SDT # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_sdt_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param SDT body: (required)
:return: SDT
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_sdt_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_sdt_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_sdt_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update SDT # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_sdt_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param SDT body: (required)
:return: SDT
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_sdt_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_sdt_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_sdt_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/sdt/sdts/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SDT', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_website_by_id(self, id, body, **kwargs): # noqa: E501
"""update website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_website_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Website body: (required)
:return: Website
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_website_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_website_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_website_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_website_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Website body: (required)
:return: Website
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_website_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_website_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_website_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_website_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Website', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_website_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_website_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param WebsiteGroup body: (required)
:return: WebsiteGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_website_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_website_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_website_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_website_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param WebsiteGroup body: (required)
:return: WebsiteGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_website_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_website_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_website_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_website_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/groups/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WebsiteGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def patch_widget_by_id(self, id, body, **kwargs): # noqa: E501
"""update widget # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_widget_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Widget body: (required)
:return: Widget
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.patch_widget_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.patch_widget_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def patch_widget_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update widget # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.patch_widget_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Widget body: (required)
:return: Widget
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method patch_widget_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `patch_widget_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `patch_widget_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `patch_widget_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/widgets/{id}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Widget', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def schedule_auto_discovery_by_device_id(self, id, **kwargs): # noqa: E501
"""schedule active discovery for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.schedule_auto_discovery_by_device_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.schedule_auto_discovery_by_device_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.schedule_auto_discovery_by_device_id_with_http_info(id, **kwargs) # noqa: E501
return data
def schedule_auto_discovery_by_device_id_with_http_info(self, id, **kwargs): # noqa: E501
"""schedule active discovery for a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.schedule_auto_discovery_by_device_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param int start:
:param int end:
:param str netflow_filter:
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'start', 'end', 'netflow_filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method schedule_auto_discovery_by_device_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `schedule_auto_discovery_by_device_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `schedule_auto_discovery_by_device_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}/scheduleAutoDiscovery', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_admin_by_id(self, id, body, **kwargs): # noqa: E501
"""update user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_admin_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Admin body: (required)
:param bool change_password:
:return: Admin
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_admin_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_admin_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_admin_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_admin_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Admin body: (required)
:param bool change_password:
:return: Admin
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'change_password'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_admin_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_admin_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_admin_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_admin_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'change_password' in params:
query_params.append(('changePassword', params['change_password'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Admin', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_alert_rule_by_id(self, id, body, **kwargs): # noqa: E501
"""update alert rule # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_alert_rule_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param AlertRule body: (required)
:return: AlertRule
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_alert_rule_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_alert_rule_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_alert_rule_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update alert rule # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_alert_rule_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param AlertRule body: (required)
:return: AlertRule
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_alert_rule_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_alert_rule_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_alert_rule_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_alert_rule_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/rules/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertRule', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_api_token_by_admin_id(self, admin_id, apitoken_id, body, **kwargs): # noqa: E501
"""update api tokens for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_api_token_by_admin_id(admin_id, apitoken_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param int apitoken_id: (required)
:param APIToken body: (required)
:return: APIToken
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_api_token_by_admin_id_with_http_info(admin_id, apitoken_id, body, **kwargs) # noqa: E501
else:
(data) = self.update_api_token_by_admin_id_with_http_info(admin_id, apitoken_id, body, **kwargs) # noqa: E501
return data
def update_api_token_by_admin_id_with_http_info(self, admin_id, apitoken_id, body, **kwargs): # noqa: E501
"""update api tokens for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_api_token_by_admin_id_with_http_info(admin_id, apitoken_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int admin_id: (required)
:param int apitoken_id: (required)
:param APIToken body: (required)
:return: APIToken
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['admin_id', 'apitoken_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_api_token_by_admin_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'admin_id' is set
if ('admin_id' not in params or
params['admin_id'] is None):
raise ValueError("Missing the required parameter `admin_id` when calling `update_api_token_by_admin_id`") # noqa: E501
# verify the required parameter 'apitoken_id' is set
if ('apitoken_id' not in params or
params['apitoken_id'] is None):
raise ValueError("Missing the required parameter `apitoken_id` when calling `update_api_token_by_admin_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_api_token_by_admin_id`") # noqa: E501
if 'admin_id' in params and not re.search('\d+', params['admin_id'] if type(params['admin_id']) is str else str(params['admin_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `admin_id` when calling `update_api_token_by_admin_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'apitoken_id' in params and not re.search('\d+', params['apitoken_id'] if type(params['apitoken_id']) is str else str(params['apitoken_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `apitoken_id` when calling `update_api_token_by_admin_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'admin_id' in params:
path_params['adminId'] = params['admin_id'] # noqa: E501
if 'apitoken_id' in params:
path_params['apitokenId'] = params['apitoken_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/admins/{adminId}/apitokens/{apitokenId}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='APIToken', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_collector_by_id(self, id, body, **kwargs): # noqa: E501
"""update collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_collector_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Collector body: (required)
:param bool collector_load_balanced:
:param bool force_update_failed_over_devices:
:return: Collector
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_collector_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_collector_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_collector_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update collector # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_collector_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Collector body: (required)
:param bool collector_load_balanced:
:param bool force_update_failed_over_devices:
:return: Collector
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'collector_load_balanced', 'force_update_failed_over_devices'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_collector_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_collector_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_collector_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_collector_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'collector_load_balanced' in params:
query_params.append(('collectorLoadBalanced', params['collector_load_balanced'])) # noqa: E501
if 'force_update_failed_over_devices' in params:
query_params.append(('forceUpdateFailedOverDevices', params['force_update_failed_over_devices'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/collectors/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Collector', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_collector_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_collector_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param CollectorGroup body: (required)
:param bool collector_load_balanced:
:param bool force_update_failed_over_devices:
:return: CollectorGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_collector_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_collector_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_collector_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update collector group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_collector_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param CollectorGroup body: (required)
:param bool collector_load_balanced:
:param bool force_update_failed_over_devices:
:return: CollectorGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'collector_load_balanced', 'force_update_failed_over_devices'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_collector_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_collector_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_collector_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_collector_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'collector_load_balanced' in params:
query_params.append(('collectorLoadBalanced', params['collector_load_balanced'])) # noqa: E501
if 'force_update_failed_over_devices' in params:
query_params.append(('forceUpdateFailedOverDevices', params['force_update_failed_over_devices'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/collector/groups/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CollectorGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_dashboard_by_id(self, id, body, **kwargs): # noqa: E501
"""update dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_dashboard_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Dashboard body: (required)
:param bool overwrite_group_fields:
:return: Dashboard
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_dashboard_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_dashboard_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_dashboard_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update dashboard # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_dashboard_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Dashboard body: (required)
:param bool overwrite_group_fields:
:return: Dashboard
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'overwrite_group_fields'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_dashboard_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_dashboard_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_dashboard_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_dashboard_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'overwrite_group_fields' in params:
query_params.append(('overwriteGroupFields', params['overwrite_group_fields'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/dashboards/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Dashboard', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_dashboard_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_dashboard_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param DashboardGroup body: (required)
:return: DashboardGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_dashboard_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_dashboard_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_dashboard_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update dashboard group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_dashboard_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param DashboardGroup body: (required)
:return: DashboardGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_dashboard_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_dashboard_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_dashboard_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_dashboard_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/groups/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DashboardGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_device(self, id, body, **kwargs): # noqa: E501
"""update a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Device body: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str op_type:
:return: Device
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_device_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_device_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_device_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update a device # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Device body: (required)
:param int start:
:param int end:
:param str netflow_filter:
:param str op_type:
:return: Device
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'start', 'end', 'netflow_filter', 'op_type'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_device" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_device`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_device`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_device`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'start' in params:
query_params.append(('start', params['start'])) # noqa: E501
if 'end' in params:
query_params.append(('end', params['end'])) # noqa: E501
if 'netflow_filter' in params:
query_params.append(('netflowFilter', params['netflow_filter'])) # noqa: E501
if 'op_type' in params:
query_params.append(('opType', params['op_type'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Device', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_device_datasource_instance_alert_setting_by_id(self, device_id, hds_id, instance_id, id, body, **kwargs): # noqa: E501
"""update device instance alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_datasource_instance_alert_setting_by_id(device_id, hds_id, instance_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: Device-DataSource ID (required)
:param int instance_id: (required)
:param int id: (required)
:param DeviceDataSourceInstanceAlertSetting body: (required)
:return: DeviceDataSourceInstanceAlertSetting
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_device_datasource_instance_alert_setting_by_id_with_http_info(device_id, hds_id, instance_id, id, body, **kwargs) # noqa: E501
else:
(data) = self.update_device_datasource_instance_alert_setting_by_id_with_http_info(device_id, hds_id, instance_id, id, body, **kwargs) # noqa: E501
return data
def update_device_datasource_instance_alert_setting_by_id_with_http_info(self, device_id, hds_id, instance_id, id, body, **kwargs): # noqa: E501
"""update device instance alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_datasource_instance_alert_setting_by_id_with_http_info(device_id, hds_id, instance_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: Device-DataSource ID (required)
:param int instance_id: (required)
:param int id: (required)
:param DeviceDataSourceInstanceAlertSetting body: (required)
:return: DeviceDataSourceInstanceAlertSetting
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'instance_id', 'id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_device_datasource_instance_alert_setting_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `update_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `update_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'instance_id' is set
if ('instance_id' not in params or
params['instance_id'] is None):
raise ValueError("Missing the required parameter `instance_id` when calling `update_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_device_datasource_instance_alert_setting_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_device_datasource_instance_alert_setting_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `update_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `update_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'instance_id' in params and not re.search('\d+', params['instance_id'] if type(params['instance_id']) is str else str(params['instance_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `instance_id` when calling `update_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_device_datasource_instance_alert_setting_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'instance_id' in params:
path_params['instanceId'] = params['instance_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{instanceId}/alertsettings/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceAlertSetting', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_device_datasource_instance_by_id(self, device_id, hds_id, id, body, **kwargs): # noqa: E501
"""update device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_datasource_instance_by_id(device_id, hds_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param DeviceDataSourceInstance body: (required)
:param str op_type:
:return: DeviceDataSourceInstance
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, body, **kwargs) # noqa: E501
else:
(data) = self.update_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, body, **kwargs) # noqa: E501
return data
def update_device_datasource_instance_by_id_with_http_info(self, device_id, hds_id, id, body, **kwargs): # noqa: E501
"""update device instance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_datasource_instance_by_id_with_http_info(device_id, hds_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int hds_id: The device-datasource ID (required)
:param int id: (required)
:param DeviceDataSourceInstance body: (required)
:param str op_type:
:return: DeviceDataSourceInstance
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'hds_id', 'id', 'body', 'op_type'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_device_datasource_instance_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `update_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'hds_id' is set
if ('hds_id' not in params or
params['hds_id'] is None):
raise ValueError("Missing the required parameter `hds_id` when calling `update_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_device_datasource_instance_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_device_datasource_instance_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `update_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'hds_id' in params and not re.search('\d+', params['hds_id'] if type(params['hds_id']) is str else str(params['hds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `hds_id` when calling `update_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_device_datasource_instance_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'hds_id' in params:
path_params['hdsId'] = params['hds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'op_type' in params:
query_params.append(('opType', params['op_type'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{hdsId}/instances/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstance', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_device_datasource_instance_group_by_id(self, device_id, device_ds_id, id, body, **kwargs): # noqa: E501
"""update device datasource instance group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_datasource_instance_group_by_id(device_id, device_ds_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param int id: (required)
:param DeviceDataSourceInstanceGroup body: (required)
:return: DeviceDataSourceInstanceGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_device_datasource_instance_group_by_id_with_http_info(device_id, device_ds_id, id, body, **kwargs) # noqa: E501
else:
(data) = self.update_device_datasource_instance_group_by_id_with_http_info(device_id, device_ds_id, id, body, **kwargs) # noqa: E501
return data
def update_device_datasource_instance_group_by_id_with_http_info(self, device_id, device_ds_id, id, body, **kwargs): # noqa: E501
"""update device datasource instance group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_datasource_instance_group_by_id_with_http_info(device_id, device_ds_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param int device_ds_id: The device-datasource ID you'd like to add an instance group for (required)
:param int id: (required)
:param DeviceDataSourceInstanceGroup body: (required)
:return: DeviceDataSourceInstanceGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'device_ds_id', 'id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_device_datasource_instance_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `update_device_datasource_instance_group_by_id`") # noqa: E501
# verify the required parameter 'device_ds_id' is set
if ('device_ds_id' not in params or
params['device_ds_id'] is None):
raise ValueError("Missing the required parameter `device_ds_id` when calling `update_device_datasource_instance_group_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_device_datasource_instance_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_device_datasource_instance_group_by_id`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `update_device_datasource_instance_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'device_ds_id' in params and not re.search('\d+', params['device_ds_id'] if type(params['device_ds_id']) is str else str(params['device_ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_ds_id` when calling `update_device_datasource_instance_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_device_datasource_instance_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'device_ds_id' in params:
path_params['deviceDsId'] = params['device_ds_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/devicedatasources/{deviceDsId}/groups/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceDataSourceInstanceGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_device_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param DeviceGroup body: (required)
:return: DeviceGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_device_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_device_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_device_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update device group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param DeviceGroup body: (required)
:return: DeviceGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_device_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_device_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_device_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_device_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_device_group_cluster_alert_conf_by_id(self, device_group_id, id, body, **kwargs): # noqa: E501
"""Update cluster alert configuration # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_group_cluster_alert_conf_by_id(device_group_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:param DeviceClusterAlertConfig body: (required)
:return: DeviceClusterAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, body, **kwargs) # noqa: E501
else:
(data) = self.update_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, body, **kwargs) # noqa: E501
return data
def update_device_group_cluster_alert_conf_by_id_with_http_info(self, device_group_id, id, body, **kwargs): # noqa: E501
"""Update cluster alert configuration # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_group_cluster_alert_conf_by_id_with_http_info(device_group_id, id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int id: (required)
:param DeviceClusterAlertConfig body: (required)
:return: DeviceClusterAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_device_group_cluster_alert_conf_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `update_device_group_cluster_alert_conf_by_id`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_device_group_cluster_alert_conf_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_device_group_cluster_alert_conf_by_id`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `update_device_group_cluster_alert_conf_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_device_group_cluster_alert_conf_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/clusterAlertConf/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceClusterAlertConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_device_group_datasource_alert_setting(self, device_group_id, ds_id, body, **kwargs): # noqa: E501
"""update device group datasource alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_group_datasource_alert_setting(device_group_id, ds_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int ds_id: (required)
:param DeviceGroupDataSourceAlertConfig body: (required)
:return: DeviceGroupDataSourceAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_device_group_datasource_alert_setting_with_http_info(device_group_id, ds_id, body, **kwargs) # noqa: E501
else:
(data) = self.update_device_group_datasource_alert_setting_with_http_info(device_group_id, ds_id, body, **kwargs) # noqa: E501
return data
def update_device_group_datasource_alert_setting_with_http_info(self, device_group_id, ds_id, body, **kwargs): # noqa: E501
"""update device group datasource alert setting # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_group_datasource_alert_setting_with_http_info(device_group_id, ds_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_group_id: (required)
:param int ds_id: (required)
:param DeviceGroupDataSourceAlertConfig body: (required)
:return: DeviceGroupDataSourceAlertConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_group_id', 'ds_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_device_group_datasource_alert_setting" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_group_id' is set
if ('device_group_id' not in params or
params['device_group_id'] is None):
raise ValueError("Missing the required parameter `device_group_id` when calling `update_device_group_datasource_alert_setting`") # noqa: E501
# verify the required parameter 'ds_id' is set
if ('ds_id' not in params or
params['ds_id'] is None):
raise ValueError("Missing the required parameter `ds_id` when calling `update_device_group_datasource_alert_setting`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_device_group_datasource_alert_setting`") # noqa: E501
if 'device_group_id' in params and not re.search('\d+', params['device_group_id'] if type(params['device_group_id']) is str else str(params['device_group_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_group_id` when calling `update_device_group_datasource_alert_setting`, must conform to the pattern `/\d+/`") # noqa: E501
if 'ds_id' in params and not re.search('\d+', params['ds_id'] if type(params['ds_id']) is str else str(params['ds_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `ds_id` when calling `update_device_group_datasource_alert_setting`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_group_id' in params:
path_params['deviceGroupId'] = params['device_group_id'] # noqa: E501
if 'ds_id' in params:
path_params['dsId'] = params['ds_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{deviceGroupId}/datasources/{dsId}/alertsettings', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeviceGroupDataSourceAlertConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_device_group_property_by_name(self, gid, name, body, **kwargs): # noqa: E501
"""update device group property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_group_property_by_name(gid, name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str name: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_device_group_property_by_name_with_http_info(gid, name, body, **kwargs) # noqa: E501
else:
(data) = self.update_device_group_property_by_name_with_http_info(gid, name, body, **kwargs) # noqa: E501
return data
def update_device_group_property_by_name_with_http_info(self, gid, name, body, **kwargs): # noqa: E501
"""update device group property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_group_property_by_name_with_http_info(gid, name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int gid: group ID (required)
:param str name: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['gid', 'name', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_device_group_property_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'gid' is set
if ('gid' not in params or
params['gid'] is None):
raise ValueError("Missing the required parameter `gid` when calling `update_device_group_property_by_name`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `update_device_group_property_by_name`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_device_group_property_by_name`") # noqa: E501
if 'gid' in params and not re.search('\d+', params['gid'] if type(params['gid']) is str else str(params['gid'])): # noqa: E501
raise ValueError("Invalid value for parameter `gid` when calling `update_device_group_property_by_name`, must conform to the pattern `/\d+/`") # noqa: E501
if 'name' in params and not re.search('[^\/]+', params['name'] if type(params['name']) is str else str(params['name'])): # noqa: E501
raise ValueError("Invalid value for parameter `name` when calling `update_device_group_property_by_name`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'gid' in params:
path_params['gid'] = params['gid'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/groups/{gid}/properties/{name}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EntityProperty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_device_property_by_name(self, device_id, name, body, **kwargs): # noqa: E501
"""update device property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_property_by_name(device_id, name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str name: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_device_property_by_name_with_http_info(device_id, name, body, **kwargs) # noqa: E501
else:
(data) = self.update_device_property_by_name_with_http_info(device_id, name, body, **kwargs) # noqa: E501
return data
def update_device_property_by_name_with_http_info(self, device_id, name, body, **kwargs): # noqa: E501
"""update device property # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_device_property_by_name_with_http_info(device_id, name, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int device_id: (required)
:param str name: (required)
:param EntityProperty body: (required)
:return: EntityProperty
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['device_id', 'name', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_device_property_by_name" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'device_id' is set
if ('device_id' not in params or
params['device_id'] is None):
raise ValueError("Missing the required parameter `device_id` when calling `update_device_property_by_name`") # noqa: E501
# verify the required parameter 'name' is set
if ('name' not in params or
params['name'] is None):
raise ValueError("Missing the required parameter `name` when calling `update_device_property_by_name`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_device_property_by_name`") # noqa: E501
if 'device_id' in params and not re.search('\d+', params['device_id'] if type(params['device_id']) is str else str(params['device_id'])): # noqa: E501
raise ValueError("Invalid value for parameter `device_id` when calling `update_device_property_by_name`, must conform to the pattern `/\d+/`") # noqa: E501
if 'name' in params and not re.search('[^\/]+', params['name'] if type(params['name']) is str else str(params['name'])): # noqa: E501
raise ValueError("Invalid value for parameter `name` when calling `update_device_property_by_name`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'device_id' in params:
path_params['deviceId'] = params['device_id'] # noqa: E501
if 'name' in params:
path_params['name'] = params['name'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/device/devices/{deviceId}/properties/{name}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EntityProperty', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_escalation_chain_by_id(self, id, body, **kwargs): # noqa: E501
"""update escalation chain # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_escalation_chain_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param EscalatingChain body: (required)
:return: EscalatingChain
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_escalation_chain_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_escalation_chain_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_escalation_chain_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update escalation chain # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_escalation_chain_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param EscalatingChain body: (required)
:return: EscalatingChain
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_escalation_chain_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_escalation_chain_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_escalation_chain_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_escalation_chain_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/alert/chains/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EscalatingChain', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_netscan(self, id, **kwargs): # noqa: E501
"""update a netscan # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_netscan(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Netscan body:
:param str reason:
:return: Netscan
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_netscan_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.update_netscan_with_http_info(id, **kwargs) # noqa: E501
return data
def update_netscan_with_http_info(self, id, **kwargs): # noqa: E501
"""update a netscan # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_netscan_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Netscan body:
:param str reason:
:return: Netscan
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body', 'reason'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_netscan" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_netscan`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_netscan`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
if 'reason' in params:
query_params.append(('reason', params['reason'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/netscans/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Netscan', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_ops_note_by_id(self, id, body, **kwargs): # noqa: E501
"""update opsnote # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_ops_note_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param OpsNote body: (required)
:return: OpsNote
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_ops_note_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_ops_note_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_ops_note_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update opsnote # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_ops_note_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param OpsNote body: (required)
:return: OpsNote
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_ops_note_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_ops_note_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_ops_note_by_id`") # noqa: E501
if 'id' in params and not re.search('[^\/]+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_ops_note_by_id`, must conform to the pattern `/[^\/]+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/opsnotes/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OpsNote', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_recipient_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update recipient group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_recipient_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param RecipientGroup body: (required)
:return: RecipientGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_recipient_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_recipient_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_recipient_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update recipient group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_recipient_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param RecipientGroup body: (required)
:return: RecipientGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_recipient_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_recipient_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_recipient_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_recipient_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/recipientgroups/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='RecipientGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_report_by_id(self, id, body, **kwargs): # noqa: E501
"""update report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_report_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param ReportBase body: (required)
:return: ReportBase
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_report_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_report_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_report_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update report # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_report_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param ReportBase body: (required)
:return: ReportBase
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_report_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_report_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_report_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_report_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/reports/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportBase', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_report_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update report group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_report_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param ReportGroup body: (required)
:return: ReportGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_report_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_report_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_report_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update report group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_report_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param ReportGroup body: (required)
:return: ReportGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_report_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_report_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_report_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_report_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/report/groups/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ReportGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_role_by_id(self, id, body, **kwargs): # noqa: E501
"""update role # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_role_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Role body: (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_role_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_role_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_role_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update role # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_role_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Role body: (required)
:return: Role
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_role_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_role_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_role_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_role_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/setting/roles/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Role', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_sdt_by_id(self, id, body, **kwargs): # noqa: E501
"""update SDT # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_sdt_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param SDT body: (required)
:return: SDT
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_sdt_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_sdt_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_sdt_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update SDT # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_sdt_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param SDT body: (required)
:return: SDT
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_sdt_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_sdt_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_sdt_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/sdt/sdts/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SDT', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_website_by_id(self, id, body, **kwargs): # noqa: E501
"""update website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_website_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Website body: (required)
:return: Website
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_website_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_website_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_website_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update website # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_website_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Website body: (required)
:return: Website
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_website_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_website_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_website_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_website_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/websites/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Website', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_website_group_by_id(self, id, body, **kwargs): # noqa: E501
"""update website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_website_group_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param WebsiteGroup body: (required)
:return: WebsiteGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_website_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_website_group_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_website_group_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update website group # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_website_group_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param WebsiteGroup body: (required)
:return: WebsiteGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_website_group_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_website_group_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_website_group_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_website_group_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/website/groups/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='WebsiteGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_widget_by_id(self, id, body, **kwargs): # noqa: E501
"""update widget # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_widget_by_id(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Widget body: (required)
:return: Widget
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_widget_by_id_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.update_widget_by_id_with_http_info(id, body, **kwargs) # noqa: E501
return data
def update_widget_by_id_with_http_info(self, id, body, **kwargs): # noqa: E501
"""update widget # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_widget_by_id_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:param Widget body: (required)
:return: Widget
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_widget_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_widget_by_id`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_widget_by_id`") # noqa: E501
if 'id' in params and not re.search('\d+', params['id'] if type(params['id']) is str else str(params['id'])): # noqa: E501
raise ValueError("Invalid value for parameter `id` when calling `update_widget_by_id`, must conform to the pattern `/\d+/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['LMv1'] # noqa: E501
return self.api_client.call_api(
'/dashboard/widgets/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Widget', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 41.74059 | 304 | 0.604451 | 118,095 | 995,847 | 4.840679 | 0.004175 | 0.058832 | 0.021257 | 0.027331 | 0.993487 | 0.991841 | 0.990153 | 0.98893 | 0.987585 | 0.983751 | 0 | 0.01836 | 0.296758 | 995,847 | 23,857 | 305 | 41.74234 | 0.797923 | 0.28214 | 0 | 0.828915 | 0 | 0.000076 | 0.23066 | 0.068208 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032857 | false | 0.000453 | 0.002417 | 0 | 0.084523 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
31d86259968c095f8ce3f0c5ff7fe09071e2d415 | 10,516 | py | Python | tests/test_models.py | FactomProject/factom-did-driver | c79ff5e4a6d6d33bf96739d7b69504d0948aba20 | [
"MIT"
] | null | null | null | tests/test_models.py | FactomProject/factom-did-driver | c79ff5e4a6d6d33bf96739d7b69504d0948aba20 | [
"MIT"
] | null | null | null | tests/test_models.py | FactomProject/factom-did-driver | c79ff5e4a6d6d33bf96739d7b69504d0948aba20 | [
"MIT"
] | null | null | null | import unittest
import identitykeys
import json
from src import consts
from src.models import Identity, IdentityNotFoundException
class TestIdentityCreation(unittest.TestCase):
def test_valid_input(self):
did = 'did:factom:f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
chain_id = 'f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
external_ids = [consts.IDENTITY_CHAIN_TAG, b'Test', b'v1']
public_keys = [identitykeys.generate_key_pair()[1].to_string() for _ in range(3)]
content = {'version': 1, 'keys': public_keys}
identity = Identity(did, chain_id)
identity.process_creation(
entry_hash= b'\0' * 32,
external_ids=external_ids,
content=json.dumps(content, separators=(',', ':')).encode(),
stage='factom',
height=123456
)
self.assertEqual(1, identity.version)
for i, external_id in enumerate(external_ids[1:]):
self.assertEqual(external_id.decode(), identity.name[i])
for k in public_keys:
self.assertIn(k, identity.active_keys)
self.assertIn(k, identity.all_keys)
for k in identity.active_keys.values():
self.assertIn('id', k)
self.assertIn('controller', k)
self.assertIn('type', k)
self.assertIn('publicKeyHex', k)
self.assertIn('activatedHeight', k)
self.assertIn('retiredHeight', k)
self.assertIn('priority', k)
self.assertIn('entryHash', k)
def test_bad_content_json(self):
did = 'did:factom:f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
chain_id = 'f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
external_ids = [consts.IDENTITY_CHAIN_TAG, b'Test', b'v1']
content = b'BAD{BAD/BAD'
identity = Identity(did, chain_id)
with self.assertRaises(IdentityNotFoundException):
identity.process_creation(
entry_hash=b'\0' * 32,
external_ids=external_ids,
content=content,
stage='factom',
height=123456
)
def test_bad_keys(self):
did = 'did:factom:f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
chain_id = 'f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
external_ids = [consts.IDENTITY_CHAIN_TAG, b'Test', b'v1']
public_keys = ['BAD' for _ in range(3)]
content = {'version': 1, 'keys': public_keys}
identity = Identity(did, chain_id)
with self.assertRaises(IdentityNotFoundException):
identity.process_creation(
entry_hash=b'\0' * 32,
external_ids=external_ids,
content=json.dumps(content, separators=(',', ':')).encode(),
stage='factom',
height=123456
)
def test_bad_version(self):
did = 'did:factom:f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
chain_id = 'f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
external_ids = [consts.IDENTITY_CHAIN_TAG, b'Test', b'v1']
public_keys = [identitykeys.generate_key_pair()[1].to_string() for _ in range(3)]
content = {'version': 'BAD', 'keys': public_keys}
identity = Identity(did, chain_id)
with self.assertRaises(IdentityNotFoundException):
identity.process_creation(
entry_hash=b'\0' * 32,
external_ids=external_ids,
content=json.dumps(content, separators=(',', ':')).encode(),
stage='factom',
height=123456
)
class TestIdentityKeyReplacement(unittest.TestCase):
def test_valid_replacement(self):
did = 'did:factom:f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
chain_id = 'f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
external_ids = [consts.IDENTITY_CHAIN_TAG, b'Test', b'v1']
key_pairs = [identitykeys.generate_key_pair() for _ in range(3)]
public_keys = [pub.to_string() for _, pub in key_pairs]
content = {'version': 1, 'keys': public_keys}
identity = Identity(did, chain_id)
identity.process_creation(
entry_hash=b'\0' * 32,
external_ids=external_ids,
content=json.dumps(content, separators=(',', ':')).encode(),
stage='factom',
height=123456
)
_, old_pub = key_pairs[-1]
signer_priv, signer_pub = key_pairs[-2]
new_priv, new_pub = identitykeys.generate_key_pair()
message = chain_id.encode() + old_pub.to_string().encode() + new_pub.to_string().encode()
external_ids = [
b'ReplaceKey',
old_pub.to_string().encode(),
new_pub.to_string().encode(),
signer_priv.sign(message),
signer_pub.to_string().encode()
]
result = identity.process_key_replacement(entry_hash=b'\0' * 32, external_ids=external_ids, height=123457)
self.assertTrue(result)
def test_bad_external_ids(self):
did = 'did:factom:f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
chain_id = 'f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
external_ids = [consts.IDENTITY_CHAIN_TAG, b'Test', b'v1']
key_pairs = [identitykeys.generate_key_pair() for _ in range(3)]
public_keys = [pub.to_string() for _, pub in key_pairs]
content = {'version': 1, 'keys': public_keys}
identity = Identity(did, chain_id)
identity.process_creation(
entry_hash=b'\0' * 32,
external_ids=external_ids,
content=json.dumps(content, separators=(',', ':')).encode(),
stage='factom',
height=123456
)
_, old_pub = key_pairs[-1]
signer_priv, signer_pub = key_pairs[-2]
new_priv, new_pub = identitykeys.generate_key_pair()
message = chain_id.encode() + old_pub.to_string().encode() + new_pub.to_string().encode()
external_ids = [b'BAD', b'BAD', b'BAD', b'BAD', b'BAD']
result = identity.process_key_replacement(entry_hash=b'\0' * 32, external_ids=external_ids, height=123457)
self.assertFalse(result)
def test_bad_new_key(self):
did = 'did:factom:f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
chain_id = 'f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
external_ids = [consts.IDENTITY_CHAIN_TAG, b'Test', b'v1']
key_pairs = [identitykeys.generate_key_pair() for _ in range(3)]
public_keys = [pub.to_string() for _, pub in key_pairs]
content = {'version': 1, 'keys': public_keys}
identity = Identity(did, chain_id)
identity.process_creation(
entry_hash=b'\0' * 32,
external_ids=external_ids,
content=json.dumps(content, separators=(',', ':')).encode(),
stage='factom',
height=123456
)
_, old_pub = key_pairs[-1]
signer_priv, signer_pub = key_pairs[-2]
new_pub = 'BAD'
message = chain_id.encode() + old_pub.to_string().encode() + new_pub.encode()
external_ids = [
b'ReplaceKey',
old_pub.to_string().encode(),
new_pub.encode(),
signer_priv.sign(message),
signer_pub.to_string().encode()
]
result = identity.process_key_replacement(entry_hash=b'\0' * 32, external_ids=external_ids, height=123457)
self.assertFalse(result)
def test_bad_signature(self):
did = 'did:factom:f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
chain_id = 'f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
external_ids = [consts.IDENTITY_CHAIN_TAG, b'Test', b'v1']
key_pairs = [identitykeys.generate_key_pair() for _ in range(3)]
public_keys = [pub.to_string() for _, pub in key_pairs]
content = {'version': 1, 'keys': public_keys}
identity = Identity(did, chain_id)
identity.process_creation(
entry_hash=b'\0' * 32,
external_ids=external_ids,
content=json.dumps(content, separators=(',', ':')).encode(),
stage='factom',
height=123456
)
_, old_pub = key_pairs[-1]
signer_priv, signer_pub = key_pairs[-2]
new_priv, new_pub = identitykeys.generate_key_pair()
message = chain_id.encode() + old_pub.to_string().encode() + new_pub.to_string().encode()
external_ids = [
b'ReplaceKey',
old_pub.to_string().encode(),
new_pub.to_string().encode(),
b'\0' * 64,
signer_pub.to_string().encode()
]
result = identity.process_key_replacement(entry_hash=b'\0' * 32, external_ids=external_ids, height=123457)
self.assertFalse(result)
def test_bad_signer_priority(self):
did = 'did:factom:f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
chain_id = 'f26e1c422c657521861ced450442d0c664702f49480aec67805822edfcfee758'
external_ids = [consts.IDENTITY_CHAIN_TAG, b'Test', b'v1']
key_pairs = [identitykeys.generate_key_pair() for _ in range(3)]
public_keys = [pub.to_string() for _, pub in key_pairs]
content = {'version': 1, 'keys': public_keys}
identity = Identity(did, chain_id)
identity.process_creation(
entry_hash=b'\0' * 32,
external_ids=external_ids,
content=json.dumps(content, separators=(',', ':')).encode(),
stage='factom',
height=123456
)
_, old_pub = key_pairs[-2]
signer_priv, signer_pub = key_pairs[-1]
new_priv, new_pub = identitykeys.generate_key_pair()
message = chain_id.encode() + old_pub.to_string().encode() + new_pub.to_string().encode()
external_ids = [
b'ReplaceKey',
old_pub.to_string().encode(),
new_pub.to_string().encode(),
signer_priv.sign(message),
signer_pub.to_string().encode()
]
result = identity.process_key_replacement(entry_hash=b'\0' * 32, external_ids=external_ids, height=123457)
self.assertFalse(result)
| 43.27572 | 114 | 0.631419 | 1,100 | 10,516 | 5.766364 | 0.083636 | 0.076305 | 0.043355 | 0.053602 | 0.894529 | 0.877818 | 0.873561 | 0.866782 | 0.863787 | 0.863787 | 0 | 0.122306 | 0.254374 | 10,516 | 242 | 115 | 43.454545 | 0.686647 | 0 | 0 | 0.738318 | 0 | 0 | 0.156048 | 0.118962 | 0 | 0 | 0 | 0 | 0.093458 | 1 | 0.042056 | false | 0 | 0.023364 | 0 | 0.074766 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ec5078811475d6aaf5c4359026e4ab521432bb0 | 7,119 | py | Python | tests/plugins/legacy/test_ddp_plugin.py | peblair/pytorch-lightning | e676ff96b16224331297dbd0e5ecd5cf364965b8 | [
"Apache-2.0"
] | 1 | 2021-02-12T04:15:31.000Z | 2021-02-12T04:15:31.000Z | tests/plugins/legacy/test_ddp_plugin.py | peblair/pytorch-lightning | e676ff96b16224331297dbd0e5ecd5cf364965b8 | [
"Apache-2.0"
] | null | null | null | tests/plugins/legacy/test_ddp_plugin.py | peblair/pytorch-lightning | e676ff96b16224331297dbd0e5ecd5cf364965b8 | [
"Apache-2.0"
] | null | null | null | import os
import platform
from unittest import mock
import pytest
from pytorch_lightning import Trainer
from pytorch_lightning.callbacks import Callback
from pytorch_lightning.plugins.legacy.ddp_plugin import DDPPlugin
from pytorch_lightning.plugins.legacy.sharded_plugin import DDPShardedPlugin
from pytorch_lightning.utilities import _FAIRSCALE_AVAILABLE
from pytorch_lightning.utilities.exceptions import MisconfigurationException
from tests.helpers.boring_model import BoringModel
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, 2), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
def test_ddp_choice_default_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
class CB(Callback):
def on_fit_start(self, trainer, pl_module):
assert isinstance(trainer.accelerator_backend.ddp_plugin, DDPPlugin)
raise RuntimeError('finished plugin check')
model = BoringModel()
trainer = Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
accelerator=ddp_backend,
callbacks=[CB()],
)
with pytest.raises(RuntimeError, match='finished plugin check'):
trainer.fit(model)
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, 2), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
def test_ddp_choice_custom_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
class MyDDP(DDPPlugin):
pass
class CB(Callback):
def on_fit_start(self, trainer, pl_module):
assert isinstance(trainer.accelerator_backend.ddp_plugin, MyDDP)
raise RuntimeError('finished plugin check')
model = BoringModel()
trainer = Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
accelerator=ddp_backend,
plugins=[MyDDP()],
callbacks=[CB()],
)
with pytest.raises(RuntimeError, match='finished plugin check'):
trainer.fit(model)
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, 2), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
@pytest.mark.skipif(platform.system() == "Windows", reason="Distributed sharded plugin is not supported on Windows")
@pytest.mark.skipif(not _FAIRSCALE_AVAILABLE, reason="Fairscale is not available")
def test_ddp_choice_string_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
class CB(Callback):
def on_fit_start(self, trainer, pl_module):
assert isinstance(trainer.accelerator_backend.ddp_plugin, DDPShardedPlugin)
raise RuntimeError('finished plugin check')
model = BoringModel()
trainer = Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
accelerator=ddp_backend,
plugins='ddp_sharded',
callbacks=[CB()],
)
with pytest.raises(RuntimeError, match='finished plugin check'):
trainer.fit(model)
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, 2), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
def test_ddp_invalid_choice_string_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
with pytest.raises(MisconfigurationException, match='not a supported lightning custom plugin'):
Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
accelerator=ddp_backend,
plugins='invalid',
)
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, 2), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
@pytest.mark.skipif(platform.system() == "Windows", reason="Distributed sharded plugin is not supported on Windows")
@pytest.mark.skipif(not _FAIRSCALE_AVAILABLE, reason="Fairscale is not available")
def test_ddp_invalid_choice_string_and_custom_ddp_cpu(tmpdir, ddp_backend, gpus, num_processes):
"""
Test passing a lightning custom ddp plugin and a default ddp plugin throws an error.
"""
class MyDDP(DDPPlugin):
pass
with pytest.raises(MisconfigurationException, match='you can only use one DDP plugin in plugins'):
Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
accelerator=ddp_backend,
plugins=['ddp_sharded', MyDDP()],
)
@mock.patch.dict(
os.environ,
{
"CUDA_VISIBLE_DEVICES": "0,1",
"SLURM_NTASKS": "2",
"SLURM_JOB_NAME": "SOME_NAME",
"SLURM_NODEID": "0",
"LOCAL_RANK": "0",
"SLURM_LOCALID": "0",
},
)
@mock.patch("torch.cuda.device_count", return_value=2)
@pytest.mark.parametrize(
["ddp_backend", "gpus", "num_processes"],
[("ddp_cpu", None, 2), ("ddp", 2, 0), ("ddp2", 2, 0), ("ddp_spawn", 2, 0)],
)
def test_ddp_choice_custom_ddp_cpu_custom_args(tmpdir, ddp_backend, gpus, num_processes):
class MyDDP(DDPPlugin):
pass
class CB(Callback):
def on_fit_start(self, trainer, pl_module):
assert isinstance(trainer.accelerator_backend.ddp_plugin, MyDDP)
raise RuntimeError('finished plugin check')
model = BoringModel()
trainer = Trainer(
fast_dev_run=True,
gpus=gpus,
num_processes=num_processes,
accelerator=ddp_backend,
plugins=[MyDDP(broadcast_buffers=False, find_unused_parameters=True)],
callbacks=[CB()],
)
with pytest.raises(RuntimeError, match='finished plugin check'):
trainer.fit(model)
| 30.165254 | 116 | 0.638713 | 860 | 7,119 | 5.045349 | 0.140698 | 0.066375 | 0.066375 | 0.047015 | 0.85711 | 0.820696 | 0.814704 | 0.814704 | 0.814704 | 0.804563 | 0 | 0.016248 | 0.221941 | 7,119 | 235 | 117 | 30.293617 | 0.767106 | 0.011799 | 0 | 0.717172 | 0 | 0 | 0.210571 | 0.019661 | 0 | 0 | 0 | 0 | 0.020202 | 1 | 0.050505 | false | 0.015152 | 0.055556 | 0 | 0.141414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ecded65952b51616e1cb3d5cfe6ca9c98a71c33 | 136 | py | Python | app/user_profile/__init__.py | theposter/food-server | d6a1a9e1300d35ff4642463f0a73074b1440c648 | [
"MIT"
] | null | null | null | app/user_profile/__init__.py | theposter/food-server | d6a1a9e1300d35ff4642463f0a73074b1440c648 | [
"MIT"
] | null | null | null | app/user_profile/__init__.py | theposter/food-server | d6a1a9e1300d35ff4642463f0a73074b1440c648 | [
"MIT"
] | null | null | null | from flask import Blueprint
user_profile_blueprint = Blueprint('user_profile_blueprint', __name__)
from app.user_profile import routes | 27.2 | 70 | 0.852941 | 18 | 136 | 5.944444 | 0.5 | 0.308411 | 0.373832 | 0.542056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095588 | 136 | 5 | 71 | 27.2 | 0.869919 | 0 | 0 | 0 | 0 | 0 | 0.160584 | 0.160584 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 8 |
9ed08772bc43d42dac33a0dd62117d5691106f83 | 26,981 | py | Python | tests/integration/dashboard_tests.py | brianherman/data-act-broker-backend | 80eb055b9d245046192f7ad4fd0be7d0e11d2dec | [
"CC0-1.0"
] | null | null | null | tests/integration/dashboard_tests.py | brianherman/data-act-broker-backend | 80eb055b9d245046192f7ad4fd0be7d0e11d2dec | [
"CC0-1.0"
] | null | null | null | tests/integration/dashboard_tests.py | brianherman/data-act-broker-backend | 80eb055b9d245046192f7ad4fd0be7d0e11d2dec | [
"CC0-1.0"
] | 1 | 2020-07-17T23:50:56.000Z | 2020-07-17T23:50:56.000Z | from dataactcore.interfaces.db import GlobalDB
from dataactcore.models.userModel import User
from dataactvalidator.health_check import create_app
from tests.integration.baseTestAPI import BaseTestAPI
from tests.integration.integration_test_helper import insert_submission
class DashboardTests(BaseTestAPI):
""" Test dashboard routes. """
@classmethod
def setUpClass(cls):
""" Set up class-wide resources (test data) """
super(DashboardTests, cls).setUpClass()
# TODO: refactor into a pytest fixture
with create_app().app_context():
# get the submission test user
sess = GlobalDB.db().session
cls.session = sess
submission_user = sess.query(User).filter(User.email == cls.test_users['admin_user']).one()
cls.submission_user_id = submission_user.user_id
other_user = sess.query(User).filter(User.email == cls.test_users['agency_user']).one()
cls.other_user_id = other_user.user_id
no_submissions_user = sess.query(User).filter(User.email == cls.test_users['no_permissions_user']).one()
cls.no_submissions_user_email = no_submissions_user.email
cls.no_submissions_user_id = no_submissions_user.user_id
cls.quarter_sub = insert_submission(cls.session, cls.submission_user_id, cgac_code='SYS',
start_date='01/2017', end_date='03/2017', is_quarter=True)
def setUp(self):
""" Test set-up. """
super(DashboardTests, self).setUp()
self.login_admin_user()
def test_post_dabs_summary(self):
""" Test successfully getting the dabs summary """
# Basic passing test
dabs_summary_json = {'filters': {'quarters': [], 'fys': [], 'agencies': []}}
response = self.app.post_json('/v1/historic_dabs_summary/', dabs_summary_json,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertEqual([{'agency_name': 'Admin Agency', 'submissions': []},
{'agency_name': 'Example Agency', 'submissions': []}],
sorted(response.json, key=lambda agency: agency['agency_name']))
def test_post_dabs_summary_fail(self):
""" Test failing getting the dabs summary """
# Not including any required filters
dabs_summary_json = {'filters': {}}
response = self.app.post_json('/v1/historic_dabs_summary/', dabs_summary_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'The following filters were not provided: quarters, fys, agencies')
# Not including some required filters
dabs_summary_json = {'filters': {'quarters': []}}
response = self.app.post_json('/v1/historic_dabs_summary/', dabs_summary_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'The following filters were not provided: fys, agencies')
# Wrong quarter
dabs_summary_json = {'filters': {'quarters': [6], 'fys': [], 'agencies': []}}
response = self.app.post_json('/v1/historic_dabs_summary/', dabs_summary_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Quarters must be a list of integers, each ranging 1-4,'
' or an empty list.')
# Wrong fys
dabs_summary_json = {'filters': {'quarters': [], 'fys': [2011], 'agencies': []}}
response = self.app.post_json('/v1/historic_dabs_summary/', dabs_summary_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Fiscal Years must be a list of integers, each ranging from 2017'
' through the current fiscal year, or an empty list.')
# Wrong agencies - integer instead of a string
dabs_summary_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [90]}}
response = self.app.post_json('/v1/historic_dabs_summary/', dabs_summary_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Agencies must be a list of strings, or an empty list.')
def test_get_rule_labels(self):
""" Test successfully getting a list of rule labels. """
# Getting all FABS warnings
rule_label_json = {'files': [], 'fabs': True, 'error_level': 'warning'}
response = self.app.post_json('/v1/get_rule_labels/', rule_label_json,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertIn('labels', response.json)
# Getting all DABS errors
params = {'files': [], 'fabs': False, 'error_level': 'error'}
response = self.app.post_json('/v1/get_rule_labels/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertIn('labels', response.json)
# Leaving all non-required params out
params = {'files': []}
response = self.app.post_json('/v1/get_rule_labels/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertIn('labels', response.json)
# Specifying a few DABS files with mixed results
params = {'files': ['C', 'cross-AB'], 'error_level': 'mixed'}
response = self.app.post_json('/v1/get_rule_labels/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertIn('labels', response.json)
# Getting the labels with no permissions
self.logout()
self.login_user(username=self.no_submissions_user_email)
params = {'files': [], 'fabs': True, 'error_level': 'warning'}
response = self.app.post_json('/v1/get_rule_labels/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertIn('labels', response.json)
# Getting the labels while logged out
self.logout()
params = {'files': [], 'fabs': True, 'error_level': 'warning'}
response = self.app.post_json('/v1/get_rule_labels/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertIn('labels', response.json)
def test_get_rule_labels_fail(self):
""" Test failing to get a list of rule labels. """
# Invalid error level
params = {'files': [], 'error_level': 'bad'}
response = self.app.post_json('/v1/get_rule_labels/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'error_level: Must be either warning, error, or mixed')
# Missing files param
params = {'error_level': 'warning'}
response = self.app.post_json('/v1/get_rule_labels/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'files: Missing data for required field.')
def test_post_dabs_graphs(self):
""" Test successfully getting the dabs graphs """
# Basic passing test
dabs_summary_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_graphs/', dabs_summary_json,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertEqual({'A': [], 'B': [], 'C': [], 'cross-AB': [], 'cross-BC': [], 'cross-CD1': [],
'cross-CD2': []}, response.json)
def test_post_dabs_graphs_fail(self):
""" Test failing getting the dabs graphs """
# Not including any required filters
dabs_graphs_json = {'filters': {}}
response = self.app.post_json('/v1/historic_dabs_graphs/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'The following filters were not provided: quarters, fys, agencies,'
' files, rules')
# Not including some required filters
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': []}}
response = self.app.post_json('/v1/historic_dabs_graphs/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'The following filters were not provided: files, rules')
# Wrong quarter
dabs_graphs_json = {'filters': {'quarters': [6], 'fys': [], 'agencies': [], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_graphs/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Quarters must be a list of integers, each ranging 1-4,'
' or an empty list.')
# Wrong fys
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [2011], 'agencies': [], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_graphs/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Fiscal Years must be a list of integers, each ranging from 2017'
' through the current fiscal year, or an empty list.')
# Wrong agencies - integer instead of a string
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [90], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_graphs/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Agencies must be a list of strings, or an empty list.')
# Wrong agencies - non-existent agency
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': ['999'], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_graphs/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'All codes in the agency_codes filter must be valid agency codes')
# Wrong files
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [], 'files': ['cross-AC'], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_graphs/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Files must be a list of one or more of the following,'
' or an empty list: A, B, C, cross-AB, cross-BC, cross-CD1,'
' cross-CD2')
# Wrong rules
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [], 'files': [], 'rules': [9]}}
response = self.app.post_json('/v1/historic_dabs_graphs/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Rules must be a list of strings, or an empty list.')
def test_historic_dabs_table(self):
""" Test successfully getting the historic dabs table """
# Basic passing test
dabs_summary_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [], 'files': [], 'rules': []},
'page': 1, 'limit': 10, 'sort': 'rule_label', 'order': 'asc'}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_summary_json,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertEqual({'results': [], 'page_metadata': {'total': 0, 'page': 1, 'limit': 10}}, response.json)
# Test with none of the optional content
dabs_summary_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_summary_json,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
self.assertEqual({'results': [], 'page_metadata': {'total': 0, 'page': 1, 'limit': 5}}, response.json)
def test_historic_dabs_table_fail(self):
""" Test failing getting the historic dabs table """
# Not including any required filters
dabs_graphs_json = {'filters': {}}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'The following filters were not provided: quarters, fys, agencies,'
' files, rules')
# Not including some required filters
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': []}}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'The following filters were not provided: files, rules')
# Wrong quarter
dabs_graphs_json = {'filters': {'quarters': [6], 'fys': [], 'agencies': [], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Quarters must be a list of integers, each ranging 1-4,'
' or an empty list.')
# Wrong fys
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [2011], 'agencies': [], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Fiscal Years must be a list of integers, each ranging from 2017'
' through the current fiscal year, or an empty list.')
# Wrong agencies - integer instead of a string
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [90], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Agencies must be a list of strings, or an empty list.')
# Wrong agencies - non-existent agency
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': ['999'], 'files': [], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'All codes in the agency_codes filter must be valid agency codes')
# Wrong files
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [], 'files': ['cross-AC'], 'rules': []}}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Files must be a list of one or more of the following,'
' or an empty list: A, B, C, cross-AB, cross-BC, cross-CD1,'
' cross-CD2')
# Wrong rules
dabs_graphs_json = {'filters': {'quarters': [], 'fys': [], 'agencies': [], 'files': [], 'rules': [9]}}
response = self.app.post_json('/v1/historic_dabs_table/', dabs_graphs_json, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'Rules must be a list of strings, or an empty list.')
def test_active_submission_overview(self):
""" Test successfully getting the active submission overview """
# Error type not specified
params = {'submission_id': self.quarter_sub, 'file': 'A'}
response = self.app.get('/v1/active_submission_overview/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
# Error type specified
params = {'submission_id': self.quarter_sub, 'file': 'A', 'error_level': 'mixed'}
response = self.app.get('/v1/active_submission_overview/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
def test_active_submission_overview_fail(self):
""" Test failing to get the active submission overview """
# Invalid submission ID
params = {'submission_id': -1, 'file': 'A'}
response = self.app.get('/v1/active_submission_overview/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'No such submission')
# Invalid file type
params = {'submission_id': self.quarter_sub, 'file': 'Q'}
response = self.app.get('/v1/active_submission_overview/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'file: Must be A, B, C, cross-AB, cross-BC, cross-CD1, or cross-CD2')
# Invalid error level
params = {'submission_id': self.quarter_sub, 'file': 'A', 'error_level': 'all'}
response = self.app.get('/v1/active_submission_overview/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'error_level: Must be either warning, error, or mixed')
# Missing submission ID
params = {'file': 'A'}
response = self.app.get('/v1/active_submission_overview/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'submission_id is required')
# Missing file
params = {'submission_id': self.quarter_sub}
response = self.app.get('/v1/active_submission_overview/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'file: Missing data for required field.')
def test_get_impact_counts(self):
""" Test successfully getting the impact counts for a submission """
# Error type not specified
params = {'submission_id': self.quarter_sub, 'file': 'A'}
response = self.app.get('/v1/get_impact_counts/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
# Error type specified
params = {'submission_id': self.quarter_sub, 'file': 'A', 'error_level': 'mixed'}
response = self.app.get('/v1/get_impact_counts/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
def test_get_impact_counts_fail(self):
""" Test failing to get impact counts for a submission """
# Invalid submission ID
params = {'submission_id': -1, 'file': 'A'}
response = self.app.get('/v1/get_impact_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'No such submission')
# Invalid file type
params = {'submission_id': self.quarter_sub, 'file': 'Q'}
response = self.app.get('/v1/get_impact_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'file: Must be A, B, C, cross-AB, cross-BC, cross-CD1, or cross-CD2')
# Invalid error level
params = {'submission_id': self.quarter_sub, 'file': 'A', 'error_level': 'all'}
response = self.app.get('/v1/get_impact_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'error_level: Must be either warning, error, or mixed')
# Missing submission ID
params = {'file': 'A'}
response = self.app.get('/v1/get_impact_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'submission_id is required')
# Missing file
params = {'submission_id': self.quarter_sub}
response = self.app.get('/v1/get_impact_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'file: Missing data for required field.')
def test_get_significance_counts(self):
""" Test successfully getting the significance counts for a submission """
# Error type not specified
params = {'submission_id': self.quarter_sub, 'file': 'A'}
response = self.app.get('/v1/get_significance_counts/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
# Error type specified
params = {'submission_id': self.quarter_sub, 'file': 'A', 'error_level': 'mixed'}
response = self.app.get('/v1/get_significance_counts/', params, headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 200)
def test_get_significance_counts_fail(self):
""" Test failing to get significance counts for a submission """
# Invalid submission ID
params = {'submission_id': -1, 'file': 'A'}
response = self.app.get('/v1/get_significance_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'No such submission')
# Invalid file type
params = {'submission_id': self.quarter_sub, 'file': 'Q'}
response = self.app.get('/v1/get_significance_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'file: Must be A, B, C, cross-AB, cross-BC, cross-CD1, or cross-CD2')
# Invalid error level
params = {'submission_id': self.quarter_sub, 'file': 'A', 'error_level': 'all'}
response = self.app.get('/v1/get_significance_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'error_level: Must be either warning, error, or mixed')
# Missing submission ID
params = {'file': 'A'}
response = self.app.get('/v1/get_significance_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'submission_id is required')
# Missing file
params = {'submission_id': self.quarter_sub}
response = self.app.get('/v1/get_significance_counts/', params, expect_errors=True,
headers={'x-session-id': self.session_id})
self.assertEqual(response.status_code, 400)
self.assertEqual(response.json['message'], 'file: Missing data for required field.')
| 60.225446 | 120 | 0.60676 | 3,125 | 26,981 | 5.06208 | 0.06784 | 0.046653 | 0.088754 | 0.058031 | 0.90208 | 0.872874 | 0.848789 | 0.836336 | 0.836336 | 0.836336 | 0 | 0.014889 | 0.253215 | 26,981 | 447 | 121 | 60.360179 | 0.770212 | 0.080019 | 0 | 0.787097 | 0 | 0.016129 | 0.250203 | 0.048167 | 0 | 0 | 0 | 0.002237 | 0.329032 | 1 | 0.051613 | false | 0 | 0.016129 | 0 | 0.070968 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ed2f8b1067ae859c6571cfc45692312ea6ab5d8 | 1,607 | py | Python | Day-03/Day-03-LogicalOperators.py | ShazidHasanRiam/100DaysOfPython | 9f71ffb3e6e797b01e14377d3433c036adf9c3f3 | [
"MIT"
] | null | null | null | Day-03/Day-03-LogicalOperators.py | ShazidHasanRiam/100DaysOfPython | 9f71ffb3e6e797b01e14377d3433c036adf9c3f3 | [
"MIT"
] | null | null | null | Day-03/Day-03-LogicalOperators.py | ShazidHasanRiam/100DaysOfPython | 9f71ffb3e6e797b01e14377d3433c036adf9c3f3 | [
"MIT"
] | null | null | null | #Day-03 of 100 Days of Coding
#November 20, 2020
#Shazid Hasan Riam
print("Welcome to the Rollercoaster")
height = int(input("What is your height in CM? "))
if height >= 120:
print("Congratulations! You can ride the Rollercoaster.")
age = int(input("What is your age? "))
if age <12:
bill = 5
print("Child tickets are $5.")
elif age <= 18:
bill = 7
print("Youth tickets are $7.")
else:
bill = 12
print("Adult tickets are $12.")
wants_photo = (input("Do You want a photo taken? Y or N\n"))
if wants_photo == "Y":
bill += 3
print(f"Your Final Bill Is: {bill}")
else:
print("Sorry, You have to grow taller before you can ride.")
#Day-03 of 100 Days of Coding
#November 20, 2020
#Shazid Hasan Riam
print("Welcome to the Rollercoaster")
height = int(input("What is your height in CM? "))
if height >= 120:
print("Congratulations! You can ride the Rollercoaster.")
age = int(input("What is your age? "))
if age <12:
bill = 5
print("Child tickets are $5.")
elif age <= 18:
bill = 7
print("Youth tickets are $7.")
elif age >= 45 and age <= 55:
print("Everything is going to be okay. Have a free ride on us.")
else:
bill = 12
print("Adult tickets are $12.")
wants_photo = (input("Do You want a photo taken? Y or N\n"))
if wants_photo == "Y":
bill += 3
print(f"Your Final Bill Is: {bill}")
else:
print("Sorry, You have to grow taller before you can ride.")
| 28.192982 | 73 | 0.574984 | 239 | 1,607 | 3.849372 | 0.297071 | 0.065217 | 0.052174 | 0.06087 | 0.930435 | 0.930435 | 0.930435 | 0.930435 | 0.930435 | 0.930435 | 0 | 0.051925 | 0.304916 | 1,607 | 56 | 74 | 28.696429 | 0.77171 | 0.077162 | 0 | 0.952381 | 0 | 0 | 0.458128 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.357143 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ee2b78c6a025e562e56db3d6653b30a1549ab8d | 20,022 | py | Python | potion/algorithms/safexplore.py | T3p/policy-optimization | 77006545779823737c4ca3b19e9d80506015c132 | [
"MIT"
] | null | null | null | potion/algorithms/safexplore.py | T3p/policy-optimization | 77006545779823737c4ca3b19e9d80506015c132 | [
"MIT"
] | null | null | null | potion/algorithms/safexplore.py | T3p/policy-optimization | 77006545779823737c4ca3b19e9d80506015c132 | [
"MIT"
] | 1 | 2019-09-08T15:11:55.000Z | 2019-09-08T15:11:55.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Wed Jan 16 16:18:45 2019
@author: matteo
"""
from potion.simulation.trajectory_generators import generate_batch
from potion.common.misc_utils import performance, avg_horizon
from potion.estimation.gradients import gpomdp_estimator
from potion.estimation.metagradients import mixed_estimator, metagrad
from potion.common.logger import Logger
from potion.common.misc_utils import clip, seed_all_agent
import torch
import math
from potion.meta.safety_requirements import MonotonicImprovement, Budget, FixedThreshold
import scipy.stats as sts
def sepg(env, policy,
horizon,
batchsize = 100,
iterations = 1000,
gamma = 0.99,
rmax = 1.,
phimax = 1.,
safety_requirement = 'mi',
delta = 1.,
confidence_schedule = None,
clip_at = 100,
test_batchsize = False,
render = False,
seed = None,
baseline = 'peters',
shallow = True,
action_filter = None,
parallel = False,
logger = Logger(name='SEPG'),
save_params = 1000,
log_params = True,
verbose = True):
"""
Only for SIMPLE Gaussian policy w/ scalar variance
Policy must have learn_std = False, as std is META-learned
"""
#Defaults
assert policy.learn_std
if action_filter is None:
action_filter = clip(env)
#Seed agent
if seed is not None:
seed_all_agent(seed)
#Prepare logger
algo_info = {'Algorithm': 'SEPG',
'Environment': str(env),
'BatchSize': batchsize,
'Max horizon': horizon,
'Iterations': iterations,
'gamma': gamma,
'actionFilter': action_filter,
'rmax': rmax,
'phimax': phimax}
logger.write_info({**algo_info, **policy.info()})
log_keys = ['Perf', 'UPerf', 'AvgHorizon',
'Alpha', 'BatchSize', 'Exploration', 'Eta',
'ThetaGradNorm', 'OmegaGrad', 'OmegaMetagrad',
'Penalty', 'MetaPenalty',
'IterationKind',
'ThetaGradNorm', 'Eps', 'Up', 'Down', 'C', 'Cmax', 'Delta'] #0: theta, 1: omega
if log_params:
log_keys += ['param%d' % i for i in range(policy.num_params())]
if test_batchsize:
log_keys.append('DetPerf')
log_row = dict.fromkeys(log_keys)
logger.open(log_row.keys())
#Safety requirements
if safety_requirement == 'mi':
thresholder = MonotonicImprovement()
elif safety_requirement == 'budget':
batch = generate_batch(env, policy, horizon, batchsize, action_filter)
thresholder = Budget(performance(batch, gamma))
else:
thresholder = FixedThreshold(float(safety_requirement))
#Learning loop
omega_grad = float('nan')
omega_metagrad = float('nan')
metapenalty = float('nan')
eta = float('nan')
it = 0
while(it < iterations):
#Begin iteration
if verbose:
print('\nIteration ', it)
if verbose:
print('Params: ', policy.get_flat())
#Test mean parameters on deterministic policy
if test_batchsize:
test_batch = generate_batch(env, policy, horizon, test_batchsize,
action_filter=action_filter,
seed=seed,
njobs=parallel,
deterministic=True)
log_row['DetPerf'] = performance(test_batch, gamma)
#Render behavior
if render:
generate_batch(env, policy, horizon, 1, action_filter, render=True)
#
if it % 2 == 0:
#Std update
omega = policy.get_scale_params()
sigma = torch.exp(omega).item()
batch = generate_batch(env, policy, horizon, batchsize,
action_filter=action_filter,
njobs=parallel,
seed=seed)
if confidence_schedule is not None:
delta = confidence_schedule.next(it)
log_row['Delta'] = delta
if delta <1:
grad, grad_var = simple_gpomdp_estimator(batch, gamma, policy, baseline, result='moments')
omega_grad = grad[0]
omega_grad_var = grad_var[0]
omega_metagrad, omega_metagrad_var = metagrad(batch, gamma, policy, alpha, clip_at, baseline, result='moments')
quant = 2 * sts.t.interval(1 - delta, batchsize-1,loc=0.,scale=1.)[1]
eps = torch.tensor(quant * torch.sqrt(omega_grad_var / batchsize), dtype=torch.float)
log_row['Eps'] = torch.norm(eps).item()
metaeps = torch.tensor(quant * torch.sqrt(omega_metagrad_var / batchsize), dtype=torch.float)
if torch.sign(omega_grad).item() >= 0 and torch.sign(omega_metagrad).item() >= 0:
up = torch.clamp(torch.abs(omega_grad - eps), min=0.) * torch.clamp(torch.abs(omega_metagrad - metaeps), min=0.)
elif torch.sign(omega_grad).item() >= 0 and torch.sign(omega_metagrad).item() < 0:
up = (omega_grad + eps) * (omega_metagrad - metaeps)
elif torch.sign(omega_grad).item() < 0 and torch.sign(omega_metagrad).item() >=0:
up = (omega_grad - eps) * (omega_metagrad + eps)
else:
up = torch.abs(omega_grad + eps) * torch.abs(omega_metagrad + metaeps)
down = omega_metagrad + metaeps * torch.sign(omega_metagrad)
log_row['Up'] = up.item()
log_row['Down'] = down.item()
metapenalty = rmax / (1 - gamma)**2 * (0.53 * avol / (2 * sigma) + gamma / (1 - gamma))
eta_star = (up / (2 * metapenalty * down**2 + 1e-12)).item()
Cmax = up**2 / (4 * metapenalty * down**2).item()
else:
log_row['Eps'] = 0
grad = gpomdp_estimator(batch, gamma, policy,
baselinekind=baseline,
shallow=shallow)
theta_grad = grad[1:]
omega_grad = grad[0]
#->
mixed, _ = mixed_estimator(batch, gamma, policy, baseline, theta_grad)
norm_grad = 2 * theta_grad.dot(mixed)
A = omega_grad
B = 2 * alpha * torch.norm(theta_grad)**2
C = sigma * alpha * norm_grad
C = torch.clamp(C, min=-clip_at, max=clip_at)
omega_metagrad = A + B + C
metapenalty = rmax / (1 - gamma)**2 * (0.53 * avol / (2 * sigma) + gamma / (1 - gamma))
eta_star = (omega_grad / (2 * metapenalty * omega_metagrad) + 1e-12).item()
Cmax = (omega_grad ** 2 / (4 * metapenalty)).item()
log_row['Up'] = torch.tensor(omega_grad).item()
log_row['Down'] = torch.tensor(omega_metagrad).item()
perf = performance(batch, gamma)
Co = thresholder.next(perf)
Co = min(Co, Cmax)
log_row['C'] = Co
log_row['Cmax'] = Cmax
eta = eta_star + abs(eta_star) * math.sqrt(1 - Co / (Cmax + 1e-12) + 1e-12)
new_omega = omega + eta * omega_metagrad
policy.set_scale_params(new_omega)
###
else:
#Mean update
omega = policy.get_scale_params()
sigma = torch.exp(omega).item()
batch = generate_batch(env, policy, horizon, batchsize,
action_filter=action_filter,
n_jobs=parallel,
seed=seed)
if confidence_schedule is not None:
delta = confidence_schedule.next(it)
log_row['Delta'] = delta
if delta < 1:
grad, grad_var = simple_gpomdp_estimator(batch, gamma, policy, baseline, result='moments')
theta_grad = grad[1:]
theta_grad_var = grad_var[1:]
quant = 2*sts.t.interval(1 - delta, batchsize-1,loc=0.,scale=1.)[1]
eps = quant * torch.sqrt(theta_grad_var / batchsize)
log_row['Eps'] = torch.norm(eps).item()
norm2 = torch.norm(torch.clamp(torch.abs(theta_grad) - eps, min=0.))
norm1 = torch.sum(torch.abs(theta_grad) + eps)
log_row['Up'] = norm1.item()
log_row['Down'] = norm2.item()
else:
log_row['Eps'] = 0
grad = simple_gpomdp_estimator(batch, gamma, policy, baseline)
theta_grad = grad[1:]
norm2 = torch.norm(theta_grad)
norm1 = torch.sum(torch.abs(theta_grad))
log_row['Up'] = norm1.item()
log_row['Down'] = norm2.item()
penalty = rmax * phimax**2 / (1-gamma)**2 * (avol / (sigma * math.sqrt(2*math.pi)) + gamma / (2*(1-gamma)))
alpha_star = sigma ** 2 * norm2 ** 2 / (2 * penalty * norm1 ** 2 + 1e-12)
Cmax = (alpha_star * norm2**2 / 2).item()
perf = performance(batch, gamma)
Co = thresholder.next(perf)
Co = min(Co, Cmax)
log_row['C'] = Co
log_row['Cmax'] = Cmax
alpha = alpha_star * (1 + math.sqrt(1 - Co / (Cmax + 1e-12) + 1e-12))
theta = policy.get_loc_params()
new_theta = theta + alpha * theta_grad
policy.set_loc_params(new_theta)
###
# Log
log_row['IterationKind'] = it % 2
log_row['ThetaGradNorm'] = torch.norm(theta_grad).item()
log_row['Alpha'] = alpha
log_row['Eta'] = eta
log_row['Penalty'] = penalty
log_row['MetaPenalty'] = metapenalty
log_row['OmegaGrad'] = torch.tensor(omega_grad).item()
log_row['OmegaMetagrad'] = torch.tensor(omega_metagrad).item()
log_row['ThetaGradNorm'] = torch.norm(theta_grad).item()
log_row['BatchSize'] = batchsize
log_row['Exploration'] = policy.exploration()
log_row['Alpha'] = alpha.item()
log_row['Perf'] = perf
log_row['UPerf'] = performance(batch, 1.)
log_row['AvgHorizon'] = avg_horizon(batch)
params = policy.get_flat()
if log_params:
for i in range(policy.num_params()):
log_row['param%d' % i] = params[i].item()
logger.write_row(log_row, it)
if save_params and it % save_params == 0:
logger.save_params(params, it)
# Next iteration
it += 1
# Final policy
if save_params:
logger.save_params(params, it)
def adastep(env, policy,
horizon,
batchsize = 100,
iterations = 1000,
gamma = 0.99,
rmax = 1.,
phimax = 1.,
greedy = True,
delta = 1.,
test_det = True,
render = False,
seed = None,
baseline = 'peters',
action_filter = None,
parallel = False,
n_jobs = 4,
logger = Logger(name='test_sunday'),
save_params = 1000,
log_params = True,
verbose = True):
"""
Only for SIMPLE Gaussian policy w/ scalar variance
Policy must have learn_std = False, as std is META-learned
"""
# Defaults
assert policy.learn_std
if action_filter is None:
action_filter = clip(env)
# Seeding agent
if seed is not None:
seed_all_agent(seed)
# Preparing logger
algo_info = {'Algorithm': 'ADASTEP',
'Environment': str(env),
'BatchSize': batchsize,
'Max horizon': horizon,
'Iterations': iterations,
'gamma': gamma,
'actionFilter': action_filter,
'rmax': rmax,
'phimax': phimax,
'greedy': greedy}
logger.write_info({**algo_info, **policy.info()})
log_keys = ['Perf', 'UPerf', 'AvgHorizon',
'Alpha', 'BatchSize', 'Exploration',
'ThetaGradNorm',
'Penalty']
if log_params:
log_keys += ['param%d' % i for i in range(policy.num_params())]
if test_det:
log_keys.append('DetPerf')
log_row = dict.fromkeys(log_keys)
logger.open(log_row.keys())
# Learning
avol = torch.tensor(env.action_space.high - env.action_space.low).item()
it = 0
while(it < iterations):
# Begin iteration
if verbose:
print('\nIteration ', it)
if verbose:
print('Params: ', policy.get_flat())
# Test
if test_det:
omega = policy.get_scale_params()
policy.set_scale_params(-100.)
batch = generate_batch(env, policy, horizon, 1, action_filter)
policy.set_scale_params(omega)
log_row['DetPerf'] = performance(batch, gamma)
if render:
generate_batch(env, policy, horizon, 1, action_filter, render=True)
omega = policy.get_scale_params()
sigma = torch.exp(omega).item()
batch = generate_batch(env, policy, horizon, batchsize, action_filter, parallel=parallel, n_jobs=n_jobs, seed=seed)
if delta < 1:
grad, grad_var = simple_gpomdp_estimator(batch, gamma, policy, baseline, result='moments')
theta_grad = grad[1:]
theta_grad_var = grad_var[1:]
quant = 2*sts.t.interval(1 - delta, batchsize-1,loc=0.,scale=1.)[1]
eps = quant * torch.sqrt(theta_grad_var / batchsize + 1e-12)
norm2 = torch.norm(torch.clamp(torch.abs(theta_grad) - eps, min=0.))
norm1 = torch.sum(torch.abs(theta_grad) + eps)
else:
grad = simple_gpomdp_estimator(batch, gamma, policy, baseline)
theta_grad = grad[1:]
norm2 = torch.norm(theta_grad)
norm1 = torch.sum(torch.abs(theta_grad))
penalty = rmax * phimax**2 / (1-gamma)**2 * (avol / (sigma * math.sqrt(2*math.pi)) + gamma / (2*(1-gamma)))
alpha_star = sigma ** 2 * norm2 ** 2 / (2 * penalty * norm1 ** 2 + 1e-12)
Cmax = alpha_star * norm2**2 / 2
if greedy:
C = Cmax
else:
C = 0
alpha = alpha_star * (1 + math.sqrt(1 - C / (Cmax + 1e-12) + 1e-12))
theta = policy.get_loc_params()
new_theta = theta + alpha * theta_grad
policy.set_loc_params(new_theta)
# Log
log_row['Alpha'] = alpha
log_row['Penalty'] = penalty
log_row['ThetaGradNorm'] = torch.norm(theta_grad).item()
log_row['BatchSize'] = batchsize
log_row['Exploration'] = policy.exploration()
log_row['Alpha'] = alpha.item()
log_row['Perf'] = performance(batch, gamma)
log_row['UPerf'] = performance(batch, 1.)
log_row['AvgHorizon'] = avg_horizon(batch)
params = policy.get_flat()
if log_params:
for i in range(policy.num_params()):
log_row['param%d' % i] = params[i].item()
logger.write_row(log_row, it)
if save_params and it % save_params == 0:
logger.save_params(params, it)
# Next iteration
it += 1
# Final policy
if save_params:
logger.save_params(params, it)
def adabatch(env, policy,
horizon,
batchsize = 100,
iterations = 1000,
gamma = 0.99,
rmax = 1.,
phimax = 1.,
safety_requirement = MonotonicImprovement(),
test_det = True,
render = False,
seed = None,
baseline = 'peters',
action_filter = None,
parallel = False,
n_jobs = 4,
logger = Logger(name='test_sunday'),
save_params = 1000,
log_params = True,
verbose = True):
"""
Only for SIMPLE Gaussian policy w/ scalar variance
"""
# Defaults
assert policy.learn_std
if action_filter is None:
action_filter = clip(env)
# Seeding agent
if seed is not None:
seed_all_agent(seed)
# Preparing logger
algo_info = {'Algorithm': 'ADASTEP',
'Environment': str(env),
'BatchSize': batchsize,
'Max horizon': horizon,
'Iterations': iterations,
'gamma': gamma,
'actionFilter': action_filter,
'rmax': rmax,
'phimax': phimax}
logger.write_info({**algo_info, **policy.info()})
log_keys = ['Perf', 'UPerf', 'AvgHorizon',
'Alpha', 'BatchSize', 'Exploration',
'ThetaGradNorm',
'Penalty', 'Coordinate']
if log_params:
log_keys += ['param%d' % i for i in range(policy.num_params())]
if test_det:
log_keys.append('DetPerf')
log_row = dict.fromkeys(log_keys)
logger.open(log_row.keys())
# Learning
avol = torch.tensor(env.action_space.high - env.action_space.low).item()
it = 0
while(it < iterations):
# Begin iteration
if verbose:
print('\nIteration ', it)
if verbose:
print('Params: ', policy.get_flat())
# Test
if test_det:
omega = policy.get_scale_params()
policy.set_scale_params(-100.)
batch = generate_batch(env, policy, horizon, 1, action_filter)
policy.set_scale_params(omega)
log_row['DetPerf'] = performance(batch, gamma)
if render:
generate_batch(env, policy, horizon, 1, action_filter, render=True)
omega = policy.get_scale_params()
sigma = torch.exp(omega).item()
batch = generate_batch(env, policy, horizon, batchsize, action_filter, parallel=parallel, n_jobs=n_jobs, seed=seed)
grad = simple_gpomdp_estimator(batch, gamma, policy, baseline)
theta_grad = grad[1:]
norminf = torch.max(torch.abs(theta_grad))
k = torch.argmax(torch.abs(theta_grad))
penalty = rmax * phimax**2 / (1-gamma)**2 * (avol / (sigma * math.sqrt(2*math.pi)) + gamma / (2*(1-gamma)))
alpha_star = sigma ** 2/ (2 * penalty)
Cmax = alpha_star * norminf*2 / 2
C = safety_requirement.next()
alpha = alpha_star * (1 + math.sqrt(1 - C / (Cmax + 1e-12) + 1e-12))
theta = policy.get_loc_params()
new_theta = theta
new_theta[k] += alpha * theta_grad[k]
policy.set_loc_params(new_theta)
# Log
log_row['Coordinate'] = k.item()
log_row['Alpha'] = alpha
log_row['Penalty'] = penalty
log_row['ThetaGradNorm'] = torch.norm(theta_grad).item()
log_row['BatchSize'] = batchsize
log_row['Exploration'] = policy.exploration()
log_row['Alpha'] = alpha
log_row['Perf'] = performance(batch, gamma)
log_row['UPerf'] = performance(batch, 1.)
log_row['AvgHorizon'] = avg_horizon(batch)
params = policy.get_flat()
if log_params:
for i in range(policy.num_params()):
log_row['param%d' % i] = params[i].item()
logger.write_row(log_row, it)
if save_params and it % save_params == 0:
logger.save_params(params, it)
# Next iteration
it += 1
# Final policy
if save_params:
logger.save_params(params, it)
| 39.491124 | 132 | 0.534112 | 2,228 | 20,022 | 4.627917 | 0.101885 | 0.038987 | 0.021724 | 0.02347 | 0.79808 | 0.76753 | 0.746678 | 0.725051 | 0.725051 | 0.713025 | 0 | 0.020313 | 0.348417 | 20,022 | 506 | 133 | 39.56917 | 0.770044 | 0.039057 | 0 | 0.753027 | 1 | 0 | 0.058116 | 0 | 0 | 0 | 0 | 0 | 0.007264 | 1 | 0.007264 | false | 0 | 0.024213 | 0 | 0.031477 | 0.014528 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9eeed3f44009d1296c8440b0f93d2ad889bc73bb | 12,624 | py | Python | torch-test/mpich-3.4.3/modules/yaksa/src/backend/gencomm.py | alchemy315/NoPFS | f3901e963e2301e8a6f1c7aac0511d0cf9a1889d | [
"BSD-3-Clause"
] | null | null | null | torch-test/mpich-3.4.3/modules/yaksa/src/backend/gencomm.py | alchemy315/NoPFS | f3901e963e2301e8a6f1c7aac0511d0cf9a1889d | [
"BSD-3-Clause"
] | null | null | null | torch-test/mpich-3.4.3/modules/yaksa/src/backend/gencomm.py | alchemy315/NoPFS | f3901e963e2301e8a6f1c7aac0511d0cf9a1889d | [
"BSD-3-Clause"
] | null | null | null | #! /usr/bin/env python3
##
## Copyright (C) by Argonne National Laboratory
## See COPYRIGHT in top-level directory
##
import sys
sys.path.append('maint/')
import yutils
## loop through the derived and basic types to generate individual
## pack functions
derived_types = [ "hvector", "blkhindx", "hindexed", "contig", "resized" ]
########################################################################################
##### Switch statement generation for pup function selection
########################################################################################
def child_type_str(typelist):
s = "type"
for x in typelist:
s = s + "->u.%s.child" % x
return s
def switcher_builtin_element(backend, OUTFILE, blklens, typelist, pupstr, key, val):
yutils.display(OUTFILE, "case %s:\n" % key.upper())
if (len(typelist) == 0):
t = ""
else:
t = typelist.pop()
if (t == ""):
nesting_level = 0
else:
nesting_level = len(typelist) + 1
if ((t == "hvector" or t == "blkhindx") and (len(blklens) > 1)):
yutils.display(OUTFILE, "switch (%s->u.%s.blocklength) {\n" % (child_type_str(typelist), t))
for blklen in blklens:
if (blklen != "generic"):
yutils.display(OUTFILE, "case %s:\n" % blklen)
else:
yutils.display(OUTFILE, "default:\n")
yutils.display(OUTFILE, "if (max_nesting_level >= %d) {\n" % nesting_level)
yutils.display(OUTFILE, "%s->pack = yaksuri_%si_%s_blklen_%s_%s;\n" % (backend, backend, pupstr, blklen, val))
yutils.display(OUTFILE, "%s->unpack = yaksuri_%si_un%s_blklen_%s_%s;\n" % (backend, backend, pupstr, blklen, val))
yutils.display(OUTFILE, "}\n")
yutils.display(OUTFILE, "break;\n")
yutils.display(OUTFILE, "}\n")
else:
yutils.display(OUTFILE, "if (max_nesting_level >= %d) {\n" % nesting_level)
yutils.display(OUTFILE, "%s->pack = yaksuri_%si_%s_%s;\n" % (backend, backend, pupstr, val))
yutils.display(OUTFILE, "%s->unpack = yaksuri_%si_un%s_%s;\n" % (backend, backend, pupstr, val))
yutils.display(OUTFILE, "}\n")
if (t != ""):
typelist.append(t)
yutils.display(OUTFILE, "break;\n")
def switcher_builtin(backend, OUTFILE, blklens, builtin_types, builtin_maps, typelist, pupstr):
yutils.display(OUTFILE, "switch (%s->u.builtin.handle) {\n" % child_type_str(typelist))
for b in builtin_types:
switcher_builtin_element(backend, OUTFILE, blklens, typelist, pupstr, "YAKSA_TYPE__%s" % b.replace(" ", "_"), b.replace(" ", "_"))
for key in builtin_maps:
switcher_builtin_element(backend, OUTFILE, blklens, typelist, pupstr, key, builtin_maps[key])
yutils.display(OUTFILE, "default:\n")
yutils.display(OUTFILE, " break;\n")
yutils.display(OUTFILE, "}\n")
def switcher(backend, OUTFILE, blklens, builtin_types, builtin_maps, typelist, pupstr, nests):
yutils.display(OUTFILE, "switch (%s->kind) {\n" % child_type_str(typelist))
for x in range(len(derived_types)):
d = derived_types[x]
if (nests > 1):
yutils.display(OUTFILE, "case YAKSI_TYPE_KIND__%s:\n" % d.upper())
typelist.append(d)
switcher(backend, OUTFILE, blklens, builtin_types, builtin_maps, typelist, pupstr + "_%s" % d, nests - 1)
typelist.pop()
yutils.display(OUTFILE, "break;\n")
if (len(typelist)):
yutils.display(OUTFILE, "case YAKSI_TYPE_KIND__BUILTIN:\n")
switcher_builtin(backend, OUTFILE, blklens, builtin_types, builtin_maps, typelist, pupstr)
yutils.display(OUTFILE, "break;\n")
yutils.display(OUTFILE, "default:\n")
yutils.display(OUTFILE, " break;\n")
yutils.display(OUTFILE, "}\n")
########################################################################################
##### main function
########################################################################################
def populate_pupfns(pup_max_nesting, backend, blklens, builtin_types, builtin_maps):
##### generate the switching logic to select pup functions
filename = "src/backend/%s/pup/yaksuri_%si_populate_pupfns.c" % (backend, backend)
yutils.copyright_c(filename)
OUTFILE = open(filename, "a")
yutils.display(OUTFILE, "#include <stdio.h>\n")
yutils.display(OUTFILE, "#include \"yaksi.h\"\n")
yutils.display(OUTFILE, "#include \"yaksu.h\"\n")
yutils.display(OUTFILE, "#include \"yaksuri_%si.h\"\n" % backend)
yutils.display(OUTFILE, "#include \"yaksuri_%si_populate_pupfns.h\"\n" % backend)
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "int yaksuri_%si_populate_pupfns(yaksi_type_s * type)\n" % backend)
yutils.display(OUTFILE, "{\n")
yutils.display(OUTFILE, "int rc = YAKSA_SUCCESS;\n")
yutils.display(OUTFILE, "yaksuri_%si_type_s *%s = (yaksuri_%si_type_s *) type->backend.%s.priv;\n" \
% (backend, backend, backend, backend))
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "%s->pack = YAKSURI_KERNEL_NULL;\n" % backend)
yutils.display(OUTFILE, "%s->unpack = YAKSURI_KERNEL_NULL;\n" % backend)
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "switch (type->kind) {\n")
for dtype1 in derived_types:
yutils.display(OUTFILE, "case YAKSI_TYPE_KIND__%s:\n" % dtype1.upper())
yutils.display(OUTFILE, "switch (type->u.%s.child->kind) {\n" % dtype1)
for dtype2 in derived_types:
yutils.display(OUTFILE, "case YAKSI_TYPE_KIND__%s:\n" % dtype2.upper())
yutils.display(OUTFILE, "rc = yaksuri_%si_populate_pupfns_%s_%s(type);\n" % (backend, dtype1, dtype2))
yutils.display(OUTFILE, "break;\n")
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "case YAKSI_TYPE_KIND__BUILTIN:\n")
yutils.display(OUTFILE, "rc = yaksuri_%si_populate_pupfns_%s_builtin(type);\n" % (backend, dtype1))
yutils.display(OUTFILE, "break;\n")
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "default:\n")
yutils.display(OUTFILE, " break;\n")
yutils.display(OUTFILE, "}\n")
yutils.display(OUTFILE, "break;\n")
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "default:\n")
yutils.display(OUTFILE, " break;\n")
yutils.display(OUTFILE, "}\n")
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, " return rc;\n")
yutils.display(OUTFILE, "}\n");
OUTFILE.close()
for dtype1 in derived_types:
for dtype2 in derived_types:
filename = "src/backend/%s/pup/yaksuri_%si_populate_pupfns_%s_%s.c" % (backend, backend, dtype1, dtype2)
yutils.copyright_c(filename)
OUTFILE = open(filename, "a")
yutils.display(OUTFILE, "#include <stdio.h>\n")
yutils.display(OUTFILE, "#include <stdlib.h>\n")
yutils.display(OUTFILE, "#include <wchar.h>\n")
yutils.display(OUTFILE, "#include \"yaksi.h\"\n")
yutils.display(OUTFILE, "#include \"yaksu.h\"\n")
yutils.display(OUTFILE, "#include \"yaksuri_%si.h\"\n" % backend)
yutils.display(OUTFILE, "#include \"yaksuri_%si_populate_pupfns.h\"\n" % backend)
yutils.display(OUTFILE, "#include \"yaksuri_%si_pup.h\"\n" % backend)
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "int yaksuri_%si_populate_pupfns_%s_%s(yaksi_type_s * type)\n" % (backend, dtype1, dtype2))
yutils.display(OUTFILE, "{\n")
yutils.display(OUTFILE, "int rc = YAKSA_SUCCESS;\n")
yutils.display(OUTFILE, "yaksuri_%si_type_s *%s = (yaksuri_%si_type_s *) type->backend.%s.priv;\n" \
% (backend, backend, backend, backend))
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "char *str = getenv(\"YAKSA_ENV_MAX_NESTING_LEVEL\");\n")
yutils.display(OUTFILE, "int max_nesting_level;\n")
yutils.display(OUTFILE, "if (str) {\n")
yutils.display(OUTFILE, "max_nesting_level = atoi(str);\n")
yutils.display(OUTFILE, "} else {\n")
yutils.display(OUTFILE, "max_nesting_level = YAKSI_ENV_DEFAULT_NESTING_LEVEL;\n")
yutils.display(OUTFILE, "}\n")
yutils.display(OUTFILE, "\n")
pupstr = "pack_%s_%s" % (dtype1, dtype2)
typelist = [ dtype1, dtype2 ]
switcher(backend, OUTFILE, blklens, builtin_types, builtin_maps, typelist, pupstr, pup_max_nesting - 1)
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "return rc;\n")
yutils.display(OUTFILE, "}\n")
OUTFILE.close()
filename = "src/backend/%s/pup/yaksuri_%si_populate_pupfns_%s_builtin.c" % (backend, backend, dtype1)
yutils.copyright_c(filename)
OUTFILE = open(filename, "a")
yutils.display(OUTFILE, "#include <stdio.h>\n")
yutils.display(OUTFILE, "#include <stdlib.h>\n")
yutils.display(OUTFILE, "#include <wchar.h>\n")
yutils.display(OUTFILE, "#include \"yaksi.h\"\n")
yutils.display(OUTFILE, "#include \"yaksu.h\"\n")
yutils.display(OUTFILE, "#include \"yaksuri_%si.h\"\n" % backend)
yutils.display(OUTFILE, "#include \"yaksuri_%si_populate_pupfns.h\"\n" % backend)
yutils.display(OUTFILE, "#include \"yaksuri_%si_pup.h\"\n" % backend)
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "int yaksuri_%si_populate_pupfns_%s_builtin(yaksi_type_s * type)\n" % (backend, dtype1))
yutils.display(OUTFILE, "{\n")
yutils.display(OUTFILE, "int rc = YAKSA_SUCCESS;\n")
yutils.display(OUTFILE, "yaksuri_%si_type_s *%s = (yaksuri_%si_type_s *) type->backend.%s.priv;\n" \
% (backend, backend, backend, backend))
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "char *str = getenv(\"YAKSA_ENV_MAX_NESTING_LEVEL\");\n")
yutils.display(OUTFILE, "int max_nesting_level;\n")
yutils.display(OUTFILE, "if (str) {\n")
yutils.display(OUTFILE, "max_nesting_level = atoi(str);\n")
yutils.display(OUTFILE, "} else {\n")
yutils.display(OUTFILE, "max_nesting_level = YAKSI_ENV_DEFAULT_NESTING_LEVEL;\n")
yutils.display(OUTFILE, "}\n")
yutils.display(OUTFILE, "\n")
pupstr = "pack_%s" % dtype1
typelist = [ dtype1 ]
switcher_builtin(backend, OUTFILE, blklens, builtin_types, builtin_maps, typelist, pupstr)
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "return rc;\n")
yutils.display(OUTFILE, "}\n")
OUTFILE.close()
##### generate the Makefile for the pup function selection functions
filename = "src/backend/%s/pup/Makefile.populate_pupfns.mk" % backend
yutils.copyright_makefile(filename)
OUTFILE = open(filename, "a")
yutils.display(OUTFILE, "libyaksa_la_SOURCES += \\\n")
for dtype1 in derived_types:
for dtype2 in derived_types:
yutils.display(OUTFILE, "\tsrc/backend/%s/pup/yaksuri_%si_populate_pupfns_%s_%s.c \\\n" % (backend, backend, dtype1, dtype2))
yutils.display(OUTFILE, "\tsrc/backend/%s/pup/yaksuri_%si_populate_pupfns_%s_builtin.c \\\n" % (backend, backend, dtype1))
yutils.display(OUTFILE, "\tsrc/backend/%s/pup/yaksuri_%si_populate_pupfns.c\n" % (backend, backend))
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "noinst_HEADERS += \\\n")
yutils.display(OUTFILE, "\tsrc/backend/%s/pup/yaksuri_%si_populate_pupfns.h\n" % (backend, backend))
OUTFILE.close()
##### generate the header file for the pup function selection functions
filename = "src/backend/%s/pup/yaksuri_%si_populate_pupfns.h" % (backend, backend)
yutils.copyright_c(filename)
OUTFILE = open(filename, "a")
yutils.display(OUTFILE, "#ifndef YAKSURI_%sI_POPULATE_PUPFNS_H_INCLUDED\n" % backend.upper())
yutils.display(OUTFILE, "#define YAKSURI_%sI_POPULATE_PUPFNS_H_INCLUDED\n" % backend.upper())
yutils.display(OUTFILE, "\n")
for dtype1 in derived_types:
for dtype2 in derived_types:
yutils.display(OUTFILE, "int yaksuri_%si_populate_pupfns_%s_%s(yaksi_type_s * type);\n" % (backend, dtype1, dtype2))
yutils.display(OUTFILE, "int yaksuri_%si_populate_pupfns_%s_builtin(yaksi_type_s * type);\n" % (backend, dtype1))
yutils.display(OUTFILE, "\n")
yutils.display(OUTFILE, "#endif /* YAKSURI_%sI_POPULATE_PUPFNS_H_INCLUDED */\n" % backend.upper())
OUTFILE.close()
| 51.526531 | 138 | 0.62001 | 1,520 | 12,624 | 4.971053 | 0.090789 | 0.218502 | 0.336157 | 0.194547 | 0.849524 | 0.819614 | 0.793409 | 0.789968 | 0.761779 | 0.696268 | 0 | 0.003953 | 0.198511 | 12,624 | 244 | 139 | 51.737705 | 0.742835 | 0.034696 | 0 | 0.612745 | 0 | 0 | 0.2539 | 0.116395 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02451 | false | 0 | 0.009804 | 0 | 0.039216 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
733c36b43b9252eace48a421c0f12cecfa63c5bf | 218 | py | Python | Curso_de_Python_3_do_Basico_Ao_Avancado_Udemy/desafios/desafioValidaCNPJ/__init__.py | DanilooSilva/Cursos_de_Python | 8f167a4c6e16f01601e23b6f107578aa1454472d | [
"MIT"
] | null | null | null | Curso_de_Python_3_do_Basico_Ao_Avancado_Udemy/desafios/desafioValidaCNPJ/__init__.py | DanilooSilva/Cursos_de_Python | 8f167a4c6e16f01601e23b6f107578aa1454472d | [
"MIT"
] | null | null | null | Curso_de_Python_3_do_Basico_Ao_Avancado_Udemy/desafios/desafioValidaCNPJ/__init__.py | DanilooSilva/Cursos_de_Python | 8f167a4c6e16f01601e23b6f107578aa1454472d | [
"MIT"
] | null | null | null | from desafioValidaCNPJ.cnpj import validador
from desafioValidaCNPJ.cnpj import validadorSegundoDigito
from desafioValidaCNPJ.geradorCNPJ import get_CNPJ
__all__ = ['validador', 'get_CNPJ', 'validadorSegundoDigito']
| 31.142857 | 61 | 0.848624 | 21 | 218 | 8.52381 | 0.428571 | 0.351955 | 0.27933 | 0.346369 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087156 | 218 | 6 | 62 | 36.333333 | 0.899497 | 0 | 0 | 0 | 0 | 0 | 0.179724 | 0.101382 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
733e06530ef45a75d9ce131cc24337c6273f62c4 | 31,817 | py | Python | sdk/storage/azure-storage-blob/tests/test_cpk.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 1 | 2021-04-26T21:15:01.000Z | 2021-04-26T21:15:01.000Z | sdk/storage/azure-storage-blob/tests/test_cpk.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 2 | 2021-08-24T15:32:30.000Z | 2021-08-24T23:21:34.000Z | sdk/storage/azure-storage-blob/tests/test_cpk.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 1 | 2016-04-19T22:15:47.000Z | 2016-04-19T22:15:47.000Z | # coding: utf-8
# -------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
# --------------------------------------------------------------------------
import unittest
import pytest
from datetime import datetime, timedelta
from azure.core.exceptions import HttpResponseError
from azure.storage.blob import (
BlobServiceClient,
BlobType,
BlobBlock,
CustomerProvidedEncryptionKey,
BlobSasPermissions,
generate_blob_sas
)
from azure.storage.blob import CustomerProvidedEncryptionKey, BlobSasPermissions
from _shared.testcase import GlobalStorageAccountPreparer
from devtools_testutils import ResourceGroupPreparer, StorageAccountPreparer
from devtools_testutils.storage import StorageTestCase
# ------------------------------------------------------------------------------
TEST_ENCRYPTION_KEY = CustomerProvidedEncryptionKey(key_value="MDEyMzQ1NjcwMTIzNDU2NzAxMjM0NTY3MDEyMzQ1Njc=",
key_hash="3QFFFpRA5+XANHqwwbT4yXDmrT/2JaLt/FKHjzhOdoE=")
# ------------------------------------------------------------------------------
class StorageCPKTest(StorageTestCase):
def _setup(self, bsc):
self.config = bsc._config
self.container_name = self.get_resource_name('utcontainer')
# prep some test data so that they can be used in upload tests
self.byte_data = self.get_random_bytes(64 * 1024)
if self.is_live:
bsc.create_container(self.container_name)
def _teardown(self, bsc):
if self.is_live:
try:
bsc.delete_container(self.container_name)
except:
pass
return super(StorageCPKTest, self).tearDown()
# --Helpers-----------------------------------------------------------------
def _get_blob_reference(self):
return self.get_resource_name("cpk")
def _create_block_blob(self, bsc, blob_name=None, data=None, cpk=None, max_concurrency=1):
blob_name = blob_name if blob_name else self._get_blob_reference()
blob_client = bsc.get_blob_client(self.container_name, blob_name)
data = data if data else b''
resp = blob_client.upload_blob(data, cpk=cpk, max_concurrency=max_concurrency)
return blob_client, resp
def _create_append_blob(self, bsc, cpk=None):
blob_name = self._get_blob_reference()
blob = bsc.get_blob_client(
self.container_name,
blob_name)
blob.create_append_blob(cpk=cpk)
return blob
def _create_page_blob(self, bsc, cpk=None):
blob_name = self._get_blob_reference()
blob = bsc.get_blob_client(
self.container_name,
blob_name)
blob.create_page_blob(1024 * 1024, cpk=cpk)
return blob
# -- Test cases for APIs supporting CPK ----------------------------------------------
@GlobalStorageAccountPreparer()
def test_put_block_and_put_block_list(self, resource_group, location, storage_account, storage_account_key):
# Arrange
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
blob_client, _ = self._create_block_blob(bsc)
blob_client.stage_block('1', b'AAA', cpk=TEST_ENCRYPTION_KEY)
blob_client.stage_block('2', b'BBB', cpk=TEST_ENCRYPTION_KEY)
blob_client.stage_block('3', b'CCC', cpk=TEST_ENCRYPTION_KEY)
# Act
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
put_block_list_resp = blob_client.commit_block_list(block_list,
cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(put_block_list_resp['etag'])
self.assertIsNotNone(put_block_list_resp['last_modified'])
self.assertTrue(put_block_list_resp['request_server_encrypted'])
self.assertEqual(put_block_list_resp['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
blob_client.download_blob()
# Act get the blob content
blob = blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), b'AAABBBCCC')
self.assertEqual(blob.properties.etag, put_block_list_resp['etag'])
self.assertEqual(blob.properties.last_modified, put_block_list_resp['last_modified'])
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
def test_create_block_blob_with_chunks(self, resource_group, location, storage_account, storage_account_key):
# parallel operation
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
# Arrange
# to force the in-memory chunks to be used
self.config.use_byte_buffer = True
# Act
# create_blob_from_bytes forces the in-memory chunks to be used
blob_client, upload_response = self._create_block_blob(bsc, data=self.byte_data, cpk=TEST_ENCRYPTION_KEY,
max_concurrency=2)
# Assert
self.assertIsNotNone(upload_response['etag'])
self.assertIsNotNone(upload_response['last_modified'])
self.assertTrue(upload_response['request_server_encrypted'])
self.assertEqual(upload_response['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
blob_client.download_blob()
# Act get the blob content
blob = blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), self.byte_data)
self.assertEqual(blob.properties.etag, upload_response['etag'])
self.assertEqual(blob.properties.last_modified, upload_response['last_modified'])
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
def test_create_block_blob_with_sub_streams(self, resource_group, location, storage_account, storage_account_key):
# problem with the recording framework can only run live
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
# Act
# create_blob_from_bytes forces the in-memory chunks to be used
blob_client, upload_response = self._create_block_blob(bsc, data=self.byte_data, cpk=TEST_ENCRYPTION_KEY,
max_concurrency=2)
# Assert
self.assertIsNotNone(upload_response['etag'])
self.assertIsNotNone(upload_response['last_modified'])
self.assertTrue(upload_response['request_server_encrypted'])
self.assertEqual(upload_response['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
blob_client.download_blob()
# Act get the blob content
blob = blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), self.byte_data)
self.assertEqual(blob.properties.etag, upload_response['etag'])
self.assertEqual(blob.properties.last_modified, upload_response['last_modified'])
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
@GlobalStorageAccountPreparer()
def test_create_block_blob_with_single_chunk(self, resource_group, location, storage_account, storage_account_key):
# Act
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
data = b'AAABBBCCC'
# create_blob_from_bytes forces the in-memory chunks to be used
blob_client, upload_response = self._create_block_blob(bsc, data=data, cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(upload_response['etag'])
self.assertIsNotNone(upload_response['last_modified'])
self.assertTrue(upload_response['request_server_encrypted'])
self.assertEqual(upload_response['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
blob_client.download_blob()
# Act get the blob content
blob = blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), data)
self.assertEqual(blob.properties.etag, upload_response['etag'])
self.assertEqual(blob.properties.last_modified, upload_response['last_modified'])
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
@GlobalStorageAccountPreparer()
def test_put_block_from_url_and_commit_with_cpk(self, resource_group, location, storage_account, storage_account_key):
# Arrange
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
# create source blob and get source blob url
source_blob_name = self.get_resource_name("sourceblob")
self.config.use_byte_buffer = True # Make sure using chunk upload, then we can record the request
source_blob_client, _ = self._create_block_blob(bsc, blob_name=source_blob_name, data=self.byte_data)
source_blob_sas = generate_blob_sas(
source_blob_client.account_name,
source_blob_client.container_name,
source_blob_client.blob_name,
snapshot=source_blob_client.snapshot,
account_key=source_blob_client.credential.account_key,
permission=BlobSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1)
)
source_blob_url = source_blob_client.url + "?" + source_blob_sas
# create destination blob
self.config.use_byte_buffer = False
destination_blob_client, _ = self._create_block_blob(bsc, cpk=TEST_ENCRYPTION_KEY)
# Act part 1: make put block from url calls
destination_blob_client.stage_block_from_url(block_id=1, source_url=source_blob_url,
source_offset=0, source_length=4 * 1024,
cpk=TEST_ENCRYPTION_KEY)
destination_blob_client.stage_block_from_url(block_id=2, source_url=source_blob_url,
source_offset=4 * 1024, source_length=4 * 1024,
cpk=TEST_ENCRYPTION_KEY)
# Assert blocks
committed, uncommitted = destination_blob_client.get_block_list('all')
self.assertEqual(len(uncommitted), 2)
self.assertEqual(len(committed), 0)
# commit the blocks without cpk should fail
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2')]
with self.assertRaises(HttpResponseError):
destination_blob_client.commit_block_list(block_list)
# Act commit the blocks with cpk should succeed
put_block_list_resp = destination_blob_client.commit_block_list(block_list,
cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(put_block_list_resp['etag'])
self.assertIsNotNone(put_block_list_resp['last_modified'])
self.assertTrue(put_block_list_resp['request_server_encrypted'])
self.assertEqual(put_block_list_resp['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content
blob = destination_blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), self.byte_data[0: 8 * 1024])
self.assertEqual(blob.properties.etag, put_block_list_resp['etag'])
self.assertEqual(blob.properties.last_modified, put_block_list_resp['last_modified'])
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
@GlobalStorageAccountPreparer()
def test_append_block(self, resource_group, location, storage_account, storage_account_key):
# Arrange
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
blob_client = self._create_append_blob(bsc, cpk=TEST_ENCRYPTION_KEY)
# Act
for content in [b'AAA', b'BBB', b'CCC']:
append_blob_prop = blob_client.append_block(content, cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(append_blob_prop['etag'])
self.assertIsNotNone(append_blob_prop['last_modified'])
self.assertTrue(append_blob_prop['request_server_encrypted'])
self.assertEqual(append_blob_prop['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
blob_client.download_blob()
# Act get the blob content
blob = blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), b'AAABBBCCC')
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
@GlobalStorageAccountPreparer()
def test_append_block_from_url(self, resource_group, location, storage_account, storage_account_key):
# Arrange
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
source_blob_name = self.get_resource_name("sourceblob")
self.config.use_byte_buffer = True # chunk upload
source_blob_client, _ = self._create_block_blob(bsc, blob_name=source_blob_name, data=self.byte_data)
source_blob_sas = generate_blob_sas(
source_blob_client.account_name,
source_blob_client.container_name,
source_blob_client.blob_name,
snapshot=source_blob_client.snapshot,
account_key=source_blob_client.credential.account_key,
permission=BlobSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1)
)
source_blob_url = source_blob_client.url + "?" + source_blob_sas
self.config.use_byte_buffer = False
destination_blob_client = self._create_append_blob(bsc, cpk=TEST_ENCRYPTION_KEY)
# Act
append_blob_prop = destination_blob_client.append_block_from_url(source_blob_url,
source_offset=0,
source_length=4 * 1024,
cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(append_blob_prop['etag'])
self.assertIsNotNone(append_blob_prop['last_modified'])
# TODO: verify that the swagger is correct, header wasn't added for the response
# self.assertTrue(append_blob_prop['request_server_encrypted'])
self.assertEqual(append_blob_prop['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
destination_blob_client.download_blob()
# Act get the blob content
blob = destination_blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), self.byte_data[0: 4 * 1024])
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
@GlobalStorageAccountPreparer()
def test_create_append_blob_with_chunks(self, resource_group, location, storage_account, storage_account_key):
# Arrange
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
blob_client = self._create_append_blob(bsc, cpk=TEST_ENCRYPTION_KEY)
# Act
append_blob_prop = blob_client.upload_blob(self.byte_data,
blob_type=BlobType.AppendBlob, cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(append_blob_prop['etag'])
self.assertIsNotNone(append_blob_prop['last_modified'])
self.assertTrue(append_blob_prop['request_server_encrypted'])
self.assertEqual(append_blob_prop['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
blob_client.download_blob()
# Act get the blob content
blob = blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), self.byte_data)
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
@GlobalStorageAccountPreparer()
def test_update_page(self, resource_group, location, storage_account, storage_account_key):
# Arrange
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
blob_client = self._create_page_blob(bsc, cpk=TEST_ENCRYPTION_KEY)
# Act
page_blob_prop = blob_client.upload_page(self.byte_data,
offset=0,
length=len(self.byte_data),
cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(page_blob_prop['etag'])
self.assertIsNotNone(page_blob_prop['last_modified'])
self.assertTrue(page_blob_prop['request_server_encrypted'])
self.assertEqual(page_blob_prop['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
blob_client.download_blob()
# Act get the blob content
blob = blob_client.download_blob(offset=0,
length=len(self.byte_data),
cpk=TEST_ENCRYPTION_KEY, )
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), self.byte_data)
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
@GlobalStorageAccountPreparer()
def test_update_page_from_url(self, resource_group, location, storage_account, storage_account_key):
# Arrange
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
source_blob_name = self.get_resource_name("sourceblob")
self.config.use_byte_buffer = True # Make sure using chunk upload, then we can record the request
source_blob_client, _ = self._create_block_blob(bsc, blob_name=source_blob_name, data=self.byte_data)
source_blob_sas = generate_blob_sas(
source_blob_client.account_name,
source_blob_client.container_name,
source_blob_client.blob_name,
snapshot=source_blob_client.snapshot,
account_key=source_blob_client.credential.account_key,
permission=BlobSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1)
)
source_blob_url = source_blob_client.url + "?" + source_blob_sas
self.config.use_byte_buffer = False
blob_client = self._create_page_blob(bsc, cpk=TEST_ENCRYPTION_KEY)
# Act
page_blob_prop = blob_client.upload_pages_from_url(source_blob_url,
offset=0,
length=len(self.byte_data),
source_offset=0,
cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(page_blob_prop['etag'])
self.assertIsNotNone(page_blob_prop['last_modified'])
self.assertTrue(page_blob_prop['request_server_encrypted'])
# TODO: FIX SWAGGER
# self.assertEqual(page_blob_prop['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
blob_client.download_blob()
# Act get the blob content
blob = blob_client.download_blob(offset=0,
length=len(self.byte_data),
cpk=TEST_ENCRYPTION_KEY, )
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), self.byte_data)
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
@pytest.mark.live_test_only
@GlobalStorageAccountPreparer()
def test_create_page_blob_with_chunks(self, resource_group, location, storage_account, storage_account_key):
# Act
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
blob_client = bsc.get_blob_client(self.container_name, self._get_blob_reference())
page_blob_prop = blob_client.upload_blob(self.byte_data,
blob_type=BlobType.PageBlob,
max_concurrency=2,
cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(page_blob_prop['etag'])
self.assertIsNotNone(page_blob_prop['last_modified'])
self.assertTrue(page_blob_prop['request_server_encrypted'])
self.assertEqual(page_blob_prop['encryption_key_sha256'], TEST_ENCRYPTION_KEY.key_hash)
# Act get the blob content without cpk should fail
with self.assertRaises(HttpResponseError):
blob_client.download_blob()
# Act get the blob content
blob = blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# Assert content was retrieved with the cpk
self.assertEqual(blob.readall(), self.byte_data)
self.assertEqual(blob.properties.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
self._teardown(bsc)
# TODO: verify why clear page works without providing cpk
# @record
# def test_clear_page(self):
# # Arrange
# blob_client = bsc.get_blob_client(self.container_name, self._get_blob_reference())
# data = self.get_random_bytes(1024)
# blob_client.upload_blob(data, blob_type=BlobType.PageBlob, cpk=TEST_ENCRYPTION_KEY)
#
# # Act
# blob = blob_client.download_blob(cpk=TEST_ENCRYPTION_KEY)
# self.assertEqual(blob.readall(), data)
#
# # with self.assertRaises(HttpResponseError):
# # blob_client.clear_page(0, 511)
#
# resp = blob_client.clear_page(0, 511, cpk=TEST_ENCRYPTION_KEY)
# blob = blob_client.download_blob(0, 511, cpk=TEST_ENCRYPTION_KEY)
#
# # Assert
# self.assertIsNotNone(resp.get('etag'))
# self.assertIsNotNone(resp.get('last_modified'))
# self.assertIsNotNone(resp.get('blob_sequence_number'))
# self.assertEqual(blob.readall(), b'\x00' * 512)
#
# blob = blob_client.download_blob(512, 1023, cpk=TEST_ENCRYPTION_KEY)
# self.assertEqual(blob.readall(), data[512:])
@GlobalStorageAccountPreparer()
def test_get_set_blob_metadata(self, resource_group, location, storage_account, storage_account_key):
# Arrange
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
blob_client, _ = self._create_block_blob(bsc, data=b'AAABBBCCC', cpk=TEST_ENCRYPTION_KEY)
# Act without the encryption key should fail
with self.assertRaises(HttpResponseError):
blob_client.get_blob_properties()
# Act
blob_props = blob_client.get_blob_properties(cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertTrue(blob_props.server_encrypted)
self.assertEqual(blob_props.encryption_key_sha256, TEST_ENCRYPTION_KEY.key_hash)
# Act set blob properties
metadata = {'hello': 'world', 'number': '42', 'up': 'upval'}
with self.assertRaises(HttpResponseError):
blob_client.set_blob_metadata(
metadata=metadata,
)
blob_client.set_blob_metadata(metadata=metadata, cpk=TEST_ENCRYPTION_KEY)
# Assert
blob_props = blob_client.get_blob_properties(cpk=TEST_ENCRYPTION_KEY)
md = blob_props.metadata
self.assertEqual(3, len(md))
self.assertEqual(md['hello'], 'world')
self.assertEqual(md['number'], '42')
self.assertEqual(md['up'], 'upval')
self.assertFalse('Up' in md)
self._teardown(bsc)
@GlobalStorageAccountPreparer()
def test_snapshot_blob(self, resource_group, location, storage_account, storage_account_key):
# Arrange
# test chunking functionality by reducing the size of each chunk,
# otherwise the tests would take too long to execute
bsc = BlobServiceClient(
self.account_url(storage_account, "blob"),
credential=storage_account_key,
connection_data_block_size=1024,
max_single_put_size=1024,
min_large_block_upload_threshold=1024,
max_block_size=1024,
max_page_size=1024)
self._setup(bsc)
blob_client, _ = self._create_block_blob(bsc, data=b'AAABBBCCC', cpk=TEST_ENCRYPTION_KEY)
# Act without cpk should not work
with self.assertRaises(HttpResponseError):
blob_client.create_snapshot()
# Act with cpk should work
blob_snapshot = blob_client.create_snapshot(cpk=TEST_ENCRYPTION_KEY)
# Assert
self.assertIsNotNone(blob_snapshot)
self._teardown(bsc)
# ------------------------------------------------------------------------------
| 45.845821 | 122 | 0.654587 | 3,639 | 31,817 | 5.388019 | 0.070899 | 0.060999 | 0.058959 | 0.044882 | 0.861223 | 0.831795 | 0.817412 | 0.799 | 0.787729 | 0.766716 | 0 | 0.018953 | 0.257095 | 31,817 | 693 | 123 | 45.911977 | 0.810551 | 0.180124 | 0 | 0.749436 | 0 | 0 | 0.039921 | 0.020751 | 0 | 0 | 0 | 0.001443 | 0.221219 | 1 | 0.042889 | false | 0.002257 | 0.020316 | 0.002257 | 0.076749 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b418476245062119468a87e82a28e835e2968331 | 16,750 | py | Python | tests/bugs/core_2006_test.py | reevespaul/firebird-qa | 98f16f425aa9ab8ee63b86172f959d63a2d76f21 | [
"MIT"
] | null | null | null | tests/bugs/core_2006_test.py | reevespaul/firebird-qa | 98f16f425aa9ab8ee63b86172f959d63a2d76f21 | [
"MIT"
] | null | null | null | tests/bugs/core_2006_test.py | reevespaul/firebird-qa | 98f16f425aa9ab8ee63b86172f959d63a2d76f21 | [
"MIT"
] | null | null | null | #coding:utf-8
#
# id: bugs.core_2006
# title: SUBSTRING with regular expression (SIMILAR TO) capability
# decription:
# tracker_id: CORE-2006
# min_versions: ['3.0']
# versions: 3.0
# qmid: None
import pytest
from firebird.qa import db_factory, isql_act, Action
# version: 3.0
# resources: None
substitutions_1 = []
init_script_1 = """"""
db_1 = db_factory(sql_dialect=3, init=init_script_1)
test_script_1 = """
set list on;
------------------------------------------------------------------------------
-- Test for matching with percent characters before and after pattern:
select
trim(str) str
,trim(ptn) ptn
,iif( trim(str) similar to '%'||trim(ptn)||'%', 1, 0 ) "str similar to %ptn%"
,substring( trim(str) similar '%\\"' || trim(ptn) || '\\"%' escape '\\' ) "subs(str similar to %ptn%)"
from(
select
'WDWDWDWD' str
,'((DW)|(WD)){4}' ptn
from rdb$database
union all
select
'AAXYAAXYAAAAXYAAAXYAA' str
,'(AAXY|AAAX){2,}' ptn
from rdb$database
union all
select
'YZZXYZZ0Z0YZZYZZYYZZZYZZ0Z0YZZ'
,'(0Z0(Y|Z)*){2}'
from rdb$database
union all
select
'AARARARAARARAR'
,'RA(AR){3}'
from rdb$database
union all
select
'eiavieieav' str
,'(ie){2,}' ptn
from rdb$database
union all
select
'avieieavav' str
,'(av|ie){2,}' ptn
from rdb$database
union all
select
'avieieieav' str
,'((av)|(ie)){2,}' ptn
from rdb$database
);
----------------------
-- Test for exact matching to pattern:
select
trim(str) str
,trim(ptn) ptn
,iif( trim(str) similar to trim(ptn), 1, 0 ) "str similar to ptn"
,substring( trim(str) similar '\\"' || trim(ptn) || '\\"' escape '\\' ) "subs(str similar to ptn)"
from(
select ----------- core-2389
'x/t' str
,'%[/]t' ptn
from rdb$database
union all
select ------------------- core-2756
'2015-04-13' str
,'[[:DIGIT:]]{4}[-][[:DIGIT:]]{2}[-][[:DIGIT:]]{2}' ptn
from rdb$database
union all
select ------------------- core-2780
'WI-T3.0.0.31780 Firebird 3.0 Beta 2'
,'%[0-9]+.[0-9]+.[0-9]+((.?[0-9]+)*)[[:WHITESPACE:]]%'
from rdb$database
union all
select ----------- core-3523
'm'
,'[p-k]'
from rdb$database
union all
------------------- core-3754
select '1', '(1|2){0,}' from rdb$database union all select
'1', '(1|2){0,1}' from rdb$database union all select
'1', '(1|2){1}' from rdb$database union all select
'123', '(1|12[3]?){1}' from rdb$database union all select
'123', '(1|12[3]?)+' from rdb$database union all select
------------- core-0769
'ab', 'ab|cd|efg' from rdb$database union all select
'efg', 'ab|cd|efg' from rdb$database union all select
'a', 'ab|cd|efg' from rdb$database union all select -- 0
'', 'a*' from rdb$database union all select
'a', 'a*' from rdb$database union all select
'aaa', 'a*' from rdb$database union all select
'', 'a+' from rdb$database union all select -- 0
'a', 'a+' from rdb$database union all select
'aaa', 'a+' from rdb$database union all select
'', 'a?' from rdb$database union all select
'a', 'a?' from rdb$database union all select
'aaa', 'a?' from rdb$database union all select -- 0
'', 'a{2,}' from rdb$database union all select -- 0
'a', 'a{2,}' from rdb$database union all select -- 0
'aa', 'a{2,}' from rdb$database union all select
'aaa', 'a{2,}' from rdb$database union all select
'', 'a{2,4}' from rdb$database union all select -- 0
'a', 'a{2,4}' from rdb$database union all select -- 0
'aa', 'a{2,4}' from rdb$database union all select
'aaa', 'a{2,4}' from rdb$database union all select
'aaaa', 'a{2,4}' from rdb$database union all select
'aaaaa', 'a{2,4}' from rdb$database union all select -- 0
'', '_' from rdb$database union all select -- 0
'a', '_' from rdb$database union all select
'1', '_' from rdb$database union all select
'a1', '_' from rdb$database union all select -- 0
'', '%' from rdb$database union all select
'az', 'a%z' from rdb$database union all select
'a123z', 'a%z' from rdb$database union all select
'azx', 'a%z' from rdb$database union all select -- 0
'ab', '(ab){2}' from rdb$database union all select -- 0
'aabb', '(ab){2}' from rdb$database union all select -- 0
'abab', '(ab){2}' from rdb$database union all select
'b', '[abc]' from rdb$database union all select
'd', '[abc]' from rdb$database union all select -- 0
'9', '[0-9]' from rdb$database union all select
'9', '[0-8]' from rdb$database union all select -- 0
'b', '[^abc]' from rdb$database union all select -- 0
'd', '[^abc]' from rdb$database union all select
'3', '[[:DIGIT:]^3]' from rdb$database union all select -- 0
'4', '[[:DIGIT:]^3]' from rdb$database union all select
'4', '[[:DIGIT:]]' from rdb$database union all select
'a', '[[:DIGIT:]]' from rdb$database union all select -- 0
'4', '[^[:DIGIT:]]' from rdb$database union all select -- 0
'a', '[^[:DIGIT:]]' from rdb$database
);
"""
act_1 = isql_act('db_1', test_script_1, substitutions=substitutions_1)
expected_stdout_1 = """
STR WDWDWDWD
PTN ((DW)|(WD)){4}
str similar to %ptn% 1
subs(str similar to %ptn%) WDWDWDWD
STR AAXYAAXYAAAAXYAAAXYAA
PTN (AAXY|AAAX){2,}
str similar to %ptn% 1
subs(str similar to %ptn%) AAXYAAXY
STR YZZXYZZ0Z0YZZYZZYYZZZYZZ0Z0YZZ
PTN (0Z0(Y|Z)*){2}
str similar to %ptn% 1
subs(str similar to %ptn%) 0Z0YZZYZZYYZZZYZZ0Z0YZZ
STR AARARARAARARAR
PTN RA(AR){3}
str similar to %ptn% 1
subs(str similar to %ptn%) RAARARAR
STR eiavieieav
PTN (ie){2,}
str similar to %ptn% 1
subs(str similar to %ptn%) ieie
STR avieieavav
PTN (av|ie){2,}
str similar to %ptn% 1
subs(str similar to %ptn%) avieieavav
STR avieieieav
PTN ((av)|(ie)){2,}
str similar to %ptn% 1
subs(str similar to %ptn%) avieieieav
STR x/t
PTN %[/]t
str similar to ptn 1
subs(str similar to ptn) x/t
STR 2015-04-13
PTN [[:DIGIT:]]{4}[-][[:DIGIT:]]{2}[-][[:DIGIT:]]{2}
str similar to ptn 1
subs(str similar to ptn) 2015-04-13
STR WI-T3.0.0.31780 Firebird 3.0 Beta 2
PTN %[0-9]+.[0-9]+.[0-9]+((.?[0-9]+)*)[[:WHITESPACE:]]%
str similar to ptn 1
subs(str similar to ptn) WI-T3.0.0.31780 Firebird 3.0 Beta 2
STR m
PTN [p-k]
str similar to ptn 0
subs(str similar to ptn) <null>
STR 1
PTN (1|2){0,}
str similar to ptn 1
subs(str similar to ptn) 1
STR 1
PTN (1|2){0,1}
str similar to ptn 1
subs(str similar to ptn) 1
STR 1
PTN (1|2){1}
str similar to ptn 1
subs(str similar to ptn) 1
STR 123
PTN (1|12[3]?){1}
str similar to ptn 1
subs(str similar to ptn) 123
STR 123
PTN (1|12[3]?)+
str similar to ptn 1
subs(str similar to ptn) 123
STR ab
PTN ab|cd|efg
str similar to ptn 1
subs(str similar to ptn) ab
STR efg
PTN ab|cd|efg
str similar to ptn 1
subs(str similar to ptn) efg
STR a
PTN ab|cd|efg
str similar to ptn 0
subs(str similar to ptn) <null>
STR
PTN a*
str similar to ptn 1
subs(str similar to ptn)
STR a
PTN a*
str similar to ptn 1
subs(str similar to ptn) a
STR aaa
PTN a*
str similar to ptn 1
subs(str similar to ptn) aaa
STR
PTN a+
str similar to ptn 0
subs(str similar to ptn) <null>
STR a
PTN a+
str similar to ptn 1
subs(str similar to ptn) a
STR aaa
PTN a+
str similar to ptn 1
subs(str similar to ptn) aaa
STR
PTN a?
str similar to ptn 1
subs(str similar to ptn)
STR a
PTN a?
str similar to ptn 1
subs(str similar to ptn) a
STR aaa
PTN a?
str similar to ptn 0
subs(str similar to ptn) <null>
STR
PTN a{2,}
str similar to ptn 0
subs(str similar to ptn) <null>
STR a
PTN a{2,}
str similar to ptn 0
subs(str similar to ptn) <null>
STR aa
PTN a{2,}
str similar to ptn 1
subs(str similar to ptn) aa
STR aaa
PTN a{2,}
str similar to ptn 1
subs(str similar to ptn) aaa
STR
PTN a{2,4}
str similar to ptn 0
subs(str similar to ptn) <null>
STR a
PTN a{2,4}
str similar to ptn 0
subs(str similar to ptn) <null>
STR aa
PTN a{2,4}
str similar to ptn 1
subs(str similar to ptn) aa
STR aaa
PTN a{2,4}
str similar to ptn 1
subs(str similar to ptn) aaa
STR aaaa
PTN a{2,4}
str similar to ptn 1
subs(str similar to ptn) aaaa
STR aaaaa
PTN a{2,4}
str similar to ptn 0
subs(str similar to ptn) <null>
STR
PTN _
str similar to ptn 0
subs(str similar to ptn) <null>
STR a
PTN _
str similar to ptn 1
subs(str similar to ptn) a
STR 1
PTN _
str similar to ptn 1
subs(str similar to ptn) 1
STR a1
PTN _
str similar to ptn 0
subs(str similar to ptn) <null>
STR
PTN %
str similar to ptn 1
subs(str similar to ptn)
STR az
PTN a%z
str similar to ptn 1
subs(str similar to ptn) az
STR a123z
PTN a%z
str similar to ptn 1
subs(str similar to ptn) a123z
STR azx
PTN a%z
str similar to ptn 0
subs(str similar to ptn) <null>
STR ab
PTN (ab){2}
str similar to ptn 0
subs(str similar to ptn) <null>
STR aabb
PTN (ab){2}
str similar to ptn 0
subs(str similar to ptn) <null>
STR abab
PTN (ab){2}
str similar to ptn 1
subs(str similar to ptn) abab
STR b
PTN [abc]
str similar to ptn 1
subs(str similar to ptn) b
STR d
PTN [abc]
str similar to ptn 0
subs(str similar to ptn) <null>
STR 9
PTN [0-9]
str similar to ptn 1
subs(str similar to ptn) 9
STR 9
PTN [0-8]
str similar to ptn 0
subs(str similar to ptn) <null>
STR b
PTN [^abc]
str similar to ptn 0
subs(str similar to ptn) <null>
STR d
PTN [^abc]
str similar to ptn 1
subs(str similar to ptn) d
STR 3
PTN [[:DIGIT:]^3]
str similar to ptn 0
subs(str similar to ptn) <null>
STR 4
PTN [[:DIGIT:]^3]
str similar to ptn 1
subs(str similar to ptn) 4
STR 4
PTN [[:DIGIT:]]
str similar to ptn 1
subs(str similar to ptn) 4
STR a
PTN [[:DIGIT:]]
str similar to ptn 0
subs(str similar to ptn) <null>
STR 4
PTN [^[:DIGIT:]]
str similar to ptn 0
subs(str similar to ptn) <null>
STR a
PTN [^[:DIGIT:]]
str similar to ptn 1
subs(str similar to ptn) a
"""
@pytest.mark.version('>=3.0')
def test_1(act_1: Action):
act_1.expected_stdout = expected_stdout_1
act_1.execute()
assert act_1.clean_expected_stdout == act_1.clean_stdout
| 33.770161 | 110 | 0.379463 | 1,792 | 16,750 | 3.522879 | 0.076451 | 0.205924 | 0.243307 | 0.299382 | 0.815935 | 0.80754 | 0.793284 | 0.75495 | 0.667195 | 0.584508 | 0 | 0.049987 | 0.527045 | 16,750 | 495 | 111 | 33.838384 | 0.746907 | 0.014328 | 0 | 0.585938 | 0 | 0.013021 | 0.970906 | 0.026246 | 0 | 0 | 0 | 0 | 0.002604 | 1 | 0.002604 | false | 0 | 0.005208 | 0 | 0.007813 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
b44b58284aeff469b18dbc4d4d63f2922a2a2faf | 9,437 | py | Python | halotools_ia/correlation_functions/tests/test_ii_plus_projected.py | nvanalfen/halotools_ia | 22ecc3cc9dd8c7688df67e36466c592ec5043ea8 | [
"BSD-3-Clause"
] | null | null | null | halotools_ia/correlation_functions/tests/test_ii_plus_projected.py | nvanalfen/halotools_ia | 22ecc3cc9dd8c7688df67e36466c592ec5043ea8 | [
"BSD-3-Clause"
] | null | null | null | halotools_ia/correlation_functions/tests/test_ii_plus_projected.py | nvanalfen/halotools_ia | 22ecc3cc9dd8c7688df67e36466c592ec5043ea8 | [
"BSD-3-Clause"
] | 2 | 2021-07-13T17:51:55.000Z | 2021-07-27T19:18:45.000Z | """
Module providing unit-testing for the `~halotools.mock_observables.alignments.w_gplus` function.
"""
from __future__ import absolute_import, division, print_function, unicode_literals
import numpy as np
import warnings
import pytest
from astropy.utils.misc import NumpyRNGContext
from ..ii_plus_projected import ii_plus_projected
from halotools.custom_exceptions import HalotoolsError
slow = pytest.mark.slow
__all__ = ('test_wplus_returned_shape', 'test_w_plus_threading', 'test_orientation_usage', 'test_round_result', 'test_position_usage')
fixed_seed = 43
def test_w_plus_returned_shape():
"""
make sure the result that is returned has the correct shape
"""
ND = 100
NR = 100
with NumpyRNGContext(fixed_seed):
sample1 = np.random.random((ND, 3))
randoms = np.random.random((NR, 3))
period = np.array([1.0, 1.0, 1.0])
rp_bins = np.linspace(0.001, 0.3, 5)
pi_max = 0.2
random_orientation = np.random.random((len(sample1), 2))
random_ellipticities = np.random.random((len(sample1)))
# analytic randoms
result_1 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, period=period, num_threads=1)
assert np.shape(result_1) == (len(rp_bins)-1, )
result_2 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, period=period, num_threads=3)
assert np.shape(result_2) == (len(rp_bins)-1, )
# real randoms
result_1 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, randoms1=randoms, randoms2=randoms, period=period, num_threads=1)
assert np.shape(result_1) == (len(rp_bins)-1, )
result_2 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, randoms1=randoms, randoms2=randoms, period=period, num_threads=3)
assert np.shape(result_2) == (len(rp_bins)-1, )
def test_w_plus_threading():
"""
test to make sure the results are consistent when num_threads=1 or >1
"""
ND = 100
NR = 100
with NumpyRNGContext(fixed_seed):
sample1 = np.random.random((ND, 3))
randoms = np.random.random((NR, 3))
period = np.array([1.0, 1.0, 1.0])
rp_bins = np.linspace(0.001, 0.3, 5)
pi_max = 0.2
random_orientation = np.random.random((len(sample1), 2))
random_ellipticities = np.random.random((len(sample1)))
# analytic randoms
result_1 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, period=period, num_threads=1)
result_2 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, period=period, num_threads=3)
assert np.allclose(result_1, result_2)
# real randoms
result_1 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, randoms1=randoms, randoms2=randoms, period=period, num_threads=1)
result_2 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, randoms1=randoms, randoms2=randoms, period=period, num_threads=3)
assert np.allclose(result_1, result_2)
def test_orientation_usage():
"""
test to make sure the results are sensitive to the orientations passed in
"""
ND = 1000
with NumpyRNGContext(fixed_seed):
sample1 = np.random.random((ND, 3))
period = np.array([1.0, 1.0, 1.0])
rp_bins = np.linspace(0.001, 0.3, 5)
pi_max = 0.2
random_orientation_1 = np.random.random((len(sample1), 2))
random_orientation_2 = np.random.random((len(sample1), 2))
random_ellipticities = np.random.random((len(sample1)))
# analytic randoms
result_1 = ii_plus_projected(sample1, random_orientation_1, random_ellipticities, sample1,
random_orientation_1, random_ellipticities, rp_bins, pi_max, period=period, num_threads=1, estimator='Natural')
result_2 = ii_plus_projected(sample1, random_orientation_2, random_ellipticities, sample1,
random_orientation_2, random_ellipticities, rp_bins, pi_max, period=period, num_threads=1, estimator='Natural')
assert not np.allclose(result_1, result_2)
def test_ellipticity_usage():
"""
test to make sure the results are sensitive to the ellipticities passed in
"""
ND = 1000
with NumpyRNGContext(fixed_seed):
sample1 = np.random.random((ND, 3))
period = np.array([1.0, 1.0, 1.0])
rp_bins = np.linspace(0.001, 0.3, 5)
pi_max = 0.2
random_orientation = np.random.random((len(sample1), 2))
random_ellipticities_1 = np.random.random((len(sample1)))
random_ellipticities_2 = np.random.random((len(sample1)))
# analytic randoms
result_1 = ii_plus_projected(sample1, random_orientation, random_ellipticities_1, sample1,
random_orientation, random_ellipticities_1, rp_bins, pi_max, period=period, num_threads=1, estimator='Natural')
result_2 = ii_plus_projected(sample1, random_orientation, random_ellipticities_2, sample1,
random_orientation, random_ellipticities_2, rp_bins, pi_max, period=period, num_threads=1, estimator='Natural')
assert not np.allclose(result_1, result_2)
def test_rpbinning_usage():
"""
test to make sure the results are sensitive to the rp bin edges passed in
"""
ND = 1000
with NumpyRNGContext(fixed_seed):
sample1 = np.random.random((ND, 3))
period = np.array([1.0, 1.0, 1.0])
rp_bins_1 = np.linspace(0.001, 0.3, 5)
rp_bins_2 = np.linspace(0.001, 0.1, 5)
pi_max = 0.2
random_orientation = np.random.random((len(sample1), 2))
random_ellipticities = np.random.random((len(sample1)))
# analytic randoms
result_1 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins_1, pi_max, period=period, num_threads=1, estimator='Natural')
result_2 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins_2, pi_max, period=period, num_threads=1, estimator='Natural')
assert not np.allclose(result_1, result_2)
def test_integration_range():
"""
test to make sure the results are sensitive to the line of sight integration range
"""
ND = 1000
with NumpyRNGContext(fixed_seed):
sample1 = np.random.random((ND, 3))
period = np.array([1.0, 1.0, 1.0])
rp_bins = np.linspace(0.001, 0.3, 5)
pi_max_1 = 0.05
pi_max_2 = 0.2
random_orientation = np.random.random((len(sample1), 2))
random_ellipticities = np.random.random((len(sample1)))
# analytic randoms
result_1 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max_1, period=period, num_threads=1, estimator='Natural')
result_2 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max_2, period=period, num_threads=1, estimator='Natural')
assert not np.allclose(result_1, result_2)
def test_round_result():
"""
test to make sure the projected correlation comes out as zero in the case of non-elliptical input
"""
ND = 100
NR = 100
with NumpyRNGContext(fixed_seed):
sample1 = np.random.random((ND, 3))
sample2 = np.random.random((ND, 3))
period = np.array([1.0, 1.0, 1.0])
rp_bins = np.linspace(0.001, 0.3, 5)
pi_max = 0.2
random_orientation_1 = np.random.random((len(sample1), 2))
random_orientation_2 = np.random.random((len(sample1), 2))
zero_ellipticities = np.zeros((len(sample1)))
# analytic randoms
result_1 = ii_plus_projected(sample1, random_orientation_1, zero_ellipticities, sample2,
random_orientation_2, zero_ellipticities, rp_bins, pi_max, period=period, num_threads=1)
assert np.allclose(result_1, 0.0)
def test_position_usage():
"""
test to make sure the results are sensitive to the position coordinates passed in
"""
ND = 1000
with NumpyRNGContext(fixed_seed):
sample1 = np.random.random((ND, 3))
sample2 = np.random.random((ND, 3))
period = np.array([1.0, 1.0, 1.0])
rp_bins = np.linspace(0.001, 0.3, 5)
pi_max = 0.2
random_orientation = np.random.random((len(sample1), 2))
random_ellipticities = np.random.random((len(sample1)))
# analytic randoms
result_1 = ii_plus_projected(sample1, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, period=period, num_threads=1, estimator='Natural')
result_2 = ii_plus_projected(sample2, random_orientation, random_ellipticities, sample1,
random_orientation, random_ellipticities, rp_bins, pi_max, period=period, num_threads=1, estimator='Natural')
assert not np.allclose(result_1, result_2)
| 35.081784 | 134 | 0.715588 | 1,314 | 9,437 | 4.885845 | 0.094368 | 0.127103 | 0.134579 | 0.179439 | 0.865732 | 0.852181 | 0.820249 | 0.813396 | 0.812305 | 0.812305 | 0 | 0.045823 | 0.176751 | 9,437 | 268 | 135 | 35.212687 | 0.780538 | 0.09272 | 0 | 0.715278 | 0 | 0 | 0.020658 | 0.008073 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.055556 | false | 0 | 0.048611 | 0 | 0.104167 | 0.006944 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c3353fa1d5c9765195ff9a75b9c60dbe09551e6b | 2,709 | py | Python | insights/parsers/tests/test_proc_stat.py | lhuett/insights-core | 1c84eeffc037f85e2bbf60c9a302c83aa1a50cf8 | [
"Apache-2.0"
] | 121 | 2017-05-30T20:23:25.000Z | 2022-03-23T12:52:15.000Z | insights/parsers/tests/test_proc_stat.py | lhuett/insights-core | 1c84eeffc037f85e2bbf60c9a302c83aa1a50cf8 | [
"Apache-2.0"
] | 1,977 | 2017-05-26T14:36:03.000Z | 2022-03-31T10:38:53.000Z | insights/parsers/tests/test_proc_stat.py | lhuett/insights-core | 1c84eeffc037f85e2bbf60c9a302c83aa1a50cf8 | [
"Apache-2.0"
] | 244 | 2017-05-30T20:22:57.000Z | 2022-03-26T10:09:39.000Z | import doctest
from insights.parsers import proc_stat
from insights.parsers.proc_stat import ProcStat
from insights.tests import context_wrap
PROC_STAT = """
cpu 32270961 89036 23647730 1073132344 1140756 0 1522035 18738206 0 0
cpu0 3547155 11248 2563031 135342787 113432 0 199615 2199379 0 0
cpu1 4660934 10954 3248126 132271933 120282 0 279870 2660186 0 0
cpu2 4421035 10729 3306081 132914999 126705 0 194141 2505565 0 0
cpu3 4224551 10633 3139695 133634676 121035 0 181213 2380738 0 0
cpu4 3985452 11151 2946570 134064686 205568 0 165839 2478471 0 0
cpu5 3914912 11396 2896447 134635676 117341 0 164794 2260011 0 0
cpu6 3802544 11418 2817453 134878674 222855 0 182738 2150276 0 0
cpu7 3714375 11503 2730323 135388911 113534 0 153821 2103576 0 0
intr 21359029 22 106 0 0 0 0 3 0 1 0 16 155 357 0 0 671261 0 0 0 0 0 0 0 0 0 0 0 32223 0 4699385 2 0 8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
ctxt 17852681
btime 1542179825
processes 19212
procs_running 1
procs_blocked 0
softirq 11867930 1 3501158 6 4705528 368244 0 79 2021509 0 1271405
""".strip()
def test_proc_stat():
proc_stat = ProcStat(context_wrap(PROC_STAT))
assert proc_stat.btime == '1542179825'
assert proc_stat.softirq_total == 11867930
assert proc_stat.cpu_percentage == '6.73%'
assert proc_stat.btime == '1542179825'
assert proc_stat.ctxt == 17852681
assert proc_stat.intr_total == 21359029
assert proc_stat.processes == 19212
assert proc_stat.procs_running == 1
assert proc_stat.procs_blocked == 0
def test_proc_stat_doc_examples():
env = {
'proc_stat': ProcStat(
context_wrap(PROC_STAT))
}
failed, total = doctest.testmod(proc_stat, globs=env)
assert failed == 0
| 58.891304 | 1,140 | 0.662975 | 788 | 2,709 | 2.23731 | 0.163706 | 0.613727 | 0.898469 | 1.191151 | 0.389109 | 0.389109 | 0.389109 | 0.349404 | 0.300624 | 0.300624 | 0 | 0.626667 | 0.307863 | 2,709 | 45 | 1,141 | 60.2 | 0.3136 | 0 | 0 | 0.05 | 0 | 0.025 | 0.706165 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.05 | false | 0 | 0.1 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c34335215cc8419f919dc87cf2480c4e206e9d74 | 2,518 | py | Python | energyinanutshell.py | GamerCallsdev/Random-Python-Code-GC | 0f8a5b2b397daf5300ac48e2564890d5a615b7bb | [
"CC0-1.0"
] | null | null | null | energyinanutshell.py | GamerCallsdev/Random-Python-Code-GC | 0f8a5b2b397daf5300ac48e2564890d5a615b7bb | [
"CC0-1.0"
] | null | null | null | energyinanutshell.py | GamerCallsdev/Random-Python-Code-GC | 0f8a5b2b397daf5300ac48e2564890d5a615b7bb | [
"CC0-1.0"
] | null | null | null | import time
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
print("positive energy used")
time.sleep(1)
print("positive turns into negative")
time.sleep(1)
print("left over energy from positive goes to negative")
time.sleep(1)
print("negative turns into positive")
time.sleep(1)
| 26.787234 | 57 | 0.737887 | 387 | 2,518 | 4.801034 | 0.041344 | 0.213132 | 0.236814 | 0.347147 | 0.994618 | 0.994618 | 0.994618 | 0.994618 | 0.994618 | 0.994618 | 0 | 0.020314 | 0.139793 | 2,518 | 93 | 58 | 27.075269 | 0.837488 | 0 | 0 | 0.988764 | 0 | 0 | 0.558399 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.011236 | 0 | 0.011236 | 0.494382 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 12 |
c3614680beb5bfe18b5848bebbf9aaf5300ce931 | 23,319 | py | Python | release/stubs/Microsoft/Win32/SafeHandles.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs/Microsoft/Win32/SafeHandles.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs/Microsoft/Win32/SafeHandles.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | # encoding: utf-8
# module Microsoft.Win32.SafeHandles calls itself SafeHandles
# from mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089, System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
# by generator 1.145
""" NamespaceTracker represent a CLS namespace. """
# no imports
# no functions
# classes
class CriticalHandleMinusOneIsInvalid(CriticalHandle, IDisposable):
""" Provides a base class for Win32 critical handle implementations in which the value of -1 indicates an invalid handle. """
def Dispose(self):
"""
Dispose(self: CriticalHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.CriticalHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
"""
ReleaseHandle(self: CriticalHandle) -> bool
When overridden in a derived class, executes the code required to free the handle.
Returns: true if the handle is released successfully; otherwise, in the event of a catastrophic failure,
false. In this case, it generates a releaseHandleFailed MDA Managed Debugging Assistant.
"""
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: CriticalHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
IsInvalid = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a value that indicates whether the handle is invalid.
Get: IsInvalid(self: CriticalHandleMinusOneIsInvalid) -> bool
"""
handle = None
class CriticalHandleZeroOrMinusOneIsInvalid(CriticalHandle, IDisposable):
""" Provides a base class for Win32 critical handle implementations in which the value of either 0 or -1 indicates an invalid handle. """
def Dispose(self):
"""
Dispose(self: CriticalHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.CriticalHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
"""
ReleaseHandle(self: CriticalHandle) -> bool
When overridden in a derived class, executes the code required to free the handle.
Returns: true if the handle is released successfully; otherwise, in the event of a catastrophic failure,
false. In this case, it generates a releaseHandleFailed MDA Managed Debugging Assistant.
"""
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: CriticalHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
IsInvalid = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a value that indicates whether the handle is invalid.
Get: IsInvalid(self: CriticalHandleZeroOrMinusOneIsInvalid) -> bool
"""
handle = None
class SafeAccessTokenHandle(SafeHandle, IDisposable):
""" SafeAccessTokenHandle(handle: IntPtr) """
def Dispose(self):
"""
Dispose(self: SafeHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.SafeHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
""" ReleaseHandle(self: SafeAccessTokenHandle) -> bool """
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: SafeHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, handle):
""" __new__(cls: type, handle: IntPtr) """
pass
IsInvalid = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Get: IsInvalid(self: SafeAccessTokenHandle) -> bool
"""
handle = None
InvalidHandle = None
class SafeHandleZeroOrMinusOneIsInvalid(SafeHandle, IDisposable):
""" Provides a base class for Win32 safe handle implementations in which the value of either 0 or -1 indicates an invalid handle. """
def Dispose(self):
"""
Dispose(self: SafeHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.SafeHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
"""
ReleaseHandle(self: SafeHandle) -> bool
When overridden in a derived class, executes the code required to free the handle.
Returns: true if the handle is released successfully; otherwise, in the event of a catastrophic failure,
false. In this case, it generates a releaseHandleFailed MDA Managed Debugging Assistant.
"""
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: SafeHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, *args): # cannot find CLR constructor
""" __new__(cls: type, ownsHandle: bool) """
pass
IsInvalid = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a value that indicates whether the handle is invalid.
Get: IsInvalid(self: SafeHandleZeroOrMinusOneIsInvalid) -> bool
"""
handle = None
class SafeFileHandle(SafeHandleZeroOrMinusOneIsInvalid, IDisposable):
"""
Represents a wrapper class for a file handle.
SafeFileHandle(preexistingHandle: IntPtr, ownsHandle: bool)
"""
def Dispose(self):
"""
Dispose(self: SafeHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.SafeHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
""" ReleaseHandle(self: SafeFileHandle) -> bool """
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: SafeHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, preexistingHandle, ownsHandle):
""" __new__(cls: type, preexistingHandle: IntPtr, ownsHandle: bool) """
pass
handle = None
class SafeHandleMinusOneIsInvalid(SafeHandle, IDisposable):
""" Provides a base class for Win32 safe handle implementations in which the value of -1 indicates an invalid handle. """
def Dispose(self):
"""
Dispose(self: SafeHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.SafeHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
"""
ReleaseHandle(self: SafeHandle) -> bool
When overridden in a derived class, executes the code required to free the handle.
Returns: true if the handle is released successfully; otherwise, in the event of a catastrophic failure,
false. In this case, it generates a releaseHandleFailed MDA Managed Debugging Assistant.
"""
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: SafeHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, *args): # cannot find CLR constructor
""" __new__(cls: type, ownsHandle: bool) """
pass
IsInvalid = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a value that indicates whether the handle is invalid.
Get: IsInvalid(self: SafeHandleMinusOneIsInvalid) -> bool
"""
handle = None
class SafeProcessHandle(SafeHandleZeroOrMinusOneIsInvalid, IDisposable):
""" SafeProcessHandle(existingHandle: IntPtr, ownsHandle: bool) """
def Dispose(self):
"""
Dispose(self: SafeHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.SafeHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
""" ReleaseHandle(self: SafeProcessHandle) -> bool """
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: SafeHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, existingHandle, ownsHandle):
""" __new__(cls: type, existingHandle: IntPtr, ownsHandle: bool) """
pass
handle = None
class SafeRegistryHandle(SafeHandleZeroOrMinusOneIsInvalid, IDisposable):
"""
Represents a safe handle to the Windows registry.
SafeRegistryHandle(preexistingHandle: IntPtr, ownsHandle: bool)
"""
def Dispose(self):
"""
Dispose(self: SafeHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.SafeHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
""" ReleaseHandle(self: SafeRegistryHandle) -> bool """
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: SafeHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, preexistingHandle, ownsHandle):
""" __new__(cls: type, preexistingHandle: IntPtr, ownsHandle: bool) """
pass
handle = None
class SafeWaitHandle(SafeHandleZeroOrMinusOneIsInvalid, IDisposable):
"""
Represents a wrapper class for a wait handle.
SafeWaitHandle(existingHandle: IntPtr, ownsHandle: bool)
"""
def Dispose(self):
"""
Dispose(self: SafeHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.SafeHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
""" ReleaseHandle(self: SafeWaitHandle) -> bool """
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: SafeHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, existingHandle, ownsHandle):
""" __new__(cls: type, existingHandle: IntPtr, ownsHandle: bool) """
pass
handle = None
class SafeX509ChainHandle(SafeHandleZeroOrMinusOneIsInvalid, IDisposable):
# no doc
def Dispose(self):
"""
Dispose(self: SafeHandle, disposing: bool)
Releases the unmanaged resources used by the System.Runtime.InteropServices.SafeHandle class
specifying whether to perform a normal dispose operation.
disposing: true for a normal dispose operation; false to finalize the handle.
"""
pass
def ReleaseHandle(self, *args): # cannot find CLR method
""" ReleaseHandle(self: SafeX509ChainHandle) -> bool """
pass
def SetHandle(self, *args): # cannot find CLR method
"""
SetHandle(self: SafeHandle, handle: IntPtr)
Sets the handle to the specified pre-existing handle.
handle: The pre-existing handle to use.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
handle = None
| 30.522251 | 221 | 0.613019 | 2,411 | 23,319 | 5.616342 | 0.066777 | 0.030722 | 0.053763 | 0.069123 | 0.916402 | 0.916402 | 0.916402 | 0.916402 | 0.905915 | 0.897792 | 0 | 0.003651 | 0.307089 | 23,319 | 763 | 222 | 30.562254 | 0.834385 | 0.606801 | 0 | 0.903955 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.378531 | false | 0.378531 | 0 | 0 | 0.525424 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 10 |
5ede2395746378680c445dabc1de8cf46e9c3b68 | 14,412 | py | Python | test/pyaz/vmss/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | test/pyaz/vmss/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | 9 | 2021-09-24T16:37:24.000Z | 2021-12-24T00:39:19.000Z | test/pyaz/vmss/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | import json, subprocess
from .. pyaz_utils import get_cli_name, get_params
def create(name, resource_group, image=None, disable_overprovision=None, instance_count=None, location=None, tags=None, upgrade_policy_mode=None, validate=None, admin_username=None, admin_password=None, authentication_type=None, vm_sku=None, ssh_dest_key_path=None, ssh_key_values=None, generate_ssh_keys=None, load_balancer=None, lb_sku=None, app_gateway=None, app_gateway_subnet_address_prefix=None, app_gateway_sku=None, app_gateway_capacity=None, backend_pool_name=None, lb_nat_pool_name=None, backend_port=None, health_probe=None, public_ip_address=None, public_ip_address_allocation=None, public_ip_address_dns_name=None, accelerated_networking=None, public_ip_per_vm=None, vm_domain_name=None, dns_servers=None, nsg=None, os_disk_caching=None, data_disk_caching=None, storage_container_name=None, storage_sku=None, os_type=None, os_disk_name=None, use_unmanaged_disk=None, data_disk_sizes_gb=None, __DISK_INFO=None, vnet_name=None, vnet_address_prefix=None, subnet=None, subnet_address_prefix=None, __OS_OFFER=None, __OS_PUBLISHER=None, __OS_SKU=None, __OS_VERSION=None, __LOAD_BALANCER_TYPE=None, __APP_GATEWAY_TYPE=None, __VNET_TYPE=None, __PUBLIC_IP_ADDRESS_TYPE=None, __STORAGE_PROFILE=None, single_placement_group=None, custom_data=None, secrets=None, platform_fault_domain_count=None, plan_name=None, plan_product=None, plan_publisher=None, plan_promotion_code=None, license_type=None, assign_identity=None, scope=None, role=None, __IDENTITY_ROLE_ID=None, zones=None, priority=None, eviction_policy=None, asgs=None, ultra_ssd_enabled=None, ephemeral_os_disk=None, ppg=None, __AUX_SUBSCRIPTIONS=None, terminate_notification_time=None, max_price=None, computer_name_prefix=None, orchestration_mode=None, scale_in_policy=None, os_disk_encryption_set=None, data_disk_encryption_sets=None, data_disk_iops=None, data_disk_mbps=None, automatic_repairs_grace_period=None, specialized=None, os_disk_size_gb=None, encryption_at_host=None, host_group=None, max_batch_instance_percent=None, max_unhealthy_instance_percent=None, max_unhealthy_upgraded_instance_percent=None, pause_time_between_batches=None, enable_cross_zone_upgrade=None, prioritize_unhealthy_instances=None, edge_zone=None, user_data=None, network_api_version=None, enable_spot_restore=None, spot_restore_timeout=None, capacity_reservation_group=None, no_wait=None):
params = get_params(locals())
command = "az vmss create " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def deallocate(resource_group, name, instance_ids=None, no_wait=None):
params = get_params(locals())
command = "az vmss deallocate " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def delete(resource_group, name, force_deletion=None, no_wait=None):
params = get_params(locals())
command = "az vmss delete " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def delete_instances(resource_group, name, instance_ids, no_wait=None):
params = get_params(locals())
command = "az vmss delete-instances " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def get_instance_view(resource_group, name, instance_id=None):
params = get_params(locals())
command = "az vmss get-instance-view " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def list(resource_group=None):
params = get_params(locals())
command = "az vmss list " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def list_instances(resource_group, name, filter=None, select=None, expand=None):
params = get_params(locals())
command = "az vmss list-instances " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def list_instance_connection_info(resource_group, name):
params = get_params(locals())
command = "az vmss list-instance-connection-info " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def list_instance_public_ips(resource_group, name):
params = get_params(locals())
command = "az vmss list-instance-public-ips " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def list_skus(resource_group, name):
params = get_params(locals())
command = "az vmss list-skus " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def reimage(resource_group, name, instance_id=None, no_wait=None):
params = get_params(locals())
command = "az vmss reimage " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def perform_maintenance(resource_group, name, vm_instance_i_ds=None):
params = get_params(locals())
command = "az vmss perform-maintenance " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def restart(resource_group, name, instance_ids=None, no_wait=None):
params = get_params(locals())
command = "az vmss restart " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def scale(resource_group, name, new_capacity, no_wait=None):
params = get_params(locals())
command = "az vmss scale " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def show(resource_group, name, instance_id=None, include_user_data=None):
params = get_params(locals())
command = "az vmss show " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def simulate_eviction(resource_group, name, instance_id):
params = get_params(locals())
command = "az vmss simulate-eviction " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def start(resource_group, name, instance_ids=None, no_wait=None):
params = get_params(locals())
command = "az vmss start " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def stop(resource_group, name, instance_ids=None, skip_shutdown=None, no_wait=None):
params = get_params(locals())
command = "az vmss stop " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def update(resource_group, name, instance_id=None, license_type=None, protect_from_scale_in=None, protect_from_scale_set_actions=None, enable_terminate_notification=None, terminate_notification_time=None, ultra_ssd_enabled=None, scale_in_policy=None, priority=None, max_price=None, ppg=None, enable_automatic_repairs=None, automatic_repairs_grace_period=None, max_batch_instance_percent=None, max_unhealthy_instance_percent=None, max_unhealthy_upgraded_instance_percent=None, pause_time_between_batches=None, enable_cross_zone_upgrade=None, prioritize_unhealthy_instances=None, user_data=None, enable_spot_restore=None, spot_restore_timeout=None, capacity_reservation_group=None, set=None, add=None, remove=None, force_string=None, no_wait=None):
params = get_params(locals())
command = "az vmss update " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def update_instances(resource_group, name, instance_ids, no_wait=None):
params = get_params(locals())
command = "az vmss update-instances " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def wait(resource_group, name, instance_id=None, __INCLUDE_USER_DATA=None, timeout=None, interval=None, deleted=None, created=None, updated=None, exists=None, custom=None):
params = get_params(locals())
command = "az vmss wait " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def get_os_upgrade_history(resource_group, name):
params = get_params(locals())
command = "az vmss get-os-upgrade-history " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def set_orchestration_service_state(resource_group, name, service_name, action, no_wait=None):
params = get_params(locals())
command = "az vmss set-orchestration-service-state " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
| 44.208589 | 2,335 | 0.706009 | 1,868 | 14,412 | 5.26606 | 0.115096 | 0.065467 | 0.046762 | 0.0491 | 0.765579 | 0.747992 | 0.729389 | 0.725933 | 0.710278 | 0.701738 | 0 | 0.003907 | 0.183111 | 14,412 | 325 | 2,336 | 44.344615 | 0.831649 | 0 | 0 | 0.840532 | 0 | 0 | 0.049889 | 0.007355 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076412 | false | 0.003322 | 0.006645 | 0 | 0.159468 | 0.229236 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6f386358e887210ff4ec131e19757324b09874e5 | 145 | py | Python | devcon_iv_ethpm/solidity_manifest.py | njgheorghita/devcon-iv-ethpm | 3cbd1dd64fdbfb787f89cd369acb6f3d36893817 | [
"MIT"
] | 4 | 2018-11-01T12:17:09.000Z | 2018-11-01T13:58:27.000Z | devcon_iv_ethpm/solidity_manifest.py | njgheorghita/devcon-iv-ethpm | 3cbd1dd64fdbfb787f89cd369acb6f3d36893817 | [
"MIT"
] | null | null | null | devcon_iv_ethpm/solidity_manifest.py | njgheorghita/devcon-iv-ethpm | 3cbd1dd64fdbfb787f89cd369acb6f3d36893817 | [
"MIT"
] | null | null | null | # To generate a manifest for your solidity contracts, follow the guide in `devcon_iv_ethpm/solidity_contracts/escrow/escrow_manifest_creation.py
| 72.5 | 144 | 0.855172 | 22 | 145 | 5.409091 | 0.818182 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096552 | 145 | 1 | 145 | 145 | 0.908397 | 0.97931 | 0 | null | 1 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6f41cabefd6df9da5216b5a6ee9faa2f29501ca9 | 1,767 | py | Python | python/phonenumbers/shortdata/region_TR.py | squadrun/python-phonenumbers | b60ebefd6e2fbe058b742a5fd527e5aaa6cf3203 | [
"Apache-2.0"
] | 1 | 2020-03-25T20:40:32.000Z | 2020-03-25T20:40:32.000Z | python/phonenumbers/shortdata/region_TR.py | squadrun/python-phonenumbers | b60ebefd6e2fbe058b742a5fd527e5aaa6cf3203 | [
"Apache-2.0"
] | 5 | 2020-03-24T16:37:25.000Z | 2021-06-10T21:24:54.000Z | upibo-venv/Lib/site-packages/phonenumbers/shortdata/region_TR.py | smbpgroup/upibo | 625dcda9f9692c62aeb9fe8f7123a5d407c610ae | [
"BSD-3-Clause"
] | 1 | 2020-09-08T14:45:34.000Z | 2020-09-08T14:45:34.000Z | """Auto-generated file, do not edit by hand. TR metadata"""
from ..phonemetadata import NumberFormat, PhoneNumberDesc, PhoneMetadata
PHONE_METADATA_TR = PhoneMetadata(id='TR', country_code=None, international_prefix=None,
general_desc=PhoneNumberDesc(national_number_pattern='[1-9]\\d{2,4}', possible_length=(3, 4, 5)),
toll_free=PhoneNumberDesc(national_number_pattern='1(?:22|3[126]|4[04]|5[16-9]|6[18]|77|83)', example_number='183', possible_length=(3,)),
emergency=PhoneNumberDesc(national_number_pattern='1(?:1[02]|55)', example_number='112', possible_length=(3,)),
short_code=PhoneNumberDesc(national_number_pattern='1(?:1(?:[0239]|811)|2[126]|3(?:[126]|37?|[58]6|65)|4(?:[014]|71)|5(?:[135-9]|07|78)|6(?:[02]6|[1389]|99)|7[0-79]|8(?:\\d|63|95))|2(?:077|268|4(?:17|23)|5(?:7[26]|82)|6[14]4|8\\d{2}|9(?:30|89))|3(?:0(?:05|72)|353|4(?:06|30|64)|502|674|747|851|9(?:1[29]|60))|4(?:0(?:25|3[12]|[47]2)|3(?:3[13]|[89]1)|439|5(?:43|55)|717|832)|5(?:145|290|[4-6]\\d{2}|772|833|9(?:[06]1|92))|6(?:236|6(?:12|39|8[59])|769)|7890|8(?:688|7(?:28|65)|85[06])|9(?:159|290)', example_number='112', possible_length=(3, 4, 5)),
standard_rate=PhoneNumberDesc(national_number_pattern='2850|5420', example_number='5420', possible_length=(4,)),
sms_services=PhoneNumberDesc(national_number_pattern='1(?:3(?:37|[58]6|65)|4(?:4|71)|5(?:07|78)|6(?:[02]6|99)|8(?:3|63|95))|2(?:077|268|4(?:17|23)|5(?:7[26]|82)|6[14]4|8\\d{2}|9(?:30|89))|3(?:0(?:05|72)|353|4(?:06|30|64)|502|674|747|851|9(?:1[29]|60))|4(?:0(?:25|3[12]|[47]2)|3(?:3[13]|[89]1)|439|5(?:43|55)|717|832)|5(?:145|290|[4-6]\\d{2}|772|833|9(?:[06]1|92))|6(?:236|6(?:12|39|8[59])|769)|7890|8(?:688|7(?:28|65)|85[06])|9(?:159|290)', example_number='5420', possible_length=(3, 4)),
short_data=True)
| 147.25 | 551 | 0.632145 | 352 | 1,767 | 3.079545 | 0.34375 | 0.127306 | 0.160517 | 0.199262 | 0.630996 | 0.436347 | 0.321033 | 0.321033 | 0.321033 | 0.321033 | 0 | 0.274336 | 0.040747 | 1,767 | 11 | 552 | 160.636364 | 0.365192 | 0.029994 | 0 | 0 | 1 | 0.333333 | 0.539227 | 0.507611 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6f4fb9b94d6dad9864405598b215e5170479ed43 | 8,323 | py | Python | VL-T5/src/classification_model.py | ylsung/VL_adapter | 287409f383f89a11764fc45806864693a4d3e498 | [
"MIT"
] | 41 | 2021-12-14T02:50:16.000Z | 2022-03-30T07:41:19.000Z | VL-T5/src/classification_model.py | ylsung/VL_adapter | 287409f383f89a11764fc45806864693a4d3e498 | [
"MIT"
] | 1 | 2022-01-07T03:31:47.000Z | 2022-03-25T00:31:53.000Z | VL-T5/src/classification_model.py | ylsung/VL_adapter | 287409f383f89a11764fc45806864693a4d3e498 | [
"MIT"
] | 2 | 2021-12-14T03:10:18.000Z | 2022-03-29T04:59:23.000Z | from pathlib import Path
import torch
import torch.nn as nn
import torch.nn.functional as F
import numpy as np
from modeling_t5 import VLT5
class VLT5Classification(VLT5):
def __init__(self, config):
super().__init__(config)
if config.classifier:
self.answer_head = nn.Sequential(
nn.Linear(config.d_model, config.d_model * 2),
nn.GELU(),
nn.LayerNorm(config.d_model * 2),
nn.Linear(config.d_model * 2, num_answers)
)
self.bce_loss = nn.BCEWithLogitsLoss()
def train_step(self, batch):
device = next(self.parameters()).device
batch = self.vis_forward(batch, device)
task = batch["task"]
vis_feats = batch['vis_feats'].to(device)
input_ids = batch['input_ids'].to(device)
vis_pos = batch['boxes'].to(device)
if self.config.classifier:
B = len(input_ids)
decoder_input_ids = torch.ones(
B, 1, dtype=torch.long, device=device) * self.config.decoder_start_token_id
output = self(
input_ids=input_ids,
vis_inputs=(vis_feats, vis_pos),
decoder_input_ids=decoder_input_ids,
output_hidden_states=True,
return_dict=True,
task=task
)
target = batch['targets'].to(device)
last_layer_hidden_state = output.decoder_hidden_states[-1]
last_hidden_state = last_layer_hidden_state.view(B, -1, self.config.d_model)[:, -1]
# [B, num_answers]
logit = self.answer_head(last_hidden_state)
loss = self.bce_loss(logit, target)
else:
lm_labels = batch["target_ids"].to(device)
output = self(
input_ids=input_ids,
vis_inputs=(vis_feats, vis_pos),
labels=lm_labels,
return_dict=True,
task=task
)
assert 'loss' in output
lm_mask = (lm_labels != -100).float()
B, L = lm_labels.size()
loss = output['loss']
loss = loss.view(B, L) * lm_mask
loss = loss.sum(dim=1) / lm_mask.sum(dim=1).clamp(min=1) # B
loss = loss.mean()
result = {
'loss': loss
}
return result
@torch.no_grad()
def test_step(self, batch, **kwargs):
self.eval()
device = next(self.parameters()).device
batch = self.vis_forward(batch, device)
task = batch["task"]
vis_feats = batch['vis_feats'].to(device)
input_ids = batch['input_ids'].to(device)
vis_pos = batch['boxes'].to(device)
result = {}
if self.config.classifier:
B = len(input_ids)
decoder_input_ids = torch.ones(
B, 1, dtype=torch.long, device=device) * self.config.decoder_start_token_id
output = self(
input_ids=input_ids,
vis_inputs=(vis_feats, vis_pos),
decoder_input_ids=decoder_input_ids,
output_hidden_states=True,
return_dict=True,
task=task
)
last_layer_hidden_state = output.decoder_hidden_states[-1]
last_hidden_state = last_layer_hidden_state.view(B, -1, self.config.d_model)[:, -1]
# [B, num_answers]
logit = self.answer_head(last_hidden_state)
score, pred_ans_id = logit.max(1)
pred_ans_id = pred_ans_id.cpu().numpy()
pred_ans = [self.label2ans[ans_id] for ans_id in pred_ans_id]
result['pred_ans'] = pred_ans
else:
output = self.generate(
input_ids=input_ids,
vis_inputs=(vis_feats, vis_pos),
task=task,
**kwargs
)
generated_sents = self.tokenizer.batch_decode(output, skip_special_tokens=True)
result['token_ids'] = output
result['pred_ans'] = generated_sents
return result
from modeling_bart import VLBart
class VLBartClassification(VLBart):
def __init__(self, config):
super().__init__(config)
if config.classifier:
self.answer_head = nn.Sequential(
nn.Linear(config.d_model, config.d_model * 2),
nn.GELU(),
nn.LayerNorm(config.d_model * 2),
nn.Linear(config.d_model * 2, num_answers)
)
self.bce_loss = nn.BCEWithLogitsLoss()
def train_step(self, batch):
device = next(self.parameters()).device
batch = self.vis_forward(batch, device)
task = batch["task"]
vis_feats = batch['vis_feats'].to(device)
input_ids = batch['input_ids'].to(device)
vis_pos = batch['boxes'].to(device)
if self.config.classifier:
B = len(input_ids)
decoder_input_ids = torch.tensor(
[self.config.decoder_start_token_id, self.config.bos_token_id],
dtype=torch.long, device=device).unsqueeze(0).expand(B, 2)
output = self(
input_ids=input_ids,
vis_inputs=(vis_feats, vis_pos),
decoder_input_ids=decoder_input_ids,
output_hidden_states=True,
return_dict=True,
task=task
)
target = batch['targets'].to(device)
last_layer_hidden_state = output.decoder_hidden_states[-1]
last_hidden_state = last_layer_hidden_state.view(B, -1, self.config.d_model)[:, -1]
# [B, num_answers]
logit = self.answer_head(last_hidden_state)
loss = self.bce_loss(logit, target)
else:
lm_labels = batch["target_ids"].to(device)
output = self(
input_ids=input_ids,
vis_inputs=(vis_feats, vis_pos),
labels=lm_labels,
return_dict=True,
task=task
)
assert 'loss' in output
lm_mask = (lm_labels != -100).float()
B, L = lm_labels.size()
loss = output['loss']
loss = loss.view(B, L) * lm_mask
loss = loss.sum(dim=1) / lm_mask.sum(dim=1).clamp(min=1) # B
loss = loss.mean()
result = {
'loss': loss
}
return result
@torch.no_grad()
def test_step(self, batch, **kwargs):
self.eval()
device = next(self.parameters()).device
batch = self.vis_forward(batch, device)
vis_feats = batch['vis_feats'].to(device)
input_ids = batch['input_ids'].to(device)
vis_pos = batch['boxes'].to(device)
task = batch["task"]
result = {}
if self.config.classifier:
B = len(input_ids)
decoder_input_ids = torch.tensor(
[self.config.decoder_start_token_id, self.config.bos_token_id],
dtype=torch.long, device=device).unsqueeze(0).expand(B, 2)
output = self(
input_ids=input_ids,
vis_inputs=(vis_feats, vis_pos),
decoder_input_ids=decoder_input_ids,
output_hidden_states=True,
return_dict=True,
task=task
)
last_layer_hidden_state = output.decoder_hidden_states[-1]
last_hidden_state = last_layer_hidden_state.view(B, -1, self.config.d_model)[:, -1]
# [B, num_answers]
logit = self.answer_head(last_hidden_state)
score, pred_ans_id = logit.max(1)
pred_ans_id = pred_ans_id.cpu().numpy()
pred_ans = [self.label2ans[ans_id] for ans_id in pred_ans_id]
result['pred_ans'] = pred_ans
else:
output = self.generate(
input_ids=input_ids,
vis_inputs=(vis_feats, vis_pos),
task=task,
**kwargs
)
generated_sents = self.tokenizer.batch_decode(output, skip_special_tokens=True)
result['token_ids'] = output
result['pred_ans'] = generated_sents
return result
| 30.599265 | 95 | 0.547879 | 977 | 8,323 | 4.382805 | 0.122825 | 0.074731 | 0.033629 | 0.037366 | 0.950257 | 0.950257 | 0.950257 | 0.950257 | 0.950257 | 0.950257 | 0 | 0.008108 | 0.347951 | 8,323 | 271 | 96 | 30.712177 | 0.78091 | 0.008531 | 0 | 0.894472 | 0 | 0 | 0.026198 | 0 | 0 | 0 | 0 | 0 | 0.01005 | 1 | 0.030151 | false | 0 | 0.035176 | 0 | 0.095477 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
48ae2c4832be4dac5ece481a9883e29cdaad79ab | 87 | py | Python | layer/util_layer/__init__.py | chutien/zpp-mem | 470dec89dda475f7272b876f191cef9f8266a6dc | [
"MIT"
] | 1 | 2019-10-22T11:33:23.000Z | 2019-10-22T11:33:23.000Z | layer/util_layer/__init__.py | chutien/zpp-mem | 470dec89dda475f7272b876f191cef9f8266a6dc | [
"MIT"
] | null | null | null | layer/util_layer/__init__.py | chutien/zpp-mem | 470dec89dda475f7272b876f191cef9f8266a6dc | [
"MIT"
] | null | null | null | from layer.util_layer.batch_normalization import *
from layer.util_layer.pool import *
| 29 | 50 | 0.83908 | 13 | 87 | 5.384615 | 0.538462 | 0.257143 | 0.371429 | 0.514286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091954 | 87 | 2 | 51 | 43.5 | 0.886076 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
48b0f71cb8744c32d9c0a5fdf1ffaae1f9f0b67b | 1,884 | py | Python | Criptografia/algoritmos/views/sha3_view.py | jorgeluis098/proyecto_cripto | 825d7b0c4c9a39af6fa8aa892475235ab75d8142 | [
"MIT"
] | null | null | null | Criptografia/algoritmos/views/sha3_view.py | jorgeluis098/proyecto_cripto | 825d7b0c4c9a39af6fa8aa892475235ab75d8142 | [
"MIT"
] | null | null | null | Criptografia/algoritmos/views/sha3_view.py | jorgeluis098/proyecto_cripto | 825d7b0c4c9a39af6fa8aa892475235ab75d8142 | [
"MIT"
] | null | null | null | from django.shortcuts import render
from django.views import View
#Para SHA-3 de 384 y 512 bits
from Cryptodome.Hash import SHA3_384, SHA3_512
class SHA2_384(View):
template_name = 'dummy.html'
# Pueden ver a la función "get" como un main
def get(self, request, *args, **kwargs):
#codigo
#codigo
#codigo
context = {
# Nota: El contexto son las variables que se van a mandar a la página web
# Si quieren por el momento no vamos a mandar nada a la página web y todo lo manejamos por la consola
# Y a variables que decidan imprimir al final, son las que se mandarán en el conexto a la pg web
# Por el momento lo pueden dejar en blanco
# La estructura es la siguiente
# 'nombre_variable_html' : nombre_variable_python,
}
return render(request, self.template_name, context)
def funcion_2(self, arg1, arg2):
#codigo
#codigo
#codigo
return
class SHA2_512(View):
template_name = 'dummy.html'
# Pueden ver a la función "get" como un main
def get(self, request, *args, **kwargs):
#codigo
#codigo
#codigo
context = {
# Nota: El contexto son las variables que se van a mandar a la página web
# Si quieren por el momento no vamos a mandar nada a la página web y todo lo manejamos por la consola
# Y a variables que decidan imprimir al final, son las que se mandarán en el conexto a la pg web
# Por el momento lo pueden dejar en blanco
# La estructura es la siguiente
# 'nombre_variable_html' : nombre_variable_python,
}
return render(request, self.template_name, context)
def funcion_2(self, arg1, arg2):
#codigo
#codigo
#codigo
return | 32.482759 | 113 | 0.617304 | 264 | 1,884 | 4.337121 | 0.318182 | 0.020961 | 0.062882 | 0.041921 | 0.878603 | 0.878603 | 0.878603 | 0.878603 | 0.878603 | 0.878603 | 0 | 0.022853 | 0.326433 | 1,884 | 58 | 114 | 32.482759 | 0.879433 | 0.509554 | 0 | 0.631579 | 0 | 0 | 0.022272 | 0 | 0 | 0 | 0 | 0.017241 | 0 | 1 | 0.210526 | false | 0 | 0.157895 | 0.105263 | 0.789474 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 10 |
48d2ac1e821fca9300c20ef919ff8890e8fe7e37 | 21,632 | py | Python | proteus/tests/cylinder2D/ibm_rans2p_3D/my_domain.py | burgreen/proteus | 033bbd3fd0ff11d53d8e85b2da1af49e10af9c5d | [
"MIT"
] | null | null | null | proteus/tests/cylinder2D/ibm_rans2p_3D/my_domain.py | burgreen/proteus | 033bbd3fd0ff11d53d8e85b2da1af49e10af9c5d | [
"MIT"
] | null | null | null | proteus/tests/cylinder2D/ibm_rans2p_3D/my_domain.py | burgreen/proteus | 033bbd3fd0ff11d53d8e85b2da1af49e10af9c5d | [
"MIT"
] | null | null | null | from proteus import Domain
# |y
# |
# |
# +-------+
# /| /|
# +-+----+x1|
# | |x0 | |
# | +----+--+---------x
# |/ | /
# +------+
# /
# /
# / z
#===============================================================================
# One box
#===============================================================================
def get_domain_one_box(x0=(0.0,0.0,0.0),L=(1.0,1.0,1.0),he=0.001):
boundaries = ['left', 'right', 'bottom', 'top', 'front', 'back']
boundaryTags = dict([(key, i + 1) for (i, key) in enumerate(boundaries)])
x1 = (x0[0]+L[0],x0[1]+L[1],x0[2]+L[2])
vertices = [(x0[0],x0[1],x0[2]),
(x1[0],x0[1],x0[2]),
(x1[0],x1[1],x0[2]),
(x0[0],x1[1],x0[2]),
(x0[0],x0[1],x1[2]),
(x1[0],x0[1],x1[2]),
(x1[0],x1[1],x1[2]),
(x0[0],x1[1],x1[2])]
vertexFlags=[boundaryTags['bottom'],boundaryTags['bottom'],boundaryTags['top'],boundaryTags['top'],
boundaryTags['bottom'],boundaryTags['bottom'],boundaryTags['top'],boundaryTags['top'],
]
planes = []
planeFlags = []
planeHolds = []
planes.extend([[[0,1,2,3],],
[[4,5,6,7],],
[[0,3,7,4]],
[[1,2,6,5]],
[[0,1,5,4]],
[[2,3,7,6]]])
planeFlags.extend([boundaryTags['back'],boundaryTags['front'],
boundaryTags['left'],boundaryTags['right'],
boundaryTags['bottom'],boundaryTags['top'],])
planeHolds.extend([[],[],[],[],[],[]])
regions = [[0.5*(x0[0]+x1[0]),0.5*(x0[1]+x1[1]),0.5*(x0[2]+x1[2])],
]
regionFlags = [1,]
regionConstraints=[0.5*he**2]
domain = Domain.PiecewiseLinearComplexDomain(vertices=vertices,facets=planes,facetFlags=planeFlags,facetHoles=planeHolds,
regions=regions,regionFlags=regionFlags,regionConstraints=regionConstraints)
return domain,boundaryTags,x0,x1
#===============================================================================
# Two boxes
# the 2nd box is inside the 1st box;
# the 2nd box cannot touch the 1st box;
#===============================================================================
def get_domain_two_box(x0=(0.0,0.0,0.0),L=(1.0,1.0,1.0),he=0.001,he2=None,x2=None,x3=None):
boundaries = ['left', 'right', 'bottom', 'top', 'front', 'back']
boundaryTags = dict([(key, i + 1) for (i, key) in enumerate(boundaries)])
x1 = (x0[0]+L[0],x0[1]+L[1],x0[2]+L[2])
if x2==None:
x2 = (0.25*(x0[0]+x1[0]), 0.25*(x0[1]+x1[1]), 0.25*(x0[2]+x1[2]))
if x3==None:
x3 = (0.75*(x0[0]+x1[0]), 0.75*(x0[1]+x1[1]), 0.75*(x0[2]+x1[2]))
if he2==None:
he2 = he
vertices = [(x0[0],x0[1],x0[2]),
(x1[0],x0[1],x0[2]),
(x1[0],x1[1],x0[2]),
(x0[0],x1[1],x0[2]),
(x0[0],x0[1],x1[2]),
(x1[0],x0[1],x1[2]),
(x1[0],x1[1],x1[2]),
(x0[0],x1[1],x1[2]),
(x2[0],x2[1],x2[2]),
(x3[0],x2[1],x2[2]),
(x3[0],x3[1],x2[2]),
(x2[0],x3[1],x2[2]),
(x2[0],x2[1],x3[2]),
(x3[0],x2[1],x3[2]),
(x3[0],x3[1],x3[2]),
(x2[0],x3[1],x3[2])]
vertexFlags=[boundaryTags['bottom'],boundaryTags['bottom'],boundaryTags['top'],boundaryTags['top'],
boundaryTags['bottom'],boundaryTags['bottom'],boundaryTags['top'],boundaryTags['top'],
0,0,0,0,
0,0,0,0]
planes = []
planeFlags = []
planeHolds = []
planes.extend([[[0,1,2,3],],
[[4,5,6,7],],
[[0,3,7,4]],
[[1,2,6,5]],
[[0,1,5,4]],
[[2,3,7,6]]])
planeFlags.extend([boundaryTags['back'],boundaryTags['front'],
boundaryTags['left'],boundaryTags['right'],
boundaryTags['bottom'],boundaryTags['top'],])
planeHolds.extend([[],[],[],[],[],[]])
planes.extend([[[8,9,10,11]],
[[12,13,14,15]],
[[8,11,15,12]],
[[9,10,14,13]],
[[8,9,13,12]],
[[10,11,15,14]]])
planeFlags.extend([0,0,0,0,0,0])
planeHolds.extend([[],[],[],[],[],[]])
regions = [(0.5*(x2[0]+x3[0]), 0.5*(x2[1]+x3[1]), 0.5*(x2[2]+x3[2])),
(0.5*(x3[0]+x1[0]), 0.5*(x3[1]+x1[1]), 0.5*(x3[2]+x1[2]))]
regionFlags = [1,2]
regionConstraints=[0.5*(he2)**2,0.5*he**2]
domain = Domain.PiecewiseLinearComplexDomain(vertices=vertices,facets=planes,facetFlags=planeFlags,facetHoles=planeHolds,
regions=regions,regionFlags=regionFlags,regionConstraints=regionConstraints)
return domain,boundaryTags,x0,x1,x2,x3
#===============================================================================
# One box with one shelf
#===============================================================================
def get_domain_one_box_with_one_shelf(x0=(0.0,0.0,0.0),L=(1.0,1.0,1.0),he=0.001,he2=None,he3=None,L1=None,L2=None):
boundaries = ['left', 'right', 'bottom', 'top', 'front', 'back']
boundaryTags = dict([(key, i + 1) for (i, key) in enumerate(boundaries)])
x1 = (x0[0]+L[0],x0[1]+L[1],x0[2]+L[2])
if L1==None:
L1 = 0.4*L[1]
if L2==None:
L2 = 0.6*L[1]
x2 = (x0[0],x0[1]+L1,x0[2])
x3 = (x1[0],x0[1]+L2,x1[2])
if he2==None:
he2=he
if he3==None:
he3=he2
vertices = [(x0[0],x0[1],x0[2]),
(x1[0],x0[1],x0[2]),
(x1[0],x1[1],x0[2]),
(x0[0],x1[1],x0[2]),
(x0[0],x0[1],x1[2]),
(x1[0],x0[1],x1[2]),
(x1[0],x1[1],x1[2]),
(x0[0],x1[1],x1[2]),
(x2[0],x2[1],x2[2]),
(x3[0],x2[1],x2[2]),
(x3[0],x3[1],x2[2]),
(x2[0],x3[1],x2[2]),
(x2[0],x2[1],x3[2]),
(x3[0],x2[1],x3[2]),
(x3[0],x3[1],x3[2]),
(x2[0],x3[1],x3[2])]
vertexFlags=[boundaryTags['bottom'],boundaryTags['bottom'],boundaryTags['top'],boundaryTags['top'],
boundaryTags['bottom'],boundaryTags['bottom'],boundaryTags['top'],boundaryTags['top'],
0,0,0,0,
0,0,0,0,]
planes = []
planeFlags = []
planeHolds = []
planes.extend([[[0,1,9,10,2,3,11,8],[8,9],[10,11]],
[[4,5,13,14,6,7,15,12],[12,13],[14,15]],
[[0,8,11,3,7,15,12,4],[8,12],[15,11]],
[[1,9,10,2,6,14,13,5],[9,13],[14,10]],
[[0,1,5,4]],
[[2,3,7,6]]])
planeFlags.extend([boundaryTags['back'],boundaryTags['front'],
boundaryTags['left'],boundaryTags['right'],
boundaryTags['bottom'],boundaryTags['top'],])
planeHolds.extend([[],[],[],[],[],[]])
planes.extend([[[10,11,15,14]],
[[8,9,13,12]],
])
planeFlags.extend([0,0])
planeHolds.extend([[],[],])
regions = [(0.5*(x0[0]+x1[0]), 0.5*(x0[1]+x2[1]), 0.5*(x0[2]+x1[2])),
(0.5*(x0[0]+x1[0]), 0.5*(x2[1]+x3[1]), 0.5*(x0[2]+x1[2])),
(0.5*(x0[0]+x1[0]), 0.5*(x3[1]+x1[1]), 0.5*(x0[2]+x1[2]))]
regionFlags = [1,2,3]
regionConstraints=[0.5*he**2,0.5*(he2)**2,0.5*(he3)**2,]
domain = Domain.PiecewiseLinearComplexDomain(vertices=vertices,facets=planes,facetFlags=planeFlags,facetHoles=planeHolds,
regions=regions,regionFlags=regionFlags,regionConstraints=regionConstraints)
return domain,boundaryTags,x0,x1,x2,x3
#===============================================================================
# For 2D cylinder problems
#===============================================================================
import math
from proteus import Domain
def circular_cross_section(center,radius,theta):
return (radius*math.sin(theta)+center[0],
radius*math.cos(theta)+center[1])
def get_pseudo_3D_cylinder_domain(
x0=(0.0,0.0,0.0),L=(1.0,1.0,1.0),
radius=0.1,
center=(0.5,0.5),
n_points_on_obstacle=2*21-2,
cross_section=circular_cross_section,
thetaOffset=0.0,
he=1.0,
he2=None):
if he2==None:
he2=he
x1 = (x0[0]+L[0],x0[1]+L[1],x0[2]+L[2])
boundaries=['left',
'right',
'bottom',
'top',
'front',
'back',]
boundaryTags=dict([(key,i+1) for (i,key) in enumerate(boundaries)])
#work around the domain from (0.0,0.0) going counterclockwise
vertexKeys = ['left_bottom',
'right_bottom',
'right_top',
'left_top']
vertices = [[x0[0],x0[1]],
[x1[0],x0[1]],
[x1[0],x1[1]],
[x0[0],x1[1]]]
vertexFlags = [boundaryTags['bottom'],
boundaryTags['bottom'],
boundaryTags['top'],
boundaryTags['top']]
nv = len(vertices)
#cylinder
theta=thetaOffset
pb = cross_section(center,radius,theta)
vertices.append([pb[0],pb[1]])
vertexKeys.append('obstacle_'+`0`)
vertexFlags.append(boundaryTags['back'])
for gb in range(1,n_points_on_obstacle):
theta = float(gb)/float(n_points_on_obstacle)*2.0*math.pi+thetaOffset
pb = cross_section(center,radius,theta)
vertexKeys.append('obstacle_'+`gb`)
vertices.append([pb[0],pb[1]])
vertexFlags.append(boundaryTags['back'])
#convert to 3D
vertices3dDict={}
vertices3d=[]
vertexFlags3d=[]
facets3d=[]
facetFlags3d=[]
facetHoles3d=[]
front_cylinder=[]
back_cylinder=[]
for vN,v in enumerate(vertices):
vertices3dDict[vertexKeys[vN]+'_back'] = vN
vertices3d.append([v[0],v[1],x0[2]])
vertexFlags3d.append(boundaryTags['back'])
if 'obstacle' in vertexKeys[vN]:
back_cylinder.append(vN)
for vN,v in enumerate(vertices):
vertices3dDict[vertexKeys[vN]+'_front']=vN+len(vertices)
vertices3d.append([v[0],v[1],x1[2]])
vertexFlags3d.append(boundaryTags['front'])#note that the original tag is back
if 'obstacle' in vertexKeys[vN]:
front_cylinder.append(vN+len(vertices))
#left
facets3d.append([[vertices3dDict['left_bottom_front'],
vertices3dDict['left_bottom_back'],
vertices3dDict['left_top_back'],
vertices3dDict['left_top_front']]])
facetFlags3d.append(boundaryTags['left'])
facetHoles3d.append([])
#right
facets3d.append([[vertices3dDict['right_bottom_front'],
vertices3dDict['right_bottom_back'],
vertices3dDict['right_top_back'],
vertices3dDict['right_top_front']]])
facetFlags3d.append(boundaryTags['right'])
facetHoles3d.append([])
#top
facets3d.append([[vertices3dDict['left_top_front'],
vertices3dDict['right_top_front'],
vertices3dDict['right_top_back'],
vertices3dDict['left_top_back']]])
facetFlags3d.append(boundaryTags['top'])
facetHoles3d.append([])
#bottom
facets3d.append([[vertices3dDict['left_bottom_front'],
vertices3dDict['right_bottom_front'],
vertices3dDict['right_bottom_back'],
vertices3dDict['left_bottom_back']]])
facetFlags3d.append(boundaryTags['bottom'])
facetHoles3d.append([])
#front
facets3d.append([[vertices3dDict['left_bottom_front'],
vertices3dDict['right_bottom_front'],
vertices3dDict['right_top_front'],
vertices3dDict['left_top_front']],
front_cylinder])#add points on the front circle
facetFlags3d.append(boundaryTags['front'])
facetHoles3d.append([])
#back
facets3d.append([[vertices3dDict['left_bottom_back'],
vertices3dDict['right_bottom_back'],
vertices3dDict['right_top_back'],
vertices3dDict['left_top_back']],
back_cylinder])#add points on the back circle
facetFlags3d.append(boundaryTags['back'])
facetHoles3d.append([])
#sides of cylinder
for fN in range(n_points_on_obstacle-1):
facets3d.append([[front_cylinder[fN],
back_cylinder[fN],
back_cylinder[fN+1],
front_cylinder[fN+1]]])
facetFlags3d.append(0)
facetHoles3d.append([])
facets3d.append([[front_cylinder[-1],
back_cylinder[-1],
back_cylinder[0],
front_cylinder[0]]])
facetFlags3d.append(0)
facetHoles3d.append([])
#region
regions = [(center[0],center[1], 0.5*(x0[2]+x1[2])),
(0.1*x0[0]+0.9*x1[0],0.1*x0[1]+0.9*x1[1],0.1*x0[2]+0.9*x1[2])]
regionFlags = [1,2,]
regionConstraints=[0.5*he2**2,0.5*(he)**2]
# make domain
domain = Domain.PiecewiseLinearComplexDomain(vertices=vertices3d,
vertexFlags=vertexFlags3d,
facets=facets3d,
facetFlags=facetFlags3d,
facetHoles=facetHoles3d,
regions=regions,regionFlags=regionFlags,regionConstraints=regionConstraints)
domain.boundaryTags = boundaryTags
return domain,boundaryTags
#===============================================================================
# For 2D cylinder + box around this cylinder
#===============================================================================
def get_pseudo_3D_cylinder_box_domain(
x0=(0.0,0.0,0.0),L=(1.0,1.0,1.0),
x2=None,x3=None,
radius=0.1,
center=(0.5,0.5),
n_points_on_obstacle=2*21-2,
cross_section=circular_cross_section,
thetaOffset=0.0,
he=1.0,
he2=None,
he3=None):
if he2==None:
he2=he
if he3==None:
he3=he2
x1 = (x0[0]+L[0],x0[1]+L[1],x0[2]+L[2])
if x2==None:
x2 = (0.25*(x0[0]+x1[0]), 0.25*(x0[1]+x1[1]))
if x3==None:
x3 = (0.75*(x0[0]+x1[0]), 0.75*(x0[1]+x1[1]))
boundaries=['left',
'right',
'bottom',
'top',
'front',
'back',]
boundaryTags=dict([(key,i+1) for (i,key) in enumerate(boundaries)])
#work around the domain from (0.0,0.0) going counterclockwise
vertexKeys = ['left_bottom',
'right_bottom',
'right_top',
'left_top']
vertices = [[x0[0],x0[1]],
[x1[0],x0[1]],
[x1[0],x1[1]],
[x0[0],x1[1]]]
vertexFlags = [boundaryTags['bottom'],
boundaryTags['bottom'],
boundaryTags['top'],
boundaryTags['top']]
#cylinder
theta=thetaOffset
pb = cross_section(center,radius,theta)
vertices.append([pb[0],pb[1]])
vertexKeys.append('obstacle_'+`0`)
vertexFlags.append(boundaryTags['back'])
for gb in range(1,n_points_on_obstacle):
theta = float(gb)/float(n_points_on_obstacle)*2.0*math.pi+thetaOffset
pb = cross_section(center,radius,theta)
vertexKeys.append('obstacle_'+`gb`)
vertices.append([pb[0],pb[1]])
vertexFlags.append(boundaryTags['back'])
#box
vertexKeys.extend(['box_1','box_2','box_3','box_4'])
vertices.extend([[x2[0],x2[1]],
[x3[0],x2[1]],
[x3[0],x3[1]],
[x2[0],x3[1]]])
vertexFlags.extend([boundaryTags['back'],
boundaryTags['back'],
boundaryTags['back'],
boundaryTags['back']])
#convert to 3D
vertices3dDict={}
vertices3d=[]
vertexFlags3d=[]
facets3d=[]
facetFlags3d=[]
facetHoles3d=[]
front_cylinder=[]
back_cylinder=[]
front_box=[]
back_box=[]
for vN,v in enumerate(vertices):
vertices3dDict[vertexKeys[vN]+'_back'] = vN
vertices3d.append([v[0],v[1],x0[2]])
vertexFlags3d.append(boundaryTags['back'])#note that the original tag is back
if 'obstacle' in vertexKeys[vN]:
back_cylinder.append(vN)
if 'box' in vertexKeys[vN]:
back_box.append(vN)
for vN,v in enumerate(vertices):
vertices3dDict[vertexKeys[vN]+'_front']=vN+len(vertices)
vertices3d.append([v[0],v[1],x1[2]])
vertexFlags3d.append(boundaryTags['front'])#note that the original tag is back
if 'obstacle' in vertexKeys[vN]:
front_cylinder.append(vN+len(vertices))
if 'box' in vertexKeys[vN]:
front_box.append(vN+len(vertices))
#left
facets3d.append([[vertices3dDict['left_bottom_front'],
vertices3dDict['left_bottom_back'],
vertices3dDict['left_top_back'],
vertices3dDict['left_top_front']]])
facetFlags3d.append(boundaryTags['left'])
facetHoles3d.append([])
#right
facets3d.append([[vertices3dDict['right_bottom_front'],
vertices3dDict['right_bottom_back'],
vertices3dDict['right_top_back'],
vertices3dDict['right_top_front']]])
facetFlags3d.append(boundaryTags['right'])
facetHoles3d.append([])
#top
facets3d.append([[vertices3dDict['left_top_front'],
vertices3dDict['right_top_front'],
vertices3dDict['right_top_back'],
vertices3dDict['left_top_back']]])
facetFlags3d.append(boundaryTags['top'])
facetHoles3d.append([])
#bottom
facets3d.append([[vertices3dDict['left_bottom_front'],
vertices3dDict['right_bottom_front'],
vertices3dDict['right_bottom_back'],
vertices3dDict['left_bottom_back']]])
facetFlags3d.append(boundaryTags['bottom'])
facetHoles3d.append([])
#front
facets3d.append([[vertices3dDict['left_bottom_front'],
vertices3dDict['right_bottom_front'],
vertices3dDict['right_top_front'],
vertices3dDict['left_top_front']],
front_cylinder,
front_box])#add points on the front circle
facetFlags3d.append(boundaryTags['front'])
facetHoles3d.append([])
#back
facets3d.append([[vertices3dDict['left_bottom_back'],
vertices3dDict['right_bottom_back'],
vertices3dDict['right_top_back'],
vertices3dDict['left_top_back']],
back_cylinder,
back_box])#add points on the back circle
facetFlags3d.append(boundaryTags['back'])
facetHoles3d.append([])
#cylinder
for fN in range(n_points_on_obstacle-1):
facets3d.append([[front_cylinder[fN],
back_cylinder[fN],
back_cylinder[fN+1],
front_cylinder[fN+1]]])
facetFlags3d.append(0)
facetHoles3d.append([])
facets3d.append([[front_cylinder[-1],
back_cylinder[-1],
back_cylinder[0],
front_cylinder[0]]])
facetFlags3d.append(0)
facetHoles3d.append([])
#sides of box
for fN in range(3):
facets3d.append([[front_box[fN],
back_box[fN],
back_box[fN+1],
front_box[fN+1]]])
facetFlags3d.append(0)
facetHoles3d.append([])
facets3d.append([[front_box[-1],
back_box[-1],
back_box[0],
front_box[0]]])
facetFlags3d.append(0)
facetHoles3d.append([])
#region
regions = [(center[0],center[1], 0.5*(x0[2]+x1[2])),
(0.1*x2[0]+0.9*x3[0],0.1*x2[1]+0.9*x3[1],0.5*(x0[2]+x1[2])),
(0.1*x0[0]+0.9*x1[0],0.1*x0[1]+0.9*x1[1],0.5*(x0[2]+x1[2]))]
regionFlags = [1,2,3]
regionConstraints=[0.5*he2**2,0.5*he3**2,0.5*(he)**2]
# make domain
domain = Domain.PiecewiseLinearComplexDomain(vertices=vertices3d,
vertexFlags=vertexFlags3d,
facets=facets3d,
facetFlags=facetFlags3d,
facetHoles=facetHoles3d,
regions=regions,regionFlags=regionFlags,regionConstraints=regionConstraints)
domain.boundaryTags = boundaryTags
return domain,boundaryTags | 39.546618 | 125 | 0.482849 | 2,428 | 21,632 | 4.213344 | 0.056013 | 0.01349 | 0.01173 | 0.01173 | 0.924829 | 0.9 | 0.89306 | 0.88827 | 0.886315 | 0.884555 | 0 | 0.084237 | 0.309079 | 21,632 | 547 | 126 | 39.546618 | 0.600227 | 0.077385 | 0 | 0.806236 | 0 | 0 | 0.070599 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.006682 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
48fcaec73d6d22973269225c607e89b82b97fee3 | 16,563 | py | Python | Tensile/Tests/extended/convolution_config/test_forward_nchw.py | cgmb/Tensile | b0055f18fe76bcb1a9e5fd963b6b7f93aae85ee0 | [
"MIT"
] | 116 | 2017-06-29T08:52:55.000Z | 2022-03-25T03:01:43.000Z | Tensile/Tests/extended/convolution_config/test_forward_nchw.py | cgmb/Tensile | b0055f18fe76bcb1a9e5fd963b6b7f93aae85ee0 | [
"MIT"
] | 431 | 2017-07-19T16:29:54.000Z | 2022-03-31T19:40:12.000Z | Tensile/Tests/extended/convolution_config/test_forward_nchw.py | cgmb/Tensile | b0055f18fe76bcb1a9e5fd963b6b7f93aae85ee0 | [
"MIT"
] | 107 | 2017-10-14T01:38:41.000Z | 2022-03-07T08:49:09.000Z | ################################################################################
# Copyright 2020-2021 Advanced Micro Devices, Inc. All rights reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell cop-
# ies of the Software, and to permit persons to whom the Software is furnished
# to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IM-
# PLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
# FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
# COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
# IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNE-
# CTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
################################################################################
import logging,pytest
from Tensile.SolutionStructs import Convolution
from YamlBuilder.YamlBuilder import defaultSizes, resnetSizes, inceptionSizes
log =logging.getLogger("testlog")
def test_nchw_defaults(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
assert(z['NumIndicesC']==3)
assert(z['IndexAssignmentsA']==[0, 3, 2])
assert(z['IndexAssignmentsB']==[3, 1, 2])
assert(not z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,2:0})
assert(conv.solutionParms["AssertSizeEqual"] == {})
run_convolution_level.func(conv, z, run_convolution_level.solution)
@pytest.mark.parametrize("problemSizes", [defaultSizes, resnetSizes, inceptionSizes])
def test_nchw_filter1x1(tensile_state, run_convolution_level, problemSizes):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'Filter': '1x1',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
assert(z['NumIndicesC']==3)
assert(z['IndexAssignmentsA']==[0, 3, 2])
assert(z['IndexAssignmentsB']==[3, 1, 2])
assert(not z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,2:0})
assert(conv.solutionParms["AssertSizeEqual"] == {})
run_convolution_level.func(conv, z, run_convolution_level.solution, problemSizes[0], problemSizes[1])
def test_nchw_packed_spatial0(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'PackedSpatialDims': 0
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
assert(z['NumIndicesC']==4)
assert(z['IndexAssignmentsA']==[0, 1, 4, 3])
assert(z['IndexAssignmentsB']==[4, 2, 3])
assert(not z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,2:0})
assert(conv.solutionParms["AssertSizeEqual"] == {})
run_convolution_level.func(conv, z, run_convolution_level.solution)
def test_nchw_tbd_strides(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'Stride': 'NxN',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
assert(z['NumIndicesC']==4)
assert(z['IndexAssignmentsA']==[0, 1, 4, 3])
assert(z['IndexAssignmentsB']==[4, 2, 3])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,2:0})
assert(conv.solutionParms["AssertSizeEqual"] == {})
run_convolution_level.func(conv, z, run_convolution_level.solution)
def test_nchw_const_strides(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'Stride': '2x2',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
assert(z['NumIndicesC']==4)
assert(z['IndexAssignmentsA']==[0, 1, 4, 3])
assert(z['IndexAssignmentsB']==[4, 2, 3])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:2})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,2:0})
assert(conv.solutionParms["AssertSizeEqual"] == {})
run_convolution_level.func(conv, z, run_convolution_level.solution)
def test_nchw_const_use_initial_strides(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'Stride': '2x3',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
assert(z['NumIndicesC']==4)
assert(z['IndexAssignmentsA']==[0, 1, 4, 3])
assert(z['IndexAssignmentsB']==[4, 2, 3])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:3})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,2:0})
assert(conv.solutionParms["AssertSizeEqual"] == {})
run_convolution_level.func(conv, z, run_convolution_level.solution)
@pytest.mark.parametrize("unrollOnChannel", [0, 1], ids=["unrollOnChannel0", "unrollOnChannel1"])
def test_nchw_filter2x2(tensile_state, run_convolution_level, unrollOnChannel):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'TensorBFormat': 'KCYX',
'UnrollOnChannel': unrollOnChannel,
'Filter': '2x2',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
assert(z['NumIndicesC']==3)
filterDims = [4, 3] if conv.unrollOnChannel else [5,4]
cdim = 5 if conv.unrollOnChannel else 3
assert(z['IndexAssignmentsA']==filterDims + [0, cdim, 2])
assert(z['IndexAssignmentsB']==filterDims + [cdim, 1, 2])
assert(not z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:1,2:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,4:0})
assert(conv.solutionParms["AssertSizeEqual"] == {filterDims[0]:2, filterDims[1]:2})
run_convolution_level.func(conv, z, run_convolution_level.solution)
@pytest.mark.parametrize("problemSizes", [defaultSizes, resnetSizes, inceptionSizes])
def test_nchw_filter2x1(tensile_state, run_convolution_level, problemSizes):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'TensorBFormat': 'KCYX',
'Filter': '2x1',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
filterDims = [3] if conv.unrollOnChannel else [4]
cdim = 4 if conv.unrollOnChannel else 3
assert(z['NumIndicesC']==3)
assert(z['IndexAssignmentsA']==filterDims + [0, cdim, 2])
assert(z['IndexAssignmentsB']==filterDims + [cdim, 1, 2])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {1:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {3:0})
assert(conv.solutionParms["AssertSizeEqual"] == {filterDims[0]:2})
run_convolution_level.func(conv, z, run_convolution_level.solution, problemSizes[0], problemSizes[1])
@pytest.mark.parametrize("problemSizes", [defaultSizes, resnetSizes, inceptionSizes])
def test_nchw_filter7x1(tensile_state, run_convolution_level, problemSizes):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'TensorBFormat': 'KCYX',
'Filter': '7x1',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
filterDims = [3] if conv.unrollOnChannel else [4]
cdim = 4 if conv.unrollOnChannel else 3
assert(z['NumIndicesC']==3)
assert(z['IndexAssignmentsA']==filterDims + [0, cdim, 2])
assert(z['IndexAssignmentsB']==filterDims + [cdim, 1, 2])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {1:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {3:0})
assert(conv.solutionParms["AssertSizeEqual"] == {filterDims[0]:7})
run_convolution_level.func(conv, z, run_convolution_level.solution, problemSizes[0], problemSizes[1])
def test_nchw_filter2x1_dilation(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'TensorBFormat': 'KCYX',
'Filter': '2x1',
'Dilation': '1x2',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
filterDims = [3] if conv.unrollOnChannel else [4]
cdim = 4 if conv.unrollOnChannel else 3
assert(z['NumIndicesC']==3)
assert(z['IndexAssignmentsA']==filterDims + [0, cdim, 2])
assert(z['IndexAssignmentsB']==filterDims + [cdim, 1, 2])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {1:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {3:0})
assert(conv.solutionParms["AssertSizeEqual"] == {filterDims[0]:2})
run_convolution_level.func(conv, z, run_convolution_level.solution)
def test_nchw_filter1x2(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'TensorBFormat': 'KCYX',
'Filter': '1x2',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
filterDims = [3] if conv.unrollOnChannel else [4]
cdim = 4 if conv.unrollOnChannel else 3
assert(z['NumIndicesC']==3)
assert(z['IndexAssignmentsA']==filterDims + [0, cdim, 2])
assert(z['IndexAssignmentsB']==filterDims + [cdim, 1, 2])
assert(not z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:1,1:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,3:0})
assert(conv.solutionParms["AssertSizeEqual"] == {filterDims[0]:2})
run_convolution_level.func(conv, z, run_convolution_level.solution)
def test_nchw_filter1x2_dilation(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'TensorBFormat': 'KCYX',
'Filter': '1x2',
'Dilation': '1x2',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
filterDims = [3] if conv.unrollOnChannel else [4]
cdim = 4 if conv.unrollOnChannel else 3
assert(z['NumIndicesC']==3)
assert(z['IndexAssignmentsA']==filterDims + [0, cdim, 2])
assert(z['IndexAssignmentsB']==filterDims + [cdim, 1, 2])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:2,1:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,3:0})
assert(conv.solutionParms["AssertSizeEqual"] == {filterDims[0]:2})
run_convolution_level.func(conv, z, run_convolution_level.solution)
def test_nchw_dilation2x2(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'Dilation': '2x2',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
(cdim, filterDims) = (5,[4,3]) if conv.unrollOnChannel else (5,[4,3])
assert(z['NumIndicesC']==3)
assert(z['IndexAssignmentsA']==filterDims + [0, cdim, 2])
assert(z['IndexAssignmentsB']==filterDims + [cdim, 1, 2])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:2,2:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,filterDims[0]:0})
assert(conv.solutionParms["AssertSizeEqual"] == {filterDims[0]:1, filterDims[1]:1})
run_convolution_level.func(conv, z, run_convolution_level.solution)
def test_nchw_stride_filter(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCHW',
'Stride': 'NxN',
'Filter': '2x2',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
filterDims = [5,4] if conv.unrollOnChannel else [6,5]
cdim = 6 if conv.unrollOnChannel else 3
assert(z['NumIndicesC']==4)
assert(z['IndexAssignmentsA']==filterDims + [0, 1, cdim, 3])
assert(z['IndexAssignmentsB']==filterDims + [cdim, 2, 3])
assert(not z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {0:1})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,4:0})
assert(conv.solutionParms["AssertSizeEqual"] == {filterDims[0]:2, filterDims[1]:2})
run_convolution_level.func(conv, z, run_convolution_level.solution)
def test_ncdhw_packed_strides3d_defaults(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCDHW',
'Stride': 'NxNxN',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
assert(z['NumIndicesC']==5)
assert(z['IndexAssignmentsA']==[0, 1, 2, 5, 4])
assert(z['IndexAssignmentsB']==[5, 3, 4])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,2:0})
assert(conv.solutionParms["AssertSizeEqual"] == {})
run_convolution_level.func(conv, z, run_convolution_level.solution)
@pytest.mark.skip(reason="out of registers in asm runs")
def test_ncdhw_packed_strides_filter3d(tensile_state, run_convolution_level):
z={} # problemType definition
conv = Convolution(z, 'ConvolutionForward',
config={'TensorAFormat': 'NCDHW',
'TensorBFormat': 'KCZYX',
'TensorDFormat': 'NCDHW',
'Stride': 'NxNxN',
})
log.debug(conv.printUsage(z))
if not tensile_state.args["no_conv_assertions"]:
assert(z['NumIndicesC']==5)
assert(z['IndexAssignmentsA']==[0, 1, 2, 5, 4])
assert(z['IndexAssignmentsB']==[5, 3, 4])
assert(z['UseInitialStridesAB'])
assert(conv.solutionParms["AssertStrideAEqual"] == {})
assert(conv.solutionParms["AssertStrideBEqual"] == {0:1,2:0})
assert(conv.solutionParms["AssertSizeEqual"] == {})
run_convolution_level.func(conv, z, run_convolution_level.solution)
| 49.738739 | 105 | 0.635271 | 1,755 | 16,563 | 5.878632 | 0.11339 | 0.039353 | 0.088398 | 0.040322 | 0.856354 | 0.846079 | 0.846079 | 0.836387 | 0.822041 | 0.821944 | 0 | 0.024694 | 0.215178 | 16,563 | 332 | 106 | 49.888554 | 0.768982 | 0.08839 | 0 | 0.834483 | 0 | 0 | 0.209608 | 0 | 0 | 0 | 0 | 0 | 0.441379 | 1 | 0.055172 | false | 0 | 0.010345 | 0 | 0.065517 | 0.055172 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5b224d9e4b85a5f7538fd5d2c5bb6fb2eb950f16 | 4,635 | py | Python | bing.py | Masood-M/yalih | 7f222f4063928fe191c6b5ceee592eb36baa7939 | [
"Apache-2.0"
] | 56 | 2015-02-12T16:19:16.000Z | 2022-01-29T12:12:51.000Z | bing.py | Masood-M/yalih | 7f222f4063928fe191c6b5ceee592eb36baa7939 | [
"Apache-2.0"
] | 2 | 2016-04-20T20:29:43.000Z | 2020-06-11T09:12:04.000Z | bing.py | Masood-M/yalih | 7f222f4063928fe191c6b5ceee592eb36baa7939 | [
"Apache-2.0"
] | 9 | 2015-07-14T19:23:21.000Z | 2022-03-01T04:15:06.000Z | #! /usr/bin/env python
# -*- coding: utf-8 -*-
import urllib
import urllib2
import json
import honeypotconfig
from BeautifulSoup import BeautifulSoup
def searchBing(keyword):
queryBingFor = "'%s'" %keyword #"'google fibre'" # the apostrophe's required as that is the format the API Url expects.
quoted_query = urllib.quote(queryBingFor)
account_key = "Unolx7kLAlLp1NSwmPyis9df+ecjQeN9pqGe57sW/D8="
rootURL = "https://api.datamarket.azure.com/Bing/Search/"
searchURL = rootURL + "Web?$format=json&Query=" +quoted_query+ "&$top=1&$skip="+str(honeypotconfig.starturl)
username="masood.mansoori@live.com"
password_mgr = urllib2.HTTPPasswordMgrWithDefaultRealm()
password_mgr.add_password(None, searchURL,username,account_key)
handler = urllib2.HTTPBasicAuthHandler(password_mgr)
opener = urllib2.build_opener(handler)
# authhandler = urllib2.HTTPDigestAuthHandler(password_mgr)
# opener = urllib2.build_opener(authhandler)
urllib2.install_opener(opener)
readURL = urllib2.urlopen(searchURL).read()
response_data = readURL
json_result = json.loads(response_data)
result_list = json_result['d']['results']
searchresult=open("list/searchresult.txt", "w")
for i in result_list:
print i['Url']
searchresult.write(i['Url']+"\n")
searchresult.close()
nextsearch=honeypotconfig.starturl+50
queryBingFor = "'%s'" %keyword #"'google fibre'" # the apostrophe's required as that is the format the API Url expects.
quoted_query = urllib.quote(queryBingFor)
account_key = "Unolx7kLAlLp1NSwmPyis9df+ecjQeN9pqGe57sW/D8="
rootURL = "https://api.datamarket.azure.com/Bing/Search/"
searchURL = rootURL + "Web?$format=json&Query=" +quoted_query+ "&$top=50&$skip="+str(nextsearch)
username="masood.mansoori@live.com"
password_mgr = urllib2.HTTPPasswordMgrWithDefaultRealm()
password_mgr.add_password(None, searchURL,username,account_key)
handler = urllib2.HTTPBasicAuthHandler(password_mgr)
opener = urllib2.build_opener(handler)
# authhandler = urllib2.HTTPDigestAuthHandler(password_mgr)
# opener = urllib2.build_opener(authhandler)
urllib2.install_opener(opener)
readURL = urllib2.urlopen(searchURL).read()
response_data = readURL
json_result = json.loads(response_data)
result_list = json_result['d']['results']
searchresult=open("list/searchresult.txt", "a")
for i in result_list:
print i['Url']
searchresult.write(i['Url']+"\n")
searchresult.close()
nextsearch=honeypotconfig.starturl+100
queryBingFor = "'%s'" %keyword #"'google fibre'" # the apostrophe's required as that is the format the API Url expects.
quoted_query = urllib.quote(queryBingFor)
account_key = "Unolx7kLAlLp1NSwmPyis9df+ecjQeN9pqGe57sW/D8="
rootURL = "https://api.datamarket.azure.com/Bing/Search/"
searchURL = rootURL + "Web?$format=json&Query=" +quoted_query+ "&$top=50&$skip="+str(nextsearch)
username="masood.mansoori@live.com"
password_mgr = urllib2.HTTPPasswordMgrWithDefaultRealm()
password_mgr.add_password(None, searchURL,username,account_key)
handler = urllib2.HTTPBasicAuthHandler(password_mgr)
opener = urllib2.build_opener(handler)
# authhandler = urllib2.HTTPDigestAuthHandler(password_mgr)
# opener = urllib2.build_opener(authhandler)
urllib2.install_opener(opener)
readURL = urllib2.urlopen(searchURL).read()
response_data = readURL
json_result = json.loads(response_data)
result_list = json_result['d']['results']
searchresult=open("list/searchresult.txt", "a")
for i in result_list:
print i['Url']
searchresult.write(i['Url']+"\n")
searchresult.close()
nextsearch=honeypotconfig.starturl+150
queryBingFor = "'%s'" %keyword #"'google fibre'" # the apostrophe's required as that is the format the API Url expects.
quoted_query = urllib.quote(queryBingFor)
account_key = "Unolx7kLAlLp1NSwmPyis9df+ecjQeN9pqGe57sW/D8="
rootURL = "https://api.datamarket.azure.com/Bing/Search/"
searchURL = rootURL + "Web?$format=json&Query=" +quoted_query+ "&$top=5&$skip="+str(nextsearch)
username="masood.mansoori@live.com"
password_mgr = urllib2.HTTPPasswordMgrWithDefaultRealm()
password_mgr.add_password(None, searchURL,username,account_key)
handler = urllib2.HTTPBasicAuthHandler(password_mgr)
opener = urllib2.build_opener(handler)
# authhandler = urllib2.HTTPDigestAuthHandler(password_mgr)
# opener = urllib2.build_opener(authhandler)
urllib2.install_opener(opener)
readURL = urllib2.urlopen(searchURL).read()
response_data = readURL
json_result = json.loads(response_data)
result_list = json_result['d']['results']
searchresult=open("list/searchresult.txt", "a")
for i in result_list:
print i['Url']
searchresult.write(i['Url']+"\n")
searchresult.close()
| 37.379032 | 121 | 0.769364 | 563 | 4,635 | 6.206039 | 0.170515 | 0.050372 | 0.038924 | 0.054951 | 0.949628 | 0.949628 | 0.949628 | 0.949628 | 0.949628 | 0.949628 | 0 | 0.017237 | 0.098813 | 4,635 | 123 | 122 | 37.682927 | 0.819248 | 0.171521 | 0 | 0.865169 | 0 | 0 | 0.201465 | 0.117216 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.134831 | 0.05618 | null | null | 0.044944 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
d28eb2b0831ef61e1a81999f048469eccd9214d2 | 2,599 | py | Python | steeldesign/modules/appendix_B.py | mainqueg/steeldesign | e1250f93af0b156f5820e83e5925d2d991d43519 | [
"MIT"
] | null | null | null | steeldesign/modules/appendix_B.py | mainqueg/steeldesign | e1250f93af0b156f5820e83e5925d2d991d43519 | [
"MIT"
] | null | null | null | steeldesign/modules/appendix_B.py | mainqueg/steeldesign | e1250f93af0b156f5820e83e5925d2d991d43519 | [
"MIT"
] | null | null | null | '''Ecuaciones del Apéndice B de ASCE - 8 - 02
'''
def B_1(FY, E0, offset, n, s):
'''Modulo elasticidad secante segun Eq B-1
Parameters
----------
E0 : float
Modulo elasticidad inicial
FY : float
Tension de fluencia con una deformacion permanente de offset
offset : float
Valor de deformacion permanente a la que se obtuvo FY
n : float
Exponente de Ramberg-Osgood
s : float
Tension a la que se debe determinar Es
Returns
----------
float
Modulo secante Es para la tension s
Raises
------
none
Tests
-----
>>> round( B_1(344.8, 186200, 0.002, 4.58, 159.3), 2)
174334.98
'''
Es = E0 / (1 + offset*E0* ( s**(n-1)/(FY**n)) )
return Es
def B_2(FY, E0, offset, n, s):
'''Modulo elasticidad tangente segun Eq B-2
Parameters
----------
E0 : float
Modulo elasticidad inicial
FY : float
Tension de fluencia con una deformacion permanente de offset
offset : float
Valor de deformacion permanente a la que se obtuvo FY
n : float
Exponente de Ramberg-Osgood
s : float
Tension a la que se debe determinar Et
Returns
----------
float
Modulo tangente Et para la tension s
Raises
------
none
Tests
-----
>>> round( B_2(344.8, 186200, 0.002, 4.58, 159.3), 2)
141952.2
'''
Et = E0*FY/(FY+offset*n*E0* (s/FY)**(n-1))
return Et
def B_1(FY, E0, offset, n, s):
'''Modulo elasticidad secante segun Eq B-1
Parameters
----------
E0 : float
Modulo elasticidad inicial
FY : float
Tension de fluencia con una deformacion permanente de offset
offset : float
Valor de deformacion permanente a la que se obtuvo FY
n : float
Exponente de Ramberg-Osgood
s : float
Tension a la que se debe determinar Es
Returns
----------
float
Modulo secante Es para la tension s
Raises
------
none
Tests
-----
>>> round( B_1(344.8, 186200, 0.002, 4.58, 159.3), 2)
174334.98
'''
Es = E0 / (1 + offset*E0* ( s**(n-1)/(FY**n)) )
return Es
def TableA12(tau):
'''Coeficiente de plasticidad para pandeo de columnas o LTB de vigas, segun Eq B-5.
Parameters
----------
tau: float,
tension de corte para calcular eta_shear.
Returns
----------
eta_shear: float
Coeficiente de plasticidad por corte.
Raises
------
none
Tests
-----
>>> round( B_5(), 2)
'''
| 19.992308 | 87 | 0.548288 | 350 | 2,599 | 4.045714 | 0.22 | 0.059322 | 0.025424 | 0.033898 | 0.788842 | 0.774011 | 0.774011 | 0.753531 | 0.753531 | 0.728814 | 0 | 0.070366 | 0.327434 | 2,599 | 130 | 88 | 19.992308 | 0.739703 | 0.722201 | 0 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
d2919f81c66b6033d921d2788a65d96ebe259bfb | 2,738 | py | Python | get_dataloaders.py | Barchid/segmenter | 6b5dae6cd75c1f04bc906ccb4d4a44e6c321f0e3 | [
"MIT"
] | null | null | null | get_dataloaders.py | Barchid/segmenter | 6b5dae6cd75c1f04bc906ccb4d4a44e6c321f0e3 | [
"MIT"
] | null | null | null | get_dataloaders.py | Barchid/segmenter | 6b5dae6cd75c1f04bc906ccb4d4a44e6c321f0e3 | [
"MIT"
] | null | null | null | from cv2 import phase
from datasets.nyuv2.pytorch_dataset import NYUv2
from datasets.sunrgbd.pytorch_dataset import SUNRGBD
from preprocessing import get_preprocessor
from torch.utils.data import DataLoader
def get_nyuv2(data_dir: str, config: dict, batch_size: int, num_workers: int, debug: bool, **kwargs):
train_set = NYUv2(
data_dir=data_dir,
n_classes=40,
split='train',
depth_mode='refined'
)
train_transforms = get_preprocessor(
depth_mean=train_set.depth_mean,
depth_std=train_set.depth_std,
height=config['dataset_kwargs']['image_size'],
width=config['dataset_kwargs']['image_size'],
phase='train'
)
train_set.preprocessor = train_transforms
train_loader = DataLoader(train_set, batch_size=batch_size,
num_workers=num_workers, drop_last=True, shuffle= not debug)
val_set = NYUv2(
data_dir=data_dir,
n_classes=40,
split='test',
depth_mode='refined'
)
val_transforms = get_preprocessor(
depth_mean=train_set.depth_mean,
depth_std=train_set.depth_std,
height=config['dataset_kwargs']['image_size'],
width=config['dataset_kwargs']['image_size'],
phase='test'
)
val_set.preprocessor = val_transforms
val_loader = DataLoader(train_set, batch_size=batch_size,
num_workers=num_workers, shuffle=False)
return train_loader, val_loader
def get_sunrgbd(data_dir: str, config: dict, batch_size: int, num_workers: int, debug: bool, **kwargs):
train_set = SUNRGBD(
data_dir=data_dir,
split='train',
depth_mode='refined'
)
train_transforms = get_preprocessor(
depth_mean=train_set.depth_mean,
depth_std=train_set.depth_std,
height=config['dataset_kwargs']['image_size'],
width=config['dataset_kwargs']['image_size'],
phase='train'
)
train_set.preprocessor = train_transforms
train_loader = DataLoader(train_set, batch_size=batch_size,
num_workers=num_workers, drop_last=True, shuffle=not debug)
val_set = SUNRGBD(
data_dir=data_dir,
split='test',
depth_mode='refined'
)
val_transforms = get_preprocessor(
depth_mean=train_set.depth_mean,
depth_std=train_set.depth_std,
height=config['dataset_kwargs']['image_size'],
width=config['dataset_kwargs']['image_size'],
phase='test'
)
val_set.preprocessor = val_transforms
val_loader = DataLoader(train_set, batch_size=batch_size,
num_workers=num_workers, shuffle=False)
return train_loader, val_loader
| 34.225 | 103 | 0.662893 | 335 | 2,738 | 5.074627 | 0.161194 | 0.075294 | 0.061176 | 0.112941 | 0.881176 | 0.881176 | 0.881176 | 0.856471 | 0.856471 | 0.856471 | 0 | 0.004805 | 0.239956 | 2,738 | 79 | 104 | 34.658228 | 0.81211 | 0 | 0 | 0.732394 | 0 | 0 | 0.093499 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028169 | false | 0 | 0.070423 | 0 | 0.126761 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d2b193522cd4cdc71aed79c12615d87716ad6018 | 236 | py | Python | exps/utils/colors.py | beneisner/partnet_seg_exps | 54cd57885ba248fb9a35b42664e1c69c7ef62255 | [
"MIT"
] | 70 | 2019-03-26T05:26:35.000Z | 2022-02-15T00:28:50.000Z | exps/utils/colors.py | beneisner/partnet_seg_exps | 54cd57885ba248fb9a35b42664e1c69c7ef62255 | [
"MIT"
] | 12 | 2019-06-27T07:27:08.000Z | 2022-01-05T05:58:07.000Z | exps/utils/colors.py | beneisner/partnet_seg_exps | 54cd57885ba248fb9a35b42664e1c69c7ef62255 | [
"MIT"
] | 13 | 2019-04-08T13:50:30.000Z | 2022-01-28T07:40:13.000Z | colors = [[0, 0, 1], [1, 0, 0], [0, 1, 0], \
[0.5, 0.5, 0], [0.5, 0, 0.5], [0, 0.5, 0.5], \
[0.3, 0.6, 0], [0.6, 0, 0.3], [0.3, 0, 0.6], \
[0.6, 0.3, 0], [0.3, 0, 0.6], [0.6, 0, 0.3], \
[0.8, 0.2, 0.5]]
| 33.714286 | 54 | 0.279661 | 61 | 236 | 1.081967 | 0.131148 | 0.393939 | 0.272727 | 0.242424 | 0.621212 | 0.621212 | 0.424242 | 0.424242 | 0.181818 | 0 | 0 | 0.379747 | 0.330508 | 236 | 6 | 55 | 39.333333 | 0.037975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d2c44ad28596fb7a2483b51470644db8e01f9e59 | 135,109 | py | Python | opensilexClientToolsPython/api/data_api.py | OpenSILEX/opensilexClientToolsPython | 41b1e7e707670ecf1b2c06d79bdd9749945788cb | [
"RSA-MD"
] | null | null | null | opensilexClientToolsPython/api/data_api.py | OpenSILEX/opensilexClientToolsPython | 41b1e7e707670ecf1b2c06d79bdd9749945788cb | [
"RSA-MD"
] | 7 | 2021-05-25T14:06:04.000Z | 2021-11-05T15:42:14.000Z | opensilexClientToolsPython/api/data_api.py | OpenSILEX/opensilexClientToolsPython | 41b1e7e707670ecf1b2c06d79bdd9749945788cb | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
OpenSilex API
No description provided (generated by Swagger Codegen https://github.com/swagger-api/swagger-codegen) # noqa: E501
OpenAPI spec version: INSTANCE-SNAPSHOT
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from opensilexClientToolsPython.api_client import ApiClient
class DataApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def add_list_data(self, **kwargs): # noqa: E501
"""Add data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_list_data(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param list[DataCreationDTO] body: Data description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.add_list_data_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.add_list_data_with_http_info(**kwargs) # noqa: E501
return data
def add_list_data_with_http_info(self, **kwargs): # noqa: E501
"""Add data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.add_list_data_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param list[DataCreationDTO] body: Data description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_list_data" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def count_data(self, **kwargs): # noqa: E501
"""Count data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.count_data(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str start_date: Search by minimal date
:param str end_date: Search by maximal date
:param str timezone: Precise the timezone corresponding to the given dates
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by target uris
:param list[str] variables: Search by variables uris
:param list[str] devices: Search by devices uris
:param float min_confidence: Search by minimal confidence index
:param float max_confidence: Search by maximal confidence index
:param list[str] provenances: Search by provenances
:param str metadata: Search by metadata
:param str accept_language: Request accepted language
:return: int
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.count_data_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.count_data_with_http_info(**kwargs) # noqa: E501
return data
def count_data_with_http_info(self, **kwargs): # noqa: E501
"""Count data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.count_data_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str start_date: Search by minimal date
:param str end_date: Search by maximal date
:param str timezone: Precise the timezone corresponding to the given dates
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by target uris
:param list[str] variables: Search by variables uris
:param list[str] devices: Search by devices uris
:param float min_confidence: Search by minimal confidence index
:param float max_confidence: Search by maximal confidence index
:param list[str] provenances: Search by provenances
:param str metadata: Search by metadata
:param str accept_language: Request accepted language
:return: int
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['start_date', 'end_date', 'timezone', 'experiments', 'targets', 'variables', 'devices', 'min_confidence', 'max_confidence', 'provenances', 'metadata', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method count_data" % key
)
params[key] = val
del params['kwargs']
if 'min_confidence' in params and params['min_confidence'] > 1: # noqa: E501
raise ValueError("Invalid value for parameter `min_confidence` when calling `count_data`, must be a value less than or equal to `1`") # noqa: E501
if 'min_confidence' in params and params['min_confidence'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `min_confidence` when calling `count_data`, must be a value greater than or equal to `0`") # noqa: E501
if 'max_confidence' in params and params['max_confidence'] > 1: # noqa: E501
raise ValueError("Invalid value for parameter `max_confidence` when calling `count_data`, must be a value less than or equal to `1`") # noqa: E501
if 'max_confidence' in params and params['max_confidence'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `max_confidence` when calling `count_data`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'start_date' in params:
query_params.append(('start_date', params['start_date'])) # noqa: E501
if 'end_date' in params:
query_params.append(('end_date', params['end_date'])) # noqa: E501
if 'timezone' in params:
query_params.append(('timezone', params['timezone'])) # noqa: E501
if 'experiments' in params:
query_params.append(('experiments', params['experiments'])) # noqa: E501
collection_formats['experiments'] = 'multi' # noqa: E501
if 'targets' in params:
query_params.append(('targets', params['targets'])) # noqa: E501
collection_formats['targets'] = 'multi' # noqa: E501
if 'variables' in params:
query_params.append(('variables', params['variables'])) # noqa: E501
collection_formats['variables'] = 'multi' # noqa: E501
if 'devices' in params:
query_params.append(('devices', params['devices'])) # noqa: E501
collection_formats['devices'] = 'multi' # noqa: E501
if 'min_confidence' in params:
query_params.append(('min_confidence', params['min_confidence'])) # noqa: E501
if 'max_confidence' in params:
query_params.append(('max_confidence', params['max_confidence'])) # noqa: E501
if 'provenances' in params:
query_params.append(('provenances', params['provenances'])) # noqa: E501
collection_formats['provenances'] = 'multi' # noqa: E501
if 'metadata' in params:
query_params.append(('metadata', params['metadata'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data/count', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='int', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_provenance(self, **kwargs): # noqa: E501
"""Add a provenance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_provenance(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param ProvenanceCreationDTO body: Provenance description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_provenance_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.create_provenance_with_http_info(**kwargs) # noqa: E501
return data
def create_provenance_with_http_info(self, **kwargs): # noqa: E501
"""Add a provenance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_provenance_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param ProvenanceCreationDTO body: Provenance description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_provenance" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/provenances', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_data(self, uri, **kwargs): # noqa: E501
"""Delete data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_data(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Data URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_data_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.delete_data_with_http_info(uri, **kwargs) # noqa: E501
return data
def delete_data_with_http_info(self, uri, **kwargs): # noqa: E501
"""Delete data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_data_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Data URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_data" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `delete_data`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data/{uri}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_data_on_search(self, **kwargs): # noqa: E501
"""Delete data on criteria # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_data_on_search(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str experiment: Search by experiment uri
:param str target: Search by target uri
:param str variable: Search by variable uri
:param str provenance: Search by provenance uri
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_data_on_search_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.delete_data_on_search_with_http_info(**kwargs) # noqa: E501
return data
def delete_data_on_search_with_http_info(self, **kwargs): # noqa: E501
"""Delete data on criteria # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_data_on_search_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str experiment: Search by experiment uri
:param str target: Search by target uri
:param str variable: Search by variable uri
:param str provenance: Search by provenance uri
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiment', 'target', 'variable', 'provenance', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_data_on_search" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'experiment' in params:
query_params.append(('experiment', params['experiment'])) # noqa: E501
if 'target' in params:
query_params.append(('target', params['target'])) # noqa: E501
if 'variable' in params:
query_params.append(('variable', params['variable'])) # noqa: E501
if 'provenance' in params:
query_params.append(('provenance', params['provenance'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_provenance(self, uri, **kwargs): # noqa: E501
"""Delete a provenance that doesn't describe data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_provenance(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Provenance URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_provenance_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.delete_provenance_with_http_info(uri, **kwargs) # noqa: E501
return data
def delete_provenance_with_http_info(self, uri, **kwargs): # noqa: E501
"""Delete a provenance that doesn't describe data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_provenance_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Provenance URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_provenance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `delete_provenance`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/provenances/{uri}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def export_data(self, **kwargs): # noqa: E501
"""Export data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.export_data(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str start_date: Search by minimal date
:param str end_date: Search by maximal date
:param str timezone: Precise the timezone corresponding to the given dates
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets
:param list[str] variables: Search by variables
:param list[str] devices: Search by devices uris
:param float min_confidence: Search by minimal confidence index
:param float max_confidence: Search by maximal confidence index
:param list[str] provenances: Search by provenances
:param str metadata: Search by metadata
:param str mode: Format wide or long
:param bool with_raw_data: Export also raw_data
:param list[str] order_by: List of fields to sort as an array of fieldName=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.export_data_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.export_data_with_http_info(**kwargs) # noqa: E501
return data
def export_data_with_http_info(self, **kwargs): # noqa: E501
"""Export data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.export_data_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str start_date: Search by minimal date
:param str end_date: Search by maximal date
:param str timezone: Precise the timezone corresponding to the given dates
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets
:param list[str] variables: Search by variables
:param list[str] devices: Search by devices uris
:param float min_confidence: Search by minimal confidence index
:param float max_confidence: Search by maximal confidence index
:param list[str] provenances: Search by provenances
:param str metadata: Search by metadata
:param str mode: Format wide or long
:param bool with_raw_data: Export also raw_data
:param list[str] order_by: List of fields to sort as an array of fieldName=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['start_date', 'end_date', 'timezone', 'experiments', 'targets', 'variables', 'devices', 'min_confidence', 'max_confidence', 'provenances', 'metadata', 'mode', 'with_raw_data', 'order_by', 'page', 'page_size', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method export_data" % key
)
params[key] = val
del params['kwargs']
if 'min_confidence' in params and params['min_confidence'] > 1: # noqa: E501
raise ValueError("Invalid value for parameter `min_confidence` when calling `export_data`, must be a value less than or equal to `1`") # noqa: E501
if 'min_confidence' in params and params['min_confidence'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `min_confidence` when calling `export_data`, must be a value greater than or equal to `0`") # noqa: E501
if 'max_confidence' in params and params['max_confidence'] > 1: # noqa: E501
raise ValueError("Invalid value for parameter `max_confidence` when calling `export_data`, must be a value less than or equal to `1`") # noqa: E501
if 'max_confidence' in params and params['max_confidence'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `max_confidence` when calling `export_data`, must be a value greater than or equal to `0`") # noqa: E501
if 'page' in params and params['page'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page` when calling `export_data`, must be a value greater than or equal to `0`") # noqa: E501
if 'page_size' in params and params['page_size'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page_size` when calling `export_data`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'start_date' in params:
query_params.append(('start_date', params['start_date'])) # noqa: E501
if 'end_date' in params:
query_params.append(('end_date', params['end_date'])) # noqa: E501
if 'timezone' in params:
query_params.append(('timezone', params['timezone'])) # noqa: E501
if 'experiments' in params:
query_params.append(('experiments', params['experiments'])) # noqa: E501
collection_formats['experiments'] = 'multi' # noqa: E501
if 'targets' in params:
query_params.append(('targets', params['targets'])) # noqa: E501
collection_formats['targets'] = 'multi' # noqa: E501
if 'variables' in params:
query_params.append(('variables', params['variables'])) # noqa: E501
collection_formats['variables'] = 'multi' # noqa: E501
if 'devices' in params:
query_params.append(('devices', params['devices'])) # noqa: E501
collection_formats['devices'] = 'multi' # noqa: E501
if 'min_confidence' in params:
query_params.append(('min_confidence', params['min_confidence'])) # noqa: E501
if 'max_confidence' in params:
query_params.append(('max_confidence', params['max_confidence'])) # noqa: E501
if 'provenances' in params:
query_params.append(('provenances', params['provenances'])) # noqa: E501
collection_formats['provenances'] = 'multi' # noqa: E501
if 'metadata' in params:
query_params.append(('metadata', params['metadata'])) # noqa: E501
if 'mode' in params:
query_params.append(('mode', params['mode'])) # noqa: E501
if 'with_raw_data' in params:
query_params.append(('with_raw_data', params['with_raw_data'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
collection_formats['order_by'] = 'multi' # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'page_size' in params:
query_params.append(('page_size', params['page_size'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data/export', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_data(self, uri, **kwargs): # noqa: E501
"""Get data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Data URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DataGetDTO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_data_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.get_data_with_http_info(uri, **kwargs) # noqa: E501
return data
def get_data_with_http_info(self, uri, **kwargs): # noqa: E501
"""Get data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Data URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DataGetDTO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_data" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `get_data`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data/{uri}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DataGetDTO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_data_file(self, uri, **kwargs): # noqa: E501
"""Get a data file # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_file(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Search by fileUri (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_data_file_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.get_data_file_with_http_info(uri, **kwargs) # noqa: E501
return data
def get_data_file_with_http_info(self, uri, **kwargs): # noqa: E501
"""Get a data file # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_file_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Search by fileUri (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_data_file" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `get_data_file`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/octet-stream']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/datafiles/{uri}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_data_file_description(self, uri, **kwargs): # noqa: E501
"""Get a data file description # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_file_description(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Search by fileUri (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DataFileGetDTO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_data_file_description_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.get_data_file_description_with_http_info(uri, **kwargs) # noqa: E501
return data
def get_data_file_description_with_http_info(self, uri, **kwargs): # noqa: E501
"""Get a data file description # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_file_description_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Search by fileUri (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DataFileGetDTO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_data_file_description" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `get_data_file_description`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/datafiles/{uri}/description', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DataFileGetDTO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_data_file_descriptions_by_search(self, **kwargs): # noqa: E501
"""Search data files # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_file_descriptions_by_search(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str rdf_type: Search by rdf type uri
:param str start_date: Search by minimal date
:param str end_date: Search by maximal date
:param str timezone: Precise the timezone corresponding to the given dates
:param list[str] experiments: Search by experiments
:param list[str] targets: Search by targets uris list
:param list[str] devices: Search by devices uris
:param list[str] provenances: Search by provenance uris list
:param str metadata: Search by metadata
:param list[str] order_by: List of fields to sort as an array of fieldName=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: list[DataFileGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_data_file_descriptions_by_search_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_data_file_descriptions_by_search_with_http_info(**kwargs) # noqa: E501
return data
def get_data_file_descriptions_by_search_with_http_info(self, **kwargs): # noqa: E501
"""Search data files # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_data_file_descriptions_by_search_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str rdf_type: Search by rdf type uri
:param str start_date: Search by minimal date
:param str end_date: Search by maximal date
:param str timezone: Precise the timezone corresponding to the given dates
:param list[str] experiments: Search by experiments
:param list[str] targets: Search by targets uris list
:param list[str] devices: Search by devices uris
:param list[str] provenances: Search by provenance uris list
:param str metadata: Search by metadata
:param list[str] order_by: List of fields to sort as an array of fieldName=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: list[DataFileGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['rdf_type', 'start_date', 'end_date', 'timezone', 'experiments', 'targets', 'devices', 'provenances', 'metadata', 'order_by', 'page', 'page_size', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_data_file_descriptions_by_search" % key
)
params[key] = val
del params['kwargs']
if 'page' in params and params['page'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page` when calling `get_data_file_descriptions_by_search`, must be a value greater than or equal to `0`") # noqa: E501
if 'page_size' in params and params['page_size'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page_size` when calling `get_data_file_descriptions_by_search`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'rdf_type' in params:
query_params.append(('rdf_type', params['rdf_type'])) # noqa: E501
if 'start_date' in params:
query_params.append(('start_date', params['start_date'])) # noqa: E501
if 'end_date' in params:
query_params.append(('end_date', params['end_date'])) # noqa: E501
if 'timezone' in params:
query_params.append(('timezone', params['timezone'])) # noqa: E501
if 'experiments' in params:
query_params.append(('experiments', params['experiments'])) # noqa: E501
collection_formats['experiments'] = 'multi' # noqa: E501
if 'targets' in params:
query_params.append(('targets', params['targets'])) # noqa: E501
collection_formats['targets'] = 'multi' # noqa: E501
if 'devices' in params:
query_params.append(('devices', params['devices'])) # noqa: E501
collection_formats['devices'] = 'multi' # noqa: E501
if 'provenances' in params:
query_params.append(('provenances', params['provenances'])) # noqa: E501
collection_formats['provenances'] = 'multi' # noqa: E501
if 'metadata' in params:
query_params.append(('metadata', params['metadata'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
collection_formats['order_by'] = 'multi' # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'page_size' in params:
query_params.append(('page_size', params['page_size'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/datafiles', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[DataFileGetDTO]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_datafiles_provenances(self, **kwargs): # noqa: E501
"""Get provenances linked to datafiles # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_datafiles_provenances(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets uris
:param list[str] devices: Search by devices uris
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_datafiles_provenances_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_datafiles_provenances_with_http_info(**kwargs) # noqa: E501
return data
def get_datafiles_provenances_with_http_info(self, **kwargs): # noqa: E501
"""Get provenances linked to datafiles # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_datafiles_provenances_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets uris
:param list[str] devices: Search by devices uris
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiments', 'targets', 'devices', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_datafiles_provenances" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'experiments' in params:
query_params.append(('experiments', params['experiments'])) # noqa: E501
collection_formats['experiments'] = 'multi' # noqa: E501
if 'targets' in params:
query_params.append(('targets', params['targets'])) # noqa: E501
collection_formats['targets'] = 'multi' # noqa: E501
if 'devices' in params:
query_params.append(('devices', params['devices'])) # noqa: E501
collection_formats['devices'] = 'multi' # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/datafiles/provenances', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[ProvenanceGetDTO]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_pictures_thumbnails(self, uri, **kwargs): # noqa: E501
"""Get a picture thumbnail # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_pictures_thumbnails(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Search by fileUri (required)
:param str authorization: Authentication token (required)
:param int scaled_width: Thumbnail width
:param int scaled_height: Thumbnail height
:param str accept_language: Request accepted language
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_pictures_thumbnails_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.get_pictures_thumbnails_with_http_info(uri, **kwargs) # noqa: E501
return data
def get_pictures_thumbnails_with_http_info(self, uri, **kwargs): # noqa: E501
"""Get a picture thumbnail # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_pictures_thumbnails_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Search by fileUri (required)
:param str authorization: Authentication token (required)
:param int scaled_width: Thumbnail width
:param int scaled_height: Thumbnail height
:param str accept_language: Request accepted language
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', 'scaled_width', 'scaled_height', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_pictures_thumbnails" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `get_pictures_thumbnails`") # noqa: E501
if 'scaled_width' in params and params['scaled_width'] > 1920: # noqa: E501
raise ValueError("Invalid value for parameter `scaled_width` when calling `get_pictures_thumbnails`, must be a value less than or equal to `1920`") # noqa: E501
if 'scaled_width' in params and params['scaled_width'] < 256: # noqa: E501
raise ValueError("Invalid value for parameter `scaled_width` when calling `get_pictures_thumbnails`, must be a value greater than or equal to `256`") # noqa: E501
if 'scaled_height' in params and params['scaled_height'] > 1080: # noqa: E501
raise ValueError("Invalid value for parameter `scaled_height` when calling `get_pictures_thumbnails`, must be a value less than or equal to `1080`") # noqa: E501
if 'scaled_height' in params and params['scaled_height'] < 144: # noqa: E501
raise ValueError("Invalid value for parameter `scaled_height` when calling `get_pictures_thumbnails`, must be a value greater than or equal to `144`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
if 'scaled_width' in params:
query_params.append(('scaled_width', params['scaled_width'])) # noqa: E501
if 'scaled_height' in params:
query_params.append(('scaled_height', params['scaled_height'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/octet-stream']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/datafiles/{uri}/thumbnail', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_provenance(self, uri, **kwargs): # noqa: E501
"""Get a provenance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_provenance(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Provenance URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ProvenanceGetDTO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_provenance_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.get_provenance_with_http_info(uri, **kwargs) # noqa: E501
return data
def get_provenance_with_http_info(self, uri, **kwargs): # noqa: E501
"""Get a provenance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_provenance_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Provenance URI (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ProvenanceGetDTO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_provenance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `get_provenance`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/provenances/{uri}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProvenanceGetDTO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_provenances_by_ur_is(self, uris, **kwargs): # noqa: E501
"""Get a list of provenances by their URIs # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_provenances_by_ur_is(uris, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[str] uris: Provenances URIs (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_provenances_by_ur_is_with_http_info(uris, **kwargs) # noqa: E501
else:
(data) = self.get_provenances_by_ur_is_with_http_info(uris, **kwargs) # noqa: E501
return data
def get_provenances_by_ur_is_with_http_info(self, uris, **kwargs): # noqa: E501
"""Get a list of provenances by their URIs # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_provenances_by_ur_is_with_http_info(uris, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[str] uris: Provenances URIs (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uris', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_provenances_by_ur_is" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uris' is set
if ('uris' not in params or
params['uris'] is None):
raise ValueError("Missing the required parameter `uris` when calling `get_provenances_by_ur_is`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'uris' in params:
query_params.append(('uris', params['uris'])) # noqa: E501
collection_formats['uris'] = 'multi' # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/provenances/by_uris', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[ProvenanceGetDTO]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_used_provenances(self, **kwargs): # noqa: E501
"""Get provenances linked to data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_used_provenances(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets uris
:param list[str] variables: Search by variables uris
:param list[str] devices: Search by devices uris
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_used_provenances_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_used_provenances_with_http_info(**kwargs) # noqa: E501
return data
def get_used_provenances_with_http_info(self, **kwargs): # noqa: E501
"""Get provenances linked to data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_used_provenances_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets uris
:param list[str] variables: Search by variables uris
:param list[str] devices: Search by devices uris
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiments', 'targets', 'variables', 'devices', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_used_provenances" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'experiments' in params:
query_params.append(('experiments', params['experiments'])) # noqa: E501
collection_formats['experiments'] = 'multi' # noqa: E501
if 'targets' in params:
query_params.append(('targets', params['targets'])) # noqa: E501
collection_formats['targets'] = 'multi' # noqa: E501
if 'variables' in params:
query_params.append(('variables', params['variables'])) # noqa: E501
collection_formats['variables'] = 'multi' # noqa: E501
if 'devices' in params:
query_params.append(('devices', params['devices'])) # noqa: E501
collection_formats['devices'] = 'multi' # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data/provenances', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[ProvenanceGetDTO]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_used_variables(self, **kwargs): # noqa: E501
"""Get variables linked to data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_used_variables(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets uris
:param list[str] provenances: Search by provenance uris
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_used_variables_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_used_variables_with_http_info(**kwargs) # noqa: E501
return data
def get_used_variables_with_http_info(self, **kwargs): # noqa: E501
"""Get variables linked to data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_used_variables_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets uris
:param list[str] provenances: Search by provenance uris
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiments', 'targets', 'provenances', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_used_variables" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'experiments' in params:
query_params.append(('experiments', params['experiments'])) # noqa: E501
collection_formats['experiments'] = 'multi' # noqa: E501
if 'targets' in params:
query_params.append(('targets', params['targets'])) # noqa: E501
collection_formats['targets'] = 'multi' # noqa: E501
if 'provenances' in params:
query_params.append(('provenances', params['provenances'])) # noqa: E501
collection_formats['provenances'] = 'multi' # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data/variables', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[ProvenanceGetDTO]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def import_csv_data(self, provenance, file, **kwargs): # noqa: E501
"""Import a CSV file for the given provenanceURI # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_csv_data(provenance, file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str provenance: Provenance URI (required)
:param file file: File (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DataCSVValidationDTO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.import_csv_data_with_http_info(provenance, file, **kwargs) # noqa: E501
else:
(data) = self.import_csv_data_with_http_info(provenance, file, **kwargs) # noqa: E501
return data
def import_csv_data_with_http_info(self, provenance, file, **kwargs): # noqa: E501
"""Import a CSV file for the given provenanceURI # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.import_csv_data_with_http_info(provenance, file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str provenance: Provenance URI (required)
:param file file: File (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DataCSVValidationDTO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['provenance', 'file', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method import_csv_data" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'provenance' is set
if ('provenance' not in params or
params['provenance'] is None):
raise ValueError("Missing the required parameter `provenance` when calling `import_csv_data`") # noqa: E501
# verify the required parameter 'file' is set
if ('file' not in params or
params['file'] is None):
raise ValueError("Missing the required parameter `file` when calling `import_csv_data`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'provenance' in params:
query_params.append(('provenance', params['provenance'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
if 'file' in params:
local_var_files['file'] = params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data/import', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DataCSVValidationDTO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def post_data_file(self, description, file, **kwargs): # noqa: E501
"""Add a data file # noqa: E501
{\"rdf_type\":\"http://www.opensilex.org/vocabulary/oeso#Image\", \"date\":\"2020-08-21T00:00:00+01:00\", \"target\":\"http://plot01\", \"provenance\": { \"uri\":\"http://opensilex.dev/provenance/1598001689415\" }, \"metadata\":{ \"LabelView\" : \"side90\", \"paramA\" : \"90\"}} # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_data_file(description, file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str description: File description with metadata (required)
:param file file: Data file (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_data_file_with_http_info(description, file, **kwargs) # noqa: E501
else:
(data) = self.post_data_file_with_http_info(description, file, **kwargs) # noqa: E501
return data
def post_data_file_with_http_info(self, description, file, **kwargs): # noqa: E501
"""Add a data file # noqa: E501
{\"rdf_type\":\"http://www.opensilex.org/vocabulary/oeso#Image\", \"date\":\"2020-08-21T00:00:00+01:00\", \"target\":\"http://plot01\", \"provenance\": { \"uri\":\"http://opensilex.dev/provenance/1598001689415\" }, \"metadata\":{ \"LabelView\" : \"side90\", \"paramA\" : \"90\"}} # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_data_file_with_http_info(description, file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str description: File description with metadata (required)
:param file file: Data file (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['description', 'file', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_data_file" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'description' is set
if ('description' not in params or
params['description'] is None):
raise ValueError("Missing the required parameter `description` when calling `post_data_file`") # noqa: E501
# verify the required parameter 'file' is set
if ('file' not in params or
params['file'] is None):
raise ValueError("Missing the required parameter `file` when calling `post_data_file`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
if 'description' in params:
form_params.append(('description', params['description'])) # noqa: E501
if 'file' in params:
local_var_files['file'] = params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/datafiles', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def post_data_file_paths(self, body, **kwargs): # noqa: E501
"""Describe datafiles and give their relative paths in the configured storage system. In the case of already stored datafiles. # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_data_file_paths(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[DataFilePathCreationDTO] body: Metadata of the file (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_data_file_paths_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.post_data_file_paths_with_http_info(body, **kwargs) # noqa: E501
return data
def post_data_file_paths_with_http_info(self, body, **kwargs): # noqa: E501
"""Describe datafiles and give their relative paths in the configured storage system. In the case of already stored datafiles. # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_data_file_paths_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param list[DataFilePathCreationDTO] body: Metadata of the file (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_data_file_paths" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `post_data_file_paths`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/datafiles/description', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def search_data_list(self, **kwargs): # noqa: E501
"""Search data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.search_data_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str start_date: Search by minimal date
:param str end_date: Search by maximal date
:param str timezone: Precise the timezone corresponding to the given dates
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets uris
:param list[str] variables: Search by variables uris
:param list[str] devices: Search by devices uris
:param float min_confidence: Search by minimal confidence index
:param float max_confidence: Search by maximal confidence index
:param list[str] provenances: Search by provenances
:param str metadata: Search by metadata
:param list[str] order_by: List of fields to sort as an array of fieldName=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: list[DataGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.search_data_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.search_data_list_with_http_info(**kwargs) # noqa: E501
return data
def search_data_list_with_http_info(self, **kwargs): # noqa: E501
"""Search data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.search_data_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str start_date: Search by minimal date
:param str end_date: Search by maximal date
:param str timezone: Precise the timezone corresponding to the given dates
:param list[str] experiments: Search by experiment uris
:param list[str] targets: Search by targets uris
:param list[str] variables: Search by variables uris
:param list[str] devices: Search by devices uris
:param float min_confidence: Search by minimal confidence index
:param float max_confidence: Search by maximal confidence index
:param list[str] provenances: Search by provenances
:param str metadata: Search by metadata
:param list[str] order_by: List of fields to sort as an array of fieldName=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: list[DataGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['start_date', 'end_date', 'timezone', 'experiments', 'targets', 'variables', 'devices', 'min_confidence', 'max_confidence', 'provenances', 'metadata', 'order_by', 'page', 'page_size', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method search_data_list" % key
)
params[key] = val
del params['kwargs']
if 'min_confidence' in params and params['min_confidence'] > 1: # noqa: E501
raise ValueError("Invalid value for parameter `min_confidence` when calling `search_data_list`, must be a value less than or equal to `1`") # noqa: E501
if 'min_confidence' in params and params['min_confidence'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `min_confidence` when calling `search_data_list`, must be a value greater than or equal to `0`") # noqa: E501
if 'max_confidence' in params and params['max_confidence'] > 1: # noqa: E501
raise ValueError("Invalid value for parameter `max_confidence` when calling `search_data_list`, must be a value less than or equal to `1`") # noqa: E501
if 'max_confidence' in params and params['max_confidence'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `max_confidence` when calling `search_data_list`, must be a value greater than or equal to `0`") # noqa: E501
if 'page' in params and params['page'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page` when calling `search_data_list`, must be a value greater than or equal to `0`") # noqa: E501
if 'page_size' in params and params['page_size'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page_size` when calling `search_data_list`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'start_date' in params:
query_params.append(('start_date', params['start_date'])) # noqa: E501
if 'end_date' in params:
query_params.append(('end_date', params['end_date'])) # noqa: E501
if 'timezone' in params:
query_params.append(('timezone', params['timezone'])) # noqa: E501
if 'experiments' in params:
query_params.append(('experiments', params['experiments'])) # noqa: E501
collection_formats['experiments'] = 'multi' # noqa: E501
if 'targets' in params:
query_params.append(('targets', params['targets'])) # noqa: E501
collection_formats['targets'] = 'multi' # noqa: E501
if 'variables' in params:
query_params.append(('variables', params['variables'])) # noqa: E501
collection_formats['variables'] = 'multi' # noqa: E501
if 'devices' in params:
query_params.append(('devices', params['devices'])) # noqa: E501
collection_formats['devices'] = 'multi' # noqa: E501
if 'min_confidence' in params:
query_params.append(('min_confidence', params['min_confidence'])) # noqa: E501
if 'max_confidence' in params:
query_params.append(('max_confidence', params['max_confidence'])) # noqa: E501
if 'provenances' in params:
query_params.append(('provenances', params['provenances'])) # noqa: E501
collection_formats['provenances'] = 'multi' # noqa: E501
if 'metadata' in params:
query_params.append(('metadata', params['metadata'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
collection_formats['order_by'] = 'multi' # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'page_size' in params:
query_params.append(('page_size', params['page_size'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[DataGetDTO]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def search_provenance(self, **kwargs): # noqa: E501
"""Get provenances # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.search_provenance(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str name: Regex pattern for filtering by name
:param str description: Search by description
:param str activity: Search by activity URI
:param str activity_type: Search by activity type
:param str agent: Search by agent URI
:param str agent_type: Search by agent type
:param list[str] order_by: List of fields to sort as an array of fieldName=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.search_provenance_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.search_provenance_with_http_info(**kwargs) # noqa: E501
return data
def search_provenance_with_http_info(self, **kwargs): # noqa: E501
"""Get provenances # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.search_provenance_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param str name: Regex pattern for filtering by name
:param str description: Search by description
:param str activity: Search by activity URI
:param str activity_type: Search by activity type
:param str agent: Search by agent URI
:param str agent_type: Search by agent type
:param list[str] order_by: List of fields to sort as an array of fieldName=asc|desc
:param int page: Page number
:param int page_size: Page size
:param str accept_language: Request accepted language
:return: list[ProvenanceGetDTO]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['name', 'description', 'activity', 'activity_type', 'agent', 'agent_type', 'order_by', 'page', 'page_size', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method search_provenance" % key
)
params[key] = val
del params['kwargs']
if 'page' in params and params['page'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page` when calling `search_provenance`, must be a value greater than or equal to `0`") # noqa: E501
if 'page_size' in params and params['page_size'] < 0: # noqa: E501
raise ValueError("Invalid value for parameter `page_size` when calling `search_provenance`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'name' in params:
query_params.append(('name', params['name'])) # noqa: E501
if 'description' in params:
query_params.append(('description', params['description'])) # noqa: E501
if 'activity' in params:
query_params.append(('activity', params['activity'])) # noqa: E501
if 'activity_type' in params:
query_params.append(('activity_type', params['activity_type'])) # noqa: E501
if 'agent' in params:
query_params.append(('agent', params['agent'])) # noqa: E501
if 'agent_type' in params:
query_params.append(('agent_type', params['agent_type'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
collection_formats['order_by'] = 'multi' # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'page_size' in params:
query_params.append(('page_size', params['page_size'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/provenances', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[ProvenanceGetDTO]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update(self, **kwargs): # noqa: E501
"""Update data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param DataUpdateDTO body: Data description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.update_with_http_info(**kwargs) # noqa: E501
return data
def update_with_http_info(self, **kwargs): # noqa: E501
"""Update data # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param DataUpdateDTO body: Data description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_confidence(self, uri, **kwargs): # noqa: E501
"""Update confidence index # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_confidence(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Data URI (required)
:param str authorization: Authentication token (required)
:param DataConfidenceDTO body: Data description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_confidence_with_http_info(uri, **kwargs) # noqa: E501
else:
(data) = self.update_confidence_with_http_info(uri, **kwargs) # noqa: E501
return data
def update_confidence_with_http_info(self, uri, **kwargs): # noqa: E501
"""Update confidence index # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_confidence_with_http_info(uri, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str uri: Data URI (required)
:param str authorization: Authentication token (required)
:param DataConfidenceDTO body: Data description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['uri', 'body', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_confidence" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'uri' is set
if ('uri' not in params or
params['uri'] is None):
raise ValueError("Missing the required parameter `uri` when calling `update_confidence`") # noqa: E501
collection_formats = {}
path_params = {}
if 'uri' in params:
path_params['uri'] = params['uri'] # noqa: E501
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data/{uri}/confidence', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_provenance(self, **kwargs): # noqa: E501
"""Update a provenance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_provenance(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param ProvenanceUpdateDTO body: Provenance description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_provenance_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.update_provenance_with_http_info(**kwargs) # noqa: E501
return data
def update_provenance_with_http_info(self, **kwargs): # noqa: E501
"""Update a provenance # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_provenance_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param str authorization: Authentication token (required)
:param ProvenanceUpdateDTO body: Provenance description
:param str accept_language: Request accepted language
:return: ObjectUriResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_provenance" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/provenances', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ObjectUriResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def validate_csv(self, provenance, file, **kwargs): # noqa: E501
"""Import a CSV file for the given provenanceURI. # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.validate_csv(provenance, file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str provenance: Provenance URI (required)
:param file file: File (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DataCSVValidationDTO
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.validate_csv_with_http_info(provenance, file, **kwargs) # noqa: E501
else:
(data) = self.validate_csv_with_http_info(provenance, file, **kwargs) # noqa: E501
return data
def validate_csv_with_http_info(self, provenance, file, **kwargs): # noqa: E501
"""Import a CSV file for the given provenanceURI. # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.validate_csv_with_http_info(provenance, file, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str provenance: Provenance URI (required)
:param file file: File (required)
:param str authorization: Authentication token (required)
:param str accept_language: Request accepted language
:return: DataCSVValidationDTO
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['provenance', 'file', ] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method validate_csv" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'provenance' is set
if ('provenance' not in params or
params['provenance'] is None):
raise ValueError("Missing the required parameter `provenance` when calling `validate_csv`") # noqa: E501
# verify the required parameter 'file' is set
if ('file' not in params or
params['file'] is None):
raise ValueError("Missing the required parameter `file` when calling `validate_csv`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'provenance' in params:
query_params.append(('provenance', params['provenance'])) # noqa: E501
header_params = {}
#if 'authorization' in params:
# header_params['Authorization'] = params['authorization'] # noqa: E501
#if 'accept_language' in params:
# header_params['Accept-Language'] = params['accept_language'] # noqa: E501
form_params = []
local_var_files = {}
if 'file' in params:
local_var_files['file'] = params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/core/data/import_validation', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DataCSVValidationDTO', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 42.987273 | 301 | 0.613927 | 15,347 | 135,109 | 5.192676 | 0.019027 | 0.060633 | 0.026665 | 0.02349 | 0.980086 | 0.974427 | 0.969834 | 0.965869 | 0.95962 | 0.95578 | 0 | 0.020452 | 0.290706 | 135,109 | 3,142 | 302 | 43.000955 | 0.811128 | 0.373506 | 0 | 0.818295 | 1 | 0.014935 | 0.216729 | 0.033746 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032981 | false | 0 | 0.00809 | 0 | 0.09023 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d2c829ddbe4474e1ee69ecaa67bd5cdf6d644c43 | 3,974 | py | Python | blogtrans/readme.py | miaout17/blogtrans | 0469fb1281f45416446b7bb957cfbb835d520366 | [
"MIT"
] | 3 | 2016-05-09T10:35:46.000Z | 2019-04-23T05:40:21.000Z | blogtrans/readme.py | miaout17/blogtrans | 0469fb1281f45416446b7bb957cfbb835d520366 | [
"MIT"
] | null | null | null | blogtrans/readme.py | miaout17/blogtrans | 0469fb1281f45416446b7bb957cfbb835d520366 | [
"MIT"
] | null | null | null | import base64
html = base64.b64decode("PGgxPkJsb2d0cmFuczwvaDE+Cgo8cD5CbG9ndHJhbnPmmK/kuIDlpZfovYnmj5tibG9n6LOH5paZ5qC85byP55qE6ZaL5pS+5rqQ56K86Luf6auUPC9wPgoKPGgyPuazqOaEj++8muW3suefpeWVj+mhjDwvaDI+Cgo8dWw+CjxsaT7nhKHlkI1YTUzmqpTmnInml6XmnJ/kuI3mraPnorrnmoTllY/poYzvvIjlpoIy5pyIMzHml6XvvIk8YnI+CkJsb2d0cmFuc+acg+iHquWLleippuWcluS/ruato++8iOWmguWwhzLmnIgzMeaXpeaUueeCujLmnIgyOOaXpe+8iTwvbGk+CjxsaT7lpoLljK/lh7rnmoRCbG9nZ2VyIEF0b20gWE1M54Sh5rOV5YWo6YOo5Yyv5YWlQmxvZ2dlcu+8iOWmguWMr+WFpTgwMOevh+WPquWHuuePvjMwMOevh++8iTxicj4K5Y+v6IO95piv55Sx5pa85paH56ug6aGe5Yil5pyJ54m55q6K56ym6Jmf77yM5Y+v6Kmm6JGX5L2/55So44CM5bel5YW377ya5riF6Zmk5omA5pyJ5paH56ug6aGe5Yil44CNPC9saT4KPC91bD4KCgo8cD7ntLDnr4Dlj4rlhbbku5blt7Lnn6XllY/poYzoqbPopos8YSBocmVmPSJodHRwOi8vbWlhb3V0MTcuZ2l0aHViLmlvL2Jsb2d0cmFucy9mYXEuaHRtbCI+RkFRPC9hPu+8jOWmgueEoeazleino+axuuatoei/jjxhIGhyZWY9Imh0dHA6Ly9taWFvdXQxNy5naXRodWIuaW8vYmxvZ3RyYW5zL3JlcG9ydC5odG1sIj7lm57loLHllY/poYw8L2E+44CCCuS5n+WPr+WPg+iAgzxhIGhyZWY9Imh0dHA6Ly9ibG9nLm1pYW91dDE3Lm5ldC8iPuS9nOiAhUJsb2c8L2E+55qE6LOH6KiK44CCPC9wPgoKPGgyPuaUr+aPtOagvOW8jzwvaDI+Cgo8dWw+CjxsaT48YSBocmVmPSJodHRwOi8vd3d3Lm1vdmFibGV0eXBlLm9yZy9kb2N1bWVudGF0aW9uL2FwcGVuZGljZXMvaW1wb3J0LWV4cG9ydC1mb3JtYXQuaHRtbCI+TW92YWJsZVR5cGU8L2E+IOWMr+WFpeWPiuWMr+WHuu+8iOebuOWuueaWvOeXnuWuoumCpu+8iTwvbGk+CjxsaT48YSBocmVmPSJodHRwOi8vd3d3LndyZXRjaC5jYy9ibG9nLyI+54Sh5ZCN5bCP56uZWE1MPC9hPiDljK/lhaU8L2xpPgo8bGk+PGEgaHJlZj0iaHR0cDovL3d3dy5ibG9nZ2VyLmNvbSI+QmxvZ2dlciBBdG9tPC9hPiDljK/lh7o8L2xpPgo8L3VsPgoKCjxoMj7mm7TmlrDoqJjpjIQ8L2gyPgoKPHA+QmxvZ3RyYW5zIDEuMS4wICgyMDEzLzA5LzAzKTwvcD4KCjx1bD4KPGxpPuiHquWLleS/ruato+eEoeWQjVhNTOaqlOaXpeacn+mMr+iqpOeahOWVj+mhjO+8iOWmguiHquWLleWwhzIvMzHkv67mraPngroyLzI477yJPC9saT4KPGxpPuaWsOWinuOAjOW3peWFt++8mua4hemZpOaJgOacieaWh+eroOmhnuWIpeOAjeWKn+iDve+8iOeEoeazleWMr+WFpeaJgOacieaWh+eroOiHs0Jsb2dnZXLmmYLlj6/lmJfoqabkvb/nlKjvvIk8L2xpPgo8bGk+5L+u5q2j55WZ6KiA6ZmE5Yqg5pa86Yyv6Kqk5paH56ug55qEYnVnPC9saT4KPC91bD4KCgo8aDI+UnVubmluZyBCbG9ndHJhbnM8L2gyPgoKPGgzPldpbmRvd3M8L2gzPgoKPHA+VGhlcmUgaXMgb25seSBvZmZpY2lhbCBidWlsZCBmb3IgV2luZG93cyBzeXN0ZW0uClRoZSBXaW5kb3dzIGJ1aWxkIGlzIGF2YWlsYWJsZSBvbiBTb3VyY2VGb3JnZS4KVmlzaXQgPGEgaHJlZj0iaHR0cHM6Ly9zb3VyY2Vmb3JnZS5uZXQvcHJvamVjdC9zaG93ZmlsZXMucGhwP2dyb3VwX2lkPTIxMTU0OCI+cHJvamVjdCBkb3dubG9hZCBwYWdlPC9hPiB0byBnZXQgbmV3ZXN0IGJ1aWxkLjwvcD4KCjxoMz5MaW51eCBhbmQgTWFjPC9oMz4KCjxwPkxpbnV4IGFuZCBNYWMgdXNlciBjYW4gZ2V0IHRoZSBzb3VyY2UgY29kZSBhbmQgYnVpbGQgYnkgeW91cnNlbGYuClBsZWFzZSByZWZlciB0byA8c3Ryb25nPkJ1aWxkIGZyb20gU291cmNlPC9zdHJvbmc+IHNlY3Rpb24uPC9wPgoKPGgyPlVzYWdlPC9oMj4KCjxwPlBsZWFzZSByZWZlciB0byB0aGUgPGEgaHJlZj0iaHR0cDovL21pYW91dDE3LmdpdGh1Yi5jb20vYmxvZ3RyYW5zLyI+QmxvZ3RyYW5zIEhvbWVwYWdlPC9hPiAoVHJhZGl0aW9uYWwgQ2hpbmVzZSkuPC9wPgoKPGgyPkJ1aWxkIGZyb20gU291cmNlPC9oMj4KCjxwPlRoZSBjb2RlIGlzIG1haW5seSBob3N0ZWQgb24gPGEgaHJlZj0iaHR0cHM6Ly9naXRodWIuY29tL21pYW91dDE3L2Jsb2d0cmFucyI+R2l0aHViPC9hPgooVGhlcmUgaXMgYWxzbyBhIG1pcnJvciBvbiBTb3VyY2Vmb3JnZSwgYnV0IGl0J3Mgbm90IGFsd2F5cyB1cC10by1kYXRlKS48L3A+Cgo8cD5Zb3UgY2FuIGdldCB0aGUgc291cmNlIGNvZGUgdmlhIGdpdDo8L3A+Cgo8cHJlPjxjb2RlPmdpdCBjbG9uZSBnaXRAZ2l0aHViLmNvbTptaWFvdXQxNy9ibG9ndHJhbnMuZ2l0CjwvY29kZT48L3ByZT4KCjxwPm9yIGRpcmVjdGx5IGRvd25sb2FkIHRoZSA8YSBocmVmPSJodHRwczovL2dpdGh1Yi5jb20vbWlhb3V0MTcvYmxvZ3RyYW5zL3RhcmJhbGwvbWFzdGVyIj50YXJiYWxsPC9hPi48L3A+Cgo8aDM+RGVwZW5kZW5jaWVzPC9oMz4KCjxwPlBsZWFzZSBpbnN0YWxsIGZvbGxvd2luZyBpdGVtczo8L3A+Cgo8dWw+CjxsaT48YSBocmVmPSJodHRwOi8vd3d3LnB5dGhvbi5vcmcvIj5QeXRob248L2E+IDIuNiBvciBhYm92ZSAoYnV0IG5vdCAzLngpPC9saT4KPGxpPjxhIGhyZWY9Imh0dHA6Ly93d3cud3hweXRob24ub3JnLyI+d3hQeXRob248L2E+IDIuOCBvciBhYm92ZTwvbGk+CjxsaT48YSBocmVmPSJodHRwOi8vcHN5Y28uc291cmNlZm9yZ2UubmV0LyI+cHN5Y288L2E+IChvcHRpb25hbCk8L2xpPgo8L3VsPgoKCjxwPkFmdGVyIHRoYXQsIHlvdSBjYW4gZXhlY3V0ZSBCbG9nVHJhbnM6PC9wPgoKPHByZT48Y29kZT5weXRob24gYmxvZ3RyYW5zLnB5CjwvY29kZT48L3ByZT4KCjxoMz5UZXN0aW5nPC9oMz4KCjxwPkRvY3VtZW50IFRCRC4uPC9wPgoKPGgyPkxpZWNuc2U8L2gyPgoKPHA+QmxvZ3RyYW5zIGlzIHJlbGVhc2VkIHdpdGggTUlUIExJQ0VOU0UsIHNlZSBMSUNFTlNFIGZpbGUgZm9yIGRldGFpbHMuPC9wPgo=").decode("UTF-8")
| 1,324.666667 | 3,959 | 0.972823 | 95 | 3,974 | 40.694737 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151675 | 0.001258 | 3,974 | 2 | 3,960 | 1,987 | 0.822373 | 0 | 0 | 0 | 0 | 0.5 | 0.986663 | 0.985405 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
d2cc788d44f9ebb5aa13ebef76d41aabf3e2f4c3 | 175 | py | Python | main/data/admin.py | mikron22/clothes_finder | 3c22cd0828a13bd88f1f2c18195c3f0c88932918 | [
"MIT"
] | null | null | null | main/data/admin.py | mikron22/clothes_finder | 3c22cd0828a13bd88f1f2c18195c3f0c88932918 | [
"MIT"
] | 9 | 2020-06-06T00:48:25.000Z | 2022-02-10T14:11:49.000Z | main/data/admin.py | mikron22/clothes_finder | 3c22cd0828a13bd88f1f2c18195c3f0c88932918 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Cloth, Person, Keywords, Settings
# Register your models here.
admin.site.register([Cloth, Person, Keywords, Settings])
| 25 | 56 | 0.782857 | 23 | 175 | 5.956522 | 0.608696 | 0.160584 | 0.277372 | 0.394161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125714 | 175 | 6 | 57 | 29.166667 | 0.895425 | 0.148571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
82d3a85beff4f897e1d867af14fb8ee67dcbf455 | 4,644 | py | Python | src/fanalysis/Download/api.py | mansaluke/fanalysis | 5c902e8d1bdcd86aaa162e3fedd087dfd9483dae | [
"Apache-2.0"
] | null | null | null | src/fanalysis/Download/api.py | mansaluke/fanalysis | 5c902e8d1bdcd86aaa162e3fedd087dfd9483dae | [
"Apache-2.0"
] | null | null | null | src/fanalysis/Download/api.py | mansaluke/fanalysis | 5c902e8d1bdcd86aaa162e3fedd087dfd9483dae | [
"Apache-2.0"
] | null | null | null | import requests
from bs4 import BeautifulSoup
standardpath = 'fanalysis\\data\\get_fx_data'
def download_fx_m1_data_year(year='2018', pair='eurgbp', path = standardpath):
"""
Download 1-Minute FX data per year.
:param year: Trading year. Format is 2016.
:param month: Trading month. Format is 7 or 12.
:param pair: Currency pair. Example: eurgbp.
:return: ZIP Filename.
"""
referer = 'http://www.histdata.com/download-free-forex-historical-data/?/ascii/1-minute-bar-quotes/{}/{}'.format(
pair.lower(), year)
# Referer is the most important thing here.
headers = {'Host': 'www.histdata.com',
'Connection': 'keep-alive',
'Content-Length': '104',
'Cache-Control': 'max-age=0',
'Origin': 'http://www.histdata.com',
'Upgrade-Insecure-Requests': '1',
'Content-Type': 'application/x-www-form-urlencoded',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8',
'Referer': referer}
r1 = requests.get(referer)
assert r1.status_code == 200, 'Make sure the website www.histdata.com is up.'
soup = BeautifulSoup(r1.content, 'html.parser')
try:
token = soup.find('input', {'id': 'tk'}).attrs['value']
assert len(token) > 0
except:
raise AssertionError('There is no token. Please make sure your year/month/pair is correct.'
'Example is year=2016, month=7, pair=eurgbp')
data = {'tk': token,
'date': str(year),
'datemonth': str(year),
'platform': 'ASCII',
'timeframe': 'M1',
'fxpair': pair.upper()}
r = requests.post(url='http://www.histdata.com/get.php',
data=data,
headers=headers)
assert len(r.content) > 0, 'No data could be found here.'
print(data)
output_filename = path + '\\DAT_ASCII_{}_M1_{}.zip'.format(pair.upper(), '{}'.format(year))
with open(output_filename, 'wb') as f:
for chunk in r.iter_content(chunk_size=1024):
if chunk:
f.write(chunk)
print('Wrote to {}'.format(output_filename))
return output_filename
def download_fx_m1_data(year='2019', month ='1', pair='eurgbp', path = standardpath):
"""
Download 1-Minute FX data per month.
:param year: Trading year. Format is 2016.
:param month: Trading month. Format is 7 or 12.
:param pair: Currency pair. Example: eurgbp.
:return: ZIP Filename.
"""
referer = 'http://www.histdata.com/download-free-forex-historical-data/?/ascii/1-minute-bar-quotes/{}/{}/{}'.format(
pair.lower(),
year, month)
# Referer is the most important thing here.
headers = {'Host': 'www.histdata.com',
'Connection': 'keep-alive',
'Content-Length': '104',
'Cache-Control': 'max-age=0',
'Origin': 'http://www.histdata.com',
'Upgrade-Insecure-Requests': '1',
'Content-Type': 'application/x-www-form-urlencoded',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8',
'Referer': referer}
r1 = requests.get(referer)
assert r1.status_code == 200, 'Make sure the website www.histdata.com is up.'
soup = BeautifulSoup(r1.content, 'html.parser')
try:
token = soup.find('input', {'id': 'tk'}).attrs['value']
assert len(token) > 0
except:
raise AssertionError('There is no token. Please make sure your year/month/pair is correct.'
'Example is year=2016, month=7, pair=eurgbp')
data = {'tk': token,
'date': str(year),
'datemonth': '{}{}'.format(year, str(month).zfill(2)),
'platform': 'ASCII',
'timeframe': 'M1',
'fxpair': pair.upper()}
r = requests.post(url='http://www.histdata.com/get.php',
data=data,
headers=headers)
assert len(r.content) > 0, 'No data could be found here.'
print(data)
output_filename = path + '\\DAT_ASCII_{}_M1_{}.zip'.format(pair.upper(), '{}{}'.format(year, str(month).zfill(2)))
with open(output_filename, 'wb') as f:
for chunk in r.iter_content(chunk_size=1024):
if chunk:
f.write(chunk)
print('Wrote to {}'.format(output_filename))
return output_filename
if __name__ == '__main__':
download_fx_m1_data_year(2018)
#download_fx_m1_data(2016, 3, 'eurgbp')
| 38.7 | 120 | 0.575797 | 570 | 4,644 | 4.610526 | 0.261404 | 0.041857 | 0.053272 | 0.041096 | 0.9414 | 0.93379 | 0.901826 | 0.901826 | 0.901826 | 0.901826 | 0 | 0.029152 | 0.268734 | 4,644 | 119 | 121 | 39.02521 | 0.7447 | 0.110465 | 0 | 0.809524 | 0 | 0.047619 | 0.353785 | 0.084117 | 0 | 0 | 0 | 0 | 0.095238 | 1 | 0.02381 | false | 0 | 0.02381 | 0 | 0.071429 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7d7715f08dddc22f803b90345aa3415ce879030f | 193 | py | Python | tests/examples/test_custom_snapshot_name_suffix.py | dtczest/syrupy | c37d6521852c96cf1ae01873c02b94410d38b663 | [
"Apache-2.0"
] | 1 | 2022-02-22T08:16:31.000Z | 2022-02-22T08:16:31.000Z | tests/examples/test_custom_snapshot_name_suffix.py | dtczest/syrupy | c37d6521852c96cf1ae01873c02b94410d38b663 | [
"Apache-2.0"
] | null | null | null | tests/examples/test_custom_snapshot_name_suffix.py | dtczest/syrupy | c37d6521852c96cf1ae01873c02b94410d38b663 | [
"Apache-2.0"
] | null | null | null | def test_snapshot_custom_snapshot_name_suffix(snapshot):
assert "Syrupy is amazing!" == snapshot(name="test_is_amazing")
assert "Syrupy is awesome!" == snapshot(name="test_is_awesome")
| 48.25 | 67 | 0.766839 | 26 | 193 | 5.346154 | 0.423077 | 0.258993 | 0.201439 | 0.258993 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11399 | 193 | 3 | 68 | 64.333333 | 0.812866 | 0 | 0 | 0 | 0 | 0 | 0.341969 | 0 | 0 | 0 | 0 | 0 | 0.666667 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7dc1967074726c2ddb3040c8407ffa5d733fcdbc | 4,246 | py | Python | tests/commands/test_beam_integration.py | o3seespy/o3seespy | 4fdd942370df1ac8d454e361f651405717b8584c | [
"MIT",
"BSD-3-Clause"
] | 16 | 2019-10-24T17:58:46.000Z | 2022-03-01T19:48:06.000Z | tests/commands/test_beam_integration.py | o3seespy/o3seespy | 4fdd942370df1ac8d454e361f651405717b8584c | [
"MIT",
"BSD-3-Clause"
] | 5 | 2020-04-17T01:39:27.000Z | 2020-12-18T05:07:58.000Z | tests/commands/test_beam_integration.py | o3seespy/o3seespy | 4fdd942370df1ac8d454e361f651405717b8584c | [
"MIT",
"BSD-3-Clause"
] | 6 | 2020-02-20T02:13:11.000Z | 2021-11-01T19:08:41.000Z | import o3seespy as o3 # for testing only
def test_lobatto():
osi = o3.OpenSeesInstance(ndm=2)
sec = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.Lobatto(osi, sec=sec, big_n=1)
def test_legendre():
osi = o3.OpenSeesInstance(ndm=2)
sec = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.Legendre(osi, sec=sec, big_n=1)
def test_newton_cotes():
osi = o3.OpenSeesInstance(ndm=2)
sec = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.NewtonCotes(osi, sec=sec, big_n=1)
def test_radau():
osi = o3.OpenSeesInstance(ndm=2)
sec = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.Radau(osi, sec=sec, big_n=1)
def test_trapezoidal():
osi = o3.OpenSeesInstance(ndm=2)
sec = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.Trapezoidal(osi, sec=sec, big_n=1)
def test_composite_simpson():
osi = o3.OpenSeesInstance(ndm=2)
sec = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.CompositeSimpson(osi, sec=sec, big_n=1)
def test_user_defined():
osi = o3.OpenSeesInstance(ndm=2)
secs = [o3.section.Elastic2D(osi, e_mod=1.0, area=1.0, iz=1.0),
o3.section.Elastic2D(osi, e_mod=1.0, area=1.0, iz=1.0)]
o3.beam_integration.UserDefined(osi, big_n=2, secs=secs, locs=[0.2, 0.9], wts=[0.5, 0.5])
def test_fixed_location():
osi = o3.OpenSeesInstance(ndm=2)
secs = [o3.section.Elastic2D(osi, e_mod=1.0, area=1.0, iz=1.0),
o3.section.Elastic2D(osi, e_mod=1.0, area=1.0, iz=1.0)]
o3.beam_integration.FixedLocation(osi, big_n=2, secs=secs, locs=[0.2, 0.9])
def test_low_order():
osi = o3.OpenSeesInstance(ndm=2)
secs = [o3.section.Elastic2D(osi, e_mod=1.0, area=1.0, iz=1.0),
o3.section.Elastic2D(osi, e_mod=1.0, area=1.0, iz=1.0)]
o3.beam_integration.LowOrder(osi, big_n=2, secs=secs, locs=[0.2, 0.9], wts=[0.5, 0.5])
def test_mid_distance():
osi = o3.OpenSeesInstance(ndm=2)
secs = [o3.section.Elastic2D(osi, e_mod=1.0, area=1.0, iz=1.0),
o3.section.Elastic2D(osi, e_mod=1.0, area=1.0, iz=1.0)]
o3.beam_integration.MidDistance(osi, big_n=2, secs=secs, locs=[0.2, 0.9])
def test_user_hinge():
osi = o3.OpenSeesInstance(ndm=2)
sec_e = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
secs_l = [o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)]
secs_r = [o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)]
o3.beam_integration.UserHinge(osi, sec_e=sec_e, np_l=1, secs_ls=secs_l, locs_l=[1], wts_l=[1], np_r=1,
secs_rs=secs_r, locs_r=[1], wts_r=[1])
def test_hinge_midpoint():
osi = o3.OpenSeesInstance(ndm=2)
sec_i = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
sec_j = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
sec_e = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.HingeMidpoint(osi, sec_i=sec_i, lp_i=1.0, sec_j=sec_j, lp_j=1.0, sec_e=sec_e)
def test_hinge_radau():
osi = o3.OpenSeesInstance(ndm=2)
sec_i = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
sec_j = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
sec_e = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.HingeRadau(osi, sec_i=sec_i, lp_i=1.0, sec_j=sec_j, lp_j=1.0, sec_e=sec_e)
def test_hinge_radau_two():
osi = o3.OpenSeesInstance(ndm=2)
sec_i = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
sec_j = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
sec_e = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.HingeRadauTwo(osi, sec_i=sec_i, lp_i=1.0, sec_j=sec_j, lp_j=1.0, sec_e=sec_e)
# @pytest.mark.skip()
# def test_beamhinge_endpoint():
# osi = o3.OpenSeesInstance(ndm=2)
# sec_j = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
# sec_e = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
# o3.beam_integration.BeamhingeEndpoint(osi, lp_i=1.0, sec_j=sec_j, lp_j=1.0, sec_e=sec_e)
def test_hinge_endpoint():
osi = o3.OpenSeesInstance(ndm=2)
sec_i = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
sec_j = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
sec_e = o3.section.Elastic2D(osi, 10.0, 1.0, 1.0)
o3.beam_integration.HingeEndpoint(osi, sec_i=sec_i, lp_i=1.0, sec_j=sec_j, lp_j=1.0, sec_e=sec_e)
| 35.983051 | 106 | 0.657089 | 823 | 4,246 | 3.232078 | 0.092345 | 0.06015 | 0.05188 | 0.244737 | 0.834211 | 0.834211 | 0.824812 | 0.807519 | 0.76015 | 0.76015 | 0 | 0.107718 | 0.166981 | 4,246 | 117 | 107 | 36.290598 | 0.644331 | 0.071832 | 0 | 0.552632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.197368 | false | 0 | 0.013158 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c4a77fa20f4742401cdb5b6feee0a6ca5fc86b13 | 31,186 | py | Python | ops-tests/component/test_lacpd_ct_lag_fallback.py | ashutoshshanker/ops-lacpd | 4841d5b777691db01b5d971b21407c788c764711 | [
"Apache-2.0"
] | null | null | null | ops-tests/component/test_lacpd_ct_lag_fallback.py | ashutoshshanker/ops-lacpd | 4841d5b777691db01b5d971b21407c788c764711 | [
"Apache-2.0"
] | null | null | null | ops-tests/component/test_lacpd_ct_lag_fallback.py | ashutoshshanker/ops-lacpd | 4841d5b777691db01b5d971b21407c788c764711 | [
"Apache-2.0"
] | 2 | 2019-10-14T10:14:26.000Z | 2021-09-10T08:18:15.000Z | # Copyright (C) 2016 Hewlett-Packard Development Company, L.P.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pytest
from lib_test import (
clear_port_parameter,
print_header,
remove_port_parameter,
sw_set_intf_user_config,
sw_clear_user_config,
set_intf_parameter,
set_port_parameter,
sw_set_intf_pm_info,
sw_create_bond,
sw_wait_until_all_sm_ready,
sw_wait_until_one_sm_ready,
sw_wait_until_ready,
verify_intf_in_bond,
verify_intf_not_in_bond,
verify_intf_status
)
TOPOLOGY = """
#
# +-------+ +-------+
# | sw1 <-----> sw2 |
# +-------+ +-------+
#
# Nodes
[type=openswitch name="switch 1"] sw1
[type=openswitch name="switch 2"] sw2
# Links
# 1 Gig ports
sw1:1 -- sw2:1
sw1:2 -- sw2:2
"""
# Interfaces from 1-2 are 1G ports.
sw_intf_start = 1
sw_intf_end = 3
intf_labels = ['1', '2']
test_lag = 'lag1'
###############################################################################
#
# ACTOR STATE STATE MACHINES VARIABLES
#
###############################################################################
# Everything is working and 'Collecting and Distributing'
active_ready = '"Activ:1,TmOut:\d,Aggr:1,Sync:1,Col:1,Dist:1,Def:0,Exp:0"'
# No fallback enabled, interface is 'dead'
active_no_fallback = '"Activ:1,TmOut:\d,Aggr:1,Sync:0,Col:0,Dist:0,Def:1,Exp:1"'
# Fallback enabled, in 'Defaulted' but 'Collecting and Distributing'
active_fallback = '"Activ:1,TmOut:\d,Aggr:1,Sync:1,Col:1,Dist:1,Def:1,Exp:0"'
# Interface from same LAG with Fallback enabled but not assign to 'listen'
active_other_intf = '"Activ:1,TmOut:\d,Aggr:1,Sync:0,Col:0,Dist:0,Def:1,Exp:0"'
passive_ready = '"Activ:0,TmOut:\d,Aggr:1,Sync:1,Col:1,Dist:1,Def:0,Exp:0"'
# No fallback enabled, interface is 'dead'
passive_no_fallback = '"Activ:0,TmOut:\d,Aggr:1,Sync:0,Col:0,Dist:0,Def:1,Exp:1"'
# Fallback enabled, in 'Defaulted' but 'Collecting and Distributing'
passive_fallback = '"Activ:0,TmOut:\d,Aggr:1,Sync:1,Col:1,Dist:1,Def:1,Exp:0"'
# Interface from same LAG with Fallback enabled but not assign to 'listen'
passive_other_intf = '"Activ:0,TmOut:\d,Aggr:1,Sync:0,Col:0,Dist:0,Def:1,Exp:0"'
###############################################################################
#
# TEST TOGGLE VARIABLES
#
###############################################################################
disable_admin = ['admin=down']
enable_admin = ['admin=up']
disable_lag = ['lacp=off']
active_lag = ['lacp=active']
passive_lag = ['lacp=passive']
fallback_key = 'lacp-fallback-ab'
other_config_key = 'other_config'
enable_fallback = ['%s:%s="true"' % (other_config_key, fallback_key)]
disable_fallback = ['%s:%s="false"' % (other_config_key, fallback_key)]
@pytest.fixture(scope='module')
def main_setup(request, topology):
"""Common tests configuration.
This configuration will be applied for all Fallback test cases, enabled or
not, and after each test case the configuration must remains the same to
keep the atomicity of each test.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
# Create two dynamic LAG with two ports each.
sw_create_bond(sw1, test_lag, intf_labels, lacp_mode='active')
sw_create_bond(sw2, test_lag, intf_labels, lacp_mode='active')
set_port_parameter(sw1, test_lag, ['other_config:lacp-time=fast'])
set_port_parameter(sw2, test_lag, ['other_config:lacp-time=fast'])
# Enable both the interfaces.
for intf in intf_labels:
set_intf_parameter(sw1, intf, ['user_config:admin=up',
'other_config:lacp-aggregation-key=1'])
set_intf_parameter(sw2, intf, ['user_config:admin=up',
'other_config:lacp-aggregation-key=1'])
sw_wait_until_ready([sw1, sw2], intf_labels)
print('Verify all interfaces SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##############################################################################
#
# FALLBACK DISABLED TESTS
#
##############################################################################
@pytest.mark.skipif(True, reason="Skipping due to instable enviroment")
def test_nf_toggle_admin_flag(topology, step, main_setup):
"""Toggle 'admin' flag.
To disable LAG 1 on both switches, 'admin' flag will be toggled between
'up' and 'down'.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
print_header('Fallback disabled, toggle "admin" flag')
##########################################################################
# LAG 1 disabled on sw1
##########################################################################
print('Shutting down LAG1 on sw1')
set_port_parameter(sw1, test_lag, disable_admin)
print('Verify that all sw2 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw2], intf_labels, active_no_fallback)
print('Enabling LAG1 on sw1')
set_port_parameter(sw1, test_lag, enable_admin)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##########################################################################
# LAG 1 disabled on sw2
##########################################################################
print('Shutting down LAG1 on sw2')
set_port_parameter(sw2, test_lag, disable_admin)
print('Verify that all sw1 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw1], intf_labels, active_no_fallback)
print('Enabling LAG1 on sw2')
set_port_parameter(sw2, test_lag, enable_admin)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
@pytest.mark.skipif(True, reason="Skipping due to instable enviroment")
def test_nf_toggle_lacp_flag_active(topology, step, main_setup):
"""Toggle 'lacp' flag to 'active'.
To disable LAG 1 on both switches, 'lacp' flag will be toggled between
'[]' and 'active'.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
print_header('Fallback disabled, toggle "lacp" flag to active')
##########################################################################
# LAG 1 disabled on sw1
##########################################################################
print('Shutting down LAG1 on sw1')
set_port_parameter(sw1, test_lag, disable_lag)
print('Verify that all sw2 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw2], intf_labels, active_no_fallback)
print('Enabling LAG1 on sw1')
set_port_parameter(sw1, test_lag, active_lag)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##########################################################################
# LAG 1 disabled on sw2
##########################################################################
print('Shutting down LAG1 on sw2')
set_port_parameter(sw2, test_lag, disable_lag)
print('Verify that all sw1 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw1], intf_labels, active_no_fallback)
print('Enabling LAG1 on sw2')
set_port_parameter(sw2, test_lag, active_lag)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
@pytest.mark.skipif(True, reason="Skipping due to instable enviroment")
def test_nf_toggle_lacp_flag_passive(topology, step, main_setup):
"""Toggle 'lacp' flag to 'passive'.
To disable LAG 1 on both switches, 'lacp' flag will be toggled between
'[]' and 'passive'.
Note: Both switches cannot be in the same state of 'passive' or
connectivity will be lost.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
print_header('Fallback disabled, toggle "lacp" flag to passive')
##########################################################################
# LAG 1 disabled on sw1
##########################################################################
print('Setting sw2 LAG to "passive"')
set_port_parameter(sw2, test_lag, passive_lag)
print('Shutting down LAG1 on sw1')
set_port_parameter(sw1, test_lag, disable_lag)
print('Verify that all sw2 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw2], intf_labels, passive_no_fallback)
print('Enabling LAG1 on sw1')
set_port_parameter(sw1, test_lag, active_lag)
print('Verify all interface SM from sw1 are "active"')
sw_wait_until_all_sm_ready([sw1], intf_labels, active_ready)
print('Verify all interface SM from sw2 are "passive"')
sw_wait_until_all_sm_ready([sw2], intf_labels, passive_ready)
print('Setting sw2 LAG to "active"')
set_port_parameter(sw2, test_lag, active_lag)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##########################################################################
# LAG 1 disabled on sw2
##########################################################################
print('Setting sw1 LAG to "passive"')
set_port_parameter(sw1, test_lag, passive_lag)
print('Shutting down LAG1 on sw2')
set_port_parameter(sw2, test_lag, disable_lag)
print('Verify that all sw1 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw1], intf_labels, passive_no_fallback)
print('Enabling LAG1 on sw2')
set_port_parameter(sw2, test_lag, active_lag)
print('Verify all interface SM from sw1 are "passive"')
sw_wait_until_all_sm_ready([sw1], intf_labels, passive_ready)
print('Verify all interface SM from sw2 are "active"')
sw_wait_until_all_sm_ready([sw2], intf_labels, active_ready)
print('Setting sw1 LAG to "active"')
set_port_parameter(sw1, test_lag, active_lag)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
@pytest.mark.skipif(True, reason="Skipping due to instable enviroment")
def test_nf_false_flag_toggle_admin_flag(topology, step, main_setup):
"""Toggle 'admin' flag with 'false' flag.
The source code must handle also the possibility to retrieve from OVSDB the
'lacp-fallback-ab' flag with 'false' even when by default the flag is not
present.
To disable LAG 1 on both switches, 'admin' flag will be toggled between
'up' and 'down'.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
print_header('Fallback disabled with false flag, toggle "admin" flag')
##########################################################################
# Add 'fallback' flag as 'false'
##########################################################################
print('Setting "lacp-fallback-ab" flag into OVSDB')
set_port_parameter(sw1, test_lag, disable_fallback)
set_port_parameter(sw2, test_lag, disable_fallback)
#######################################################################
# LAG 1 disabled on sw1
##########################################################################
print('Shutting down LAG1 on sw1')
set_port_parameter(sw1, test_lag, disable_admin)
print('Verify that all sw2 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw2], intf_labels, active_no_fallback)
print('Enabling LAG1 on sw1')
set_port_parameter(sw1, test_lag, enable_admin)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##########################################################################
# LAG 1 disabled on sw2
##########################################################################
print('Shutting down LAG1 on sw2')
set_port_parameter(sw2, test_lag, disable_admin)
print('Verify that all sw1 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw1], intf_labels, active_no_fallback)
print('Enabling LAG1 on sw2')
set_port_parameter(sw2, test_lag, enable_admin)
##########################################################################
# Remove added flag
##########################################################################
print('Clearing "lacp-fallback-ab" flag from OVSDB')
remove_port_parameter(sw1, test_lag, other_config_key, [fallback_key])
remove_port_parameter(sw2, test_lag, other_config_key, [fallback_key])
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
@pytest.mark.skipif(True, reason="Skipping due to instable enviroment")
def test_nf_false_flag_toggle_lacp_active(topology, step, main_setup):
"""Toggle 'lacp' flag to 'active' with 'false' flag.
The source code must handle also the possibility to retrieve from OVSDB the
'lacp-fallback-ab' flag with 'false' even when by default the flag is not
present.
To disable LAG 1 on both switches, 'lacp' flag will be toggled between
'[]' and 'active'.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
print_header('Fallback disabled with false flag, toggle "lacp" flag to '
'active')
##########################################################################
# Add 'fallback' flag as 'false'
##########################################################################
print('Setting "lacp-fallback-ab" flag into OVSDB')
set_port_parameter(sw1, test_lag, disable_fallback)
set_port_parameter(sw2, test_lag, disable_fallback)
##########################################################################
# LAG 1 disabled on sw1
##########################################################################
print('Shutting down LAG1 on sw1')
set_port_parameter(sw1, test_lag, disable_lag)
print('Verify that all sw2 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw2], intf_labels, active_no_fallback)
print('Enabling LAG1 on sw1')
set_port_parameter(sw1, test_lag, active_lag)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##########################################################################
# LAG 1 disabled on sw2
##########################################################################
print('Shutting down LAG1 on sw2')
set_port_parameter(sw2, test_lag, disable_lag)
print('Verify that all sw1 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw1], intf_labels, active_no_fallback)
print('Enabling LAG1 on sw2')
set_port_parameter(sw2, test_lag, active_lag)
##########################################################################
# Remove added flag
##########################################################################
print('Clearing "lacp-fallback-ab" flag from OVSDB')
remove_port_parameter(sw1, test_lag, other_config_key, [fallback_key])
remove_port_parameter(sw2, test_lag, other_config_key, [fallback_key])
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
@pytest.mark.skipif(True, reason="Skipping due to instable enviroment")
def test_nf_false_flag_toggle_lacp_passive(topology, step, main_setup):
"""Toggle 'lacp' flag to 'passive' with 'false' flag.
The source code must handle also the possibility to retrieve from OVSDB the
'lacp-fallback-ab' flag with 'false' even when by default the flag is not
present.
To disable LAG 1 on both switches, 'lacp' flag will be toggled between
'[]' and 'passive'.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
print_header('Fallback disabled with false flag, toggle "lacp" flag to '
'passive')
##########################################################################
# Add 'fallback' flag as 'false'
##########################################################################
print('Setting "lacp-fallback-ab" flag into OVSDB')
set_port_parameter(sw1, test_lag, disable_fallback)
set_port_parameter(sw2, test_lag, disable_fallback)
##########################################################################
# LAG 1 disabled on sw1
##########################################################################
print('Setting sw2 LAG to "passive"')
set_port_parameter(sw2, test_lag, passive_lag)
print('Shutting down LAG1 on sw1')
set_port_parameter(sw1, test_lag, disable_lag)
print('Verify that all sw2 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw2], intf_labels, passive_no_fallback)
print('Enabling LAG1 on sw1')
set_port_parameter(sw1, test_lag, active_lag)
print('Verify all interface SM from sw1 are "active"')
sw_wait_until_all_sm_ready([sw1], intf_labels, active_ready)
print('Verify all interface SM from sw2 are "passive"')
sw_wait_until_all_sm_ready([sw2], intf_labels, passive_ready)
print('Setting sw2 LAG to "active"')
set_port_parameter(sw2, test_lag, active_lag)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##########################################################################
# LAG 1 disabled on sw2
##########################################################################
print('Setting sw1 LAG to "passive"')
set_port_parameter(sw1, test_lag, passive_lag)
print('Shutting down LAG1 on sw2')
set_port_parameter(sw2, test_lag, disable_lag)
print('Verify that all sw1 SMs are in "Defaulted and Expired"')
sw_wait_until_all_sm_ready([sw1], intf_labels, passive_no_fallback)
print('Enabling LAG1 on sw2')
set_port_parameter(sw2, test_lag, active_lag)
print('Verify all interface SM from sw1 are "passive"')
sw_wait_until_all_sm_ready([sw1], intf_labels, passive_ready)
print('Verify all interface SM from sw2 are "active"')
sw_wait_until_all_sm_ready([sw2], intf_labels, active_ready)
print('Setting sw1 LAG to "active"')
set_port_parameter(sw1, test_lag, active_lag)
##########################################################################
# Remove added flag
##########################################################################
print('Clearing "lacp-fallback-ab" flag from OVSDB')
remove_port_parameter(sw1, test_lag, other_config_key, [fallback_key])
remove_port_parameter(sw2, test_lag, other_config_key, [fallback_key])
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##############################################################################
#
# FALLBACK ENABLED TESTS
#
##############################################################################
@pytest.mark.skipif(True, reason="Skipping due to instable enviroment")
def test_fb_toggle_admin_flag(topology, step, main_setup):
"""Toggle 'admin' flag.
To disable LAG 1 on both switches, 'admin' flag will be toggled between
'up' and 'down'.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
print_header('Fallback enabled, toggle "admin" flag')
##########################################################################
# Add 'fallback' flag
##########################################################################
print('Enabling Fallback on both switches')
set_port_parameter(sw1, test_lag, enable_fallback)
set_port_parameter(sw2, test_lag, enable_fallback)
##########################################################################
# LAG 1 disabled on sw1
##########################################################################
print('Shutting down LAG1 on sw1')
set_port_parameter(sw1, test_lag, disable_admin)
print('Verify that one interface is "Collecting, Distributing and '
'Defaulted" on sw2')
intf_fallback_enabled = sw_wait_until_one_sm_ready([sw2],
intf_labels,
active_fallback)
print('Found interface: %s' % intf_fallback_enabled)
print('Verify that other interfaces are "Defaulted" on sw2')
tmp = list(intf_labels)
tmp.remove(intf_fallback_enabled)
sw_wait_until_all_sm_ready([sw2], tmp, active_other_intf)
print('Enabling LAG1 on sw1')
set_port_parameter(sw1, test_lag, enable_admin)
print('Verify all interfaces SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##########################################################################
# LAG 1 disabled on sw2
##########################################################################
print('Shutting down LAG1 on sw2')
set_port_parameter(sw2, test_lag, disable_admin)
print('Verify that one interface is "Collecting, Distributing and '
'Defaulted" on sw1')
intf_fallback_enabled = sw_wait_until_one_sm_ready([sw1],
intf_labels,
active_fallback)
print('Found interface: %s' % intf_fallback_enabled)
print('Verify that other interfaces are "Defaulted" on sw1')
tmp = list(intf_labels)
tmp.remove(intf_fallback_enabled)
sw_wait_until_all_sm_ready([sw1], tmp, active_other_intf)
print('Enabling LAG1 on sw2')
set_port_parameter(sw2, test_lag, enable_admin)
##########################################################################
# Remove added flag
##########################################################################
print('Clearing "lacp-fallback-ab" flag from OVSDB')
remove_port_parameter(sw1, test_lag, other_config_key, [fallback_key])
remove_port_parameter(sw2, test_lag, other_config_key, [fallback_key])
print('Verify all interfaces SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
@pytest.mark.skipif(True, reason="Skipping due to instable enviroment")
def test_fb_toggle_lacp_flag_active(topology, step, main_setup):
"""Toggle 'lacp' flag to 'active'.
To disable LAG 1 on both switches, 'lacp' flag will be toggled between
'active' and 'off'.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
print_header('Fallback enabled, toggle "lacp" flag to active')
##########################################################################
# Add 'fallback' flag
##########################################################################
print('Enabling Fallback on both switches')
set_port_parameter(sw1, test_lag, enable_fallback)
set_port_parameter(sw2, test_lag, enable_fallback)
##########################################################################
# LAG 1 disabled on sw1
##########################################################################
print('Shutting down LAG1 on sw1')
set_port_parameter(sw1, test_lag, disable_lag)
print('Verify that one interface is "Collecting, Distributing and '
'Defaulted" on sw2')
intf_fallback_enabled = sw_wait_until_one_sm_ready([sw2],
intf_labels,
active_fallback)
print('Found interface: %s' % intf_fallback_enabled)
print('Verify that other interfaces are "Defaulted" on sw2')
tmp = list(intf_labels)
tmp.remove(intf_fallback_enabled)
sw_wait_until_all_sm_ready([sw2], tmp, active_other_intf)
print('Enabling LAG1 on sw1')
set_port_parameter(sw1, test_lag, active_lag)
print('Verify all interfaces SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##########################################################################
# LAG 1 disabled on sw2
##########################################################################
print('Shutting down LAG1 on sw2')
set_port_parameter(sw2, test_lag, disable_lag)
print('Verify that one interface is "Collecting, Distributing and '
'Defaulted" on sw1')
intf_fallback_enabled = sw_wait_until_one_sm_ready([sw1],
intf_labels,
active_fallback)
print('Found interface: %s' % intf_fallback_enabled)
print('Verify that other interfaces are "Defaulted" on sw1')
tmp = list(intf_labels)
tmp.remove(intf_fallback_enabled)
sw_wait_until_all_sm_ready([sw1], tmp, active_other_intf)
print('Enabling LAG1 on sw2')
set_port_parameter(sw2, test_lag, active_lag)
##########################################################################
# Remove added flag
##########################################################################
print('Clearing "lacp-fallback-ab" flag from OVSDB')
remove_port_parameter(sw1, test_lag, other_config_key, [fallback_key])
remove_port_parameter(sw2, test_lag, other_config_key, [fallback_key])
print('Verify all interfaces SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
@pytest.mark.skipif(True, reason="Skipping due to instable enviroment")
def test_fb_toggle_lacp_flag_passive(topology, step, main_setup):
"""Toggle 'lacp' flag to 'passive'.
To disable LAG 1 on both switches, 'lacp' flag will be toggled between
'[]' and 'passive'.
Note: Both switches cannot be in the same state of 'passive' or
connectivity will be lost.
"""
sw1 = topology.get('sw1')
sw2 = topology.get('sw2')
assert sw1 is not None
assert sw2 is not None
print_header('Fallback enabled, toggle "lacp" flag to passive')
##########################################################################
# Add 'fallback' flag
##########################################################################
print('Enabling Fallback on both switches')
set_port_parameter(sw1, test_lag, enable_fallback)
set_port_parameter(sw2, test_lag, enable_fallback)
##########################################################################
# LAG 1 disabled on sw1
##########################################################################
print('Setting sw2 LAG to "passive"')
set_port_parameter(sw2, test_lag, passive_lag)
print('Shutting down LAG1 on sw1')
set_port_parameter(sw1, test_lag, disable_lag)
print('Verify that one interface is "Collecting, Distributing and '
'Defaulted" on sw2')
intf_fallback_enabled = sw_wait_until_one_sm_ready([sw2],
intf_labels,
passive_fallback)
print('Found interface: %s' % intf_fallback_enabled)
print('Verify that other interfaces are "Defaulted" on sw2')
tmp = list(intf_labels)
tmp.remove(intf_fallback_enabled)
sw_wait_until_all_sm_ready([sw2], tmp, passive_other_intf)
print('Enabling LAG1 on sw1')
set_port_parameter(sw1, test_lag, active_lag)
print('Verify all interface SM from sw1 are "active"')
sw_wait_until_all_sm_ready([sw1], intf_labels, active_ready)
print('Verify all interface SM from sw2 are "passive"')
sw_wait_until_all_sm_ready([sw2], intf_labels, passive_ready)
print('Setting sw2 LAG to "active"')
set_port_parameter(sw2, test_lag, active_lag)
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
##########################################################################
# LAG 1 disabled on sw2
##########################################################################
print('Setting sw1 LAG to "passive"')
set_port_parameter(sw1, test_lag, passive_lag)
print('Shutting down LAG1 on sw2')
set_port_parameter(sw2, test_lag, disable_lag)
print('Verify that one interface is "Collecting, Distributing and '
'Defaulted" on sw1')
intf_fallback_enabled = sw_wait_until_one_sm_ready([sw1],
intf_labels,
passive_fallback)
print('Found interface: %s' % intf_fallback_enabled)
print('Verify that other interfaces are "Defaulted" on sw1')
tmp = list(intf_labels)
tmp.remove(intf_fallback_enabled)
sw_wait_until_all_sm_ready([sw1], tmp, passive_other_intf)
print('Enabling LAG1 on sw2')
set_port_parameter(sw2, test_lag, active_lag)
print('Verify all interface SM from sw1 are "passive"')
sw_wait_until_all_sm_ready([sw1], intf_labels, passive_ready)
print('Verify all interface SM from sw2 are "active"')
sw_wait_until_all_sm_ready([sw2], intf_labels, active_ready)
print('Setting sw1 LAG to "active"')
set_port_parameter(sw1, test_lag, active_lag)
##########################################################################
# Remove added flag
##########################################################################
print('Clearing "lacp-fallback-ab" flag from OVSDB')
remove_port_parameter(sw1, test_lag, other_config_key, [fallback_key])
remove_port_parameter(sw2, test_lag, other_config_key, [fallback_key])
print('Verify all interface SM from both switches are working')
sw_wait_until_all_sm_ready([sw1, sw2], intf_labels, active_ready)
| 41.034211 | 81 | 0.578753 | 3,793 | 31,186 | 4.514105 | 0.061956 | 0.058463 | 0.058872 | 0.040883 | 0.905093 | 0.896975 | 0.893295 | 0.884126 | 0.879921 | 0.879921 | 0 | 0.018288 | 0.179439 | 31,186 | 759 | 82 | 41.088274 | 0.650801 | 0.129064 | 0 | 0.803398 | 0 | 0.019417 | 0.298844 | 0.026613 | 0 | 0 | 0 | 0 | 0.048544 | 1 | 0.024272 | false | 0.104369 | 0.004854 | 0 | 0.029126 | 0.317961 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
f203f206124ab035ad3c51b2cfac71d4bd2072da | 179 | py | Python | malib/environments/wrappers/__init__.py | wwxFromTju/malib | 7cd2a4af55cf1f56da8854e26ea7a4f3782ceea2 | [
"MIT"
] | 6 | 2021-05-19T10:25:36.000Z | 2021-12-27T03:30:33.000Z | malib/environments/wrappers/__init__.py | wwxFromTju/malib | 7cd2a4af55cf1f56da8854e26ea7a4f3782ceea2 | [
"MIT"
] | 1 | 2021-05-29T04:51:37.000Z | 2021-05-30T06:18:10.000Z | malib/environments/wrappers/__init__.py | wwxFromTju/malib | 7cd2a4af55cf1f56da8854e26ea7a4f3782ceea2 | [
"MIT"
] | 1 | 2021-06-30T10:53:03.000Z | 2021-06-30T10:53:03.000Z | # Created by yingwen at 2019-03-13
# from malib.environments.wrappers.normalized_env import normalize, NormalizedEnv
from malib.environments.wrappers.basic_wrapper import Wrapper
| 44.75 | 81 | 0.849162 | 24 | 179 | 6.25 | 0.75 | 0.12 | 0.28 | 0.386667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04908 | 0.089385 | 179 | 3 | 82 | 59.666667 | 0.871166 | 0.625698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f2168c5fe29a942ec34f6f6bc84d1376ec77d0a6 | 7,961 | py | Python | xhorizon/diagram_tools/region_factory.py | jcschindler01/xhorizon | 37a9dd4eebb36190f8914f8571f6b327e29d3d1c | [
"MIT"
] | 1 | 2020-04-01T16:14:00.000Z | 2020-04-01T16:14:00.000Z | xhorizon/diagram_tools/region_factory.py | jcschindler01/xhorizon | 37a9dd4eebb36190f8914f8571f6b327e29d3d1c | [
"MIT"
] | 1 | 2020-04-26T14:41:31.000Z | 2020-04-26T14:41:31.000Z | xhorizon/diagram_tools/region_factory.py | jcschindler01/xhorizon | 37a9dd4eebb36190f8914f8571f6b327e29d3d1c | [
"MIT"
] | 1 | 2021-04-15T09:23:29.000Z | 2021-04-15T09:23:29.000Z |
"""
"""
import numpy as np
from curve_class import curve
from diagram_classes import region, block
import curvemakers as cm
def EFreg(func, rparams={}, io=0, lr=0, basepoint=np.array([0.,0.]), boundary=True, uvgrid=False, rlines=True):
"""
Create an EF region with given metfunc.
basepoint_uvdl gives coordinates of the outermost horizon vertex.
io makes the region ingoing (0) or outgoing (1)
lr makes the region left-directed or right-directed
"""
## get func params
N = len(func.rj)-2
eps0 = func.sgnf(0)
## get region params
rparam = dict(c=0.,s0=10.)
rparam.update(rparams)
## create region
reg = region(func, rparams=rparam)
## get basepoint
x0, y0 = basepoint
udl0, vdl0 = 0.5*(y0-x0), 0.5*(y0+x0)
## add blocks
## ingoing io=0 lr=0
if io==0 and lr==0:
for j in range(N+1):
cdlu = udl0 - 0.5 + (N-j)
cdlv = vdl0 + 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=func.sgnf(j), epsv=1.))
## ingoing io=0 lr=1
if io==0 and lr==1:
for j in range(N+1):
cdlu = udl0 - 0.5
cdlv = vdl0 - 1.5 + (j)
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=1., epsv=func.sgnf(j) ))
## outgoing io=1 lr=0
if io==1 and lr==0:
for j in range(N+1):
cdlu = udl0 - 1.5 + (j)
cdlv = vdl0 - 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=-func.sgnf(j), epsv=-1.))
## outgoing io=1 lr=1
if io==1 and lr==1:
for j in range(N+1):
cdlu = udl0 + 0.5
cdlv = vdl0 - 0.5 + (N-j)
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=-1., epsv=-func.sgnf(j) ))
## add boundary
if boundary==True:
for b in reg.blocks:
b.add_curves_uv(cm.block_boundary(b))
## add grid
if uvgrid==True:
for b in reg.blocks:
uvvals = np.arange(-30.,30., 0.5)
b.add_curves_uv(cm.uvlines(uvvals, uv='uv', sty=dict(c='0.5')))
## add default rlines
if rlines==True:
for b in reg.blocks:
b.add_curves_tr(cm.default_rlines(reg.metfunc))
## return
return reg
def MAXreg(func, rparams={}, boundary=True, uvgrid=False, rlines=True):
"""
Create a maximally extended region with given metfunc.
"""
## get N and eps0
N = len(func.rj)-2
eps0 = func.sgnf(0)
## send to relevant subroutine
if N==0:
reg = MAXreg0(func, rparams=rparams, boundary=boundary, uvgrid=uvgrid, rlines=rlines)
elif N==1 and eps0==-1.:
reg = MAXreg1a(func, rparams=rparams, boundary=boundary, uvgrid=uvgrid, rlines=rlines)
elif N==1 and eps0== 1.:
reg = MAXreg1b(func, rparams=rparams, boundary=boundary, uvgrid=uvgrid, rlines=rlines)
elif N == 2 and eps0== 1.:
reg = MAXreg2a(func, rparams=rparams, boundary=boundary, uvgrid=uvgrid, rlines=rlines)
elif N == 2 and eps0==-1.:
reg = MAXreg2b(func, rparams=rparams, boundary=boundary, uvgrid=uvgrid, rlines=rlines)
else:
reg = None
print("\nNo existing subroutine for MAXreg at this value of N and eps0.\n")
## format
## add boundary
if boundary==True:
for b in reg.blocks:
b.add_curves_uv(cm.block_boundary(b))
## add grid
if uvgrid==True:
for b in reg.blocks:
uvvals = np.arange(-30.,30., 0.5)
b.add_curves_uv(cm.uvlines(uvvals, uv='uv', sty=dict(c='0.5')))
## add default rlines
if rlines==True:
for b in reg.blocks:
b.add_curves_tr(cm.default_rlines(reg.metfunc))
## return
return reg
def MAXreg2a(func, rparams={}, boundary=True, uvgrid=False, rlines=True):
"""
Create a maximally extended region with given metfunc with two horizons.
"""
## get func params
N = len(func.rj)-2
eps0 = func.sgnf(0)
## make sure there are exactly zero horizons
if N != 2:
return None
## get region params
rparam = dict(c=0.,s0=10.)
rparam.update(rparams)
## create region
reg = region(func, rparams=rparam)
## add blocks
## set basepoint
udl0, vdl0 = 0., 0.
## ingoing io=0 lr=0
for j in [0,1,2]:
cdlu = udl0 - 0.5 + (N-j)
cdlv = vdl0 + 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=func.sgnf(j), epsv=1.))
## ingoing io=0 lr=1
for j in [0]:
cdlu = udl0 - 0.5
cdlv = vdl0 - 1.5 + (j)
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=1., epsv=func.sgnf(j) ))
## outgoing io=1 lr=0
for j in [0,1,2]:
cdlu = udl0 - 1.5 + (j)
cdlv = vdl0 - 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=-func.sgnf(j), epsv=-1.))
## outgoing io=1 lr=1
for j in [0]:
cdlu = udl0 + 0.5
cdlv = vdl0 - 0.5 + (N-j)
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=-1., epsv=-func.sgnf(j) ))
## return
return reg
def MAXreg2b(func, rparams={}, boundary=True, uvgrid=False, rlines=True):
"""
Create a maximally extended region with given metfunc with two horizons.
"""
## get func params
N = len(func.rj)-2
eps0 = func.sgnf(0)
## make sure there are exactly zero horizons
if N != 2:
return None
## get region params
rparam = dict(c=0.,s0=10.)
rparam.update(rparams)
## create region
reg = region(func, rparams=rparam)
## add blocks
## set basepoint
udl0, vdl0 = 0., 0.
## ingoing io=0 lr=0
for j in [0,1,2]:
cdlu = udl0 - 0.5 + (N-j)
cdlv = vdl0 - 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=func.sgnf(j), epsv=1.))
## ingoing io=0 lr=1
for j in [0]:
cdlu = udl0 + 0.5
cdlv = vdl0 - 1.5 + (j)
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=1., epsv=func.sgnf(j) ))
## outgoing io=1 lr=0
for j in [0,1,2]:
cdlu = udl0 - 1.5 + (j)
cdlv = vdl0 + 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=-func.sgnf(j), epsv=-1.))
## outgoing io=1 lr=1
for j in [0]:
cdlu = udl0 - 0.5
cdlv = vdl0 - 0.5 + (N-j)
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=-1., epsv=-func.sgnf(j) ))
## return
return reg
def MAXreg1a(func, rparams={}, boundary=True, uvgrid=False, rlines=True):
"""
Create a maximally extended region with given metfunc with two horizons.
"""
## get func params
N = len(func.rj)-2
eps0 = func.sgnf(0)
## make sure there are exactly zero horizons
if N != 1:
return None
## get region params
rparam = dict(c=0.,s0=10.)
rparam.update(rparams)
## create region
reg = region(func, rparams=rparam)
## add blocks
## set basepoint
udl0, vdl0 = 0., 0.
## ingoing io=0 lr=0
for j in [0,1]:
cdlu = udl0 - 0.5 + (N-j)
cdlv = vdl0 + 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=func.sgnf(j), epsv=1.))
## outgoing io=1 lr=0
for j in [0,1]:
cdlu = udl0 - 0.5 + (j)
cdlv = vdl0 - 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=-func.sgnf(j), epsv=-1.))
## return
return reg
def MAXreg1b(func, rparams={}, boundary=True, uvgrid=False, rlines=True):
"""
Create a maximally extended region with given metfunc with two horizons.
"""
## get func params
N = len(func.rj)-2
eps0 = func.sgnf(0)
## make sure there are exactly zero horizons
if N != 1:
return None
## get region params
rparam = dict(c=0.,s0=10.)
rparam.update(rparams)
## create region
reg = region(func, rparams=rparam)
## add blocks
## set basepoint
udl0, vdl0 = 0., 0.
## ingoing io=0 lr=0
for j in [0,1]:
cdlu = udl0 - 0.5 + (N-j)
cdlv = vdl0 - 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=func.sgnf(j), epsv=1.))
## outgoing io=1 lr=0
for j in [0,1]:
cdlu = udl0 - 0.5 + (j)
cdlv = vdl0 + 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=-func.sgnf(j), epsv=-1.))
## return
return reg
def MAXreg0(func, rparams={}, boundary=True, uvgrid=False, rlines=True):
"""
Create a maximally extended region with given metfunc with two horizons.
"""
## get func params
N = len(func.rj)-2
eps0 = func.sgnf(0)
## make sure there are exactly zero horizons
if N != 0:
return None
## get region params
rparam = dict(c=0.,s0=10.)
rparam.update(rparams)
## create region
reg = region(func, rparams=rparam)
## add blocks
## set basepoint
udl0, vdl0 = 0., 0.
## ingoing io=0 lr=0
for j in [0]:
cdlu = udl0 - 0.5 + (N-j)
cdlv = vdl0 + 0.5
reg.add_block(j, bparams=dict(cdlu=cdlu, cdlv=cdlv, epsu=func.sgnf(j), epsv=1.))
## return
return reg
| 27.263699 | 111 | 0.646527 | 1,405 | 7,961 | 3.637722 | 0.08968 | 0.013305 | 0.019957 | 0.039914 | 0.891606 | 0.88515 | 0.88515 | 0.877519 | 0.877519 | 0.861475 | 0 | 0.049266 | 0.18666 | 7,961 | 291 | 112 | 27.357388 | 0.740077 | 0.225223 | 0 | 0.819767 | 0 | 0 | 0.012713 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040698 | false | 0 | 0.023256 | 0 | 0.133721 | 0.005814 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1ef16dc9e4658e60c62cdf03c4d652f8f978221f | 275 | py | Python | performer_pytorch/__init__.py | arijitthegame/performer-pytorch | fc8b78441b1e27eb5d9b01fc738a8772cee07127 | [
"MIT"
] | 829 | 2020-10-03T15:38:41.000Z | 2022-03-28T15:22:16.000Z | performer_pytorch/__init__.py | arijitthegame/performer-pytorch | fc8b78441b1e27eb5d9b01fc738a8772cee07127 | [
"MIT"
] | 77 | 2020-10-05T18:55:14.000Z | 2022-03-28T00:16:29.000Z | performer_pytorch/__init__.py | arijitthegame/performer-pytorch | fc8b78441b1e27eb5d9b01fc738a8772cee07127 | [
"MIT"
] | 125 | 2020-10-11T11:11:18.000Z | 2022-03-29T11:14:24.000Z | from performer_pytorch.performer_pytorch import PerformerLM, Performer, FastAttention, SelfAttention, CrossAttention, ProjectionUpdater
from performer_pytorch.autoregressive_wrapper import AutoregressiveWrapper
from performer_pytorch.performer_enc_dec import PerformerEncDec
| 68.75 | 135 | 0.909091 | 27 | 275 | 9 | 0.555556 | 0.263374 | 0.246914 | 0.238683 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061818 | 275 | 3 | 136 | 91.666667 | 0.94186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
1ef8c3f94a5656fe7d34157e0482e95456493c6e | 1,397 | py | Python | prompts/wizard_of_internet.py | andreamad8/FSB | a81593590189fa5ad1cc37c5857f974effd9750a | [
"MIT"
] | 53 | 2021-10-11T03:24:14.000Z | 2022-03-30T15:17:23.000Z | prompts/wizard_of_internet.py | andreamad8/FSB | a81593590189fa5ad1cc37c5857f974effd9750a | [
"MIT"
] | 1 | 2021-12-26T22:48:38.000Z | 2022-01-15T18:05:32.000Z | prompts/wizard_of_internet.py | andreamad8/FSB | a81593590189fa5ad1cc37c5857f974effd9750a | [
"MIT"
] | 5 | 2022-01-27T09:07:39.000Z | 2022-03-04T08:58:23.000Z | def convert_sample_to_shot_wit(sample, with_knowledge=True):
prefix = "Assistant Information:\n"
for s in sample["meta"]:
prefix += s+"\n"
prefix += "Dialogue:\n"
assert len(sample["dialogue"]) == len(sample["KB"])
for turn, meta in zip(sample["dialogue"],sample["KB"]):
prefix += f"User: {turn[0]}" +"\n"
if with_knowledge:
if len(meta)>0:
prefix += f"KB: {meta[0]}" +"\n"
else:
prefix += f"KB: None" +"\n"
if turn[1] == "":
prefix += f"Assistant:"
return prefix
else:
prefix += f"Assistant: {turn[1]}" +"\n"
return prefix
def convert_sample_to_shot_wit_interact(sample, with_knowledge=True):
prefix = "Assistant Information:\n"
for s in sample["meta"]:
prefix += s+"\n"
prefix += "Dialogue:\n"
assert len(sample["dialogue"]) == len(sample["KB_internet"])
for turn, meta in zip(sample["dialogue"],sample["KB_internet"]):
prefix += f"User: {turn[0]}" +"\n"
if with_knowledge:
if len(meta)>0:
prefix += f"KB: {meta[0]}" +"\n"
else:
prefix += f"KB: None" +"\n"
if turn[1] == "":
prefix += f"Assistant:"
return prefix
else:
prefix += f"Assistant: {turn[1]}" +"\n"
return prefix
| 31.044444 | 69 | 0.506084 | 169 | 1,397 | 4.094675 | 0.195266 | 0.101156 | 0.052023 | 0.052023 | 0.965318 | 0.965318 | 0.893064 | 0.893064 | 0.893064 | 0.783237 | 0 | 0.010537 | 0.320687 | 1,397 | 44 | 70 | 31.75 | 0.718651 | 0 | 0 | 0.842105 | 0 | 0 | 0.206156 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 1 | 0.052632 | false | 0 | 0 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4849c22573b9335d4b3331da009ecf49fb09290b | 5,631 | py | Python | supplai_client/api/notification.py | rhenter/supplai-client | dbae2b7d1ced14137b0eac362177df850fb111e4 | [
"MIT"
] | null | null | null | supplai_client/api/notification.py | rhenter/supplai-client | dbae2b7d1ced14137b0eac362177df850fb111e4 | [
"MIT"
] | null | null | null | supplai_client/api/notification.py | rhenter/supplai-client | dbae2b7d1ced14137b0eac362177df850fb111e4 | [
"MIT"
] | null | null | null | """
Notification endpoints
"""
class Notification:
"""Class holding notification functions.
Notification
Docs: https://api.supplai.com.br/doc/notification/
"""
endpoint_base = 'notification'
def __init__(self, api):
self._api = api
def get_notifications(self, **kwargs):
"""Get a list of all Condos.
Args:
This function takes no arguments.
Returns:
dict: All notifications.
Raises:
RequestException: An error thrown by Requests library.
ValueError: An error thrown by json parser, if JSON decoding fails.
SupplaiError: An error occurred while requesting the Supplai API.
"""
endpoint = f'{self.endpoint_base}/notifications/'
return self._api.search(endpoint, **kwargs)
def create_notification(self, data):
"""Update a single Notification Text.
Args:
data (dict, list of tuples): Data to send in the body of the request.
Returns:
dict: Data retrieved for specified endpoint.
Raises:
RequestException: An error thrown by Requests library.
ValueError: An error thrown by json parser, if JSON decoding fails.
SupplaiError: An error occurred while requesting the Supplai API.
"""
endpoint = f'{self.endpoint_base}/notifications/'
return self._api.create(endpoint, data)
def get_notification(self, code):
"""Get the full details for a single Notification.
Args:
code (str): Notification Code.
Returns:
dict: Full notification details.
Raises:
RequestException: An error thrown by Requests library.
ValueError: An error thrown by json parser, if JSON decoding fails.
SupplaiError: An error occurred while requesting the Supplai API.
"""
endpoint = f'{self.endpoint_base}/notifications/{code}/'
return self._api.search(endpoint)
def update_notification(self, code, data, partial=False):
"""Update a single Notification Text.
Args:
code (str): Notification Code.
data (dict, list of tuples): Data to send in the body of the request.
partial (bool): To specify whether the update will change everything
or just a few attributes. Default is False
Returns:
dict: Data retrieved for specified endpoint.
Raises:
RequestException: An error thrown by Requests library.
ValueError: An error thrown by json parser, if JSON decoding fails.
SupplaiError: An error occurred while requesting the Supplai API.
"""
endpoint = f'{self.endpoint_base}/notifications/{code}/'
return self._api.update(endpoint, data, partial)
def get_notification_templates(self, **kwargs):
"""Get a list of all Condos.
Args:
This function takes no arguments.
Returns:
dict: All Notification Templates.
Raises:
RequestException: An error thrown by Requests library.
ValueError: An error thrown by json parser, if JSON decoding fails.
SupplaiError: An error occurred while requesting the Supplai API.
"""
endpoint = f'{self.endpoint_base}/notification-templates/'
return self._api.search(endpoint, **kwargs)
def create_notification_template(self, data):
"""Update a single Notification Template Text.
Args:
data (dict, list of tuples): Data to send in the body of the request.
Returns:
dict: Data retrieved for specified endpoint.
Raises:
RequestException: An error thrown by Requests library.
ValueError: An error thrown by json parser, if JSON decoding fails.
SupplaiError: An error occurred while requesting the Supplai API.
"""
endpoint = f'{self.endpoint_base}/notification-templates/'
return self._api.create(endpoint, data)
def get_notification_template(self, code):
"""Get the full details for a single Notification Template.
Args:
code (str): Notification Template Code.
Returns:
dict: Full notification template details.
Raises:
RequestException: An error thrown by Requests library.
ValueError: An error thrown by json parser, if JSON decoding fails.
SupplaiError: An error occurred while requesting the Supplai API.
"""
endpoint = f'{self.endpoint_base}/notification-templates/{code}/'
return self._api.search(endpoint)
def update_notification_template(self, code, data, partial=False):
"""Update a single Notification Template Text.
Args:
code (str): Notification Code.
data (dict, list of tuples): Data to send in the body of the request.
partial (bool): To specify whether the update will change everything
or just a few attributes. Default is False
Returns:
dict: Data retrieved for specified endpoint.
Raises:
RequestException: An error thrown by Requests library.
ValueError: An error thrown by json parser, if JSON decoding fails.
SupplaiError: An error occurred while requesting the Supplai API.
"""
endpoint = f'{self.endpoint_base}/notification-templates/{code}/'
return self._api.update(endpoint, data, partial)
| 34.127273 | 81 | 0.627597 | 631 | 5,631 | 5.546751 | 0.141046 | 0.048 | 0.059429 | 0.068571 | 0.908286 | 0.884 | 0.872286 | 0.860571 | 0.846286 | 0.754286 | 0 | 0 | 0.3019 | 5,631 | 164 | 82 | 34.335366 | 0.890359 | 0.603445 | 0 | 0.571429 | 0 | 0 | 0.232832 | 0.224984 | 0 | 0 | 0 | 0 | 0 | 1 | 0.321429 | false | 0 | 0 | 0 | 0.678571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
486c79f1f417a5c06a04b88fb2a123815237a350 | 223 | py | Python | nsd1802/python/day21/mysite/polls/views.py | MrWangwf/nsd1806 | 069e993b0bb64cb21adc2a25aa56f6da674453bc | [
"Apache-2.0"
] | null | null | null | nsd1802/python/day21/mysite/polls/views.py | MrWangwf/nsd1806 | 069e993b0bb64cb21adc2a25aa56f6da674453bc | [
"Apache-2.0"
] | null | null | null | nsd1802/python/day21/mysite/polls/views.py | MrWangwf/nsd1806 | 069e993b0bb64cb21adc2a25aa56f6da674453bc | [
"Apache-2.0"
] | null | null | null | from django.shortcuts import render
def index(request):
# 把polls/index.html网页发送给用户,存在polls/templates下
return render(request, 'polls/index.html')
def hello(request):
return render(request, 'polls/hello.html')
| 22.3 | 49 | 0.744395 | 28 | 223 | 5.928571 | 0.571429 | 0.144578 | 0.228916 | 0.289157 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139013 | 223 | 9 | 50 | 24.777778 | 0.864583 | 0.192825 | 0 | 0 | 0 | 0 | 0.180791 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
6fbcb62c9aaeb70e80cc761df44e409893e6fcdb | 4,429 | py | Python | tests/test_dir.py | listuser/jc | 3ac8d0362b4fb9999fc55a60a9cb20ac80d114f7 | [
"MIT"
] | 3,215 | 2019-10-24T15:25:56.000Z | 2022-03-31T15:43:01.000Z | tests/test_dir.py | listuser/jc | 3ac8d0362b4fb9999fc55a60a9cb20ac80d114f7 | [
"MIT"
] | 109 | 2019-11-02T16:22:29.000Z | 2022-03-30T17:32:17.000Z | tests/test_dir.py | listuser/jc | 3ac8d0362b4fb9999fc55a60a9cb20ac80d114f7 | [
"MIT"
] | 75 | 2020-02-07T00:16:32.000Z | 2022-03-29T09:29:53.000Z | import os
import sys
import time
import json
import unittest
import jc.parsers.dir
THIS_DIR = os.path.dirname(os.path.abspath(__file__))
# Set the timezone on POSIX systems. Need to manually set for Windows tests
if not sys.platform.startswith('win32'):
os.environ['TZ'] = 'America/Los_Angeles'
time.tzset()
class MyTests(unittest.TestCase):
def setUp(self):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir.out'),
'r', encoding='utf-8') as f:
self.windows_10_dir = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir.json'),
'r', encoding='utf-8') as f:
self.windows_10_dir_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-ODTC.out'),
'r', encoding='utf-8') as f:
self.windows_10_dir_ODTC = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-ODTC.json'),
'r', encoding='utf-8') as f:
self.windows_10_dir_ODTC_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-C.out'),
'r', encoding='utf-8') as f:
self.windows_10_dir_C = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-mix.out'),
'r', encoding='utf-8') as f:
self.windows_10_dir_mix = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-mix.json'),
'r', encoding='utf-8') as f:
self.windows_10_dir_mix_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-files.out'),
'r', encoding='utf-8') as f:
self.windows_10_dir_files = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-files.json'),
'r', encoding='utf-8') as f:
self.windows_10_dir_files_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-dirs.out'),
'r', encoding='utf-8') as f:
self.windows_10_dir_dirs = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-dirs.json'),
'r', encoding='utf-8') as f:
self.windows_10_dir_dirs_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-S.out'),
'r', encoding='utf-8') as f:
self.windows_10_dir_S = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows-10/dir-S.json'),
'r', encoding='utf-8') as f:
self.windows_10_dir_S_json = json.loads(f.read())
def test_dir_error(self):
self.assertEqual(jc.parsers.dir.parse("Access is denied.", quiet=True), [])
def test_dir_empty(self):
self.assertEqual(jc.parsers.dir.parse("", quiet=True), [])
def test_dir_windows_10(self):
self.assertEqual(jc.parsers.dir.parse(self.windows_10_dir, quiet=True),
self.windows_10_dir_json)
def test_dir_windows_10_ODTC(self):
self.assertEqual(jc.parsers.dir.parse(self.windows_10_dir_ODTC, quiet=True),
self.windows_10_dir_ODTC_json)
def test_dir_windows_10_C(self):
self.assertEqual(jc.parsers.dir.parse(self.windows_10_dir_C, quiet=True),
self.windows_10_dir_json)
def test_dir_windows_10_mix(self):
self.assertEqual(jc.parsers.dir.parse(self.windows_10_dir_mix, quiet=True),
self.windows_10_dir_mix_json)
def test_dir_windows_10_files(self):
self.assertEqual(jc.parsers.dir.parse(self.windows_10_dir_files, quiet=True),
self.windows_10_dir_files_json)
def test_dir_windows_10_dirs(self):
self.assertEqual(jc.parsers.dir.parse(self.windows_10_dir_dirs, quiet=True),
self.windows_10_dir_dirs_json)
def test_dir_windows_10_S(self):
self.assertEqual(jc.parsers.dir.parse(self.windows_10_dir_S, quiet=True),
self.windows_10_dir_S_json)
if __name__ == '__main__':
unittest.main()
| 41.009259 | 96 | 0.622714 | 660 | 4,429 | 3.956061 | 0.115152 | 0.162007 | 0.183838 | 0.165454 | 0.866335 | 0.835695 | 0.74301 | 0.715435 | 0.715435 | 0.715435 | 0 | 0.032315 | 0.238429 | 4,429 | 107 | 97 | 41.392523 | 0.741773 | 0.016482 | 0 | 0.192308 | 0 | 0 | 0.140101 | 0.110473 | 0 | 0 | 0 | 0 | 0.115385 | 1 | 0.128205 | false | 0 | 0.076923 | 0 | 0.217949 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6fc14b388ad7b0a8ff3b8721ed87fc222ec2ab44 | 8,190 | py | Python | myuw/test/dao/test_affiliation.py | timtim17/myuw | d59702a8095daf049d7e57cbb1f7f2a5bebc69af | [
"Apache-2.0"
] | null | null | null | myuw/test/dao/test_affiliation.py | timtim17/myuw | d59702a8095daf049d7e57cbb1f7f2a5bebc69af | [
"Apache-2.0"
] | null | null | null | myuw/test/dao/test_affiliation.py | timtim17/myuw | d59702a8095daf049d7e57cbb1f7f2a5bebc69af | [
"Apache-2.0"
] | null | null | null | # Copyright 2022 UW-IT, University of Washington
# SPDX-License-Identifier: Apache-2.0
from django.test import TransactionTestCase
from myuw.dao.affiliation import get_all_affiliations, get_is_hxt_viewer
from myuw.test import fdao_sws_override, fdao_pws_override,\
get_request, get_request_with_user
@fdao_pws_override
@fdao_sws_override
class TestAffilliationDao(TransactionTestCase):
def setUp(self):
get_request()
"""
MUWM-4830
def test_fyp(self):
now_request = get_request_with_user('jnew')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations['fyp'])
self.assertFalse(affiliations['aut_transfer'])
self.assertFalse(affiliations['win_transfer'])
def test_aut_transfer(self):
now_request = get_request_with_user('javg001')
affiliations = get_all_affiliations(now_request)
self.assertFalse(affiliations['fyp'])
self.assertTrue(affiliations['aut_transfer'])
self.assertFalse(affiliations['win_transfer'])
def test_win_transfer(self):
now_request = get_request_with_user('javg002')
affiliations = get_all_affiliations(now_request)
self.assertFalse(affiliations['fyp'])
self.assertFalse(affiliations['aut_transfer'])
self.assertTrue(affiliations['win_transfer'])
"""
def test_get_is_hxt_viewer(self):
request = get_request_with_user('staff')
self.assertTrue(get_is_hxt_viewer(request)[2])
request = get_request_with_user('javg001')
self.assertTrue(get_is_hxt_viewer(request)[2])
request = get_request_with_user('jbothell')
self.assertFalse(get_is_hxt_viewer(request)[2])
request = get_request_with_user('seagrad')
self.assertFalse(get_is_hxt_viewer(request)[2])
def test_is_instructor(self):
now_request = get_request_with_user('bill')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations['instructor'])
self.assertTrue(affiliations['clinician'])
self.assertTrue(affiliations['employee'])
self.assertTrue(affiliations['all_employee'])
def test_is_faculty(self):
now_request = get_request_with_user('billtac')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations['faculty'])
self.assertTrue(affiliations.get("official_tacoma"))
def test_is_alumni(self):
now_request = get_request_with_user('jalum')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations["alumni"])
self.assertTrue(affiliations["no_1st_class_affi"])
self.assertTrue(affiliations["past_stud"])
self.assertTrue(affiliations["past_employee"])
self.assertTrue(affiliations["alum_asso"])
self.assertFalse(affiliations["registered_stud"])
def test_is_retiree(self):
now_request = get_request_with_user('retirestaff')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations["retiree"])
self.assertTrue(affiliations["past_stud"])
self.assertTrue(affiliations["no_1st_class_affi"])
def test_is_pce_stud(self):
now_request = get_request_with_user('jpce')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get('pce'))
self.assertTrue(affiliations.get('undergrad_c2'))
self.assertFalse(affiliations.get('grad_c2'))
self.assertTrue(affiliations.get("undergrad"))
self.assertTrue(affiliations.get("student"))
self.assertTrue(affiliations.get("seattle"))
self.assertFalse(affiliations.get("official_pce"))
self.assertTrue(affiliations.get('J1'))
self.assertTrue(affiliations.get("intl_stud"))
def test_jinter(self):
now_request = get_request_with_user('jinter')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get('F1'))
self.assertTrue(affiliations.get("intl_stud"))
self.assertTrue(affiliations.get('pce'))
self.assertFalse(affiliations.get('undergrad_c2'))
self.assertTrue(affiliations.get('grad_c2'))
self.assertFalse(affiliations.get("hxt_viewer"))
def test_error_case(self):
now_request = get_request_with_user('jerror')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get('student'))
self.assertFalse(affiliations.get('intl_stud'))
self.assertTrue(affiliations.get('instructor'))
def test_is_2fa_permitted(self):
now_request = get_request_with_user('javerage')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get('2fa_permitted'))
self.assertFalse(affiliations.get('F1'))
self.assertFalse(affiliations.get("intl_stud"))
def test_student_campus(self):
now_request = get_request_with_user('javerage')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get("seattle"))
self.assertTrue(affiliations.get("undergrad"))
self.assertTrue(affiliations.get("registered_stud"))
self.assertTrue(affiliations.get("hxt_viewer"))
now_request = get_request_with_user('jbothell')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get("bothell"))
self.assertFalse(affiliations.get("hxt_viewer"))
now_request = get_request_with_user('eight')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get("tacoma"))
self.assertFalse(affiliations.get("hxt_viewer"))
self.assertFalse(affiliations.get("employee"))
self.assertTrue(affiliations.get("all_employee"))
def test_is_grad_stud_employee(self):
now_request = get_request_with_user('billseata')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get("grad"))
self.assertTrue(affiliations.get("student"))
self.assertTrue(affiliations.get("seattle"))
self.assertTrue(affiliations.get("instructor"))
self.assertTrue(affiliations.get("stud_employee"))
self.assertTrue(affiliations.get("all_employee"))
self.assertTrue(affiliations.get("seattle"))
self.assertTrue(affiliations.get("official_seattle"))
self.assertFalse(affiliations.get("hxt_viewer"))
def test_botgrad(self):
now_request = get_request_with_user('botgrad')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get("grad"))
self.assertTrue(affiliations.get("bothell"))
self.assertTrue(affiliations.get("official_bothell"))
self.assertTrue(affiliations.get('J1'))
self.assertTrue(affiliations.get("intl_stud"))
def test_tacgrad(self):
now_request = get_request_with_user('tacgrad')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get("grad"))
self.assertTrue(affiliations.get("tacoma"))
self.assertTrue(affiliations.get("official_tacoma"))
self.assertTrue(affiliations.get('F1'))
self.assertTrue(affiliations.get("intl_stud"))
def test_employee(self):
now_request = get_request_with_user('staff')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get("official_seattle"))
self.assertTrue(affiliations['employee'])
self.assertTrue(affiliations['all_employee'])
self.assertTrue(affiliations['clinician'])
def test_eos_enrollment(self):
now_request = get_request_with_user('jeos')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get("seattle"))
self.assertTrue(affiliations.get("undergrad_c2"))
def test_stud_empl_campuses(self):
now_request = get_request_with_user('seagrad')
affiliations = get_all_affiliations(now_request)
self.assertTrue(affiliations.get("seattle"))
self.assertTrue(affiliations.get("official_bothell"))
| 43.56383 | 72 | 0.708181 | 906 | 8,190 | 6.104857 | 0.112583 | 0.208823 | 0.30085 | 0.235943 | 0.829145 | 0.781595 | 0.706563 | 0.606039 | 0.503345 | 0.401374 | 0 | 0.005498 | 0.178266 | 8,190 | 187 | 73 | 43.796791 | 0.816345 | 0.010012 | 0 | 0.514286 | 0 | 0 | 0.111683 | 0 | 0 | 0 | 0 | 0 | 0.55 | 1 | 0.121429 | false | 0 | 0.021429 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b50ebe7b997c7e9593dbc436709c3fde5c0ae083 | 2,987 | py | Python | tests/path/sqlite_blob_path_spec.py | Defense-Cyber-Crime-Center/dfvfs | da2ccbc4c989ced5ad651057bd8f5a4b18af6d37 | [
"Apache-2.0"
] | 2 | 2016-02-18T12:46:26.000Z | 2022-03-13T03:05:05.000Z | tests/path/sqlite_blob_path_spec.py | Defense-Cyber-Crime-Center/dfvfs | da2ccbc4c989ced5ad651057bd8f5a4b18af6d37 | [
"Apache-2.0"
] | null | null | null | tests/path/sqlite_blob_path_spec.py | Defense-Cyber-Crime-Center/dfvfs | da2ccbc4c989ced5ad651057bd8f5a4b18af6d37 | [
"Apache-2.0"
] | 5 | 2016-12-18T08:05:39.000Z | 2019-11-19T21:18:00.000Z | #!/usr/bin/python
# -*- coding: utf-8 -*-
"""Tests for the SQLite blob path specification implementation."""
import unittest
from dfvfs.path import sqlite_blob_path_spec
from tests.path import test_lib
class SQLiteBlobPathSpecTest(test_lib.PathSpecTestCase):
"""Tests for the SQLite blob path specification implementation."""
def testInitialize(self):
"""Tests the path specification initialization."""
path_spec = sqlite_blob_path_spec.SQLiteBlobPathSpec(
table_name=u'test_table', column_name=u'test_column',
row_condition=(u'identifier', u'==', 0), parent=self._path_spec)
self.assertNotEqual(path_spec, None)
path_spec = sqlite_blob_path_spec.SQLiteBlobPathSpec(
table_name=u'test_table', column_name=u'test_column', row_index=0,
parent=self._path_spec)
self.assertNotEqual(path_spec, None)
with self.assertRaises(ValueError):
_ = sqlite_blob_path_spec.SQLiteBlobPathSpec(
table_name=u'test_table', column_name=u'test_column', row_index=0,
parent=None)
with self.assertRaises(ValueError):
_ = sqlite_blob_path_spec.SQLiteBlobPathSpec(
table_name=None, column_name=u'test_column', row_index=0,
parent=self._path_spec)
with self.assertRaises(ValueError):
_ = sqlite_blob_path_spec.SQLiteBlobPathSpec(
table_name=u'test_table', column_name=None, row_index=0,
parent=self._path_spec)
with self.assertRaises(ValueError):
_ = sqlite_blob_path_spec.SQLiteBlobPathSpec(
table_name=u'test_table', column_name=u'test_column',
row_condition=u'identifier == 0', parent=self._path_spec)
with self.assertRaises(ValueError):
_ = sqlite_blob_path_spec.SQLiteBlobPathSpec(
table_name=u'test_table', column_name=u'test_column', row_index=0,
parent=self._path_spec, bogus=u'BOGUS')
def testComparable(self):
"""Tests the path specification comparable property."""
path_spec = sqlite_blob_path_spec.SQLiteBlobPathSpec(
table_name=u'test_table', column_name=u'test_column',
row_condition=(u'identifier', u'==', 0), parent=self._path_spec)
self.assertNotEqual(path_spec, None)
expected_comparable = u'\n'.join([
u'type: TEST',
(u'type: SQLITE_BLOB, table name: test_table, '
u'column name: test_column, row condition: "identifier == 0"'),
u''])
self.assertEqual(path_spec.comparable, expected_comparable)
path_spec = sqlite_blob_path_spec.SQLiteBlobPathSpec(
table_name=u'test_table', column_name=u'test_column', row_index=0,
parent=self._path_spec)
self.assertNotEqual(path_spec, None)
expected_comparable = u'\n'.join([
u'type: TEST',
(u'type: SQLITE_BLOB, table name: test_table, '
u'column name: test_column, row index: 0'),
u''])
self.assertEqual(path_spec.comparable, expected_comparable)
if __name__ == '__main__':
unittest.main()
| 34.732558 | 76 | 0.702712 | 381 | 2,987 | 5.183727 | 0.144357 | 0.113418 | 0.072911 | 0.091139 | 0.860759 | 0.828861 | 0.828861 | 0.828861 | 0.776203 | 0.722532 | 0 | 0.004926 | 0.184466 | 2,987 | 85 | 77 | 35.141176 | 0.805829 | 0.08537 | 0 | 0.732143 | 0 | 0 | 0.157196 | 0 | 0 | 0 | 0 | 0 | 0.196429 | 1 | 0.035714 | false | 0 | 0.053571 | 0 | 0.107143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
82ed7f4274f3909dc4af1201c19ef62a0ec2b833 | 37,353 | py | Python | fire_emblem_data_scraper/spiders/characters/tests/test_characters.py | raymondtang310/fire-emblem-data-scraper | bf8685ad3f4c57331bbfcc47ab6975b35438b2d2 | [
"MIT"
] | null | null | null | fire_emblem_data_scraper/spiders/characters/tests/test_characters.py | raymondtang310/fire-emblem-data-scraper | bf8685ad3f4c57331bbfcc47ab6975b35438b2d2 | [
"MIT"
] | 1 | 2021-03-31T19:29:37.000Z | 2021-03-31T19:29:37.000Z | fire_emblem_data_scraper/spiders/characters/tests/test_characters.py | raymondtang310/fire-emblem-data-scraper | bf8685ad3f4c57331bbfcc47ab6975b35438b2d2 | [
"MIT"
] | null | null | null | import unittest
from unittest.mock import patch
from scrapy.http import HtmlResponse
from fire_emblem_data_scraper.constants import MAX_NUM_OTHER_IMAGES
from fire_emblem_data_scraper.spiders.characters.characters import CharactersSpider
class TestCharactersSpider(unittest.TestCase):
"""
TestCharactersSpider is a class for unit testing CharactersSpider.
"""
def setUp(self):
"""
Method that executes before each test method.
:return: None
"""
self.spider = CharactersSpider()
@patch('fire_emblem_data_scraper.spiders.characters.characters.scrapy.Request')
def test_when_parsing_response_then_request_is_made_for_each_character_link(self, request_mock):
"""
Tests that a request is made for each link to a Fire Emblem character web page that is found in the given
response when parsing the given response.
:param request_mock: A mock of scrapy.Request
:type request_mock: MagicMock
:return: None
"""
character_links = ['/Byleth', '/Edelgard']
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Fire Emblem Characters</title>
</head>
<body>
<div id="mw-pages">
<div class="mw-category-group">
<ul>
<li>
<a href="{character_links[0]}"></a>
</li>
<li>
<a href="{character_links[1]}"></a>
</li>
</ul>
</div>
</div>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
requests = self.spider.parse(response)
for character_link, request in zip(character_links, requests):
character_url = self.spider.BASE_URL + character_link
request_mock.assert_called_with(character_url, callback=self.spider.parse_character)
@patch('fire_emblem_data_scraper.spiders.characters.characters.scrapy.Request')
def test_when_parsing_response_given_next_page_link_is_found_then_request_is_made_for_next_page(self, request_mock):
"""
Tests that a request is made for the next page when parsing the given response, given that a link for the next
page is found in the given response.
:param request_mock: A mock of scrapy.Request
:type request_mock: MagicMock
:return: None
"""
next_page_link = '/next-page'
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Fire Emblem Characters</title>
</head>
<body>
<div id="mw-pages">
<a href="{next_page_link}">next page</a>
</div>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
next_page_url = self.spider.BASE_URL + next_page_link
requests = self.spider.parse(response)
for _ in requests:
request_mock.assert_called_with(next_page_url, callback=self.spider.parse)
@patch('fire_emblem_data_scraper.spiders.characters.characters.scrapy.Request')
def test_when_parsing_response_given_next_page_link_is_not_found_then_request_is_not_made_for_next_page(
self, request_mock):
"""
Tests that a request is not made for the next page when parsing the given response, given that a link for the
next page is not found in the given response.
:param request_mock: A mock of scrapy.Request
:type request_mock: MagicMock
:return: None
"""
html = '''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Fire Emblem Characters</title>
</head>
<body>
<div id="mw-pages"></div>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
requests = self.spider.parse(response)
for _ in requests:
request_mock.assert_not_called()
def test_when_parsing_character_given_name_is_found_then_name_is_scraped(self):
"""
Tests that the name of the Fire Emblem character is scraped when parsing the given response of the character's
web page, given that the name of the character is found in the given response.
:return: None
"""
name = 'Lucina'
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Lucina</title>
</head>
<body>
<h1 id="firstHeading">{name}</h1>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['name'], name, 'Name was not scraped correctly')
def test_when_parsing_character_given_name_is_found_then_name_is_stripped(self):
"""
Tests that the scraped name of the Fire Emblem character is stripped of leading and trailing whitespace when
parsing the given response of the character's web page, given that the name of the character is found in the
given response.
:return: None
"""
name = 'Lucina'
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Lucina</title>
</head>
<body>
<h1 id="firstHeading"> {name}\n</h1>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['name'], name, 'Name was not scraped correctly')
def test_when_parsing_character_given_name_is_not_found_then_character_is_not_scraped(self):
"""
Tests that a Fire Emblem character item is not scraped when parsing the given response of the character's web
page, given that the name of the Fire Emblem character is not found in the given response.
:return: None
"""
html = '''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title></title>
</head>
<body>
<h1 id="firstHeading"></h1>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
result = self.spider.parse_character(response)
self.assertIsNone(result, 'An item was unexpectedly scraped')
def test_when_parsing_character_given_primary_image_is_found_then_images_are_scraped(self):
"""
Tests that images of the Fire Emblem character are scraped correctly when parsing the given response of the
character's web page, given that a primary image is found in the given response. In this scenario, images are
scraped correctly if the primary image found is scraped as the primary image and other images found are scraped
as other images.
:return: None
"""
primary_image_link = '/path-of-radiance-ike.png'
other_image_links = ['/radiant-dawn-ike.jpg', '/fire-emblem-heroes-ike.jpg']
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Ike</title>
</head>
<body>
<h1 id="firstHeading">Ike</h1>
<div class="tab_content" style="display:block;">
<a class="image">
<img src="{primary_image_link}">
</a>
</div>
<div class="tab_content" style="display:none;">
<a class="image">
<img src="{other_image_links[0]}">
</a>
</div>
<div class="tab_content" style="display:none;">
<a class="image">
<img src="{other_image_links[1]}">
</a>
</div>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
primary_image_url = self.spider.BASE_URL + primary_image_link
other_image_urls = [self.spider.BASE_URL + other_image_link for other_image_link in other_image_links]
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['primaryImage'], primary_image_url, 'Primary image was not scraped correctly')
self.assertEqual(character_item['otherImages'], other_image_urls, 'Other images were not scraped correctly')
def test_when_parsing_character_given_primary_image_is_not_found_and_other_images_are_found_then_images_are_scraped_with_first_image_found_as_primary_image(
self):
"""
Tests that images of the Fire Emblem character are scraped correctly when parsing the given response of the
character's web page, given that a primary image cannot be found in the given response but other images are
found. In this scenario, images are scraped correctly if the first image found is scraped as the primary image
and other images found are scraped as other images.
:return: None
"""
image_links = ['/thracia776-reinhardt.jpg', '/fire-emblem-heroes-reinhardt.jpg']
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Reinhardt</title>
</head>
<body>
<h1 id="firstHeading">Reinhardt</h1>
<div>
<a class="image">
<img src="{image_links[0]}">
</a>
</div>
<div>
<a class="image">
<img src="{image_links[1]}">
</a>
</div>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
primary_image_url = self.spider.BASE_URL + image_links[0]
other_image_urls = [self.spider.BASE_URL + image_links[1]]
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['primaryImage'], primary_image_url, 'Primary image was not scraped correctly')
self.assertEqual(character_item['otherImages'], other_image_urls, 'Other images were not scraped correctly')
def test_when_parsing_character_given_number_of_images_found_is_greater_than_threshold_then_number_of_images_scraped_is_limited_to_threshold(
self):
"""
Tests that the number of images of the Fire Emblem character scraped is limited to the maximum threshold when
parsing the given response of the character's web page, given that the number of images of the character found
in the given response exceeds the maximum threshold.
:return: None
"""
primary_image_link = '/ike.png'
other_image_links = ['/another-ike-1.png', '/another-ike-2.png', '/another-ike-3.png', '/another-ike-4.png',
'/another-ike-5.png', '/another-ike-6.png', '/another-ike-7.png', '/another-ike-8.png',
'/another-ike-9.png', '/another-ike-10.png', '/another-ike-11.png']
other_images_html = ''.join([f'''
<div class="tab_content" style="display:none;">
<a class="image">
<img src="{other_image_link}">
</a>
</div>
''' for other_image_link in other_image_links])
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Ike</title>
</head>
<body>
<h1 id="firstHeading">Ike</h1>
<div class="tab_content" style="display:block;">
<a class="image">
<img src="{primary_image_link}">
</a>
</div>
{other_images_html}
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
primary_image_url = self.spider.BASE_URL + primary_image_link
other_image_urls = [self.spider.BASE_URL + other_image_link for other_image_link in
other_image_links[:MAX_NUM_OTHER_IMAGES]]
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['primaryImage'], primary_image_url, 'Primary image was not scraped correctly')
self.assertEqual(character_item['otherImages'], other_image_urls, 'Other images were not scraped correctly')
def test_when_parsing_character_given_duplicate_images_are_found_then_duplicate_images_are_not_scraped(self):
"""
Tests that duplicate images of the Fire Emblem character are not scraped when parsing the given response of the
character's web page, given that duplicate images of the character are found in the given response.
:return: None
"""
primary_image_link = '/ike-1.png'
other_image_links = ['/ike-2.png', '/ike-1.png', '/ike-2.png']
other_images_html = ''.join([f'''
<div class="tab_content" style="display:none;">
<a class="image">
<img src="{other_image_link}">
</a>
</div>
''' for other_image_link in other_image_links])
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Ike</title>
</head>
<body>
<h1 id="firstHeading">Ike</h1>
<div class="tab_content" style="display:block;">
<a class="image">
<img src="{primary_image_link}">
</a>
</div>
{other_images_html}
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
primary_image_url = self.spider.BASE_URL + primary_image_link
filtered_other_image_links = set(other_image_links)
filtered_other_image_links.remove(primary_image_link)
other_image_urls = [self.spider.BASE_URL + other_image_link for other_image_link in
filtered_other_image_links]
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['primaryImage'], primary_image_url, 'Primary image was not scraped correctly')
self.assertEqual(character_item['otherImages'], other_image_urls, 'Other images were not scraped correctly')
def test_when_parsing_character_given_images_are_not_found_then_images_are_not_scraped(self):
"""
Tests that images of the Fire Emblem character are not scraped when parsing the given response of the
character's web page, given that images of the Fire Emblem character are not found in the given response.
:return: None
"""
html = '''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Altina</title>
</head>
<body>
<h1 id="firstHeading">Altina</h1>
<div>No images of Altina!</div>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertNotIn('primaryImage', character_item, 'Primary image was unexpectedly scraped')
self.assertNotIn('otherImages', character_item, 'Other images were unexpectedly scraped')
def test_when_parsing_character_given_appearances_are_found_then_appearances_are_scraped(self):
"""
Tests that appearances of the Fire Emblem character are scraped when parsing the given response of the
character's web page, given that the character's appearances are found in the given response.
:return: None
"""
appearances = ['Fire Emblem: Three Houses', 'Fire Emblem: Heroes', 'Super Smash Bros. Ultimate']
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Byleth</title>
</head>
<body>
<h1 id="firstHeading">Byleth</h1>
<table>
<tr>
<th>Appearances</th>
<td>
<ul>
<li>
<a href="/wiki/Fire_Emblem:_Three_Houses" title="{appearances[0]}">
Three Houses
</a>
</li>
<li>
<a href="/wiki/Fire_Emblem:_Heroes" title="{appearances[1]}">Heroes</a>
</li>
<li>
<a href="/wiki/Super_Smash_Bros._Ultimate" title="{appearances[2]}">
Super Smash Bros. Ultimate
</a>
</li>
</ul>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['appearances'], appearances, 'Appearances were not scraped correctly')
def test_when_parsing_character_given_appearances_are_found_then_appearances_are_stripped(self):
"""
Tests that each scraped appearance of the Fire Emblem character is stripped of leading and trailing whitespace
when parsing the given response of the character's web page, given that the character's appearances are found in
the given response.
:return: None
"""
appearances = ['Fire Emblem: Three Houses', 'Fire Emblem: Heroes', 'Super Smash Bros. Ultimate']
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Byleth</title>
</head>
<body>
<h1 id="firstHeading">Byleth</h1>
<table>
<tr>
<th>Appearances</th>
<td>
<ul>
<li>
<a href="/wiki/Fire_Emblem:_Three_Houses" title=" {appearances[0]}">
Three Houses
</a>
</li>
<li>
<a href="/wiki/Fire_Emblem:_Heroes" title="{appearances[1]}\n">Heroes</a>
</li>
<li>
<a href="/wiki/Super_Smash_Bros._Ultimate" title="\t{appearances[2]} \n \t ">
Super Smash Bros. Ultimate
</a>
</li>
</ul>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['appearances'], appearances, 'Appearances were not scraped correctly')
def test_when_parsing_character_given_appearances_are_not_found_then_appearances_are_not_scraped(self):
"""
Tests that appearances of the Fire Emblem character are not scraped when parsing the given response of the
character's web page, given that the character's appearances are not found in the given response.
:return: None
"""
html = '''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Byleth</title>
</head>
<body>
<h1 id="firstHeading">Byleth</h1>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertNotIn('appearances', character_item, 'Appearances were unexpectedly scraped')
def test_when_parsing_character_given_multiple_titles_are_found_then_titles_are_scraped(self):
"""
Tests that titles of the Fire Emblem character are scraped when parsing the given response of the character's
web page, given that multiple titles for the character are found in the given response.
:return: None
"""
titles = ['Prince of Light', 'Hero-King']
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Marth</title>
</head>
<body>
<h1 id="firstHeading">Marth</h1>
<table>
<tr>
<th>Title(s)</th>
<td>
<ul>
<li>{titles[0]}</li>
<li>{titles[1]}</li>
</ul>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['titles'], titles, 'Titles were not scraped correctly')
def test_when_parsing_character_given_title_text_is_split_amongst_multiple_elements_then_titles_are_scraped(self):
"""
Tests that titles of the Fire Emblem character are scraped when parsing the given response of the character's
web page, given that the text of a title of the character is split amongst multiple elements in the given
response.
:return: None
"""
first_title_partition = 'Prince of '
second_title_partition = 'Altea'
titles = [first_title_partition + second_title_partition]
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Ike</title>
</head>
<body>
<h1 id="firstHeading">Ike</h1>
<table>
<tr>
<th>Title(s)</th>
<td>
<ul>
<li>{first_title_partition}<a href="/wiki/Altea">{second_title_partition}</a></li>
</ul>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['titles'], titles, 'Titles were not scraped correctly')
def test_when_parsing_character_given_one_title_is_found_then_title_is_scraped(self):
"""
Tests that the title of the Fire Emblem character is scraped when parsing the given response of the character's
web page, given that only one title for the character is found in the given response.
:return: None
"""
titles = ['Radiant Hero']
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Ike</title>
</head>
<body>
<h1 id="firstHeading">Ike</h1>
<table>
<tr>
<th>Title(s)</th>
<td>
<p>{titles[0]}</p>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['titles'], titles, 'Titles were not scraped correctly')
def test_when_parsing_character_given_titles_are_found_then_titles_are_stripped(self):
"""
Tests that each scraped title of the Fire Emblem character is stripped of leading and trailing whitespace when
parsing the given response of the character's web page, given that titles of the character are found in the
given response.
:return: None
"""
titles = ['Prince of Light', 'Hero-King']
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Marth</title>
</head>
<body>
<h1 id="firstHeading">Marth</h1>
<table>
<tr>
<th>Title(s)</th>
<td>
<ul>
<li>\t{titles[0]} </li>
<li> {titles[1]}\n</li>
</ul>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['titles'], titles, 'Titles were not scraped correctly')
def test_when_parsing_character_given_titles_are_not_found_then_titles_are_not_scraped(self):
"""
Tests that titles of the Fire Emblem character are not scraped when parsing the given response of the
character's web page, given that titles of the character are not found in the given response.
:return: None
"""
html = '''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Ike</title>
</head>
<body>
<h1 id="firstHeading">Ike</h1>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertNotIn('titles', character_item, 'Titles were unexpectedly scraped')
def test_when_parsing_character_given_only_english_voice_actors_are_found_then_only_english_voice_actors_are_scraped(
self):
"""
Tests that only English voice actors of the Fire Emblem character are scraped when parsing the given response of
the character's web page, given that only English voice actors of the character are found in the given response.
:return: None
"""
voice_actors = {
'english': ['David Lodge']
}
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Jeralt</title>
</head>
<body>
<h1 id="firstHeading">Jeralt</h1>
<table>
<tr>
<th>Voiced by</th>
<td>
<ul>
<li>
<a href="https://en.wikipedia.org/wiki/David">{voice_actors['english'][0]}</a>
<small>(English, Three Houses)</small>
</li>
</ul>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['voiceActors'], voice_actors, 'Voice actors were not scraped correctly')
def test_when_parsing_character_given_only_japanese_voice_actors_are_found_then_only_japanese_voice_actors_are_scraped(
self):
"""
Tests that only Japanese voice actors of the Fire Emblem character are scraped when parsing the given response
of the character's web page, given that only Japanese voice actors of the character are found in the given
response.
:return: None
"""
voice_actors = {
'japanese': ['Akio Ōtsuka']
}
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Jeralt</title>
</head>
<body>
<h1 id="firstHeading">Jeralt</h1>
<table>
<tr>
<th>Voiced by</th>
<td>
<ul>
<li>
<a href="https://en.wikipedia.org/wiki/Akio">{voice_actors['japanese'][0]}</a>
<small>(Japanese, Three Houses)</small>
</li>
</ul>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['voiceActors'], voice_actors, 'Voice actors were not scraped correctly')
def test_when_parsing_character_given_english_and_japanese_voice_actors_are_found_then_english_and_japanese_voice_actors_are_scraped(
self):
"""
Tests that both English and Japanese voice actors of the Fire Emblem character are scraped when parsing the
given response of the character's web page, given that both English and Japanese voice actors of the character
are found in the given response.
:return: None
"""
voice_actors = {
'english': ['David Lodge'],
'japanese': ['Akio Ōtsuka']
}
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Jeralt</title>
</head>
<body>
<h1 id="firstHeading">Jeralt</h1>
<table>
<tr>
<th>Voiced by</th>
<td>
<ul>
<li>
<a href="https://en.wikipedia.org/wiki/Akio">{voice_actors['japanese'][0]}</a>
<small>(Japanese, Three Houses)</small>
</li>
<li>
<a href="https://en.wikipedia.org/wiki/David">{voice_actors['english'][0]}</a>
<small>(English, Three Houses)</small>
</li>
</ul>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['voiceActors'], voice_actors, 'Voice actors were not scraped correctly')
def test_when_parsing_character_given_voice_actors_are_found_then_voice_actors_are_stripped(
self):
"""
Tests that each scraped voice actor of the Fire Emblem character is stripped when parsing the given response of
the character's web page, given that voice actors of the character are found in the given response.
:return: None
"""
voice_actors = {
'english': ['David Lodge'],
'japanese': ['Akio Ōtsuka']
}
html = f'''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Jeralt</title>
</head>
<body>
<h1 id="firstHeading">Jeralt</h1>
<table>
<tr>
<th>Voiced by</th>
<td>
<ul>
<li>
<a href="https://en.wikipedia.org/wiki/Akio">\t{voice_actors['japanese'][0]} </a>
<small>(Japanese, Three Houses)</small>
</li>
<li>
<a href="https://en.wikipedia.org/wiki/David"> {voice_actors['english'][0]}\n</a>
<small>(English, Three Houses)</small>
</li>
</ul>
</td>
</tr>
</table>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertEqual(character_item['voiceActors'], voice_actors, 'Voice actors were not scraped correctly')
def test_when_parsing_character_given_voice_actors_are_not_found_then_voice_actors_are_not_scraped(self):
"""
Tests that voice actors are not scraped when parsing the given response of the character's web page, given that
voice actors of the character are not found in the given response.
:return: None
"""
html = '''
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Jeralt</title>
</head>
<body>
<h1 id="firstHeading">Jeralt</h1>
</body>
</html>
'''
response = HtmlResponse(url='', body=html.encode('utf-8'))
character_item = self.spider.parse_character(response)
self.assertNotIn('voiceActors', character_item, 'Voice actors were unexpectedly scraped')
if __name__ == '__main__':
unittest.main()
| 40.689542 | 160 | 0.495302 | 3,728 | 37,353 | 4.78809 | 0.060354 | 0.015126 | 0.043025 | 0.024202 | 0.884426 | 0.849356 | 0.818487 | 0.799272 | 0.788235 | 0.767731 | 0 | 0.006084 | 0.405991 | 37,353 | 917 | 161 | 40.733915 | 0.798405 | 0.160844 | 0 | 0.843284 | 0 | 0.01194 | 0.634123 | 0.0708 | 0 | 0 | 0 | 0 | 0.043284 | 1 | 0.037313 | false | 0 | 0.007463 | 0 | 0.046269 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d236f2edfc20b925934ae9e16c551d7bb967b861 | 45,777 | py | Python | packages/fetchai/protocols/contract_api/contract_api_pb2.py | bryanchriswhite/agents-aea | d3f177a963eb855d9528555167255bf2b478f4ba | [
"Apache-2.0"
] | 126 | 2019-09-07T09:32:44.000Z | 2022-03-29T14:28:41.000Z | packages/fetchai/protocols/contract_api/contract_api_pb2.py | salman6049/agents-aea | d3f177a963eb855d9528555167255bf2b478f4ba | [
"Apache-2.0"
] | 1,814 | 2019-08-24T10:08:07.000Z | 2022-03-31T14:28:36.000Z | packages/fetchai/protocols/contract_api/contract_api_pb2.py | salman6049/agents-aea | d3f177a963eb855d9528555167255bf2b478f4ba | [
"Apache-2.0"
] | 46 | 2019-09-03T22:13:58.000Z | 2022-03-22T01:25:16.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: contract_api.proto
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name="contract_api.proto",
package="aea.fetchai.contract_api.v1_0_0",
syntax="proto3",
serialized_options=None,
serialized_pb=b'\n\x12\x63ontract_api.proto\x12\x1f\x61\x65\x61.fetchai.contract_api.v1_0_0"\x93\x11\n\x12\x43ontractApiMessage\x12W\n\x05\x65rror\x18\x05 \x01(\x0b\x32\x46.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Error_PerformativeH\x00\x12y\n\x16get_deploy_transaction\x18\x06 \x01(\x0b\x32W.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Deploy_Transaction_PerformativeH\x00\x12k\n\x0fget_raw_message\x18\x07 \x01(\x0b\x32P.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Message_PerformativeH\x00\x12s\n\x13get_raw_transaction\x18\x08 \x01(\x0b\x32T.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Transaction_PerformativeH\x00\x12_\n\tget_state\x18\t \x01(\x0b\x32J.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_State_PerformativeH\x00\x12\x63\n\x0braw_message\x18\n \x01(\x0b\x32L.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Raw_Message_PerformativeH\x00\x12k\n\x0fraw_transaction\x18\x0b \x01(\x0b\x32P.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Raw_Transaction_PerformativeH\x00\x12W\n\x05state\x18\x0c \x01(\x0b\x32\x46.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.State_PerformativeH\x00\x1a\x18\n\x06Kwargs\x12\x0e\n\x06kwargs\x18\x01 \x01(\x0c\x1a!\n\nRawMessage\x12\x13\n\x0braw_message\x18\x01 \x01(\x0c\x1a)\n\x0eRawTransaction\x12\x17\n\x0fraw_transaction\x18\x01 \x01(\x0c\x1a\x16\n\x05State\x12\r\n\x05state\x18\x01 \x01(\x0c\x1a\xab\x01\n#Get_Deploy_Transaction_Performative\x12\x11\n\tledger_id\x18\x01 \x01(\t\x12\x13\n\x0b\x63ontract_id\x18\x02 \x01(\t\x12\x10\n\x08\x63\x61llable\x18\x03 \x01(\t\x12J\n\x06kwargs\x18\x04 \x01(\x0b\x32:.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Kwargs\x1a\xc2\x01\n Get_Raw_Transaction_Performative\x12\x11\n\tledger_id\x18\x01 \x01(\t\x12\x13\n\x0b\x63ontract_id\x18\x02 \x01(\t\x12\x18\n\x10\x63ontract_address\x18\x03 \x01(\t\x12\x10\n\x08\x63\x61llable\x18\x04 \x01(\t\x12J\n\x06kwargs\x18\x05 \x01(\x0b\x32:.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Kwargs\x1a\xbe\x01\n\x1cGet_Raw_Message_Performative\x12\x11\n\tledger_id\x18\x01 \x01(\t\x12\x13\n\x0b\x63ontract_id\x18\x02 \x01(\t\x12\x18\n\x10\x63ontract_address\x18\x03 \x01(\t\x12\x10\n\x08\x63\x61llable\x18\x04 \x01(\t\x12J\n\x06kwargs\x18\x05 \x01(\x0b\x32:.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Kwargs\x1a\xb8\x01\n\x16Get_State_Performative\x12\x11\n\tledger_id\x18\x01 \x01(\t\x12\x13\n\x0b\x63ontract_id\x18\x02 \x01(\t\x12\x18\n\x10\x63ontract_address\x18\x03 \x01(\t\x12\x10\n\x08\x63\x61llable\x18\x04 \x01(\t\x12J\n\x06kwargs\x18\x05 \x01(\x0b\x32:.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Kwargs\x1a^\n\x12State_Performative\x12H\n\x05state\x18\x01 \x01(\x0b\x32\x39.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.State\x1a{\n\x1cRaw_Transaction_Performative\x12[\n\x0fraw_transaction\x18\x01 \x01(\x0b\x32\x42.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.RawTransaction\x1ao\n\x18Raw_Message_Performative\x12S\n\x0braw_message\x18\x01 \x01(\x0b\x32>.aea.fetchai.contract_api.v1_0_0.ContractApiMessage.RawMessage\x1an\n\x12\x45rror_Performative\x12\x0c\n\x04\x63ode\x18\x01 \x01(\x05\x12\x13\n\x0b\x63ode_is_set\x18\x02 \x01(\x08\x12\x0f\n\x07message\x18\x03 \x01(\t\x12\x16\n\x0emessage_is_set\x18\x04 \x01(\x08\x12\x0c\n\x04\x64\x61ta\x18\x05 \x01(\x0c\x42\x0e\n\x0cperformativeb\x06proto3',
)
_CONTRACTAPIMESSAGE_KWARGS = _descriptor.Descriptor(
name="Kwargs",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Kwargs",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="kwargs",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Kwargs.kwargs",
index=0,
number=1,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=912,
serialized_end=936,
)
_CONTRACTAPIMESSAGE_RAWMESSAGE = _descriptor.Descriptor(
name="RawMessage",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.RawMessage",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="raw_message",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.RawMessage.raw_message",
index=0,
number=1,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=938,
serialized_end=971,
)
_CONTRACTAPIMESSAGE_RAWTRANSACTION = _descriptor.Descriptor(
name="RawTransaction",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.RawTransaction",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="raw_transaction",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.RawTransaction.raw_transaction",
index=0,
number=1,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=973,
serialized_end=1014,
)
_CONTRACTAPIMESSAGE_STATE = _descriptor.Descriptor(
name="State",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.State",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="state",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.State.state",
index=0,
number=1,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1016,
serialized_end=1038,
)
_CONTRACTAPIMESSAGE_GET_DEPLOY_TRANSACTION_PERFORMATIVE = _descriptor.Descriptor(
name="Get_Deploy_Transaction_Performative",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Deploy_Transaction_Performative",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="ledger_id",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Deploy_Transaction_Performative.ledger_id",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="contract_id",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Deploy_Transaction_Performative.contract_id",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="callable",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Deploy_Transaction_Performative.callable",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="kwargs",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Deploy_Transaction_Performative.kwargs",
index=3,
number=4,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1041,
serialized_end=1212,
)
_CONTRACTAPIMESSAGE_GET_RAW_TRANSACTION_PERFORMATIVE = _descriptor.Descriptor(
name="Get_Raw_Transaction_Performative",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Transaction_Performative",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="ledger_id",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Transaction_Performative.ledger_id",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="contract_id",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Transaction_Performative.contract_id",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="contract_address",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Transaction_Performative.contract_address",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="callable",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Transaction_Performative.callable",
index=3,
number=4,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="kwargs",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Transaction_Performative.kwargs",
index=4,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1215,
serialized_end=1409,
)
_CONTRACTAPIMESSAGE_GET_RAW_MESSAGE_PERFORMATIVE = _descriptor.Descriptor(
name="Get_Raw_Message_Performative",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Message_Performative",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="ledger_id",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Message_Performative.ledger_id",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="contract_id",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Message_Performative.contract_id",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="contract_address",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Message_Performative.contract_address",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="callable",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Message_Performative.callable",
index=3,
number=4,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="kwargs",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Message_Performative.kwargs",
index=4,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1412,
serialized_end=1602,
)
_CONTRACTAPIMESSAGE_GET_STATE_PERFORMATIVE = _descriptor.Descriptor(
name="Get_State_Performative",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_State_Performative",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="ledger_id",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_State_Performative.ledger_id",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="contract_id",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_State_Performative.contract_id",
index=1,
number=2,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="contract_address",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_State_Performative.contract_address",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="callable",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_State_Performative.callable",
index=3,
number=4,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="kwargs",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_State_Performative.kwargs",
index=4,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1605,
serialized_end=1789,
)
_CONTRACTAPIMESSAGE_STATE_PERFORMATIVE = _descriptor.Descriptor(
name="State_Performative",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.State_Performative",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="state",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.State_Performative.state",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1791,
serialized_end=1885,
)
_CONTRACTAPIMESSAGE_RAW_TRANSACTION_PERFORMATIVE = _descriptor.Descriptor(
name="Raw_Transaction_Performative",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Raw_Transaction_Performative",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="raw_transaction",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Raw_Transaction_Performative.raw_transaction",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=1887,
serialized_end=2010,
)
_CONTRACTAPIMESSAGE_RAW_MESSAGE_PERFORMATIVE = _descriptor.Descriptor(
name="Raw_Message_Performative",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Raw_Message_Performative",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="raw_message",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Raw_Message_Performative.raw_message",
index=0,
number=1,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2012,
serialized_end=2123,
)
_CONTRACTAPIMESSAGE_ERROR_PERFORMATIVE = _descriptor.Descriptor(
name="Error_Performative",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Error_Performative",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="code",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Error_Performative.code",
index=0,
number=1,
type=5,
cpp_type=1,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="code_is_set",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Error_Performative.code_is_set",
index=1,
number=2,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="message",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Error_Performative.message",
index=2,
number=3,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="message_is_set",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Error_Performative.message_is_set",
index=3,
number=4,
type=8,
cpp_type=7,
label=1,
has_default_value=False,
default_value=False,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="data",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Error_Performative.data",
index=4,
number=5,
type=12,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"",
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=2125,
serialized_end=2235,
)
_CONTRACTAPIMESSAGE = _descriptor.Descriptor(
name="ContractApiMessage",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage",
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name="error",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.error",
index=0,
number=5,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="get_deploy_transaction",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.get_deploy_transaction",
index=1,
number=6,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="get_raw_message",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.get_raw_message",
index=2,
number=7,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="get_raw_transaction",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.get_raw_transaction",
index=3,
number=8,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="get_state",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.get_state",
index=4,
number=9,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="raw_message",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.raw_message",
index=5,
number=10,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="raw_transaction",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.raw_transaction",
index=6,
number=11,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
_descriptor.FieldDescriptor(
name="state",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.state",
index=7,
number=12,
type=11,
cpp_type=10,
label=1,
has_default_value=False,
default_value=None,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
),
],
extensions=[],
nested_types=[
_CONTRACTAPIMESSAGE_KWARGS,
_CONTRACTAPIMESSAGE_RAWMESSAGE,
_CONTRACTAPIMESSAGE_RAWTRANSACTION,
_CONTRACTAPIMESSAGE_STATE,
_CONTRACTAPIMESSAGE_GET_DEPLOY_TRANSACTION_PERFORMATIVE,
_CONTRACTAPIMESSAGE_GET_RAW_TRANSACTION_PERFORMATIVE,
_CONTRACTAPIMESSAGE_GET_RAW_MESSAGE_PERFORMATIVE,
_CONTRACTAPIMESSAGE_GET_STATE_PERFORMATIVE,
_CONTRACTAPIMESSAGE_STATE_PERFORMATIVE,
_CONTRACTAPIMESSAGE_RAW_TRANSACTION_PERFORMATIVE,
_CONTRACTAPIMESSAGE_RAW_MESSAGE_PERFORMATIVE,
_CONTRACTAPIMESSAGE_ERROR_PERFORMATIVE,
],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name="performative",
full_name="aea.fetchai.contract_api.v1_0_0.ContractApiMessage.performative",
index=0,
containing_type=None,
fields=[],
),
],
serialized_start=56,
serialized_end=2251,
)
_CONTRACTAPIMESSAGE_KWARGS.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE_RAWMESSAGE.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE_RAWTRANSACTION.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE_STATE.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE_GET_DEPLOY_TRANSACTION_PERFORMATIVE.fields_by_name[
"kwargs"
].message_type = _CONTRACTAPIMESSAGE_KWARGS
_CONTRACTAPIMESSAGE_GET_DEPLOY_TRANSACTION_PERFORMATIVE.containing_type = (
_CONTRACTAPIMESSAGE
)
_CONTRACTAPIMESSAGE_GET_RAW_TRANSACTION_PERFORMATIVE.fields_by_name[
"kwargs"
].message_type = _CONTRACTAPIMESSAGE_KWARGS
_CONTRACTAPIMESSAGE_GET_RAW_TRANSACTION_PERFORMATIVE.containing_type = (
_CONTRACTAPIMESSAGE
)
_CONTRACTAPIMESSAGE_GET_RAW_MESSAGE_PERFORMATIVE.fields_by_name[
"kwargs"
].message_type = _CONTRACTAPIMESSAGE_KWARGS
_CONTRACTAPIMESSAGE_GET_RAW_MESSAGE_PERFORMATIVE.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE_GET_STATE_PERFORMATIVE.fields_by_name[
"kwargs"
].message_type = _CONTRACTAPIMESSAGE_KWARGS
_CONTRACTAPIMESSAGE_GET_STATE_PERFORMATIVE.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE_STATE_PERFORMATIVE.fields_by_name[
"state"
].message_type = _CONTRACTAPIMESSAGE_STATE
_CONTRACTAPIMESSAGE_STATE_PERFORMATIVE.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE_RAW_TRANSACTION_PERFORMATIVE.fields_by_name[
"raw_transaction"
].message_type = _CONTRACTAPIMESSAGE_RAWTRANSACTION
_CONTRACTAPIMESSAGE_RAW_TRANSACTION_PERFORMATIVE.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE_RAW_MESSAGE_PERFORMATIVE.fields_by_name[
"raw_message"
].message_type = _CONTRACTAPIMESSAGE_RAWMESSAGE
_CONTRACTAPIMESSAGE_RAW_MESSAGE_PERFORMATIVE.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE_ERROR_PERFORMATIVE.containing_type = _CONTRACTAPIMESSAGE
_CONTRACTAPIMESSAGE.fields_by_name[
"error"
].message_type = _CONTRACTAPIMESSAGE_ERROR_PERFORMATIVE
_CONTRACTAPIMESSAGE.fields_by_name[
"get_deploy_transaction"
].message_type = _CONTRACTAPIMESSAGE_GET_DEPLOY_TRANSACTION_PERFORMATIVE
_CONTRACTAPIMESSAGE.fields_by_name[
"get_raw_message"
].message_type = _CONTRACTAPIMESSAGE_GET_RAW_MESSAGE_PERFORMATIVE
_CONTRACTAPIMESSAGE.fields_by_name[
"get_raw_transaction"
].message_type = _CONTRACTAPIMESSAGE_GET_RAW_TRANSACTION_PERFORMATIVE
_CONTRACTAPIMESSAGE.fields_by_name[
"get_state"
].message_type = _CONTRACTAPIMESSAGE_GET_STATE_PERFORMATIVE
_CONTRACTAPIMESSAGE.fields_by_name[
"raw_message"
].message_type = _CONTRACTAPIMESSAGE_RAW_MESSAGE_PERFORMATIVE
_CONTRACTAPIMESSAGE.fields_by_name[
"raw_transaction"
].message_type = _CONTRACTAPIMESSAGE_RAW_TRANSACTION_PERFORMATIVE
_CONTRACTAPIMESSAGE.fields_by_name[
"state"
].message_type = _CONTRACTAPIMESSAGE_STATE_PERFORMATIVE
_CONTRACTAPIMESSAGE.oneofs_by_name["performative"].fields.append(
_CONTRACTAPIMESSAGE.fields_by_name["error"]
)
_CONTRACTAPIMESSAGE.fields_by_name[
"error"
].containing_oneof = _CONTRACTAPIMESSAGE.oneofs_by_name["performative"]
_CONTRACTAPIMESSAGE.oneofs_by_name["performative"].fields.append(
_CONTRACTAPIMESSAGE.fields_by_name["get_deploy_transaction"]
)
_CONTRACTAPIMESSAGE.fields_by_name[
"get_deploy_transaction"
].containing_oneof = _CONTRACTAPIMESSAGE.oneofs_by_name["performative"]
_CONTRACTAPIMESSAGE.oneofs_by_name["performative"].fields.append(
_CONTRACTAPIMESSAGE.fields_by_name["get_raw_message"]
)
_CONTRACTAPIMESSAGE.fields_by_name[
"get_raw_message"
].containing_oneof = _CONTRACTAPIMESSAGE.oneofs_by_name["performative"]
_CONTRACTAPIMESSAGE.oneofs_by_name["performative"].fields.append(
_CONTRACTAPIMESSAGE.fields_by_name["get_raw_transaction"]
)
_CONTRACTAPIMESSAGE.fields_by_name[
"get_raw_transaction"
].containing_oneof = _CONTRACTAPIMESSAGE.oneofs_by_name["performative"]
_CONTRACTAPIMESSAGE.oneofs_by_name["performative"].fields.append(
_CONTRACTAPIMESSAGE.fields_by_name["get_state"]
)
_CONTRACTAPIMESSAGE.fields_by_name[
"get_state"
].containing_oneof = _CONTRACTAPIMESSAGE.oneofs_by_name["performative"]
_CONTRACTAPIMESSAGE.oneofs_by_name["performative"].fields.append(
_CONTRACTAPIMESSAGE.fields_by_name["raw_message"]
)
_CONTRACTAPIMESSAGE.fields_by_name[
"raw_message"
].containing_oneof = _CONTRACTAPIMESSAGE.oneofs_by_name["performative"]
_CONTRACTAPIMESSAGE.oneofs_by_name["performative"].fields.append(
_CONTRACTAPIMESSAGE.fields_by_name["raw_transaction"]
)
_CONTRACTAPIMESSAGE.fields_by_name[
"raw_transaction"
].containing_oneof = _CONTRACTAPIMESSAGE.oneofs_by_name["performative"]
_CONTRACTAPIMESSAGE.oneofs_by_name["performative"].fields.append(
_CONTRACTAPIMESSAGE.fields_by_name["state"]
)
_CONTRACTAPIMESSAGE.fields_by_name[
"state"
].containing_oneof = _CONTRACTAPIMESSAGE.oneofs_by_name["performative"]
DESCRIPTOR.message_types_by_name["ContractApiMessage"] = _CONTRACTAPIMESSAGE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
ContractApiMessage = _reflection.GeneratedProtocolMessageType(
"ContractApiMessage",
(_message.Message,),
{
"Kwargs": _reflection.GeneratedProtocolMessageType(
"Kwargs",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_KWARGS,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Kwargs)
},
),
"RawMessage": _reflection.GeneratedProtocolMessageType(
"RawMessage",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_RAWMESSAGE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.RawMessage)
},
),
"RawTransaction": _reflection.GeneratedProtocolMessageType(
"RawTransaction",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_RAWTRANSACTION,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.RawTransaction)
},
),
"State": _reflection.GeneratedProtocolMessageType(
"State",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_STATE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.State)
},
),
"Get_Deploy_Transaction_Performative": _reflection.GeneratedProtocolMessageType(
"Get_Deploy_Transaction_Performative",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_GET_DEPLOY_TRANSACTION_PERFORMATIVE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Deploy_Transaction_Performative)
},
),
"Get_Raw_Transaction_Performative": _reflection.GeneratedProtocolMessageType(
"Get_Raw_Transaction_Performative",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_GET_RAW_TRANSACTION_PERFORMATIVE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Transaction_Performative)
},
),
"Get_Raw_Message_Performative": _reflection.GeneratedProtocolMessageType(
"Get_Raw_Message_Performative",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_GET_RAW_MESSAGE_PERFORMATIVE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_Raw_Message_Performative)
},
),
"Get_State_Performative": _reflection.GeneratedProtocolMessageType(
"Get_State_Performative",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_GET_STATE_PERFORMATIVE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Get_State_Performative)
},
),
"State_Performative": _reflection.GeneratedProtocolMessageType(
"State_Performative",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_STATE_PERFORMATIVE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.State_Performative)
},
),
"Raw_Transaction_Performative": _reflection.GeneratedProtocolMessageType(
"Raw_Transaction_Performative",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_RAW_TRANSACTION_PERFORMATIVE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Raw_Transaction_Performative)
},
),
"Raw_Message_Performative": _reflection.GeneratedProtocolMessageType(
"Raw_Message_Performative",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_RAW_MESSAGE_PERFORMATIVE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Raw_Message_Performative)
},
),
"Error_Performative": _reflection.GeneratedProtocolMessageType(
"Error_Performative",
(_message.Message,),
{
"DESCRIPTOR": _CONTRACTAPIMESSAGE_ERROR_PERFORMATIVE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage.Error_Performative)
},
),
"DESCRIPTOR": _CONTRACTAPIMESSAGE,
"__module__": "contract_api_pb2"
# @@protoc_insertion_point(class_scope:aea.fetchai.contract_api.v1_0_0.ContractApiMessage)
},
)
_sym_db.RegisterMessage(ContractApiMessage)
_sym_db.RegisterMessage(ContractApiMessage.Kwargs)
_sym_db.RegisterMessage(ContractApiMessage.RawMessage)
_sym_db.RegisterMessage(ContractApiMessage.RawTransaction)
_sym_db.RegisterMessage(ContractApiMessage.State)
_sym_db.RegisterMessage(ContractApiMessage.Get_Deploy_Transaction_Performative)
_sym_db.RegisterMessage(ContractApiMessage.Get_Raw_Transaction_Performative)
_sym_db.RegisterMessage(ContractApiMessage.Get_Raw_Message_Performative)
_sym_db.RegisterMessage(ContractApiMessage.Get_State_Performative)
_sym_db.RegisterMessage(ContractApiMessage.State_Performative)
_sym_db.RegisterMessage(ContractApiMessage.Raw_Transaction_Performative)
_sym_db.RegisterMessage(ContractApiMessage.Raw_Message_Performative)
_sym_db.RegisterMessage(ContractApiMessage.Error_Performative)
# @@protoc_insertion_point(module_scope)
| 36.563099 | 3,347 | 0.641174 | 4,666 | 45,777 | 5.908487 | 0.046292 | 0.038014 | 0.054191 | 0.060213 | 0.868439 | 0.818202 | 0.773332 | 0.727121 | 0.701331 | 0.694911 | 0 | 0.03636 | 0.266422 | 45,777 | 1,251 | 3,348 | 36.592326 | 0.78461 | 0.034668 | 0 | 0.754545 | 1 | 0.000826 | 0.223178 | 0.181884 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.003306 | 0 | 0.003306 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
962a14e0adcdb51053777746a4063ed284246d9b | 2,085 | py | Python | isiscb/isisdata/migrations/0094_auto_20210717_1853.py | crispzips/IsisCB | 72f5ad47bbc2c615f995df148f5b86550835efdb | [
"MIT"
] | 4 | 2016-01-25T20:35:33.000Z | 2020-04-07T15:39:52.000Z | isiscb/isisdata/migrations/0094_auto_20210717_1853.py | crispzips/IsisCB | 72f5ad47bbc2c615f995df148f5b86550835efdb | [
"MIT"
] | 41 | 2015-08-19T17:34:41.000Z | 2022-03-11T23:19:01.000Z | isiscb/isisdata/migrations/0094_auto_20210717_1853.py | crispzips/IsisCB | 72f5ad47bbc2c615f995df148f5b86550835efdb | [
"MIT"
] | 2 | 2020-11-25T20:18:18.000Z | 2021-06-24T15:15:41.000Z | # Generated by Django 3.0.7 on 2021-07-17 18:53
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('isisdata', '0093_merge_20210717_1853'),
]
operations = [
migrations.AlterField(
model_name='authority',
name='type_controlled',
field=models.CharField(blank=True, choices=[('PE', 'Person'), ('IN', 'Institution'), ('TI', 'Time Period'), ('GE', 'Geographic Term'), ('SE', 'Serial Publication'), ('CT', 'Classification Term'), ('CO', 'Concept'), ('CW', 'Creative Work'), ('EV', 'Event'), ('CR', 'Cross-reference'), ('BL', 'Bibliographic List')], db_index=True, help_text='Specifies authority type. Each authority thema has its own list of controlled type vocabulary.', max_length=2, null=True, verbose_name='type'),
),
migrations.AlterField(
model_name='historicalauthority',
name='type_controlled',
field=models.CharField(blank=True, choices=[('PE', 'Person'), ('IN', 'Institution'), ('TI', 'Time Period'), ('GE', 'Geographic Term'), ('SE', 'Serial Publication'), ('CT', 'Classification Term'), ('CO', 'Concept'), ('CW', 'Creative Work'), ('EV', 'Event'), ('CR', 'Cross-reference'), ('BL', 'Bibliographic List')], db_index=True, help_text='Specifies authority type. Each authority thema has its own list of controlled type vocabulary.', max_length=2, null=True, verbose_name='type'),
),
migrations.AlterField(
model_name='historicalperson',
name='type_controlled',
field=models.CharField(blank=True, choices=[('PE', 'Person'), ('IN', 'Institution'), ('TI', 'Time Period'), ('GE', 'Geographic Term'), ('SE', 'Serial Publication'), ('CT', 'Classification Term'), ('CO', 'Concept'), ('CW', 'Creative Work'), ('EV', 'Event'), ('CR', 'Cross-reference'), ('BL', 'Bibliographic List')], db_index=True, help_text='Specifies authority type. Each authority thema has its own list of controlled type vocabulary.', max_length=2, null=True, verbose_name='type'),
),
]
| 71.896552 | 496 | 0.632134 | 239 | 2,085 | 5.426778 | 0.34728 | 0.037008 | 0.057826 | 0.067078 | 0.826523 | 0.826523 | 0.826523 | 0.826523 | 0.826523 | 0.826523 | 0 | 0.019744 | 0.174101 | 2,085 | 28 | 497 | 74.464286 | 0.733449 | 0.021583 | 0 | 0.545455 | 1 | 0 | 0.439156 | 0.011776 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
96435b0f469f0205274457f55a7b777b90572560 | 67,912 | py | Python | tests/shared_network_tests.py | Repast/repast4py | e616aab0eae681bbd937de06925e7872e6dc036e | [
"BSD-3-Clause"
] | 10 | 2021-11-05T17:37:00.000Z | 2022-01-23T23:17:08.000Z | tests/shared_network_tests.py | Repast/repast4py | e616aab0eae681bbd937de06925e7872e6dc036e | [
"BSD-3-Clause"
] | 6 | 2021-11-01T23:09:55.000Z | 2022-02-07T21:09:58.000Z | tests/shared_network_tests.py | Repast/repast4py | e616aab0eae681bbd937de06925e7872e6dc036e | [
"BSD-3-Clause"
] | 1 | 2022-03-27T03:33:26.000Z | 2022-03-27T03:33:26.000Z | import sys
import os
from mpi4py import MPI
import unittest
from collections import OrderedDict
import networkx as nx
import re
try:
from repast4py.network import UndirectedSharedNetwork, DirectedSharedNetwork, read_network, write_network
except ModuleNotFoundError:
sys.path.append("{}/../src".format(os.path.dirname(os.path.abspath(__file__))))
from repast4py.network import UndirectedSharedNetwork, DirectedSharedNetwork, read_network, write_network
from repast4py import core, space, random
from repast4py import context as ctx
from repast4py.space import ContinuousPoint as cpt
from repast4py.space import DiscretePoint as dpt
from repast4py.space import BorderType, OccupancyType
class EAgent(core.Agent):
def __init__(self, id, agent_type, rank, energy):
super().__init__(id=id, type=agent_type, rank=rank)
self.energy = energy
self.restored = False
def save(self):
return (self.uid, self.energy)
def update(self, data):
self.restored = True
self.energy = data
def restore_agent(agent_data):
# agent_data: [aid_tuple, energy]
uid = agent_data[0]
return EAgent(uid[0], uid[1], uid[2], agent_data[1])
class SharedDirectedNetworkTests(unittest.TestCase):
long_message = True
def test_add_remove(self):
# make 1 rank comm for basic add remove tests
new_group = MPI.COMM_WORLD.Get_group().Incl([0])
comm = MPI.COMM_WORLD.Create_group(new_group)
if comm != MPI.COMM_NULL:
g = DirectedSharedNetwork('network', comm)
self.assertEqual('network', g.name)
self.assertTrue(g.is_directed)
self.assertEqual(0, g.node_count)
self.assertEqual(0, g.edge_count)
agents = [EAgent(x, 0, comm.Get_rank(), x) for x in range(10)]
g.add(agents[0])
g.add_nodes(agents[1:4])
self.assertEqual(4, g.node_count)
nodes = [x for x in g.graph.nodes]
self.assertEqual(nodes, [x for x in agents[0:4]])
g.add_edge(agents[0], agents[1])
g.add_edge(agents[0], agents[3])
g.add_edge(agents[5], agents[6], weight=12)
# 2 nodes added via edge
self.assertEqual(6, g.node_count)
self.assertEqual(3, g.edge_count)
edges = [x for x in g.graph.edges(agents[5])]
self.assertEqual(edges, [(agents[5], agents[6])])
edges = [x for x in g.graph.edges(agents[0])]
self.assertEqual(edges, [(agents[0], agents[1]), (agents[0], agents[3])])
edges = [x for x in g.graph.edges(agents[6])]
self.assertEqual(edges, [])
edge = g.graph.edges[agents[5], agents[6]]
self.assertEqual(12, edge['weight'])
self.assertTrue(g.contains_edge(agents[0], agents[1]))
self.assertFalse(g.contains_edge(agents[1], agents[0]))
self.assertTrue(not g.contains_edge(agents[7], agents[6]))
g.remove(agents[0])
self.assertEqual(5, g.node_count)
self.assertEqual(1, g.edge_count)
g.add_edge(agents[4], agents[5])
self.assertEqual(2, g.num_edges(agents[5]))
self.assertEqual(1, g.num_edges(agents[4]))
exp = {(agents[5], agents[6]), (agents[4], agents[5])}
for edge in g._edges(agents[5]):
exp.remove(edge)
self.assertEqual(0, len(exp))
def test_sync_1(self):
# Tests add, update and remove edge
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
g = DirectedSharedNetwork('network', comm)
context.add_projection(g)
self.assertEqual(0, g.node_count)
agents = [EAgent(x, 0, rank, x) for x in range(10)]
for a in agents:
context.add(a)
self.assertEqual(rank, a.local_rank)
self.assertEqual(10, g.node_count)
requests = []
if rank == 0:
requests.append(((1, 0, 1), 1))
requests.append(((1, 0, 2), 2))
requests.append(((2, 0, 1), 1))
elif rank == 3:
requests.append(((1, 0, 0), 0))
requests.append(((4, 0, 2), 2))
requests.append(((2, 0, 1), 1))
context.request_agents(requests, restore_agent)
if rank == 0 or rank == 3:
self.assertEqual(13, g.node_count)
# Edges: 0: (0, 0, 0) -> (1, 0, 1)
# 0: (2, 0, 1) -> (0, 0, 0)
# 1: (0, 0, 1) -> (1, 0, 1)
# 3: (0, 0, 3) -> (2, 0, 1)
# 3: (1, 0, 0) -> (1, 0, 3)
if rank == 0:
other = context.ghost_agent((1, 0, 1))
g.add_edge(agents[0], other, weight=2)
self.assertEqual(1, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges, [(agents[0], other, {'weight': 2})])
other = context.ghost_agent((2, 0, 1))
g.add_edge(other, agents[0], rate=2)
self.assertEqual(2, g.edge_count)
edges = [x for x in g.graph.in_edges(agents[0], data=True)]
self.assertEqual(edges[0], (other, agents[0], {'rate': 2}))
elif rank == 1:
g.add_edge(agents[0], agents[1], weight=3)
self.assertEqual(1, g.edge_count)
elif rank == 3:
other = context.ghost_agent((2, 0, 1))
g.add_edge(agents[0], other, weight=10)
self.assertEqual(1, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges, [(agents[0], other, {'weight': 10})])
other = context.ghost_agent((1, 0, 0))
g.add_edge(other, agents[1], rate=2.1)
self.assertEqual(2, g.edge_count)
edges = [x for x in g.graph.in_edges(agents[1], data=True)]
self.assertEqual(edges[0], (other, agents[1], OrderedDict({'rate': 2.1})))
context.synchronize(restore_agent)
# TEST vertices and edges created on ghost ranks
if rank == 0:
self.assertEqual(3, g.edge_count)
agent = context.agent((0, 0, 0))
# get out and in, in that order
edges = [x for x in g._edges(agent, data=True)]
self.assertEqual(2, len(edges))
exp = [(agent, context.ghost_agent((1, 0, 1)), {'weight': 2}),
(context.ghost_agent((2, 0, 1)), agent, {'rate': 2})]
for e in edges:
self.assertTrue(e in exp)
elif rank == 1:
self.assertEqual(4, g.edge_count)
agent = context.agent((1, 0, 1))
edges = [x for x in g._edges(agent, data=True)]
self.assertEqual(2, len(edges))
# looks like in undirected graph when request via edges(k)
# the edge is returned as (k, other) regardless of how the
# edge is originally inserted
exp = [(agents[0], agent, {'weight': 3}),
(context.ghost_agent((0, 0, 0)), agent, {'weight': 2})]
self.assertEqual(edges, exp)
agent = context.agent((2, 0, 1))
edges = [x for x in g._edges(agent, data=True)]
self.assertEqual(2, len(edges))
exp = [(agent, context.ghost_agent((0, 0, 0)), {'rate': 2}),
(context.ghost_agent((0, 0, 3)), agent, {'weight': 10})]
self.assertEqual(edges, exp)
elif rank == 3:
self.assertEqual(2, g.edge_count)
agent = agents[0]
edges = [x for x in g.graph.edges(agent, data=True)]
self.assertEqual(1, len(edges))
self.assertEqual(edges[0], (agent, context.ghost_agent((2, 0, 1)), {'weight': 10}))
agent = agents[1]
edges = [x for x in g.graph.in_edges(agent, data=True)]
self.assertEqual(1, len(edges))
self.assertEqual(edges[0], (context.ghost_agent((1, 0, 0)), agent, {'rate': 2.1}))
# TEST: update 2,0,1 agent and see if updates on 0 and 3
if rank == 1:
agents[2].energy = 134324
# TEST: update edge with 2,0,1 and see if updates 1 and 0
elif rank == 3:
other = context.ghost_agent((2, 0, 1))
g.update_edge(agents[0], other, weight=14.2)
context.synchronize(restore_agent)
if rank == 0:
self.assertEqual(134324, context.ghost_agent((2, 0, 1)).energy)
elif rank == 1:
self.assertEqual(14.2, g.graph.edges[context.ghost_agent((0, 0, 3)), agents[2]]['weight'])
elif rank == 3:
self.assertEqual(134324, context.ghost_agent((2, 0, 1)).energy)
# TEST: remove edge 0: (2, 0, 1) -> (0, 0, 0)
# 3: (0, 0, 3) -> (2, 0, 1)
if rank == 0:
other = context.ghost_agent((2, 0, 1))
g.remove_edge(other, agents[0])
self.assertEqual(2, g.edge_count)
elif rank == 1:
self.assertIsNotNone(context.ghost_agent((0, 0, 3)))
elif rank == 3:
other = context.ghost_agent((2, 0, 1))
g.remove_edge(agents[0], other)
context.synchronize(restore_agent)
# TEST: 3 removed (0, 0, 3) -> (2, 0, 1), so
# that edge should be removed, and (0, 0, 3) no
# longer ghosted to 1
if rank == 1:
self.assertEqual(2, g.edge_count)
self.assertTrue((agents[0], agents[1]) in g.graph.edges())
self.assertTrue((context.ghost_agent((0, 0, 0)), agents[1]) in g.graph.edges())
self.assertIsNone(context.ghost_agent((0, 0, 3)))
def test_sync_2(self):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
g = DirectedSharedNetwork('network', comm)
context.add_projection(g)
self.assertEqual(0, g.node_count)
agents = [EAgent(x, 0, rank, x) for x in range(10)]
for a in agents:
context.add(a)
self.assertEqual(rank, a.local_rank)
self.assertEqual(10, g.node_count)
requests = []
if rank == 0:
requests.append(((1, 0, 1), 1))
requests.append(((1, 0, 2), 2))
requests.append(((2, 0, 1), 1))
elif rank == 3:
requests.append(((1, 0, 0), 0))
requests.append(((4, 0, 2), 2))
requests.append(((2, 0, 1), 1))
context.request_agents(requests, restore_agent)
if rank == 0 or rank == 3:
self.assertEqual(13, g.node_count)
# Create edges with requested vertices
# Edges: 0: (0, 0, 0) -> (1, 0, 1)
# 0: (2, 0, 1) -> (0, 0, 0)
# 1: (0, 0, 1) -> (1, 0, 1)
# 3: (0, 0, 3) -> (2, 0, 1)
# 3: (1, 0, 0) -> (1, 0, 3)
if rank == 0:
other = context.ghost_agent((1, 0, 1))
g.add_edge(agents[0], other, weight=2)
self.assertEqual(1, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges, [(agents[0], other, {'weight': 2})])
other = context.ghost_agent((2, 0, 1))
g.add_edge(other, agents[0], rate=2)
self.assertEqual(2, g.edge_count)
edges = [x for x in g.graph.in_edges(agents[0], data=True)]
self.assertEqual(edges[0], (other, agents[0], {'rate': 2}))
elif rank == 1:
g.add_edge(agents[0], agents[1], weight=3)
self.assertEqual(1, g.edge_count)
elif rank == 3:
other = context.ghost_agent((2, 0, 1))
g.add_edge(agents[0], other, weight=10)
self.assertEqual(1, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges, [(agents[0], other, {'weight': 10})])
other = context.ghost_agent((1, 0, 0))
g.add_edge(other, agents[1], rate=2.1)
self.assertEqual(2, g.edge_count)
edges = [x for x in g.graph.in_edges(agents[1], data=True)]
self.assertEqual(edges[0], (other, agents[1], {'rate': 2.1}))
context.synchronize(restore_agent)
# TEST: remove (1, 0, 1) from 1
# edges should be removed across processes
if rank == 1:
context.remove(agents[1])
self.assertEqual(2, g.edge_count)
self.assertTrue((agents[2], context.ghost_agent((0, 0, 0))) in g.graph.edges())
self.assertTrue((context.ghost_agent((0, 0, 3)), agents[2]) in g.graph.edges())
self.assertTrue((1, 0, 1) not in context._agent_manager._ghosted_agents)
context.synchronize(restore_agent)
# (0, 0, 0) -> (1, 0, 1) removed
if rank == 0:
# 1, 0, 1 no longer ghosted to 0
self.assertTrue((1, 0, 1) not in context._agent_manager._ghost_agents)
self.assertEqual(2, len(g.graph.edges()))
self.assertTrue((context.ghost_agent((2, 0, 1)), agents[0]) in g.graph.edges())
self.assertTrue((agents[1], context.ghost_agent((1, 0, 3))) in g.graph.edges())
# Add edge on 1 then move one of those
if rank == 1:
g.add_edge(agents[2], agents[4], weight=42)
# print(g.graph.nodes())
# print(g.graph.edges())
# print(rank, g.graph.edges(), flush=True)
# TEST: Move (2, 0, 1) to 0
# * local to 0
# * ghosted from 0 to 3 and 1
moved = []
if rank == 1:
# move 2,0,1 to o
moved.append(((2, 0, 1), 0))
context.move_agents(moved, restore_agent)
# 2,0,1 is on 0 with local_rank of 0
if rank == 0:
self.assertIsNotNone(context.agent((2, 0, 1)))
self.assertEqual(0, context.agent((2, 0, 1)).local_rank)
self.assertTrue(g.graph.has_edge(context.agent((2, 0, 1)), context.agent((0, 0, 0))))
self.assertTrue((2, 0, 1) in context._agent_manager._ghosted_agents)
ghosted_to_ranks = context._agent_manager._ghosted_agents[(2, 0, 1)].ghost_ranks
self.assertTrue(3 in ghosted_to_ranks)
self.assertTrue(g.graph.has_edge(context.ghost_agent((0, 0, 3)), context.agent((2, 0, 1))))
self.assertTrue(1 in ghosted_to_ranks)
self.assertTrue(g.graph.has_edge(context.agent((2, 0, 1)), context.ghost_agent((4, 0, 1))))
self.assertEqual(42, g.graph.edges[context.agent((2, 0, 1)),
context.ghost_agent((4, 0, 1))]['weight'])
# original 10 + moved 2,0,1 + requested 1,0,2 + 0,0,3 and 0,0,0 and 1,0,3 through
# synchronized edges
self.assertEqual(15, g.node_count)
elif rank == 1:
# print(g.graph.edges())
# 2,0,1 moved
self.assertIsNone(context.agent((2, 0, 1)))
self.assertIsNotNone(context.ghost_agent((2, 0, 1)))
# these were ghost agents through edges with 2,0,1
# but should now be removed
self.assertIsNone(context.ghost_agent((0, 0, 3)))
self.assertIsNone(context.ghost_agent((0, 0, 0)))
self.assertFalse(g.graph.has_edge(context.ghost_agent((4, 0, 1)), context.agent((0, 0, 3))))
self.assertFalse(g.graph.has_edge(context.ghost_agent((4, 0, 1)), context.agent((0, 0, 0))))
# ghosted from 0 to 1 because of 2,0,1 -> 4,0,1 edge
self.assertIsNotNone(context.ghost_agent((2, 0, 1)))
self.assertTrue(g.graph.has_edge(context.ghost_agent((2, 0, 1)), context.agent((4, 0, 1))))
elif rank == 3:
self.assertIsNone(context.agent((2, 0, 1)))
self.assertIsNotNone(context.ghost_agent((2, 0, 1)))
# Test: update edge(0,0,3 - 2,0,1) with new data
# ghost edge on 0 reflects change
if rank == 3:
g.update_edge(context.agent((0, 0, 3)), context.ghost_agent((2, 0, 1)), weight=12)
self.assertEqual(12, g.graph.edges[context.agent((0, 0, 3)),
context.ghost_agent((2, 0, 1))]['weight'])
context.synchronize(restore_agent)
if rank == 0:
self.assertEqual(12, g.graph.edges[context.ghost_agent((0, 0, 3)),
context.agent((2, 0, 1))]['weight'])
def test_with_oob(self):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
agents = []
for i in range(20):
a = EAgent(i, 0, rank, 1)
agents.append(a)
context.add(a)
box = space.BoundingBox(xmin=0, xextent=90, ymin=0, yextent=120, zmin=0, zextent=0)
cspace = space.SharedCSpace("shared_space", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm, tree_threshold=100)
grid = space.SharedGrid("shared_grid", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm)
net = DirectedSharedNetwork('network', comm)
context.add_projection(cspace)
context.add_projection(grid)
context.add_projection(net)
random.init(42)
bounds = grid.get_local_bounds()
xs = random.default_rng.integers(low=bounds.xmin, high=bounds.xmin + bounds.xextent, size=20)
ys = random.default_rng.integers(low=bounds.ymin, high=bounds.ymin + bounds.yextent, size=20)
for i, agent in enumerate(agents):
grid.move(agent, dpt(xs[i], ys[i]))
cspace.move(agent, cpt(xs[i], ys[i]))
# TEST:
# 1. request agents from neighboring ranks
# 2. make edges between local agent and ghosts
# 3. move agents oob, such that
# a. former ghost is now local
# b. ghost is now on different rank
# 4. do tests
requests = []
if rank == 0:
requests.append(((1, 0, 1), 1))
requests.append(((1, 0, 2), 2))
requests.append(((2, 0, 1), 1))
elif rank == 3:
requests.append(((1, 0, 0), 0))
requests.append(((4, 0, 2), 2))
requests.append(((2, 0, 1), 1))
context.request_agents(requests, restore_agent)
if rank == 0:
net.add_edge(agents[1], context.ghost_agent((2, 0, 1)), color='red')
net.add_edge(agents[10], context.ghost_agent((1, 0, 1)))
elif rank == 1:
net.add_edge(agents[2], agents[1])
elif rank == 3:
net.add_edge(agents[1], context.ghost_agent((1, 0, 0)))
net.add_edge(agents[5], context.ghost_agent((2, 0, 1)))
context.synchronize(restore_agent)
# TESTS edges
if rank == 0:
self.assertEqual(3, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[10], context.ghost_agent((1, 0, 1))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((2, 0, 1))))
self.assertTrue(net.graph.has_edge(context.ghost_agent((1, 0, 3)), agents[1]))
elif rank == 1:
self.assertEqual(4, net.edge_count)
self.assertTrue(net.graph.has_edge(context.ghost_agent((10, 0, 0)), agents[1], ))
self.assertTrue(net.graph.has_edge(agents[2], agents[1]))
self.assertTrue(net.graph.has_edge(context.ghost_agent((1, 0, 0)), agents[2]))
self.assertTrue(net.graph.has_edge(context.ghost_agent((5, 0, 3)), agents[2]))
self.assertEqual('red', net.graph.edges[context.ghost_agent((1, 0, 0)), agents[2]]['color'])
elif rank == 2:
self.assertEqual(0, net.edge_count)
elif rank == 3:
self.assertEqual(2, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((1, 0, 0))))
self.assertTrue(net.graph.has_edge(agents[5], context.ghost_agent((2, 0, 1))))
# Bounds:
# print('{}: bounds: {}'.format(rank, grid.get_local_bounds()), flush=True)
# 0: bounds: BoundingBox(xmin=0, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 1: bounds: BoundingBox(xmin=0, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# 2: bounds: BoundingBox(xmin=45, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 3: bounds: BoundingBox(xmin=45, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# Move (2, 0, 1) to 2's bounds
if rank == 1:
grid.move(agents[2], dpt(46, 35))
cspace.move(agents[2], cpt(46.2, 35.1))
context.synchronize(restore_agent)
if rank == 0:
self.assertEqual(3, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[10], context.ghost_agent((1, 0, 1))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((2, 0, 1))))
self.assertTrue(net.graph.has_edge(context.ghost_agent((1, 0, 3)), agents[1]))
elif rank == 1:
self.assertEqual(2, net.edge_count)
self.assertTrue(net.graph.has_edge(context.ghost_agent((10, 0, 0)), agents[1]))
self.assertTrue(net.graph.has_edge(context.ghost_agent((2, 0, 1)), agents[1]))
elif rank == 2:
agent_201 = context.agent((2, 0, 1))
self.assertIsNotNone(agent_201)
self.assertEqual(3, net.edge_count)
self.assertTrue(net.graph.has_edge(context.ghost_agent((1, 0, 0)), agent_201))
self.assertTrue(net.graph.has_edge(context.ghost_agent((5, 0, 3)), agent_201))
self.assertTrue(net.graph.has_edge(agent_201, context.ghost_agent((1, 0, 1))))
self.assertEqual('red', net.graph.edges[context.ghost_agent((1, 0, 0)), agent_201]['color'])
elif rank == 3:
self.assertEqual(2, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((1, 0, 0))))
self.assertTrue(net.graph.has_edge(agents[5], context.ghost_agent((2, 0, 1))))
if rank == 0:
agents[1].energy = 101
elif rank == 2:
# 201 to 3
agent_201 = context.agent((2, 0, 1))
grid.move(agent_201, dpt(46, 80))
cspace.move(agent_201, cpt(46.2, 80.1))
context.synchronize(restore_agent)
# print(f'{rank}: {net.graph.edges()}', flush=True)
if rank == 0:
self.assertEqual(3, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[10], context.ghost_agent((1, 0, 1))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((2, 0, 1))))
self.assertTrue(net.graph.has_edge(context.ghost_agent((1, 0, 3)), agents[1]))
elif rank == 1:
self.assertEqual(2, net.edge_count)
self.assertTrue(net.graph.has_edge(context.ghost_agent((10, 0, 0)), agents[1]))
self.assertTrue(net.graph.has_edge(context.ghost_agent((2, 0, 1)), agents[1]))
elif rank == 2:
self.assertEqual(0, net.edge_count)
elif rank == 3:
self.assertEqual(4, net.edge_count)
agent_201 = context.agent((2, 0, 1))
agent_100 = context.ghost_agent((1, 0, 0))
self.assertIsNotNone(agent_201)
self.assertTrue(net.graph.has_edge(agents[1], agent_100))
self.assertTrue(net.graph.has_edge(agents[5], agent_201))
self.assertTrue(net.graph.has_edge(agent_201, context.ghost_agent((1, 0, 1))))
self.assertTrue(net.graph.has_edge(agent_100, agent_201))
self.assertEqual(101, context.ghost_agent((1, 0, 0)).energy)
def test_in_buffer(self):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
agents = []
for i in range(20):
a = EAgent(i, 0, rank, 1)
agents.append(a)
context.add(a)
box = space.BoundingBox(xmin=0, xextent=90, ymin=0, yextent=120, zmin=0, zextent=0)
cspace = space.SharedCSpace("shared_space", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm, tree_threshold=100)
grid = space.SharedGrid("shared_grid", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm)
net = DirectedSharedNetwork('network', comm)
context.add_projection(net)
context.add_projection(cspace)
context.add_projection(grid)
random.init(42)
bounds = grid.get_local_bounds()
xs = random.default_rng.integers(low=bounds.xmin, high=bounds.xmin + bounds.xextent, size=20)
ys = random.default_rng.integers(low=bounds.ymin, high=bounds.ymin + bounds.yextent, size=20)
for i, agent in enumerate(agents):
grid.move(agent, dpt(xs[i], ys[i]))
cspace.move(agent, cpt(xs[i], ys[i]))
# Bounds:
# print('{}: bounds: {}'.format(rank, grid.get_local_bounds()))
# 0: bounds: BoundingBox(xmin=0, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 1: bounds: BoundingBox(xmin=0, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# 2: bounds: BoundingBox(xmin=45, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 3: bounds: BoundingBox(xmin=45, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# TEST:
# Request agent that's in buffer, then moves off of buffer.
# Is still properly ghosted?
if rank == 1:
agent_201 = context.agent((2, 0, 1))
grid.move(agent_201, dpt(10, 60))
cspace.move(agent_201, cpt(10.2, 60.1))
context.synchronize(restore_agent)
if rank == 0:
agent_201 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(agent_201)
requests = []
if rank == 0:
requests.append(((2, 0, 1), 1))
context.request_agents(requests, restore_agent)
if rank == 0:
agent_201 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(agent_201)
if rank == 1:
# move off of buffer
agent_201 = context.agent((2, 0, 1))
grid.move(agent_201, dpt(10, 66))
cspace.move(agent_201, cpt(10.2, 66.1))
context.synchronize(restore_agent)
if rank == 0:
self.assertIsNotNone(context.ghost_agent((2, 0, 1)))
class SharedUndirectedNetworkTests(unittest.TestCase):
long_message = True
def test_add_remove(self):
# make 1 rank comm for basic add remove tests
new_group = MPI.COMM_WORLD.Get_group().Incl([0])
comm = MPI.COMM_WORLD.Create_group(new_group)
if comm != MPI.COMM_NULL:
g = UndirectedSharedNetwork('network', comm)
self.assertEqual('network', g.name)
self.assertFalse(g.is_directed)
self.assertEqual(0, g.node_count)
self.assertEqual(0, g.edge_count)
agents = [EAgent(x, 0, comm.Get_rank(), x) for x in range(10)]
g.add(agents[0])
g.add_nodes(agents[1:4])
self.assertEqual(4, g.node_count)
nodes = [x for x in g.graph.nodes]
self.assertEqual(nodes, [x for x in agents[0:4]])
g.add_edge(agents[0], agents[1])
g.add_edge(agents[0], agents[3])
g.add_edge(agents[5], agents[6], weight=12)
# 2 nodes added via edge
self.assertEqual(6, g.node_count)
self.assertEqual(3, g.edge_count)
edges = [x for x in g.graph.edges(agents[5])]
self.assertEqual(edges, [(agents[5], agents[6])])
edges = [x for x in g.graph.edges(agents[0])]
self.assertEqual(edges, [(agents[0], agents[1]), (agents[0], agents[3])])
edges = [x for x in g.graph.edges(agents[6])]
self.assertEqual(edges, [(agents[6], agents[5])])
edge = g.graph.edges[agents[5], agents[6]]
self.assertEqual(12, edge['weight'])
self.assertTrue(g.contains_edge(agents[0], agents[1]))
self.assertTrue(g.contains_edge(agents[1], agents[0]))
self.assertTrue(not g.contains_edge(agents[7], agents[6]))
g.remove(agents[0])
self.assertEqual(5, g.node_count)
self.assertEqual(1, g.edge_count)
g.add_edge(agents[4], agents[5])
self.assertEqual(2, g.num_edges(agents[5]))
self.assertEqual(1, g.num_edges(agents[4]))
# Note (5, 4) because getting edges by node from
# undirected network, returns the asked for node first
exp = {(agents[5], agents[6]), (agents[5], agents[4])}
for edge in g._edges(agents[5]):
exp.remove(edge)
self.assertEqual(0, len(exp))
def test_sync_1(self):
# Tests add, update and remove edge
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
g = UndirectedSharedNetwork('network', comm)
context.add_projection(g)
self.assertEqual(0, g.node_count)
agents = [EAgent(x, 0, rank, x) for x in range(10)]
for a in agents:
context.add(a)
self.assertEqual(rank, a.local_rank)
self.assertEqual(10, g.node_count)
requests = []
if rank == 0:
requests.append(((1, 0, 1), 1))
requests.append(((1, 0, 2), 2))
requests.append(((2, 0, 1), 1))
elif rank == 3:
requests.append(((1, 0, 0), 0))
requests.append(((4, 0, 2), 2))
requests.append(((2, 0, 1), 1))
context.request_agents(requests, restore_agent)
if rank == 0 or rank == 3:
self.assertEqual(13, g.node_count)
# Edges: 0: (0, 0, 0) -> (1, 0, 1)
# 0: (2, 0, 1) -> (0, 0, 0)
# 1: (0, 0, 1) -> (1, 0, 1)
# 3: (0, 0, 3) -> (2, 0, 1)
# 3: (1, 0, 0) -> (1, 0, 3)
if rank == 0:
other = context.ghost_agent((1, 0, 1))
g.add_edge(agents[0], other, weight=2)
self.assertEqual(1, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges, [(agents[0], other, {'weight': 2})])
other = context.ghost_agent((2, 0, 1))
g.add_edge(other, agents[0], rate=2)
self.assertEqual(2, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges[1], (agents[0], other, {'rate': 2}))
elif rank == 1:
g.add_edge(agents[0], agents[1], weight=3)
self.assertEqual(1, g.edge_count)
elif rank == 3:
other = context.ghost_agent((2, 0, 1))
g.add_edge(agents[0], other, weight=10)
self.assertEqual(1, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges, [(agents[0], other, {'weight': 10})])
other = context.ghost_agent((1, 0, 0))
g.add_edge(other, agents[1], rate=2.1)
self.assertEqual(2, g.edge_count)
edges = [x for x in g.graph.edges(agents[1], data=True)]
self.assertEqual(edges[0], (agents[1], other, {'rate': 2.1}))
context.synchronize(restore_agent)
# TEST vertices and edges created on ghost ranks
if rank == 0:
self.assertEqual(3, g.edge_count)
agent = context.agent((0, 0, 0))
edges = [x for x in g.graph.edges(agent, data=True)]
self.assertEqual(2, len(edges))
exp = [(agent, context.ghost_agent((2, 0, 1)), {'rate': 2}),
(agent, context.ghost_agent((1, 0, 1)), {'weight': 2})]
for e in edges:
self.assertTrue(e in exp)
elif rank == 1:
self.assertEqual(4, g.edge_count)
agent = context.agent((1, 0, 1))
edges = [x for x in g.graph.edges(agent, data=True)]
self.assertEqual(2, len(edges))
# looks like in undirected graph when request via edges(k)
# the edge is returned as (k, other) regardless of how the
# edge is originally inserted
exp = [(agent, agents[0], {'weight': 3}), (agent, context.ghost_agent((0, 0, 0)), {'weight': 2})]
self.assertEqual(edges, exp)
agent = context.agent((2, 0, 1))
edges = [x for x in g.graph.edges(agent, data=True)]
self.assertEqual(2, len(edges))
exp = [(agent, context.ghost_agent((0, 0, 0)), {'rate': 2}),
(agent, context.ghost_agent((0, 0, 3)), {'weight': 10})]
self.assertEqual(edges, exp)
elif rank == 3:
self.assertEqual(2, g.edge_count)
agent = agents[0]
edges = [x for x in g.graph.edges(agent, data=True)]
self.assertEqual(1, len(edges))
self.assertEqual(edges[0], (agent, context.ghost_agent((2, 0, 1)), {'weight': 10}))
agent = agents[1]
edges = [x for x in g.graph.edges(agent, data=True)]
self.assertEqual(1, len(edges))
self.assertEqual(edges[0], (agent, context.ghost_agent((1, 0, 0)), {'rate': 2.1}))
# TEST: update 2,0,1 agent and see if updates on 0 and 3
if rank == 1:
agents[2].energy = 134324
# TEST: update edge with 2,0,1 and see if updates 1 and 0
elif rank == 3:
other = context.ghost_agent((2, 0, 1))
g.update_edge(agents[0], other, weight=14.2)
context.synchronize(restore_agent)
if rank == 0:
self.assertEqual(134324, context.ghost_agent((2, 0, 1)).energy)
elif rank == 1:
self.assertEqual(14.2, g.graph.edges[agents[2], context.ghost_agent((0, 0, 3))]['weight'])
elif rank == 3:
self.assertEqual(134324, context.ghost_agent((2, 0, 1)).energy)
# TEST: remove edge 0: (2, 0, 1) -> (0, 0, 0)
# 3: (0, 0, 3) -> (2, 0, 1)
if rank == 0:
other = context.ghost_agent((2, 0, 1))
g.remove_edge(agents[0], other)
self.assertEqual(2, g.edge_count)
elif rank == 1:
self.assertIsNotNone(context.ghost_agent((0, 0, 3)))
elif rank == 3:
other = context.ghost_agent((2, 0, 1))
g.remove_edge(other, agents[0])
context.synchronize(restore_agent)
# TEST: 3 removed (0, 0, 3) -> (2, 0, 1), so
# that edge should be removed, and (0, 0, 3) no
# longer ghosted to 1
if rank == 1:
self.assertEqual(2, g.edge_count)
self.assertTrue((agents[0], agents[1]) in g.graph.edges())
self.assertTrue((agents[1], context.ghost_agent((0, 0, 0))) in g.graph.edges())
self.assertIsNone(context.ghost_agent((0, 0, 3)))
def test_sync_2(self):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
g = UndirectedSharedNetwork('network', comm)
context.add_projection(g)
self.assertEqual(0, g.node_count)
agents = [EAgent(x, 0, rank, x) for x in range(10)]
for a in agents:
context.add(a)
self.assertEqual(rank, a.local_rank)
self.assertEqual(10, g.node_count)
requests = []
if rank == 0:
requests.append(((1, 0, 1), 1))
requests.append(((1, 0, 2), 2))
requests.append(((2, 0, 1), 1))
elif rank == 3:
requests.append(((1, 0, 0), 0))
requests.append(((4, 0, 2), 2))
requests.append(((2, 0, 1), 1))
context.request_agents(requests, restore_agent)
if rank == 0 or rank == 3:
self.assertEqual(13, g.node_count)
# Create edges with requested vertices
# Edges: 0: (0, 0, 0) -> (1, 0, 1)
# 0: (2, 0, 1) -> (0, 0, 0)
# 1: (0, 0, 1) -> (1, 0, 1)
# 3: (0, 0, 3) -> (2, 0, 1)
# 3: (1, 0, 0) -> (1, 0, 3)
if rank == 0:
other = context.ghost_agent((1, 0, 1))
g.add_edge(agents[0], other, weight=2)
self.assertEqual(1, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges, [(agents[0], other, {'weight': 2})])
other = context.ghost_agent((2, 0, 1))
g.add_edge(other, agents[0], rate=2)
self.assertEqual(2, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges[1], (agents[0], other, {'rate': 2}))
elif rank == 1:
g.add_edge(agents[0], agents[1], weight=3)
self.assertEqual(1, g.edge_count)
elif rank == 3:
other = context.ghost_agent((2, 0, 1))
g.add_edge(agents[0], other, weight=10)
self.assertEqual(1, g.edge_count)
edges = [x for x in g.graph.edges(agents[0], data=True)]
self.assertEqual(edges, [(agents[0], other, {'weight': 10})])
other = context.ghost_agent((1, 0, 0))
g.add_edge(other, agents[1], rate=2.1)
self.assertEqual(2, g.edge_count)
edges = [x for x in g.graph.edges(agents[1], data=True)]
self.assertEqual(edges[0], (agents[1], other, {'rate': 2.1}))
context.synchronize(restore_agent)
# TEST: remove (1, 0, 1) from 1
# edges should be removed across processes
if rank == 1:
context.remove(agents[1])
self.assertEqual(2, g.edge_count)
self.assertTrue((agents[2], context.ghost_agent((0, 0, 0))) in g.graph.edges())
self.assertTrue((agents[2], context.ghost_agent((0, 0, 3))) in g.graph.edges())
self.assertTrue((1, 0, 1) not in context._agent_manager._ghosted_agents)
context.synchronize(restore_agent)
# (0, 0, 0) -> (1, 0, 1) removed
if rank == 0:
# 1, 0, 1 no longer ghosted to 0
self.assertTrue((1, 0, 1) not in context._agent_manager._ghost_agents)
self.assertEqual(2, len(g.graph.edges()))
self.assertTrue((agents[0], context.ghost_agent((2, 0, 1))) in g.graph.edges())
self.assertTrue((agents[1], context.ghost_agent((1, 0, 3))) in g.graph.edges())
# Add edge on 1 then move one of those
if rank == 1:
g.add_edge(agents[2], agents[4], weight=42)
# print(g.graph.nodes())
# print(g.graph.edges())
# print(rank, g.graph.edges(), flush=True)
# TEST: Move (2, 0, 1) to 0
# * local to 0
# * ghosted from 0 to 3 and 1
moved = []
if rank == 1:
# move 2,0,1 to o
moved.append(((2, 0, 1), 0))
context.move_agents(moved, restore_agent)
# TODO: 2,0,1 is on 0 with local_rank of 0
if rank == 0:
self.assertIsNotNone(context.agent((2, 0, 1)))
self.assertEqual(0, context.agent((2, 0, 1)).local_rank)
self.assertTrue(g.graph.has_edge(context.agent((2, 0, 1)), context.agent((0, 0, 0))))
self.assertTrue((2, 0, 1) in context._agent_manager._ghosted_agents)
ghosted_to_ranks = context._agent_manager._ghosted_agents[(2, 0, 1)].ghost_ranks
self.assertTrue(3 in ghosted_to_ranks)
self.assertTrue(g.graph.has_edge(context.agent((2, 0, 1)), context.ghost_agent((0, 0, 3))))
self.assertTrue(1 in ghosted_to_ranks)
self.assertTrue(g.graph.has_edge(context.agent((2, 0, 1)), context.ghost_agent((4, 0, 1))))
self.assertEqual(42, g.graph.edges[context.agent((2, 0, 1)),
context.ghost_agent((4, 0, 1))]['weight'])
# original 10 + moved 2,0,1 + requested 1,0,2 + 0,0,3 and 0,0,0 and 1,0,3 through
# synchronized edges
self.assertEqual(15, g.node_count)
elif rank == 1:
# print(g.graph.edges())
# 2,0,1 moved
self.assertIsNone(context.agent((2, 0, 1)))
self.assertIsNotNone(context.ghost_agent((2, 0, 1)))
# these were ghost agents through edges with 2,0,1
# but should now be removed
self.assertIsNone(context.ghost_agent((0, 0, 3)))
self.assertIsNone(context.ghost_agent((0, 0, 0)))
self.assertFalse(g.graph.has_edge(context.ghost_agent((4, 0, 1)), context.agent((0, 0, 3))))
self.assertFalse(g.graph.has_edge(context.ghost_agent((4, 0, 1)), context.agent((0, 0, 0))))
# ghosted from 0 to 1 because of 2,0,1 -> 4,0,1 edge
self.assertIsNotNone(context.ghost_agent((2, 0, 1)))
self.assertTrue(g.graph.has_edge(context.ghost_agent((2, 0, 1)), context.agent((4, 0, 1))))
elif rank == 3:
self.assertIsNone(context.agent((2, 0, 1)))
self.assertIsNotNone(context.ghost_agent((2, 0, 1)))
# Test: update edge(0,0,3 - 2,0,1) with new data
# ghost edge on 0 reflects change
if rank == 3:
g.update_edge(context.agent((0, 0, 3)), context.ghost_agent((2, 0, 1)), weight=12)
self.assertEqual(12, g.graph.edges[context.agent((0, 0, 3)),
context.ghost_agent((2, 0, 1))]['weight'])
context.synchronize(restore_agent)
if rank == 0:
self.assertEqual(12, g.graph.edges[context.ghost_agent((0, 0, 3)),
context.agent((2, 0, 1))]['weight'])
def test_with_oob(self):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
agents = []
for i in range(20):
a = EAgent(i, 0, rank, 1)
agents.append(a)
context.add(a)
box = space.BoundingBox(xmin=0, xextent=90, ymin=0, yextent=120, zmin=0, zextent=0)
cspace = space.SharedCSpace("shared_space", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm, tree_threshold=100)
grid = space.SharedGrid("shared_grid", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm)
net = UndirectedSharedNetwork('network', comm)
context.add_projection(cspace)
context.add_projection(grid)
context.add_projection(net)
random.init(42)
bounds = grid.get_local_bounds()
xs = random.default_rng.integers(low=bounds.xmin, high=bounds.xmin + bounds.xextent, size=20)
ys = random.default_rng.integers(low=bounds.ymin, high=bounds.ymin + bounds.yextent, size=20)
for i, agent in enumerate(agents):
grid.move(agent, dpt(xs[i], ys[i]))
cspace.move(agent, cpt(xs[i], ys[i]))
# TEST:
# 1. request agents from neighboring ranks
# 2. make edges between local agent and ghosts
# 3. move agents oob, such that
# a. former ghost is now local
# b. ghost is now on different rank
# 4. do tests
requests = []
if rank == 0:
requests.append(((1, 0, 1), 1))
requests.append(((1, 0, 2), 2))
requests.append(((2, 0, 1), 1))
elif rank == 3:
requests.append(((1, 0, 0), 0))
requests.append(((4, 0, 2), 2))
requests.append(((2, 0, 1), 1))
context.request_agents(requests, restore_agent)
if rank == 0:
net.add_edge(agents[1], context.ghost_agent((2, 0, 1)), color='red')
net.add_edge(agents[10], context.ghost_agent((1, 0, 1)))
elif rank == 1:
net.add_edge(agents[2], agents[1])
elif rank == 3:
net.add_edge(agents[1], context.ghost_agent((1, 0, 0)))
net.add_edge(agents[5], context.ghost_agent((2, 0, 1)))
context.synchronize(restore_agent)
# TESTS edges
if rank == 0:
self.assertEqual(3, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[10], context.ghost_agent((1, 0, 1))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((2, 0, 1))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((1, 0, 3))))
elif rank == 1:
self.assertEqual(4, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((10, 0, 0))))
self.assertTrue(net.graph.has_edge(agents[1], agents[2]))
self.assertTrue(net.graph.has_edge(agents[2], context.ghost_agent((1, 0, 0))))
self.assertTrue(net.graph.has_edge(agents[2], context.ghost_agent((5, 0, 3))))
self.assertEqual('red', net.graph.edges[agents[2], context.ghost_agent((1, 0, 0))]['color'])
elif rank == 2:
self.assertEqual(0, net.edge_count)
elif rank == 3:
self.assertEqual(2, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((1, 0, 0))))
self.assertTrue(net.graph.has_edge(agents[5], context.ghost_agent((2, 0, 1))))
# Bounds:
# print('{}: bounds: {}'.format(rank, grid.get_local_bounds()), flush=True)
# 0: bounds: BoundingBox(xmin=0, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 1: bounds: BoundingBox(xmin=0, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# 2: bounds: BoundingBox(xmin=45, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 3: bounds: BoundingBox(xmin=45, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# Move (2, 0, 1) to 2's bounds
if rank == 1:
grid.move(agents[2], dpt(46, 35))
cspace.move(agents[2], cpt(46.2, 35.1))
context.synchronize(restore_agent)
if rank == 0:
self.assertEqual(3, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[10], context.ghost_agent((1, 0, 1))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((2, 0, 1))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((1, 0, 3))))
elif rank == 1:
self.assertEqual(2, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((10, 0, 0))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((2, 0, 1))))
elif rank == 2:
agent_201 = context.agent((2, 0, 1))
self.assertIsNotNone(agent_201)
self.assertEqual(3, net.edge_count)
self.assertTrue(net.graph.has_edge(agent_201, context.ghost_agent((1, 0, 0))))
self.assertTrue(net.graph.has_edge(agent_201, context.ghost_agent((5, 0, 3))))
self.assertTrue(net.graph.has_edge(agent_201, context.ghost_agent((1, 0, 1))))
self.assertEqual('red', net.graph.edges[agent_201, context.ghost_agent((1, 0, 0))]['color'])
elif rank == 3:
self.assertEqual(2, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((1, 0, 0))))
self.assertTrue(net.graph.has_edge(agents[5], context.ghost_agent((2, 0, 1))))
if rank == 0:
agents[1].energy = 101
elif rank == 2:
# 201 to 3
agent_201 = context.agent((2, 0, 1))
grid.move(agent_201, dpt(46, 80))
cspace.move(agent_201, cpt(46.2, 80.1))
context.synchronize(restore_agent)
# print(f'{rank}: {net.graph.edges()}', flush=True)
if rank == 0:
self.assertEqual(3, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[10], context.ghost_agent((1, 0, 1))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((2, 0, 1))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((1, 0, 3))))
elif rank == 1:
self.assertEqual(2, net.edge_count)
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((10, 0, 0))))
self.assertTrue(net.graph.has_edge(agents[1], context.ghost_agent((2, 0, 1))))
elif rank == 2:
self.assertEqual(0, net.edge_count)
elif rank == 3:
self.assertEqual(4, net.edge_count)
agent_201 = context.agent((2, 0, 1))
agent_100 = context.ghost_agent((1, 0, 0))
self.assertIsNotNone(agent_201)
self.assertTrue(net.graph.has_edge(agents[1], agent_100))
self.assertTrue(net.graph.has_edge(agents[5], agent_201))
self.assertTrue(net.graph.has_edge(context.ghost_agent((1, 0, 1)), agent_201))
self.assertTrue(net.graph.has_edge(agent_201, agent_100))
self.assertEqual(101, context.ghost_agent((1, 0, 0)).energy)
def test_in_buffer(self):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
agents = []
for i in range(20):
a = EAgent(i, 0, rank, 1)
agents.append(a)
context.add(a)
box = space.BoundingBox(xmin=0, xextent=90, ymin=0, yextent=120, zmin=0, zextent=0)
cspace = space.SharedCSpace("shared_space", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm, tree_threshold=100)
grid = space.SharedGrid("shared_grid", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm)
net = UndirectedSharedNetwork('network', comm)
context.add_projection(net)
context.add_projection(cspace)
context.add_projection(grid)
random.init(42)
bounds = grid.get_local_bounds()
xs = random.default_rng.integers(low=bounds.xmin, high=bounds.xmin + bounds.xextent, size=20)
ys = random.default_rng.integers(low=bounds.ymin, high=bounds.ymin + bounds.yextent, size=20)
for i, agent in enumerate(agents):
grid.move(agent, dpt(xs[i], ys[i]))
cspace.move(agent, cpt(xs[i], ys[i]))
# Bounds:
# print('{}: bounds: {}'.format(rank, grid.get_local_bounds()))
# 0: bounds: BoundingBox(xmin=0, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 1: bounds: BoundingBox(xmin=0, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# 2: bounds: BoundingBox(xmin=45, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 3: bounds: BoundingBox(xmin=45, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# TEST:
# Request agent that's in buffer, then moves off of buffer.
# Is still properly ghosted?
if rank == 1:
agent_201 = context.agent((2, 0, 1))
grid.move(agent_201, dpt(10, 60))
cspace.move(agent_201, cpt(10.2, 60.1))
context.synchronize(restore_agent)
if rank == 0:
agent_201 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(agent_201)
requests = []
if rank == 0:
requests.append(((2, 0, 1), 1))
context.request_agents(requests, restore_agent)
if rank == 0:
agent_201 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(agent_201)
if rank == 1:
# move off of buffer
agent_201 = context.agent((2, 0, 1))
grid.move(agent_201, dpt(10, 66))
cspace.move(agent_201, cpt(10.2, 66.1))
context.synchronize(restore_agent)
if rank == 0:
self.assertIsNotNone(context.ghost_agent((2, 0, 1)))
def test_edge_in_buffer(self):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
agents = []
for i in range(20):
a = EAgent(i, 0, rank, 1)
agents.append(a)
context.add(a)
box = space.BoundingBox(xmin=0, xextent=90, ymin=0, yextent=120, zmin=0, zextent=0)
cspace = space.SharedCSpace("shared_space", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm, tree_threshold=100)
grid = space.SharedGrid("shared_grid", bounds=box, borders=BorderType.Sticky,
occupancy=OccupancyType.Multiple, buffer_size=2, comm=comm)
net = UndirectedSharedNetwork('network', comm)
context.add_projection(net)
context.add_projection(cspace)
context.add_projection(grid)
random.init(42)
bounds = grid.get_local_bounds()
xs = random.default_rng.integers(low=bounds.xmin, high=bounds.xmin + bounds.xextent, size=20)
ys = random.default_rng.integers(low=bounds.ymin, high=bounds.ymin + bounds.yextent, size=20)
for i, agent in enumerate(agents):
grid.move(agent, dpt(xs[i], ys[i]))
cspace.move(agent, cpt(xs[i], ys[i]))
# Bounds:
# print('{}: bounds: {}'.format(rank, grid.get_local_bounds()))
# 0: bounds: BoundingBox(xmin=0, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 1: bounds: BoundingBox(xmin=0, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# 2: bounds: BoundingBox(xmin=45, xextent=45, ymin=0, yextent=60, zmin=0, zextent=0)
# 3: bounds: BoundingBox(xmin=45, xextent=45, ymin=60, yextent=60, zmin=0, zextent=0)
# TEST:
# Request agent that's in buffer, then moves off of buffer.
# Is still properly ghosted?
if rank == 1:
agent_201 = context.agent((2, 0, 1))
grid.move(agent_201, dpt(10, 60))
cspace.move(agent_201, cpt(10.2, 60.1))
context.synchronize(restore_agent)
if rank == 0:
agent_201 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(agent_201)
net.add_edge(agent_201, agents[0])
if rank == 1:
# move off of buffer
agent_201 = context.agent((2, 0, 1))
grid.move(agent_201, dpt(10, 66))
cspace.move(agent_201, cpt(10.2, 66.1))
context.synchronize(restore_agent)
if rank == 0:
agent_201 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(agent_201)
self.assertTrue(net.contains_edge(agent_201, agents[0]))
# TEST:
# 1. Update agent that was on buffer and made edge on 1
# 2. Test that change is reflected on 1
if rank == 1:
self.assertTrue(net.contains_edge(agent_201, context.ghost_agent((0, 0, 0))))
agent_201.energy = 20
context.synchronize(restore_agent)
if rank == 0:
agent_201 = context.ghost_agent((2, 0, 1))
self.assertEqual(agent_201.energy, 20)
self.assertIsNotNone(agent_201)
self.assertTrue(net.contains_edge(agent_201, agents[0]))
def construct_agent(nid, agent_type, rank, **kwargs):
energy = kwargs['energy'] if 'energy' in kwargs else -1
return EAgent(nid, agent_type, rank, energy)
class InitNetworkTests(unittest.TestCase):
long_message = True
def testInitWithAttributes(self):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
fpath = './test_data/net_with_attribs.txt'
read_network(fpath, context, construct_agent, restore_agent)
g = context.get_projection("friend_network")
self.assertFalse(g.is_directed)
if rank == 0:
self.assertEqual(1, context.size()[-1])
a1 = context.agent((3, 0, 0))
self.assertEqual(30, a1.energy)
self.assertEqual(1, len(context._agent_manager._ghost_agents))
self.assertIsNotNone(context.ghost_agent((1, 0, 1)))
g1 = context.ghost_agent((1, 0, 1))
self.assertEqual(23, g1.energy)
edges = [x for x in g.graph.edges(a1, data=True)]
self.assertEqual(edges, [(a1, g1, {'weight': 0.75})])
# self.assertTrue((a1, g1) in g.graph.edges())
# self.assertTrue((context.ghost_agent((0, 0, 0)), agents[1]) in g.graph.edges())
elif rank == 1:
self.assertEqual(2, context.size()[-1])
a1 = context.agent((1, 0, 1))
self.assertIsNotNone(a1)
a2 = context.agent((2, 0, 1))
self.assertIsNotNone(a2)
self.assertEqual(3, len(context._agent_manager._ghost_agents))
g3 = context.ghost_agent((3, 0, 0))
self.assertIsNotNone(g3)
g5 = context.ghost_agent((5, 0, 3))
self.assertIsNotNone(g5)
self.assertEqual(32, g5.energy)
g4 = context.ghost_agent((4, 0, 2))
self.assertIsNotNone(g4)
self.assertEqual(5, g.edge_count)
self.assertTrue(g.contains_edge(a1, a2))
self.assertTrue(g.contains_edge(a1, g3))
self.assertTrue(g.contains_edge(a1, g5))
self.assertTrue(g.contains_edge(a2, g5))
self.assertTrue(g.contains_edge(g4, a2))
elif rank == 2:
self.assertEqual(2, context.size()[-1])
a4 = context.agent((4, 0, 2))
self.assertIsNotNone(a4)
a6 = context.agent((6, 0, 2))
self.assertIsNotNone(a6)
self.assertEqual(2, len(context._agent_manager._ghost_agents))
g2 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(g2)
g5 = context.ghost_agent((5, 0, 3))
self.assertIsNotNone(g5)
self.assertEqual(2, g.edge_count)
self.assertTrue(g.contains_edge(a6, g5))
self.assertTrue(g.contains_edge(a4, g2))
elif rank == 3:
self.assertEqual(1, context.size()[-1])
a5 = context.agent((5, 0, 3))
self.assertIsNotNone(a5)
self.assertEqual(3, len(context._agent_manager._ghost_agents))
g6 = context.ghost_agent((6, 0, 2))
self.assertIsNotNone(g6)
g1 = context.ghost_agent((1, 0, 1))
self.assertIsNotNone(g1)
g2 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(g2)
self.assertEqual(3, g.edge_count)
self.assertTrue(g.contains_edge(a5, g6))
self.assertTrue(g.contains_edge(a5, g1))
self.assertTrue(g.contains_edge(g2, a5))
def testInitWithDiNoAttrib(self):
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
context = ctx.SharedContext(comm)
fpath = './test_data/net_with_no_attribs.txt'
read_network(fpath, context, construct_agent, restore_agent)
g = context.get_projection("friend_network")
self.assertTrue(g.is_directed)
if rank == 0:
self.assertEqual(1, context.size()[-1])
a1 = context.agent((3, 0, 0))
self.assertEqual(-1, a1.energy)
self.assertEqual(1, len(context._agent_manager._ghost_agents))
self.assertIsNotNone(context.ghost_agent((1, 0, 1)))
g1 = context.ghost_agent((1, 0, 1))
self.assertEqual(-1, g1.energy)
edges = [x for x in g.graph.edges(a1, data=True)]
self.assertEqual(edges, [(a1, g1, {})])
# self.assertTrue((a1, g1) in g.graph.edges())
# self.assertTrue((context.ghost_agent((0, 0, 0)), agents[1]) in g.graph.edges())
elif rank == 1:
self.assertEqual(2, context.size()[-1])
a1 = context.agent((1, 0, 1))
self.assertIsNotNone(a1)
a2 = context.agent((2, 0, 1))
self.assertIsNotNone(a2)
self.assertEqual(3, len(context._agent_manager._ghost_agents))
g3 = context.ghost_agent((3, 0, 0))
self.assertIsNotNone(g3)
g5 = context.ghost_agent((5, 0, 3))
self.assertIsNotNone(g5)
self.assertEqual(-1, g5.energy)
g4 = context.ghost_agent((4, 0, 2))
self.assertIsNotNone(g4)
self.assertEqual(5, g.edge_count)
self.assertTrue(g.contains_edge(a1, a2))
self.assertTrue(g.contains_edge(g3, a1))
self.assertTrue(g.contains_edge(a1, g5))
self.assertTrue(g.contains_edge(a2, g5))
self.assertTrue(g.contains_edge(g4, a2))
elif rank == 2:
self.assertEqual(2, context.size()[-1])
a4 = context.agent((4, 0, 2))
self.assertIsNotNone(a4)
a6 = context.agent((6, 0, 2))
self.assertIsNotNone(a6)
self.assertEqual(2, len(context._agent_manager._ghost_agents))
g2 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(g2)
g5 = context.ghost_agent((5, 0, 3))
self.assertIsNotNone(g5)
self.assertEqual(2, g.edge_count)
self.assertTrue(g.contains_edge(a6, g5))
self.assertTrue(g.contains_edge(a4, g2))
elif rank == 3:
self.assertEqual(1, context.size()[-1])
a5 = context.agent((5, 0, 3))
self.assertIsNotNone(a5)
self.assertEqual(3, len(context._agent_manager._ghost_agents))
g6 = context.ghost_agent((6, 0, 2))
self.assertIsNotNone(g6)
g1 = context.ghost_agent((1, 0, 1))
self.assertIsNotNone(g1)
g2 = context.ghost_agent((2, 0, 1))
self.assertIsNotNone(g2)
self.assertEqual(3, g.edge_count)
self.assertTrue(g.contains_edge(g6, a5))
self.assertTrue(g.contains_edge(g1, a5))
self.assertTrue(g.contains_edge(g2, a5))
def test_generation1(self):
# make 1 rank comm for basic add remove tests
new_group = MPI.COMM_WORLD.Get_group().Incl([0])
comm = MPI.COMM_WORLD.Create_group(new_group)
if comm != MPI.COMM_NULL:
g = nx.generators.watts_strogatz_graph(30, 2, 0.25)
fname = './test_data/gen_net_test.txt'
write_network(g, "test", fname, 3)
with open(fname, 'r') as f_in:
line = f_in.readline().strip()
vals = line.split(' ')
self.assertEqual('test', vals[0])
self.assertEqual('0', vals[1])
nid_count = 0
r_counts = [0, 0, 0]
line = f_in.readline().strip()
while line != 'EDGES':
nid, n_type, rank = [int(x) for x in line.split(' ')]
self.assertTrue(nid in g)
self.assertEqual(0, n_type)
r_counts[rank] += 1
nid_count += 1
line = f_in.readline().strip()
self.assertEqual(30, nid_count)
self.assertEqual(g.number_of_nodes(), nid_count)
self.assertEqual(r_counts, [10, 10, 10])
edge_count = 0
for line in f_in:
line = line.strip()
u, v = [int(x) for x in line.split(' ')]
edge_count += 1
self.assertTrue(g.has_edge(u, v))
self.assertEqual(g.number_of_edges(), edge_count)
def test_generation2(self):
# make 1 rank comm for basic add remove tests
new_group = MPI.COMM_WORLD.Get_group().Incl([0])
comm = MPI.COMM_WORLD.Create_group(new_group)
if comm != MPI.COMM_NULL:
g = nx.generators.dual_barabasi_albert_graph(60, 2, 1, 0.25)
attr = {}
for i in range(30):
attr[i] = {'agent_type': 0, 'a': 3}
for i in range(30, 60):
attr[i] = {'agent_type': 1, 'a': 4}
nx.set_node_attributes(g, attr)
fname = './test_data/gen_net_test.txt'
write_network(g, "test", fname, 3, partition_method='random')
p = re.compile('\{[^}]+\}|\S+')
with open(fname, 'r') as f_in:
line = f_in.readline().strip()
vals = line.split(' ')
self.assertEqual('test', vals[0])
self.assertEqual('0', vals[1])
nid_count = 0
r_counts = [0, 0, 0]
line = f_in.readline().strip()
while line != 'EDGES':
nid, n_type, rank, attr = p.findall(line.strip())
nid = int(nid)
n_type = int(n_type)
rank = int(rank)
self.assertTrue(nid in g)
if nid < 30:
self.assertEqual(0, n_type)
self.assertEqual('{"a": 3}', attr)
else:
self.assertEqual(1, n_type)
self.assertEqual('{"a": 4}', attr)
r_counts[rank] += 1
nid_count += 1
line = f_in.readline().strip()
self.assertEqual(60, nid_count)
self.assertEqual(g.number_of_nodes(), nid_count)
self.assertEqual(r_counts, [20, 20, 20])
edge_count = 0
for line in f_in:
line = line.strip()
u, v = [int(x) for x in line.split(' ')]
edge_count += 1
self.assertTrue(g.has_edge(u, v))
self.assertEqual(g.number_of_edges(), edge_count)
def test_generation3(self):
# make 1 rank comm for basic add remove tests
new_group = MPI.COMM_WORLD.Get_group().Incl([0])
comm = MPI.COMM_WORLD.Create_group(new_group)
if comm != MPI.COMM_NULL:
g = nx.complete_graph(60)
fname = './test_data/gen_net_test.txt'
try:
import nxmetis
options = nxmetis.types.MetisOptions(seed=1)
write_network(g, "metis_test", fname, 3, partition_method='metis', options=options)
p = re.compile('\{[^}]+\}|\S+')
ranks = {}
with open(fname, 'r') as f_in:
line = f_in.readline().strip()
vals = line.split(' ')
self.assertEqual('metis_test', vals[0])
self.assertEqual('0', vals[1])
line = f_in.readline().strip()
while line != 'EDGES':
nid, n_type, rank = p.findall(line.strip())
nid = int(nid)
n_type = int(n_type)
rank = int(rank)
self.assertTrue(nid in g)
self.assertEqual(0, n_type)
ranks[nid] = rank
line = f_in.readline().strip()
self.assertEqual(60, len(ranks))
_, partitions = nxmetis.partition(g, 3, options=options)
for i, partition in enumerate(partitions):
for nid in partition:
self.assertEqual(i, ranks[nid])
except ModuleNotFoundError:
print("Ignoring nxmetis test")
| 42.551378 | 115 | 0.552494 | 9,142 | 67,912 | 4.002844 | 0.036425 | 0.013445 | 0.08362 | 0.022299 | 0.948489 | 0.943078 | 0.936492 | 0.929333 | 0.920233 | 0.916708 | 0 | 0.061158 | 0.303452 | 67,912 | 1,595 | 116 | 42.578056 | 0.712434 | 0.102883 | 0 | 0.87897 | 0 | 0 | 0.013711 | 0.002486 | 0 | 0 | 0 | 0.000627 | 0.348498 | 1 | 0.018026 | false | 0 | 0.012876 | 0.000858 | 0.039485 | 0.000858 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
965863c8d420e2073fd710c8b9a3410adc778bcf | 3,195 | py | Python | tests/fire_groups/test_cone_of_fire.py | spascou/ps2-analysis | 00f99b009d15d4c401a3338ddd0408ac7eedcc0b | [
"MIT"
] | 2 | 2020-06-25T17:19:05.000Z | 2020-10-13T06:08:39.000Z | tests/fire_groups/test_cone_of_fire.py | spascou/ps2-analysis | 00f99b009d15d4c401a3338ddd0408ac7eedcc0b | [
"MIT"
] | null | null | null | tests/fire_groups/test_cone_of_fire.py | spascou/ps2-analysis | 00f99b009d15d4c401a3338ddd0408ac7eedcc0b | [
"MIT"
] | null | null | null | from ps2_analysis.fire_groups.cone_of_fire import ConeOfFire
def test_min_cof_angle():
cof: ConeOfFire = ConeOfFire(
max_angle=2.0,
min_angle=1.0,
bloom=0.1,
recovery_rate=10.0,
recovery_delay=100,
multiplier=2.0,
moving_multiplier=2.0,
pellet_spread=0.0,
)
assert cof.min_cof_angle(moving=False) == 2.0
assert cof.min_cof_angle(moving=True) == 4.0
def test_max_cof_angle():
cof: ConeOfFire = ConeOfFire(
max_angle=2.0,
min_angle=1.0,
bloom=0.1,
recovery_rate=10.0,
recovery_delay=100,
multiplier=2.0,
moving_multiplier=2.0,
pellet_spread=0.0,
)
assert cof.max_cof_angle(moving=False) == 4.0
assert cof.max_cof_angle(moving=True) == 8.0
def test_apply_bloom():
cof: ConeOfFire = ConeOfFire(
max_angle=2.0,
min_angle=1.0,
bloom=0.1,
recovery_rate=10.0,
recovery_delay=100,
multiplier=2.0,
moving_multiplier=2.0,
pellet_spread=0.0,
)
assert cof.apply_bloom(current=1.0, moving=False) == 1.1
assert cof.apply_bloom(current=2.0, moving=False) == 2.1
assert cof.apply_bloom(current=3.9, moving=False) == 4.0
assert cof.apply_bloom(current=4.0, moving=False) == 4.0
assert cof.apply_bloom(current=4.1, moving=False) == 4.0
def test_recover():
cof: ConeOfFire = ConeOfFire(
max_angle=2.0,
min_angle=1.0,
bloom=0.1,
recovery_rate=10.0,
recovery_delay=100,
multiplier=1.0,
moving_multiplier=1.0,
pellet_spread=0.0,
)
assert cof.recover(current=2.0, time=10) == 1.9
assert cof.recover(current=2.0, time=50) == 1.5
assert cof.recover(current=2.0, time=100) == 1.0
assert cof.recover(current=2.0, time=200) == 1.0
assert cof.recover(current=2.0, time=300) == 1.0
cof: ConeOfFire = ConeOfFire(
max_angle=2.0,
min_angle=1.0,
bloom=0.1,
recovery_rate=0.0,
recovery_delay=100,
multiplier=1.0,
moving_multiplier=1.0,
pellet_spread=0.0,
)
assert cof.recover(current=2.0, time=1000) == 2.0
def test_recover_time():
cof: ConeOfFire = ConeOfFire(
max_angle=2.0,
min_angle=1.0,
bloom=0.1,
recovery_rate=10.0,
recovery_delay=100,
multiplier=1.0,
moving_multiplier=2.0,
pellet_spread=0.0,
)
assert cof.recover_time(current=2.0) == 200
cof: ConeOfFire = ConeOfFire(
max_angle=2.0,
min_angle=1.0,
bloom=0.1,
recovery_rate=0.0,
recovery_delay=100,
multiplier=1.0,
moving_multiplier=2.0,
pellet_spread=0.0,
)
assert cof.recover_time(current=2.0) == -1
def test_max_recover_time():
cof: ConeOfFire = ConeOfFire(
max_angle=2.0,
min_angle=1.0,
bloom=0.1,
recovery_rate=10.0,
recovery_delay=100,
multiplier=2.0,
moving_multiplier=2.0,
pellet_spread=0.0,
)
assert cof.max_recover_time(moving=False) == 400
assert cof.max_recover_time(moving=True) == 800
| 24.767442 | 60 | 0.598748 | 475 | 3,195 | 3.848421 | 0.096842 | 0.031729 | 0.076586 | 0.113786 | 0.876368 | 0.876368 | 0.809081 | 0.730853 | 0.730853 | 0.696937 | 0 | 0.097666 | 0.275743 | 3,195 | 128 | 61 | 24.960938 | 0.692308 | 0 | 0 | 0.679245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179245 | 1 | 0.056604 | true | 0 | 0.009434 | 0 | 0.066038 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
736bc612085508f646456446b53670656070767e | 580 | py | Python | tests/unit/test_color.py | egret85/echovr-api | e135f25fb5b188e2931133d04c47c5e66e83a6c5 | [
"MIT"
] | 7 | 2018-11-02T18:12:18.000Z | 2021-03-08T10:47:59.000Z | tests/unit/test_color.py | egret85/echovr-api | e135f25fb5b188e2931133d04c47c5e66e83a6c5 | [
"MIT"
] | null | null | null | tests/unit/test_color.py | egret85/echovr-api | e135f25fb5b188e2931133d04c47c5e66e83a6c5 | [
"MIT"
] | 4 | 2018-11-02T18:12:08.000Z | 2020-06-19T19:42:39.000Z | from echovr_api.team import Team
def test_color_by_name():
assert Team.Color.by_name("blue") == Team.Color.BLUE
assert Team.Color.by_name("Blue") == Team.Color.BLUE
assert Team.Color.by_name("BLUE") == Team.Color.BLUE
assert Team.Color.by_name("BlUe") == Team.Color.BLUE
assert Team.Color.by_name("orange") == Team.Color.ORANGE
assert Team.Color.by_name("Orange") == Team.Color.ORANGE
assert Team.Color.by_name("ORANGE") == Team.Color.ORANGE
assert Team.Color.by_name("OrAnGe") == Team.Color.ORANGE
assert Team.Color.by_name("asdf") == None
| 38.666667 | 60 | 0.701724 | 90 | 580 | 4.377778 | 0.166667 | 0.388325 | 0.279188 | 0.388325 | 0.865482 | 0.865482 | 0.865482 | 0.865482 | 0.865482 | 0.865482 | 0 | 0 | 0.139655 | 580 | 14 | 61 | 41.428571 | 0.789579 | 0 | 0 | 0 | 0 | 0 | 0.075862 | 0 | 0 | 0 | 0 | 0 | 0.818182 | 1 | 0.090909 | true | 0 | 0.090909 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
737065012d5aac97b8448d43b153307da2d06e8e | 4,228 | py | Python | pyslock/event.py | snower/pyslock | 9f81cc890056496878ad4f19adb58754ee3f2401 | [
"MIT"
] | null | null | null | pyslock/event.py | snower/pyslock | 9f81cc890056496878ad4f19adb58754ee3f2401 | [
"MIT"
] | null | null | null | pyslock/event.py | snower/pyslock | 9f81cc890056496878ad4f19adb58754ee3f2401 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# 18/7/6
# create by: snower
from .utils import ensure_bytes
from .lock import Lock, LockLockedError, LockUnlockedError, LockTimeoutError, LockNotOwnError
from .protocol.exceptions import EventWaitTimeoutError
class Event(object):
WaitTimeoutError = EventWaitTimeoutError
def __init__(self, db, event_name, timeout=0, expried=60, default_seted=True):
self._db = db
self._db_id = db.id
self._event_name = ensure_bytes(event_name)
self._event_id = ensure_bytes(event_name)
self._event_lock = None
self._check_lock = None
self._wait_lock = None
self._timeout = timeout
self._expried = expried
self.default_seted = default_seted
def clear(self):
if self.default_seted:
if self._event_lock is None:
self._event_lock = Lock(self._db, self._event_name, self._timeout, self._expried, self._event_id, 0, 0)
try:
self._event_lock.acquire(0x02)
except LockLockedError:
pass
return None
if self._event_lock is None:
self._event_lock = Lock(self._db, self._event_name, self._timeout, self._expried, self._event_id, 2, 0)
try:
self._event_lock.release()
except LockUnlockedError:
pass
return None
def set(self):
if self.default_seted:
if self._event_lock is None:
self._event_lock = Lock(self._db, self._event_name, self._timeout, self._expried, self._event_id, 0, 0)
try:
self._event_lock.release()
except LockUnlockedError:
pass
return None
if self._event_lock is None:
self._event_lock = Lock(self._db, self._event_name, self._timeout, self._expried, self._event_id, 2, 0)
try:
self._event_lock.acquire(0x02)
except LockLockedError:
pass
return None
def is_set(self):
if self.default_seted:
self._check_lock = Lock(self._db, self._event_name, 0, 0, 0, 0)
try:
self._check_lock.acquire()
except LockTimeoutError:
return False
return True
self._check_lock = Lock(self._db, self._event_name, 0x02000000, 0, 2, 0)
try:
self._check_lock.acquire()
except LockTimeoutError:
return False
except LockNotOwnError:
return False
return True
def wait(self, timeout=60):
if self.default_seted:
self._wait_lock = Lock(self._db, self._event_name, timeout, 0, 0, 0)
try:
self._wait_lock.acquire()
except LockTimeoutError:
raise self.WaitTimeoutError()
return True
self._wait_lock = Lock(self._db, self._event_name, timeout | 0x02000000, 0, 2, 0)
try:
self._wait_lock.acquire()
except LockTimeoutError:
raise self.WaitTimeoutError()
return True
def wait_and_timeout_retry_clear(self, timeout=60):
if self.default_seted:
self._wait_lock = Lock(self._db, self._event_name, timeout, 0, 0, 0)
try:
self._wait_lock.acquire()
except LockTimeoutError:
self._event_lock = Lock(self._db, self._event_name, self._timeout, self._expried, self._event_id, 0, 0)
try:
self._event_lock.acquire(0x02)
except LockLockedError:
raise self.WaitTimeoutError()
try:
self._event_lock.release()
except:
pass
return True
self._wait_lock = Lock(self._db, self._event_name, timeout | 0x02000000, 0, 2, 0)
try:
self._wait_lock.acquire()
except LockTimeoutError:
raise self.WaitTimeoutError()
self._event_lock = Lock(self._db, self._event_name, self._timeout, self._expried, self._event_id, 2, 0)
try:
self._event_lock.release()
except:
pass
return True | 34.942149 | 119 | 0.583491 | 484 | 4,228 | 4.774793 | 0.121901 | 0.147988 | 0.101255 | 0.072696 | 0.757681 | 0.756815 | 0.714409 | 0.714409 | 0.714409 | 0.662916 | 0 | 0.028064 | 0.334201 | 4,228 | 121 | 120 | 34.942149 | 0.792895 | 0.01088 | 0 | 0.771429 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01005 | 0 | 0 | 1 | 0.057143 | false | 0.057143 | 0.028571 | 0 | 0.228571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
737f3247bfcce28af6ca11980c853a0e786daa60 | 3,593 | py | Python | tests/test_forced_deps.py | Tinche/incant | db99cfe0c4db974bae45ef2395f2e3caa054872f | [
"Apache-2.0"
] | 20 | 2021-12-28T01:40:09.000Z | 2022-02-06T15:59:32.000Z | tests/test_forced_deps.py | Tinche/incant | db99cfe0c4db974bae45ef2395f2e3caa054872f | [
"Apache-2.0"
] | 2 | 2021-12-29T23:51:58.000Z | 2022-03-09T11:45:49.000Z | tests/test_forced_deps.py | Tinche/incant | db99cfe0c4db974bae45ef2395f2e3caa054872f | [
"Apache-2.0"
] | 1 | 2021-12-29T05:52:03.000Z | 2021-12-29T05:52:03.000Z | from contextlib import asynccontextmanager
import pytest
from incant import Incanter
def test_noarg_forced_dep(incanter: Incanter):
def forced_dep():
raise ValueError()
def fn():
return 1
with pytest.raises(ValueError):
prep = incanter.prepare(fn, forced_deps=(forced_dep,))
prep()
def test_simple_arg_forced_dep(incanter: Incanter):
val = 5
def forced_dep(i: int):
assert i == val
raise ValueError()
def fn():
return 1
with pytest.raises(ValueError):
prep = incanter.prepare(fn, forced_deps=(forced_dep,))
prep(val)
def test_complex_arg_forced_dep(incanter: Incanter):
val = 5
val_f = 10.0
def forced_dep(i: int) -> None:
assert i == val
def fn(f: float):
assert f == val_f
return 1
prep = incanter.prepare(fn, forced_deps=(forced_dep,))
assert prep(val, val_f) == 1
def test_shared_args(incanter: Incanter):
val = 5
def forced_dep(i: int) -> None:
assert i == val
def fn(i: int):
assert i == val
return 1
prep = incanter.prepare(fn, forced_deps=(forced_dep,))
assert prep(val) == 1
def test_shared_dep(incanter: Incanter):
val = 5
def forced_dep(i: int) -> None:
assert i == val
def fn(i: int):
assert i == val
return 1
@incanter.register_by_type
def dep() -> int:
return val
prep = incanter.prepare(fn, forced_deps=(forced_dep,))
assert prep() == 1
async def test_async_ctx_mgr_dep(incanter: Incanter):
before = False
after = False
def fn(i: int):
nonlocal before, after
assert before
assert not after
assert i == 5
return 1
@asynccontextmanager
async def my_async_context_mgr():
nonlocal before, after
assert not before
assert not after
before = True
yield None
assert before
assert not after
after = True
prep = incanter.prepare(fn, is_async=True, forced_deps=(my_async_context_mgr,))
assert await prep(5) == 1
async def test_async_ctx_mgr_with_param(incanter: Incanter):
before = False
after = False
def fn(i: int):
nonlocal before, after
assert before
assert not after
assert i == 5
return 1
@asynccontextmanager
async def my_async_context_mgr(f: float):
nonlocal before, after
assert f == 10.0
assert not before
assert not after
before = True
yield None
assert before
assert not after
after = True
prep = incanter.prepare(fn, is_async=True, forced_deps=(my_async_context_mgr,))
assert await prep(10.0, 5) == 1
async def test_async_ctx_mgr_with_shared_param_and_dep(incanter: Incanter):
before = False
after = False
@incanter.register_by_type
def dep1() -> int:
return 5
def fn(i: int, f: float):
nonlocal before, after
assert before
assert not after
assert i == 5
assert f == 10.0
return 1
@asynccontextmanager
async def my_async_context_mgr(my_int: int, f: float):
nonlocal before, after
assert my_int == 5
assert f == 10.0
assert not before
assert not after
before = True
yield None
assert before
assert not after
after = True
prep = incanter.prepare(fn, is_async=True, forced_deps=(my_async_context_mgr,))
assert await prep(10.0) == 1
| 20.649425 | 83 | 0.603952 | 475 | 3,593 | 4.4 | 0.12 | 0.055981 | 0.064593 | 0.086124 | 0.857416 | 0.823445 | 0.811483 | 0.739234 | 0.739234 | 0.687081 | 0 | 0.017843 | 0.313665 | 3,593 | 173 | 84 | 20.768786 | 0.829684 | 0 | 0 | 0.729508 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.311475 | 1 | 0.163934 | false | 0 | 0.02459 | 0.032787 | 0.270492 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
738965b273c0ea1cfa452e130f0ebf9e55cf83fb | 7,864 | py | Python | InventorySystem/order/migrations/0001_initial.py | guyueming/PythonWeb | e8a38fc26c06ec78e1de61d65055dcfc480ef8f1 | [
"MIT"
] | null | null | null | InventorySystem/order/migrations/0001_initial.py | guyueming/PythonWeb | e8a38fc26c06ec78e1de61d65055dcfc480ef8f1 | [
"MIT"
] | null | null | null | InventorySystem/order/migrations/0001_initial.py | guyueming/PythonWeb | e8a38fc26c06ec78e1de61d65055dcfc480ef8f1 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.3 on 2021-06-13 17:32
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import django.utils.timezone
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('wood', '0001_initial'),
('customer', '0001_initial'),
('process', '0001_initial'),
('salesman', '0001_initial'),
('skin', '0001_initial'),
('paper', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='OrderModel',
fields=[
('id', models.AutoField(primary_key=True, serialize=False)),
('order_time', models.DateField(default=django.utils.timezone.now, verbose_name='下单时间')),
('delivery_time', models.DateField(default=django.utils.timezone.now, verbose_name='交货时间')),
('count', models.IntegerField(default=0, verbose_name='数量')),
('woodCount', models.IntegerField(default=0, verbose_name='木料数量')),
('skinCount', models.IntegerField(default=0, verbose_name='桉木皮数量')),
('paperCount', models.IntegerField(default=0, verbose_name='纸张数量')),
('color', models.TextField(max_length=64, verbose_name='花色')),
('packaging', models.TextField(default='', max_length=64, verbose_name='包装')),
('thickness', models.TextField(default='', max_length=64, verbose_name='厚度')),
('trademark', models.TextField(default='', max_length=64, verbose_name='商标')),
('word', models.TextField(default='', max_length=64, verbose_name='打字')),
('is_grooving', models.BooleanField(default=False, verbose_name='是否开槽')),
('is_drying', models.BooleanField(default=False, verbose_name='是否烘干')),
('sure', models.BooleanField(default=False, verbose_name='是否确认')),
('complete', models.BooleanField(default=False, verbose_name='是否完成')),
('note', models.TextField(default='', max_length=256, verbose_name='备注')),
('created_time', models.DateTimeField(default=django.utils.timezone.now, verbose_name='创建时间')),
('last_mod_time', models.DateTimeField(auto_now=True, verbose_name='修改时间')),
('customer', models.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, to='customer.customermodel', verbose_name='厂家')),
('paper', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='paper.papermodel', verbose_name='纸张')),
('salesman', models.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, to='salesman.salesmanmodel', verbose_name='销售员')),
('skin', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='skin.skinmodel', verbose_name='桉木皮')),
('specifications', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='process.specificationmodel', verbose_name='规格')),
('technology', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='process.technologymodel', verbose_name='钢板工艺')),
('user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL, verbose_name='作者')),
('wood', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='wood.woodmodel', verbose_name='木料')),
],
),
migrations.CreateModel(
name='WoodFormModel',
fields=[
('id', models.AutoField(primary_key=True, serialize=False)),
('arrive_time', models.DateField(default=django.utils.timezone.now, verbose_name='到货时间')),
('count', models.IntegerField(default=0, verbose_name='数量')),
('type', models.CharField(choices=[('1', '入库'), ('2', '出库'), ('3', '损耗')], default='1', max_length=4, verbose_name='类型')),
('sure', models.BooleanField(default=False, verbose_name='是否确认')),
('complete', models.BooleanField(default=False, verbose_name='是否完成')),
('note', models.TextField(default='', max_length=256, verbose_name='备注')),
('created_time', models.DateTimeField(default=django.utils.timezone.now, verbose_name='创建时间')),
('last_mod_time', models.DateTimeField(auto_now=True, verbose_name='修改时间')),
('name', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='wood.woodmodel', verbose_name='木料')),
('order', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='order.ordermodel', verbose_name='关联订单')),
('user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL, verbose_name='作者')),
],
),
migrations.CreateModel(
name='SkinFormModel',
fields=[
('id', models.AutoField(primary_key=True, serialize=False)),
('arrive_date', models.DateField(default=django.utils.timezone.now, verbose_name='到货时间')),
('count', models.IntegerField(default=0, verbose_name='数量')),
('type', models.CharField(choices=[('1', '入库'), ('2', '出库'), ('3', '损耗')], default='1', max_length=4, verbose_name='类型')),
('sure', models.BooleanField(default=False, verbose_name='是否确认')),
('complete', models.BooleanField(default=False, verbose_name='是否完成')),
('note', models.TextField(default='', max_length=256, verbose_name='备注')),
('created_time', models.DateTimeField(default=django.utils.timezone.now, verbose_name='创建时间')),
('last_mod_time', models.DateTimeField(auto_now=True, verbose_name='修改时间')),
('name', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='skin.skinmodel', verbose_name='桉木皮')),
('order', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='order.ordermodel', verbose_name='关联订单')),
('user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL, verbose_name='作者')),
],
),
migrations.CreateModel(
name='PaperFormModel',
fields=[
('id', models.AutoField(primary_key=True, serialize=False)),
('count', models.IntegerField(default=0, verbose_name='数量')),
('factory', models.TextField(max_length=64, verbose_name='厂家')),
('arrive_time', models.TextField(default=django.utils.timezone.now, verbose_name='到货时间')),
('type', models.CharField(choices=[('1', '入库'), ('2', '出库'), ('3', '损耗')], default='1', max_length=4, verbose_name='类型')),
('sure', models.BooleanField(default=False, verbose_name='是否确认')),
('complete', models.BooleanField(default=False, verbose_name='是否确认')),
('note', models.TextField(default='', max_length=256, verbose_name='备注')),
('created_time', models.DateTimeField(default=django.utils.timezone.now, verbose_name='创建时间')),
('last_mod_time', models.DateTimeField(auto_now=True, verbose_name='修改时间')),
('name', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='paper.papermodel', verbose_name='纸张')),
('order', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='order.ordermodel', verbose_name='关联订单')),
('user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL, verbose_name='作者')),
],
),
]
| 72.146789 | 153 | 0.623983 | 857 | 7,864 | 5.569428 | 0.165694 | 0.138278 | 0.052797 | 0.082967 | 0.827781 | 0.827781 | 0.787345 | 0.771004 | 0.706474 | 0.685104 | 0 | 0.013631 | 0.207019 | 7,864 | 108 | 154 | 72.814815 | 0.751764 | 0.005722 | 0 | 0.524752 | 1 | 0 | 0.134067 | 0.011897 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.039604 | 0 | 0.079208 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
738c43e6609d74d30bb69901125295e7c4370d15 | 6,048 | py | Python | tests/utils/test_concepts.py | Feng37/tfsnippet | 70c7dc5c8c8f6314f9d9e44697f90068417db5cd | [
"MIT"
] | null | null | null | tests/utils/test_concepts.py | Feng37/tfsnippet | 70c7dc5c8c8f6314f9d9e44697f90068417db5cd | [
"MIT"
] | null | null | null | tests/utils/test_concepts.py | Feng37/tfsnippet | 70c7dc5c8c8f6314f9d9e44697f90068417db5cd | [
"MIT"
] | 1 | 2020-02-08T15:33:41.000Z | 2020-02-08T15:33:41.000Z | import unittest
import pytest
from mock import MagicMock, Mock
from tfsnippet.utils import (AutoInitAndCloseable, Disposable,
NoReentrantContext, DisposableContext)
class AutoInitAndCloseableTestCase(unittest.TestCase):
def test_init_close(self):
lazy_init = AutoInitAndCloseable()
lazy_init._init = Mock()
lazy_init._close = Mock()
self.assertEqual(0, lazy_init._init.call_count)
self.assertEqual(0, lazy_init._close.call_count)
lazy_init.init()
self.assertEqual(1, lazy_init._init.call_count)
self.assertEqual(0, lazy_init._close.call_count)
lazy_init.init()
self.assertEqual(1, lazy_init._init.call_count)
self.assertEqual(0, lazy_init._close.call_count)
lazy_init.close()
self.assertEqual(1, lazy_init._init.call_count)
self.assertEqual(1, lazy_init._close.call_count)
lazy_init.close()
self.assertEqual(1, lazy_init._init.call_count)
self.assertEqual(1, lazy_init._close.call_count)
lazy_init.init()
self.assertEqual(2, lazy_init._init.call_count)
self.assertEqual(1, lazy_init._close.call_count)
def test_context(self):
lazy_init = AutoInitAndCloseable()
lazy_init._init = Mock()
lazy_init._close = Mock()
self.assertEqual(0, lazy_init._init.call_count)
self.assertEqual(0, lazy_init._close.call_count)
with lazy_init:
self.assertEqual(1, lazy_init._init.call_count)
self.assertEqual(0, lazy_init._close.call_count)
with lazy_init:
self.assertEqual(1, lazy_init._init.call_count)
self.assertEqual(0, lazy_init._close.call_count)
self.assertEqual(1, lazy_init._init.call_count)
self.assertEqual(1, lazy_init._close.call_count)
self.assertEqual(1, lazy_init._init.call_count)
self.assertEqual(1, lazy_init._close.call_count)
with lazy_init:
self.assertEqual(2, lazy_init._init.call_count)
self.assertEqual(1, lazy_init._close.call_count)
self.assertEqual(2, lazy_init._init.call_count)
self.assertEqual(2, lazy_init._close.call_count)
class DisposableTestCase(unittest.TestCase):
def test_everything(self):
disposable = Disposable()
disposable._check_usage_and_set_used()
with pytest.raises(
RuntimeError, match='Disposable object cannot be used twice'):
disposable._check_usage_and_set_used()
class _ContextA(NoReentrantContext):
def __init__(self):
self._enter = MagicMock(return_value=123)
self._exit = MagicMock()
class _ContextB(DisposableContext):
def __init__(self):
self._enter = MagicMock(return_value=456)
self._exit = MagicMock()
class NoReentrantContextTestCase(unittest.TestCase):
def test_everything(self):
ctx = _ContextA()
self.assertFalse(ctx._is_entered)
self.assertEquals(0, ctx._enter.call_count)
self.assertEquals(0, ctx._exit.call_count)
with pytest.raises(
RuntimeError, match='Context is required be entered'):
ctx._require_entered()
with ctx as x:
self.assertEqual(123, x)
self.assertTrue(ctx._is_entered)
self.assertEquals(1, ctx._enter.call_count)
self.assertEquals(0, ctx._exit.call_count)
_ = ctx._require_entered()
with pytest.raises(
RuntimeError, match='Context is not reentrant'):
with ctx:
pass
self.assertTrue(ctx._is_entered)
self.assertEquals(1, ctx._enter.call_count)
self.assertEquals(0, ctx._exit.call_count)
self.assertFalse(ctx._is_entered)
self.assertEquals(1, ctx._enter.call_count)
self.assertEquals(1, ctx._exit.call_count)
with pytest.raises(
RuntimeError, match='Context is required be entered'):
ctx._require_entered()
with ctx as x:
self.assertEqual(123, x)
self.assertTrue(ctx._is_entered)
self.assertEquals(2, ctx._enter.call_count)
self.assertEquals(1, ctx._exit.call_count)
_ = ctx._require_entered()
self.assertFalse(ctx._is_entered)
self.assertEquals(2, ctx._enter.call_count)
self.assertEquals(2, ctx._exit.call_count)
with pytest.raises(
RuntimeError, match='Context is required be entered'):
ctx._require_entered()
class DisposableContextTestCase(unittest.TestCase):
def test_everything(self):
ctx = _ContextB()
self.assertFalse(ctx._is_entered)
self.assertEquals(0, ctx._enter.call_count)
self.assertEquals(0, ctx._exit.call_count)
with pytest.raises(
RuntimeError, match='Context is required be entered'):
ctx._require_entered()
with ctx as x:
_ = ctx._require_entered()
self.assertEqual(123, x)
self.assertTrue(ctx._is_entered)
self.assertEquals(1, ctx._enter.call_count)
self.assertEquals(0, ctx._exit.call_count)
with pytest.raises(
RuntimeError, match='Context is not reentrant'):
with ctx:
pass
self.assertTrue(ctx._is_entered)
self.assertEquals(1, ctx._enter.call_count)
self.assertEquals(0, ctx._exit.call_count)
self.assertFalse(ctx._is_entered)
self.assertEquals(1, ctx._enter.call_count)
self.assertEquals(1, ctx._exit.call_count)
with pytest.raises(
RuntimeError, match='Context is required be entered'):
ctx._require_entered()
with pytest.raises(
RuntimeError,
match='A disposable context cannot be entered twice'):
with ctx:
pass
| 34.363636 | 78 | 0.635251 | 694 | 6,048 | 5.23487 | 0.096542 | 0.113955 | 0.100193 | 0.105698 | 0.850812 | 0.837875 | 0.811175 | 0.77787 | 0.743187 | 0.743187 | 0 | 0.01387 | 0.272817 | 6,048 | 175 | 79 | 34.56 | 0.812187 | 0 | 0 | 0.830882 | 0 | 0 | 0.046296 | 0 | 0 | 0 | 0 | 0 | 0.433824 | 1 | 0.051471 | false | 0.022059 | 0.029412 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7394e3339c36884471dad01b3f4de87c94d9a7cd | 54,332 | py | Python | _aux/decaffeinate_ed25519_too.sage.py | LucasPenido/libdecaf | 881353373d14a01864cf0e43c1623394beef2edf | [
"MIT"
] | 1 | 2021-09-08T10:53:23.000Z | 2021-09-08T10:53:23.000Z | _aux/decaffeinate_ed25519_too.sage.py | LucasPenido/libdecaf | 881353373d14a01864cf0e43c1623394beef2edf | [
"MIT"
] | null | null | null | _aux/decaffeinate_ed25519_too.sage.py | LucasPenido/libdecaf | 881353373d14a01864cf0e43c1623394beef2edf | [
"MIT"
] | 1 | 2021-08-08T23:13:41.000Z | 2021-08-08T23:13:41.000Z | # This file was *autogenerated* from the file decaffeinate_ed25519_too.sage
from sage.all_cmdline import * # import sage library
_sage_const_0x22f309299d1bbf9f9e0d7005d6bb9e3c9063bafa5460e3910f4f11dc333daea2 = Integer(0x22f309299d1bbf9f9e0d7005d6bb9e3c9063bafa5460e3910f4f11dc333daea2); _sage_const_0x33daa340989ac98ff07765a7bac4db9a37fad1cd0711ee3cb04401b04d3f2284 = Integer(0x33daa340989ac98ff07765a7bac4db9a37fad1cd0711ee3cb04401b04d3f2284); _sage_const_0x274c10042bf6d54c20631ddf5ac118d0f27db35b9f471643e309564cf0406ac9 = Integer(0x274c10042bf6d54c20631ddf5ac118d0f27db35b9f471643e309564cf0406ac9); _sage_const_0x33816ae7bcdd2e844fd2317e6ed2b7a9064ade4c550c34683107550a102a5844 = Integer(0x33816ae7bcdd2e844fd2317e6ed2b7a9064ade4c550c34683107550a102a5844); _sage_const_0x1e1b753c869f8f455e9ca31cac2b9750ed0db980b64899bfbf6ed4057a9d18ef = Integer(0x1e1b753c869f8f455e9ca31cac2b9750ed0db980b64899bfbf6ed4057a9d18ef); _sage_const_0x05fb64b41c47522bef7e9c7b06f98abdf67be15cad4e04546a14526e64a55ebc = Integer(0x05fb64b41c47522bef7e9c7b06f98abdf67be15cad4e04546a14526e64a55ebc); _sage_const_0x092fc5f2ffe64b2ee96fc185c9b08b6cead1d1726ab6c2fb986a71140dacce99 = Integer(0x092fc5f2ffe64b2ee96fc185c9b08b6cead1d1726ab6c2fb986a71140dacce99); _sage_const_0x57c790966cf4733d05907d98024f3a6edda98e4205b905a270001e9b55d1dca5 = Integer(0x57c790966cf4733d05907d98024f3a6edda98e4205b905a270001e9b55d1dca5); _sage_const_0x1dd8871f80931ec7d408dd7f1c45d056429d8b975f0fd1b3e5b17be43380f360 = Integer(0x1dd8871f80931ec7d408dd7f1c45d056429d8b975f0fd1b3e5b17be43380f360); _sage_const_0x064bd96210b06010d3aa5a3b60630634bbc398c03fe13c6d3ac06590604eec28 = Integer(0x064bd96210b06010d3aa5a3b60630634bbc398c03fe13c6d3ac06590604eec28); _sage_const_0x7f9e45d36a1a06cbe82c4a5b158a5c8d4052a4ce3b243c97d732f4f96d51ad69 = Integer(0x7f9e45d36a1a06cbe82c4a5b158a5c8d4052a4ce3b243c97d732f4f96d51ad69); _sage_const_0x362c024afc7e10fc16fc10e3390851ce1639cf40dffa8e40ff1e6f564093d075 = Integer(0x362c024afc7e10fc16fc10e3390851ce1639cf40dffa8e40ff1e6f564093d075); _sage_const_0x419f6832e80aa836f5fee537593318316aa93f83b3c3869163025d93d860bd63 = Integer(0x419f6832e80aa836f5fee537593318316aa93f83b3c3869163025d93d860bd63); _sage_const_0x1377aa06b3dd07c58e0370706ebd0563c155a888c3d9d33d1cc9a423ba5740b9 = Integer(0x1377aa06b3dd07c58e0370706ebd0563c155a888c3d9d33d1cc9a423ba5740b9); _sage_const_0x256f0a2bd740800aa7ebad9803215a8f75a3117e18a3f97a86e1a9a697471a65 = Integer(0x256f0a2bd740800aa7ebad9803215a8f75a3117e18a3f97a86e1a9a697471a65); _sage_const_0x2260e201c9447f7f5fc96dbf94ecfd84025e32bdb6527e5f6ea7b491ec0868ac = Integer(0x2260e201c9447f7f5fc96dbf94ecfd84025e32bdb6527e5f6ea7b491ec0868ac); _sage_const_0x3d17189e0f1de4c53e0415c6e7d7e4fd947ade06ab462ccce78c9947d6d8a32c = Integer(0x3d17189e0f1de4c53e0415c6e7d7e4fd947ade06ab462ccce78c9947d6d8a32c); _sage_const_0x5d6049184ad917bec99aa55dc97c8edb585e973f0d07ccfa50e46518db28275b = Integer(0x5d6049184ad917bec99aa55dc97c8edb585e973f0d07ccfa50e46518db28275b); _sage_const_0x48d984bc9b5e16fa12274cab9b3fdeb5753eb9d2195e1edde15710b1ff4a78b7 = Integer(0x48d984bc9b5e16fa12274cab9b3fdeb5753eb9d2195e1edde15710b1ff4a78b7); _sage_const_0x015c63020852ff73044ecea7037602ee6f9fede01a7ce27489bad570d6946dbf = Integer(0x015c63020852ff73044ecea7037602ee6f9fede01a7ce27489bad570d6946dbf); _sage_const_0x3a08084aa98a06ce5f2d3c4d75f3a2f8f1aea4989ac54bb33abfeed8cb93be20 = Integer(0x3a08084aa98a06ce5f2d3c4d75f3a2f8f1aea4989ac54bb33abfeed8cb93be20); _sage_const_0x2c55dcc8fa715d32fcf51bce0f0a31e44c3f670fedb77b11ae1fbcc7266a95d0 = Integer(0x2c55dcc8fa715d32fcf51bce0f0a31e44c3f670fedb77b11ae1fbcc7266a95d0); _sage_const_0x19bab77d15fbc92056f85d17e47ae57530aacc3ebf17209772aba8cf221aadbf = Integer(0x19bab77d15fbc92056f85d17e47ae57530aacc3ebf17209772aba8cf221aadbf); _sage_const_0x2c325cee3dd9b9a68403a0be9a26d20aa057804417e3c1d7caf174f407e5be05 = Integer(0x2c325cee3dd9b9a68403a0be9a26d20aa057804417e3c1d7caf174f407e5be05); _sage_const_0x1c2bb67d85d72d79c0f8d7f902d3e65b3f1220d0d896db35a73f50072a9fb61a = Integer(0x1c2bb67d85d72d79c0f8d7f902d3e65b3f1220d0d896db35a73f50072a9fb61a); _sage_const_0x3dfc6e4032693471f402fb3b20ae6ff0cac0dcb6335e4791b9afec311c489de1 = Integer(0x3dfc6e4032693471f402fb3b20ae6ff0cac0dcb6335e4791b9afec311c489de1); _sage_const_0x589b2fd4a8febb704a1fbfb92db961f641f93d4fc78122d9877e1ca6da5c8b5f = Integer(0x589b2fd4a8febb704a1fbfb92db961f641f93d4fc78122d9877e1ca6da5c8b5f); _sage_const_0x316d162ecfa06096e1d42664e627d41954bee5aede57959a88fec86872f1633b = Integer(0x316d162ecfa06096e1d42664e627d41954bee5aede57959a88fec86872f1633b); _sage_const_0x09c6eb0e7d84be6b3373d4337add32a79623898dd6a5ea57f8216bf4c6ec4882 = Integer(0x09c6eb0e7d84be6b3373d4337add32a79623898dd6a5ea57f8216bf4c6ec4882); _sage_const_0x5cc8f5a6d9c932f3c5bb73d8aaad712be8a612f3afafa968dfdfe4bf89de9cec = Integer(0x5cc8f5a6d9c932f3c5bb73d8aaad712be8a612f3afafa968dfdfe4bf89de9cec); _sage_const_0x1404397546c70250c71bdd77aac118c723a9a6707f4d2937e315d62577d00913 = Integer(0x1404397546c70250c71bdd77aac118c723a9a6707f4d2937e315d62577d00913); _sage_const_0x61eb41b04e2adcc2667484a34110abeebe096e45c41c8f3605ee4fa1c89ca24f = Integer(0x61eb41b04e2adcc2667484a34110abeebe096e45c41c8f3605ee4fa1c89ca24f); _sage_const_0x27003d61ae8bae9abe323a2417326fc9605373842133569e68db07766fdf13b4 = Integer(0x27003d61ae8bae9abe323a2417326fc9605373842133569e68db07766fdf13b4); _sage_const_0x0a9a0497e538bc1856892cc61b859dea42156fcdc62b66e1cc341c7a6d44affb = Integer(0x0a9a0497e538bc1856892cc61b859dea42156fcdc62b66e1cc341c7a6d44affb); _sage_const_0x2c0c4eb0ae23d2728b13e9af3ab5b4ba6a4727dcf2cd6b269847e36ba91dbe45 = Integer(0x2c0c4eb0ae23d2728b13e9af3ab5b4ba6a4727dcf2cd6b269847e36ba91dbe45); _sage_const_19 = Integer(19); _sage_const_0x6bff133e8bdd62ad16971723b400aaa021d2fded7b79a9362e103380c28ecaff = Integer(0x6bff133e8bdd62ad16971723b400aaa021d2fded7b79a9362e103380c28ecaff); _sage_const_0x126a70cdd2ea42fad54cdad0bbc6d43b4035e8796608b8a71fca7a64b4c866b5 = Integer(0x126a70cdd2ea42fad54cdad0bbc6d43b4035e8796608b8a71fca7a64b4c866b5); _sage_const_0x67b4e3c3834e8a8ece8f7e9abe535deb51155d276acf30e0d79832115b35814d = Integer(0x67b4e3c3834e8a8ece8f7e9abe535deb51155d276acf30e0d79832115b35814d); _sage_const_0x28471815cf3c4d9b8dcd7684dcdd676cb3117c5255b03f32238356a4acae433f = Integer(0x28471815cf3c4d9b8dcd7684dcdd676cb3117c5255b03f32238356a4acae433f); _sage_const_0x57126efc10bab3b42919389f5adabe3852612d4f5431380172e114f5e3f9478e = Integer(0x57126efc10bab3b42919389f5adabe3852612d4f5431380172e114f5e3f9478e); _sage_const_0x5c053b543c6410dc46c9beaca721332f618ea6f8ee612e7a01c90f492d34884d = Integer(0x5c053b543c6410dc46c9beaca721332f618ea6f8ee612e7a01c90f492d34884d); _sage_const_0x4ec777e530371db59c116842447e2289ab66a1291c99b195d2665b6fc8254988 = Integer(0x4ec777e530371db59c116842447e2289ab66a1291c99b195d2665b6fc8254988); _sage_const_0x2a7ef5d233bcf7dac30fd26cb9bf97e488f0ba67c0efe9c77fed2e93944d8ec5 = Integer(0x2a7ef5d233bcf7dac30fd26cb9bf97e488f0ba67c0efe9c77fed2e93944d8ec5); _sage_const_0x1789c532f5787319639987f2e3d04be21ef57d9e1cd5151642e590336a1c0f9e = Integer(0x1789c532f5787319639987f2e3d04be21ef57d9e1cd5151642e590336a1c0f9e); _sage_const_0x0fd200084e28fb73f308b72d1f4817c8e85402b64055fd4470d93320d3785d1a = Integer(0x0fd200084e28fb73f308b72d1f4817c8e85402b64055fd4470d93320d3785d1a); _sage_const_0x6725250788de0f6fa3c8db5f0f7e6d479e86791445a8d18e611c959bddf7f3ae = Integer(0x6725250788de0f6fa3c8db5f0f7e6d479e86791445a8d18e611c959bddf7f3ae); _sage_const_0x098725a529e09592e80c948f98e4c24d26c2929b4dd64ecb82fb17ce6fb3e62e = Integer(0x098725a529e09592e80c948f98e4c24d26c2929b4dd64ecb82fb17ce6fb3e62e); _sage_const_0x0dcf37b880e6d34b47865b30c3accd9312e18c7a138e8b68d75fb45de2cc2875 = Integer(0x0dcf37b880e6d34b47865b30c3accd9312e18c7a138e8b68d75fb45de2cc2875); _sage_const_0x3fb563cb76965388d2b7cd310e8883168a44b54f51e87e85450d8082cda27c32 = Integer(0x3fb563cb76965388d2b7cd310e8883168a44b54f51e87e85450d8082cda27c32); _sage_const_0x68438813134e5a4a79adbef436ac4b346c27f91d77e82df89b2a6aeb4890abfa = Integer(0x68438813134e5a4a79adbef436ac4b346c27f91d77e82df89b2a6aeb4890abfa); _sage_const_0x47d0e827cb1595e1470eb88580d5716c4cf22832ea2f0ff0df38ab61ca32112f = Integer(0x47d0e827cb1595e1470eb88580d5716c4cf22832ea2f0ff0df38ab61ca32112f); _sage_const_0x04bc19c92dc1b8fc6fb66631c2e7134dde56a9f185680e3ceaa438e2a2cfb2ec = Integer(0x04bc19c92dc1b8fc6fb66631c2e7134dde56a9f185680e3ceaa438e2a2cfb2ec); _sage_const_0x4d14417bed8e4bb606f83676b3fb11dc95622d2788825a8a92addf381b062ac7 = Integer(0x4d14417bed8e4bb606f83676b3fb11dc95622d2788825a8a92addf381b062ac7); _sage_const_0x70d0fdefa7d4b83d5a56347b1489aaaaa13c91c711f533e91ec3e420e731fd98 = Integer(0x70d0fdefa7d4b83d5a56347b1489aaaaa13c91c711f533e91ec3e420e731fd98); _sage_const_0x269dfda804d7f866b0ba69c04cee25bb5e3c8b36ab93af335ce666345b7bd4c6 = Integer(0x269dfda804d7f866b0ba69c04cee25bb5e3c8b36ab93af335ce666345b7bd4c6); _sage_const_0x196f7fdf033a6057bc85bab561eb75bb7c1fbdfe01dd4647c53faa18acb55552 = Integer(0x196f7fdf033a6057bc85bab561eb75bb7c1fbdfe01dd4647c53faa18acb55552); _sage_const_0x2cafede3353135c821d79fa40e25cf3699be47f926ab6e53d0e0adfd014f19a0 = Integer(0x2cafede3353135c821d79fa40e25cf3699be47f926ab6e53d0e0adfd014f19a0); _sage_const_0x761607f0fc020df5268674ca1da73bbe86a532b5d8641995aabddfff4e12ce57 = Integer(0x761607f0fc020df5268674ca1da73bbe86a532b5d8641995aabddfff4e12ce57); _sage_const_0x10b6b313cd4bbddd943dafe69c54a97c94cddca1f55b259cf2356e0b1c8334a2 = Integer(0x10b6b313cd4bbddd943dafe69c54a97c94cddca1f55b259cf2356e0b1c8334a2); _sage_const_0x04952979f1593cdb8789d50749d4293113b6d9ea0cc883dbab06254d9ffac408 = Integer(0x04952979f1593cdb8789d50749d4293113b6d9ea0cc883dbab06254d9ffac408); _sage_const_0x0a03bde685d1a307cc534124e8105e627f338e3269dca9a47b6d28992df18eef = Integer(0x0a03bde685d1a307cc534124e8105e627f338e3269dca9a47b6d28992df18eef); _sage_const_0x33ffa6e4db20db65069e87d2be7bf28a38bbdf2f71aa5379b38cbfee2d293d6b = Integer(0x33ffa6e4db20db65069e87d2be7bf28a38bbdf2f71aa5379b38cbfee2d293d6b); _sage_const_0x3131af645b0328fbc1ba1041940f792892f330801ae4cbae720d0eed272852ea = Integer(0x3131af645b0328fbc1ba1041940f792892f330801ae4cbae720d0eed272852ea); _sage_const_0x04013f892451bfae60cc2016f4488f8557a476c4fba67a37bbb7755ea6421229 = Integer(0x04013f892451bfae60cc2016f4488f8557a476c4fba67a37bbb7755ea6421229); _sage_const_0x62c5cc2ac62fd562fcb633fcac2c16c04dde3eba86005ed9f6721308f1d4c06e = Integer(0x62c5cc2ac62fd562fcb633fcac2c16c04dde3eba86005ed9f6721308f1d4c06e); _sage_const_0x1581fc4e7ba5815300e7d4229b0a33b3270f9c14ba00a272cdf657ed4ab40147 = Integer(0x1581fc4e7ba5815300e7d4229b0a33b3270f9c14ba00a272cdf657ed4ab40147); _sage_const_0x5c93f690d3fbf300654279bcfa3924109a75e66965d932629162e9f0d11fc172 = Integer(0x5c93f690d3fbf300654279bcfa3924109a75e66965d932629162e9f0d11fc172); _sage_const_0x48078d11dee44befe72a119cd3fba0cec3f2807f1c585d52f39d46cf63691236 = Integer(0x48078d11dee44befe72a119cd3fba0cec3f2807f1c585d52f39d46cf63691236); _sage_const_0x55c7869968a944116c631dd70f5d8f892915e1ddfbae57b747afe8e257025b0d = Integer(0x55c7869968a944116c631dd70f5d8f892915e1ddfbae57b747afe8e257025b0d); _sage_const_0x565323812efe5aa85d62ae86ef0094f096fd3cd02fe0fde621dcac47932b9cd5 = Integer(0x565323812efe5aa85d62ae86ef0094f096fd3cd02fe0fde621dcac47932b9cd5); _sage_const_0x4113f20d6fbbaa11b9dcc35548d2df58da22de5141b0582f8a0f55e85b040efc = Integer(0x4113f20d6fbbaa11b9dcc35548d2df58da22de5141b0582f8a0f55e85b040efc); _sage_const_0x45907da5f53a6039c250821c0f8cdc5d3b74a080674596ad961abe45a1414241 = Integer(0x45907da5f53a6039c250821c0f8cdc5d3b74a080674596ad961abe45a1414241); _sage_const_0x08e2924ad3c0a7b1e1a8095d7105d152eb3734e9b046153946a6e1ffb81ee042 = Integer(0x08e2924ad3c0a7b1e1a8095d7105d152eb3734e9b046153946a6e1ffb81ee042); _sage_const_0x06de6550ca5d285ae183b33a2c504b96f842f97a4c6e7c99a00f51f00b15a525 = Integer(0x06de6550ca5d285ae183b33a2c504b96f842f97a4c6e7c99a00f51f00b15a525); _sage_const_0x1012c8da84330dd7e57599d7230a798ad8d782e53951002d8ce2d6dd61ad22ab = Integer(0x1012c8da84330dd7e57599d7230a798ad8d782e53951002d8ce2d6dd61ad22ab); _sage_const_0x0f501eb466656489a2f5c1c359bc719257a2b96aa289750f9e7e8fd9592fcb79 = Integer(0x0f501eb466656489a2f5c1c359bc719257a2b96aa289750f9e7e8fd9592fcb79); _sage_const_0x20f482e14c42db88844c1aeae82c8e5223508232b1f17f23da39c3ef728e0a01 = Integer(0x20f482e14c42db88844c1aeae82c8e5223508232b1f17f23da39c3ef728e0a01); _sage_const_0x040876b83d9a59788d5a8b308e1c74255603d603353c0748dea9e46e6297f460 = Integer(0x040876b83d9a59788d5a8b308e1c74255603d603353c0748dea9e46e6297f460); _sage_const_255 = Integer(255); _sage_const_0x106d22b8426ea5d6c02790ffc8499bcb4fff4ace5815b52e59198fda1522d22b = Integer(0x106d22b8426ea5d6c02790ffc8499bcb4fff4ace5815b52e59198fda1522d22b); _sage_const_0x74ce5bac7f261a5defc289d7821e7919b640b96d37db5e2286c2bf2511286d71 = Integer(0x74ce5bac7f261a5defc289d7821e7919b640b96d37db5e2286c2bf2511286d71); _sage_const_0x78b661cb135ce2751f924f7a62a951c7c24b2f436fdc90be48fb0c4b1997dd85 = Integer(0x78b661cb135ce2751f924f7a62a951c7c24b2f436fdc90be48fb0c4b1997dd85); _sage_const_0x047e543079617fca27a7128d03935f4c2ef763dcf12fdf56435c7a637d6c3e94 = Integer(0x047e543079617fca27a7128d03935f4c2ef763dcf12fdf56435c7a637d6c3e94); _sage_const_0x243612efda2f2520e619461d64e56650f5a2926d6d0e7e0f5fb2bde0bc5bdd02 = Integer(0x243612efda2f2520e619461d64e56650f5a2926d6d0e7e0f5fb2bde0bc5bdd02); _sage_const_0x37ccc28e4680291c3df089c6fe4e0b69ecd5a560f43304c614d7dced6ee08eb1 = Integer(0x37ccc28e4680291c3df089c6fe4e0b69ecd5a560f43304c614d7dced6ee08eb1); _sage_const_0x07c1fa5c3b6dc3e3761ab5dcbad794beeeefd74bb2358eaa4f184a179c8df0a4 = Integer(0x07c1fa5c3b6dc3e3761ab5dcbad794beeeefd74bb2358eaa4f184a179c8df0a4); _sage_const_0x7a8f0501d26438df711067b0ae3131351c44905d5ba5c0c21da21c455f127b50 = Integer(0x7a8f0501d26438df711067b0ae3131351c44905d5ba5c0c21da21c455f127b50); _sage_const_0x3e13285c9356f3435852c36e0a28efacab11440e6de04b11487e75f064ee43c8 = Integer(0x3e13285c9356f3435852c36e0a28efacab11440e6de04b11487e75f064ee43c8); _sage_const_0x20c8ca7fcf284a0b11afe6b735d34ff8c5a4b6e30d7c73193c0359eed19474c0 = Integer(0x20c8ca7fcf284a0b11afe6b735d34ff8c5a4b6e30d7c73193c0359eed19474c0); _sage_const_0x3df16cb8ba0d99c2a83623080968e0a0e737b9791db5518b33f774c1b1ca1066 = Integer(0x3df16cb8ba0d99c2a83623080968e0a0e737b9791db5518b33f774c1b1ca1066); _sage_const_0x0744e0bbd51d38f635827f8dda3d019e61a0337c274cfa3aa040971aeafea861 = Integer(0x0744e0bbd51d38f635827f8dda3d019e61a0337c274cfa3aa040971aeafea861); _sage_const_0x49e8ec31649d4d0c13c18bc39c205f89bce5469c1b9ba435c5e32dc6c60c2412 = Integer(0x49e8ec31649d4d0c13c18bc39c205f89bce5469c1b9ba435c5e32dc6c60c2412); _sage_const_0x490c4b81e9e4db79f978f95958827a9d03d2dead4fdcba50815cb7ef74f37d1a = Integer(0x490c4b81e9e4db79f978f95958827a9d03d2dead4fdcba50815cb7ef74f37d1a); _sage_const_0x3fea5f5dd191dab2222ddb15ad45ad96db58326f453dcc179cba27e71e169e3e = Integer(0x3fea5f5dd191dab2222ddb15ad45ad96db58326f453dcc179cba27e71e169e3e); _sage_const_0x673f1b27ed4c827abb4521e17c8a9114d4a774a48b904189d0e5bff9bf29714a = Integer(0x673f1b27ed4c827abb4521e17c8a9114d4a774a48b904189d0e5bff9bf29714a); _sage_const_0x1f75ef4620c6b9fc1df0550cebe7e77a312d52ebcec44531fc8fd579cd73a6f7 = Integer(0x1f75ef4620c6b9fc1df0550cebe7e77a312d52ebcec44531fc8fd579cd73a6f7); _sage_const_0x7789d470a4b21a14ded681de924eaf4af17833197a0c9dedd223b6588e529e1b = Integer(0x7789d470a4b21a14ded681de924eaf4af17833197a0c9dedd223b6588e529e1b); _sage_const_0x1fb01b611b929b0ae76d5d9fcb3230eae78e4a5b883bf063ac8294e7faa0a1ac = Integer(0x1fb01b611b929b0ae76d5d9fcb3230eae78e4a5b883bf063ac8294e7faa0a1ac); _sage_const_0x0d0a4c3011f99055ea1b2fd72ed4feb9610b3619976669a1bf0920d37c2a81c7 = Integer(0x0d0a4c3011f99055ea1b2fd72ed4feb9610b3619976669a1bf0920d37c2a81c7); _sage_const_0x0412cb12cc650ab9d7a78e98c02a254d2efb27525850c362c7e25f7a140aa261 = Integer(0x0412cb12cc650ab9d7a78e98c02a254d2efb27525850c362c7e25f7a140aa261); _sage_const_0x694a5dfe341e94eab7026a4639bea425e90bd23a759b18c6a5310b57275c9da0 = Integer(0x694a5dfe341e94eab7026a4639bea425e90bd23a759b18c6a5310b57275c9da0); _sage_const_0x0578987fe8e5d51c14adec2e1f6db305845e8c352c334db308c7d9f7d311e055 = Integer(0x0578987fe8e5d51c14adec2e1f6db305845e8c352c334db308c7d9f7d311e055); _sage_const_0x05821acf4d3968aad6978fd5790edb5f5df303190ec2811aab0954e5ec178427 = Integer(0x05821acf4d3968aad6978fd5790edb5f5df303190ec2811aab0954e5ec178427); _sage_const_0x2e0290811c4c96b696e5a4f4f51f44e9281933759926e993e32d7a5a203a83c9 = Integer(0x2e0290811c4c96b696e5a4f4f51f44e9281933759926e993e32d7a5a203a83c9); _sage_const_0x12a902fe4628586de88003082c8b409049f403d9d8467251967eb01a8d0d46bc = Integer(0x12a902fe4628586de88003082c8b409049f403d9d8467251967eb01a8d0d46bc); _sage_const_121666 = Integer(121666); _sage_const_121665 = Integer(121665); _sage_const_0x3a2f74ac2e759c2a84c8a15ccd8a330e4cbbcdd4f0f78f8c05a18b5e5237cb57 = Integer(0x3a2f74ac2e759c2a84c8a15ccd8a330e4cbbcdd4f0f78f8c05a18b5e5237cb57); _sage_const_0x16374bf8ab4a7a0df026b79aaeb413f6b1b3a1cc817751c8e5c7c8a00eb3d276 = Integer(0x16374bf8ab4a7a0df026b79aaeb413f6b1b3a1cc817751c8e5c7c8a00eb3d276); _sage_const_0x741d4d2574942720802252deb7f0a0db2d12d2ea8baa4e70e72d4c202940870d = Integer(0x741d4d2574942720802252deb7f0a0db2d12d2ea8baa4e70e72d4c202940870d); _sage_const_0x600c2f535231b50626e8f82c941c65083aa2b7f0b05abe837a3f8a561eb4c111 = Integer(0x600c2f535231b50626e8f82c941c65083aa2b7f0b05abe837a3f8a561eb4c111); _sage_const_0x3723a079cd53bb0f66900c9520c482783ede56c04577cfa1b7ec180eaeb96535 = Integer(0x3723a079cd53bb0f66900c9520c482783ede56c04577cfa1b7ec180eaeb96535); _sage_const_0x519b487006b05dd35de8457092fe3ed666591f4ebe1cffcf699198ef28f0bac1 = Integer(0x519b487006b05dd35de8457092fe3ed666591f4ebe1cffcf699198ef28f0bac1); _sage_const_0x57ff3e743dc4633cce8cb2365a1294082c8d1b0399360a2f6e3e1b5a47a99a9c = Integer(0x57ff3e743dc4633cce8cb2365a1294082c8d1b0399360a2f6e3e1b5a47a99a9c); _sage_const_0x2697d8373aeb763fdc1dbf5eefffd1b3e8d90bc1b61f92c9a5aad68079eeef2c = Integer(0x2697d8373aeb763fdc1dbf5eefffd1b3e8d90bc1b61f92c9a5aad68079eeef2c); _sage_const_0x35d0a38a36d9a0f9f86d5e57ae4476a21a66c15b38dc69ede62a0e7aa19177e0 = Integer(0x35d0a38a36d9a0f9f86d5e57ae4476a21a66c15b38dc69ede62a0e7aa19177e0); _sage_const_0x44caffd4eb82c01ea9ab40875f951c3a86605535d25a3ca40b09dfa47c3f0079 = Integer(0x44caffd4eb82c01ea9ab40875f951c3a86605535d25a3ca40b09dfa47c3f0079); _sage_const_0x0004962f43c64c0a1f25e03666b017af771050417f5eda0ad217cf3659e8854f = Integer(0x0004962f43c64c0a1f25e03666b017af771050417f5eda0ad217cf3659e8854f); _sage_const_0x4295b408841af3d360c848b4b453e73c1abf3eb644ee5161686db5a8909dc2a1 = Integer(0x4295b408841af3d360c848b4b453e73c1abf3eb644ee5161686db5a8909dc2a1); _sage_const_0x0c41e8d17434987c3fe1f83ca28c4c7a29275477c1829e33d272e3c8634c6ccf = Integer(0x0c41e8d17434987c3fe1f83ca28c4c7a29275477c1829e33d272e3c8634c6ccf); _sage_const_0x2a17f7f61f2704e03a14ba008c3159ff0d1ef4733fe1f5f2b2b90e452976bc8b = Integer(0x2a17f7f61f2704e03a14ba008c3159ff0d1ef4733fe1f5f2b2b90e452976bc8b); _sage_const_0x0917e1b0f269a7fd11db658bd0b061fb14d2a4d42e0965036046a7ddea8f71b4 = Integer(0x0917e1b0f269a7fd11db658bd0b061fb14d2a4d42e0965036046a7ddea8f71b4); _sage_const_0x17668b514606ff6272cf0c2d5ab0be82556ac53ce7158dc6dcd0b1b197063ebf = Integer(0x17668b514606ff6272cf0c2d5ab0be82556ac53ce7158dc6dcd0b1b197063ebf); _sage_const_0x16c94534e51a349a2578283ef836c966fee6d76a9402c2dfb1797817c553e7ad = Integer(0x16c94534e51a349a2578283ef836c966fee6d76a9402c2dfb1797817c553e7ad); _sage_const_0x60496e9d6d158e82b5e4436eb250be5ac907b619f8a53eb9f615b80462d05a7e = Integer(0x60496e9d6d158e82b5e4436eb250be5ac907b619f8a53eb9f615b80462d05a7e); _sage_const_0x013dfd1f90e3df0b9ab3bc0a15955b4df218798cf65e50123167570d7e7d5435 = Integer(0x013dfd1f90e3df0b9ab3bc0a15955b4df218798cf65e50123167570d7e7d5435); _sage_const_0x2afd4db18777a9d4be03613982984496c8d2ebd6437611381e275e1b1a489356 = Integer(0x2afd4db18777a9d4be03613982984496c8d2ebd6437611381e275e1b1a489356); _sage_const_0x260a03d8414634a8b6369a20aca5a3e2eebe47cff1e00a5b15c1041af0101402 = Integer(0x260a03d8414634a8b6369a20aca5a3e2eebe47cff1e00a5b15c1041af0101402); _sage_const_0x0455543fbe2bb34477f589423af9e780a09b5bdc70a8e7c9e883391177ce3a41 = Integer(0x0455543fbe2bb34477f589423af9e780a09b5bdc70a8e7c9e883391177ce3a41); _sage_const_0x105d54f7395076044be05e2a1f92b5d35aa309bd3b51ffa13930c79aea9086a0 = Integer(0x105d54f7395076044be05e2a1f92b5d35aa309bd3b51ffa13930c79aea9086a0); _sage_const_0x3bc531dcd67c252f0d4578ce80f06650339d5a6a7bac6fb220d994d8cff03c67 = Integer(0x3bc531dcd67c252f0d4578ce80f06650339d5a6a7bac6fb220d994d8cff03c67); _sage_const_0x2e2ef1956c0f7c0d670e2aaf746f862dd357827a1f11011fc668a65540768da1 = Integer(0x2e2ef1956c0f7c0d670e2aaf746f862dd357827a1f11011fc668a65540768da1); _sage_const_0x04bb82dbce90079b8e900971137cc6b250361e904e2dd3f493d5de030a64da18 = Integer(0x04bb82dbce90079b8e900971137cc6b250361e904e2dd3f493d5de030a64da18); _sage_const_0x155e9a1ec1efa4723b03652aae3bde573fa4f88e12ee3dce9260b5d70a34420f = Integer(0x155e9a1ec1efa4723b03652aae3bde573fa4f88e12ee3dce9260b5d70a34420f); _sage_const_0x499d7f57650ecd1fe14ca5091a891f6706eec2323f479fdd3608305e24a5474d = Integer(0x499d7f57650ecd1fe14ca5091a891f6706eec2323f479fdd3608305e24a5474d); _sage_const_0x4831ef0a61c4f14cb88a4f05dd05206c21734ece774fcabb10ccdffbf68e4d31 = Integer(0x4831ef0a61c4f14cb88a4f05dd05206c21734ece774fcabb10ccdffbf68e4d31); _sage_const_0x2ef66265e7e66ac9fbaa225c1ddc30a5f4d14c50bc04f1f7b5a4f9431fc5f3b0 = Integer(0x2ef66265e7e66ac9fbaa225c1ddc30a5f4d14c50bc04f1f7b5a4f9431fc5f3b0); _sage_const_0x08f10f3cd2d4b6fda5bb3fa097f5b608d9f2b8965b148478376b45b8af179a02 = Integer(0x08f10f3cd2d4b6fda5bb3fa097f5b608d9f2b8965b148478376b45b8af179a02); _sage_const_0x3b191f28f07028b2d4737ee9c9a8b38b9cb1546fa69dbc1f889d36476f7970ae = Integer(0x3b191f28f07028b2d4737ee9c9a8b38b9cb1546fa69dbc1f889d36476f7970ae); _sage_const_0x4e80c89db679fae351891aa36921ab5f48b637a9de0b58cc37af904c1d5c7e07 = Integer(0x4e80c89db679fae351891aa36921ab5f48b637a9de0b58cc37af904c1d5c7e07); _sage_const_0x1925479ebacce955c4c6ca85236702201b28b7acbee64aa7e13b263af156d5f7 = Integer(0x1925479ebacce955c4c6ca85236702201b28b7acbee64aa7e13b263af156d5f7); _sage_const_0x1179e35b72b63758f8c7e2647c6ca81752ea789305cf38d34422e9676f550217 = Integer(0x1179e35b72b63758f8c7e2647c6ca81752ea789305cf38d34422e9676f550217); _sage_const_0x21ace5501adc79b894479c8b1859975daf5896e5cee056d3f36d3604df414151 = Integer(0x21ace5501adc79b894479c8b1859975daf5896e5cee056d3f36d3604df414151); _sage_const_0x35d770965a1dcc6b0cbd3bc3858244b5c7d249624494839bc9f9cca32b5ad096 = Integer(0x35d770965a1dcc6b0cbd3bc3858244b5c7d249624494839bc9f9cca32b5ad096); _sage_const_0x1fca672d02dff4440676693da4a4b0297b6486753111ff3863ba45d32913ade6 = Integer(0x1fca672d02dff4440676693da4a4b0297b6486753111ff3863ba45d32913ade6); _sage_const_0x1fa7119b52433e5d81e645f132a2c908ff42c27555e0698160252d2d515a43fc = Integer(0x1fa7119b52433e5d81e645f132a2c908ff42c27555e0698160252d2d515a43fc); _sage_const_0x373878bbd07997f352534a4e237d6cabdcdfc38a558b1e6c4f02cdd495a99a1d = Integer(0x373878bbd07997f352534a4e237d6cabdcdfc38a558b1e6c4f02cdd495a99a1d); _sage_const_0x2e778e955a007f2d336781bf08319397eb01b3bc922d6dd7a47d0db4cd523bfe = Integer(0x2e778e955a007f2d336781bf08319397eb01b3bc922d6dd7a47d0db4cd523bfe); _sage_const_0x39a7f4702b154a76a412f600a07faec89a08aebea657c60b95009ba7459ef89f = Integer(0x39a7f4702b154a76a412f600a07faec89a08aebea657c60b95009ba7459ef89f); _sage_const_0x1bb9a4c13f3b8ee114963f7540f70e5e2b057023126a512ed237779a4edf5308 = Integer(0x1bb9a4c13f3b8ee114963f7540f70e5e2b057023126a512ed237779a4edf5308); _sage_const_0x39d8ae7edb0e30073a0e109892716e36add92fa0c0ce99797f55e182bee15245 = Integer(0x39d8ae7edb0e30073a0e109892716e36add92fa0c0ce99797f55e182bee15245); _sage_const_0x041304aefe8d8e19d8c301a40e633268eaaca71679cb928eb3144707d830eec1 = Integer(0x041304aefe8d8e19d8c301a40e633268eaaca71679cb928eb3144707d830eec1); _sage_const_0x10bb3cdb00abe58aaa55dbbf14666f2f837443f65f104b56c80c0aa20e8f7045 = Integer(0x10bb3cdb00abe58aaa55dbbf14666f2f837443f65f104b56c80c0aa20e8f7045); _sage_const_0x2abd34573b48783a1de5f46331a812251a044a541b4b22cd9c5511b4970cc7e9 = Integer(0x2abd34573b48783a1de5f46331a812251a044a541b4b22cd9c5511b4970cc7e9); _sage_const_0x27769a082445fd9ed0395001b05e5681e6486882c10e1e267512708a046e7305 = Integer(0x27769a082445fd9ed0395001b05e5681e6486882c10e1e267512708a046e7305); _sage_const_0x03bfd7b5ce73d5e328aa268828228ef133bd656a13f6bf5a85c42f7b81c98b3c = Integer(0x03bfd7b5ce73d5e328aa268828228ef133bd656a13f6bf5a85c42f7b81c98b3c); _sage_const_2 = Integer(2); _sage_const_1 = Integer(1); _sage_const_0 = Integer(0); _sage_const_0x5cd65e6ea0651537d545ea9d2aaf7a0ab3d75e85b807aafb353b92e8782165d7 = Integer(0x5cd65e6ea0651537d545ea9d2aaf7a0ab3d75e85b807aafb353b92e8782165d7); _sage_const_4 = Integer(4); _sage_const_0x2884875ff2049d26104a2d850e93bb553dc11580374f9662cda1291b91fbc7be = Integer(0x2884875ff2049d26104a2d850e93bb553dc11580374f9662cda1291b91fbc7be); _sage_const_0x4f7954a5999c33ee181676b0f94060bbcf9d9fe5c6ba8bf88da84b1de02a3371 = Integer(0x4f7954a5999c33ee181676b0f94060bbcf9d9fe5c6ba8bf88da84b1de02a3371); _sage_const_0x0c478ede8b604da9a9916fdd90da33cf2a7fe7a41b4649e22daab5010b0e414c = Integer(0x0c478ede8b604da9a9916fdd90da33cf2a7fe7a41b4649e22daab5010b0e414c); _sage_const_0x304318c6b602176cd8920fe37da7a62ab71084a2cd4f6a529fc6d5fa2bed7761 = Integer(0x304318c6b602176cd8920fe37da7a62ab71084a2cd4f6a529fc6d5fa2bed7761); _sage_const_0x07a6beb071c915cf0d0ebfe70b35b0ff0bef1fe20b86f4b7d1e876d48b31acb3 = Integer(0x07a6beb071c915cf0d0ebfe70b35b0ff0bef1fe20b86f4b7d1e876d48b31acb3); _sage_const_0x4e7c1ffb43a208f6c63a1599477f4912b23ff764c096d58ef6121dbc5260edb9 = Integer(0x4e7c1ffb43a208f6c63a1599477f4912b23ff764c096d58ef6121dbc5260edb9); _sage_const_0x16f9615d8cfe906b7d011dd3028c8c1259d3197263d4f2c8104c1047c70cd1b5 = Integer(0x16f9615d8cfe906b7d011dd3028c8c1259d3197263d4f2c8104c1047c70cd1b5); _sage_const_0x27ff30d28901d110cf50f9deffa0d1b480b1345fec801a2dde95e7f6649b03c4 = Integer(0x27ff30d28901d110cf50f9deffa0d1b480b1345fec801a2dde95e7f6649b03c4); _sage_const_0x0fffada295bda0429748932b87cbdef25c25dcd600764a0ca7a523175ee1aaf8 = Integer(0x0fffada295bda0429748932b87cbdef25c25dcd600764a0ca7a523175ee1aaf8); _sage_const_0x121f8c18143c861112f6aae5e3b67b7a80b1d5a8d0068114cce8f7743d222f78 = Integer(0x121f8c18143c861112f6aae5e3b67b7a80b1d5a8d0068114cce8f7743d222f78); _sage_const_0x0d5ec71f95d3b0b7ed6155dc1c482db9e4bd97a0c7ddd8dddd72a5cf4702e49a = Integer(0x0d5ec71f95d3b0b7ed6155dc1c482db9e4bd97a0c7ddd8dddd72a5cf4702e49a); _sage_const_0x7a6d3cfcc08238236a8212a6a15575be8f8c1c669a7d6f333bbf3d6cdc1c52fa = Integer(0x7a6d3cfcc08238236a8212a6a15575be8f8c1c669a7d6f333bbf3d6cdc1c52fa); _sage_const_0x38fe960a91cb517d9366446ce6b038b91961cf08526a2fa3c999ec29fc4a3ba7 = Integer(0x38fe960a91cb517d9366446ce6b038b91961cf08526a2fa3c999ec29fc4a3ba7); _sage_const_0x0eedecfb37efd92ac7521f6671a93e9254d7bc14a1a983c88b9b87988af470e7 = Integer(0x0eedecfb37efd92ac7521f6671a93e9254d7bc14a1a983c88b9b87988af470e7); _sage_const_0x35b20471c614035813ecb721ab724c9936670be4dd9083af30c565859afb451d = Integer(0x35b20471c614035813ecb721ab724c9936670be4dd9083af30c565859afb451d); _sage_const_0x2824d9c49abb270c56dc53f99101b97d5b85d76239d226817b72709eeb5c3f16 = Integer(0x2824d9c49abb270c56dc53f99101b97d5b85d76239d226817b72709eeb5c3f16); _sage_const_0x2c8ccf4b35c3798eaa92b7baeecae9977c5f0a0185be5712ae4da969a6eedfde = Integer(0x2c8ccf4b35c3798eaa92b7baeecae9977c5f0a0185be5712ae4da969a6eedfde); _sage_const_0x01207be02e598e827d104537a9f6374e05db179868bc5d1f5b4465327af53144 = Integer(0x01207be02e598e827d104537a9f6374e05db179868bc5d1f5b4465327af53144); _sage_const_0x26bab8aa41bdf0f45350b6185389d50417a95a3b951cc1a5095fd9ed2c8c9bb3 = Integer(0x26bab8aa41bdf0f45350b6185389d50417a95a3b951cc1a5095fd9ed2c8c9bb3); _sage_const_0x7dd8f143d237b554a8fcb2154bbdf445de37fb2ce4656f2369db17ed3c993a52 = Integer(0x7dd8f143d237b554a8fcb2154bbdf445de37fb2ce4656f2369db17ed3c993a52); _sage_const_0x3771e9a7f4176a9e0c86d6794985ea7a21377d0c3e07160135ef0bbd33193c88 = Integer(0x3771e9a7f4176a9e0c86d6794985ea7a21377d0c3e07160135ef0bbd33193c88); _sage_const_0x2af1b78b30a30057c0d40ec76a8a1328983fa474bee0788156a8a202ea894868 = Integer(0x2af1b78b30a30057c0d40ec76a8a1328983fa474bee0788156a8a202ea894868); _sage_const_0x19571437d279b1be994a05bc4de7a24f664eea8960ba8128eb15be6ed6bc78b3 = Integer(0x19571437d279b1be994a05bc4de7a24f664eea8960ba8128eb15be6ed6bc78b3); _sage_const_0x3f4cd4a37036d932dab0234139707d3176c77c692f8c1a12e114526f4b1e903a = Integer(0x3f4cd4a37036d932dab0234139707d3176c77c692f8c1a12e114526f4b1e903a); _sage_const_0x13dc6f5094bc37892298d4b25d37b7241f443f49011cf89fa93a2a7c83639405 = Integer(0x13dc6f5094bc37892298d4b25d37b7241f443f49011cf89fa93a2a7c83639405); _sage_const_0x2af93d9cb12db49561066d66aac851945562dc9e5917a7e4aea385b96a8b0100 = Integer(0x2af93d9cb12db49561066d66aac851945562dc9e5917a7e4aea385b96a8b0100); _sage_const_0x5134cfa6c9092ca7b70c97a2cd4f2f920a3d3f84a813479c001599ec6e0a762c = Integer(0x5134cfa6c9092ca7b70c97a2cd4f2f920a3d3f84a813479c001599ec6e0a762c); _sage_const_0x1d3ceb3cc153bd08590cb470dde69dbb9abae762f5fabf30079d9aa188072575 = Integer(0x1d3ceb3cc153bd08590cb470dde69dbb9abae762f5fabf30079d9aa188072575); _sage_const_0x3b1f8c06a7ef8bae09cdaef430c04be537887a89fc667bdc1a34c4fac1595148 = Integer(0x3b1f8c06a7ef8bae09cdaef430c04be537887a89fc667bdc1a34c4fac1595148); _sage_const_0x306740203a6d0b4d414b19b907967c3c7ccb5c68bb30c39addd914c47284dfeb = Integer(0x306740203a6d0b4d414b19b907967c3c7ccb5c68bb30c39addd914c47284dfeb); _sage_const_0x0d624b1171bbb7f54396a3c276769f06f13a4c5ba663a50e788534a52474408f = Integer(0x0d624b1171bbb7f54396a3c276769f06f13a4c5ba663a50e788534a52474408f); _sage_const_0x1b9b1daaa7ea6b89f43ccc4ee456f9af174bc95ecf401349e0a4461cd1d18871 = Integer(0x1b9b1daaa7ea6b89f43ccc4ee456f9af174bc95ecf401349e0a4461cd1d18871); _sage_const_0x7714135c85cfc9980523d6da7eeecee940240e8060be8261bbde607d229beb5c = Integer(0x7714135c85cfc9980523d6da7eeecee940240e8060be8261bbde607d229beb5c); _sage_const_0x0bf36715a34fcadaf9b14d7ace129042cb0b2d93d3b8832bc7e44c801cb0c0b2 = Integer(0x0bf36715a34fcadaf9b14d7ace129042cb0b2d93d3b8832bc7e44c801cb0c0b2); _sage_const_0x365d7e1057f73d720e4a0be812fce840a5e2f26364a056b5449c2680d834567c = Integer(0x365d7e1057f73d720e4a0be812fce840a5e2f26364a056b5449c2680d834567c); _sage_const_0x6a28ebe3a77282e19b91b30b9ff316dc84f4bc84de5146b2f0dc0a4043a2c26c = Integer(0x6a28ebe3a77282e19b91b30b9ff316dc84f4bc84de5146b2f0dc0a4043a2c26c); _sage_const_0x25a6dbb50f51e49cd2f1ca4ef2f611a0f2bbee5fde9bb16ab2bd2fd544524d08 = Integer(0x25a6dbb50f51e49cd2f1ca4ef2f611a0f2bbee5fde9bb16ab2bd2fd544524d08); _sage_const_0x15e1c51b351ca71c5d6fa8aa91a568d61821df00c97672540d21169dcbb39ec7 = Integer(0x15e1c51b351ca71c5d6fa8aa91a568d61821df00c97672540d21169dcbb39ec7); _sage_const_0x03e7b8bc5eda3558c334fc7425dd008828f34fdc7df6e61f275d941cedd9f51b = Integer(0x03e7b8bc5eda3558c334fc7425dd008828f34fdc7df6e61f275d941cedd9f51b); _sage_const_0x2440bf712fdeef73e6d55c6ea69a61596df81419d1921882d1f09d7f0fc21e7c = Integer(0x2440bf712fdeef73e6d55c6ea69a61596df81419d1921882d1f09d7f0fc21e7c); _sage_const_0x0607a3605a63a801ecbc78f0c662891b13c32eb088e4010509520f341ff405ab = Integer(0x0607a3605a63a801ecbc78f0c662891b13c32eb088e4010509520f341ff405ab); _sage_const_0x5d8f2bf40b593f40fec4bcd1937e2e3c42221ab5a53fb9ccd6f7b40361f500f5 = Integer(0x5d8f2bf40b593f40fec4bcd1937e2e3c42221ab5a53fb9ccd6f7b40361f500f5); _sage_const_0x1ebd2835579b603aa79c38e56982f436c17ddd354c2bd03b02cc8e2be36ac5ee = Integer(0x1ebd2835579b603aa79c38e56982f436c17ddd354c2bd03b02cc8e2be36ac5ee); _sage_const_0x2283172d056278a24e11024ed27d3f9c27ad91bb3269230202e649ed08507bd2 = Integer(0x2283172d056278a24e11024ed27d3f9c27ad91bb3269230202e649ed08507bd2); _sage_const_0x6806c3d0918d189ac2d744fa2ebe968d397b4b3028978af910b91ce1d10d9430 = Integer(0x6806c3d0918d189ac2d744fa2ebe968d397b4b3028978af910b91ce1d10d9430); _sage_const_0x3f0559e744ece5cf0f556a91f747f574946f2d305a61e6d4ad0699d056fcacca = Integer(0x3f0559e744ece5cf0f556a91f747f574946f2d305a61e6d4ad0699d056fcacca); _sage_const_0x661eb553df91b11ee336b0cd803a62b3cd5096b9b7ffe7df5447fd17c6eead95 = Integer(0x661eb553df91b11ee336b0cd803a62b3cd5096b9b7ffe7df5447fd17c6eead95); _sage_const_0x4827367a0fbbbf849d1728d247d75ba027afa74278ee781c521b9bd2387622b7 = Integer(0x4827367a0fbbbf849d1728d247d75ba027afa74278ee781c521b9bd2387622b7)
F = GF(_sage_const_2 **_sage_const_255 -_sage_const_19 )
dM = F(-_sage_const_121665 )
d = F(-_sage_const_121665 /_sage_const_121666 )
ii = sqrt(F(-_sage_const_1 ))
def lobit(x): return int(x) & _sage_const_1
def hibit(x): return lobit(_sage_const_2 *x)
magic = sqrt(F(-_sage_const_121666 ))
if lobit(magic): magic = -magic
def eddsa_decode(y):
hi = int(y) & _sage_const_2 **_sage_const_255
y = F(y-hi)
x = sqrt((y**_sage_const_2 -_sage_const_1 )/(d*y**_sage_const_2 +_sage_const_1 ))
if int(x) & _sage_const_1 : x = -x
if hi: x = -x
assert y**_sage_const_2 - x**_sage_const_2 == _sage_const_1 +d*x**_sage_const_2 *y**_sage_const_2
return (x,y)
def eddsa_to_decaf(x,y):
"""
Converts an EdDSA point to a Decaf representation, in a manner compatible
with libdecaf.
The input point must be even.
Note well! Decaf does not represent the cofactor information of a point.
So e2d(d2e(s)) = s, but d2e(e2d(x,y)) might not be (x,y).
"""
if x*y == _sage_const_0 : return _sage_const_0 # This will happen anyway with straightforward square root trick
if not is_square((_sage_const_1 -y)/(_sage_const_1 +y)): raise Exception("Unimplemented: odd point in eddsa_to_decaf")
if hibit(magic/(x*y)): (x,y) = (ii*y,ii*x)
if hibit(_sage_const_2 *magic/x): y = -y
s = sqrt((_sage_const_1 -y)/(_sage_const_1 +y))
if hibit(s): s = -s
return s
def isqrt_trick(to_isr,to_inv):
to_sqrt = to_isr*to_inv**_sage_const_2
if to_sqrt == _sage_const_0 : return _sage_const_0 ,_sage_const_0 ,_sage_const_0 # This happens automatically in C; just to avoid problems in SAGE
if not is_square(to_sqrt): raise Exception("Not square in isqrt_trick!")
isr_times_inv = _sage_const_1 /sqrt(to_sqrt)
isr = isr_times_inv * to_inv
inv = isr_times_inv * isr * to_isr
assert isr**_sage_const_2 == _sage_const_1 /to_isr
assert inv == _sage_const_1 /to_inv
return isr, inv, isr_times_inv
def eddsa_to_decaf_opt(x,y,z=None):
"""
Optimized version of eddsa_to_decaf. Uses only one isqrt.
There's probably some way to further optimize if you have a T-coord,
but whatever.
"""
if z is None:
# Pretend that we're in projective
z = F.random_element()
x *= z
y *= z
isr,inv,isr_times_inv = isqrt_trick(z**_sage_const_2 -y**_sage_const_2 ,x*y)
minv = inv*magic*z
rotate = hibit(minv*z)
if rotate:
isr = isr_times_inv*(z**_sage_const_2 -y**_sage_const_2 )*magic
y = ii*x
if hibit(_sage_const_2 *minv*y) != rotate: y = -y
s = (z-y) * isr
if hibit(s): s = -s
return s
def decaf_to_eddsa(s):
"""
Convert a Decaf representation to an EdDSA point, in a manner compatible
with libdecaf.
Note well! Decaf does not represent the cofactor information of a point.
So e2d(d2e(s)) = s, but d2e(e2d(x,y)) might not be (x,y).
"""
if s == _sage_const_0 : return (_sage_const_0 ,_sage_const_1 )
if s < _sage_const_0 or s >= F.modulus(): raise Exception("out of field!")
s = F(s)
if hibit(s): raise Exception("invalid: s has high bit")
if not is_square(s**_sage_const_4 + (_sage_const_2 -_sage_const_4 *dM)*s**_sage_const_2 + _sage_const_1 ): raise Exception("invalid: not on curve")
t = sqrt(s**_sage_const_4 + (_sage_const_2 -_sage_const_4 *dM)*s**_sage_const_2 + _sage_const_1 )/s
if hibit(t): t = -t
y = (_sage_const_1 -s**_sage_const_2 )/(_sage_const_1 +s**_sage_const_2 )
x = _sage_const_2 *magic/t
if y == _sage_const_0 or lobit(t/y): raise Exception("invalid: t/y has high bit")
assert y**_sage_const_2 - x**_sage_const_2 == _sage_const_1 +d*x**_sage_const_2 *y**_sage_const_2
return (x,y)
def decaf_to_eddsa_opt(s):
"""
Convert a Decaf representation to an EdDSA point, in a manner compatible
with libdecaf.
This function is slightly less horrible if we don't want to decode to affine.
"""
if s < _sage_const_0 or s >= F.modulus(): raise Exception("out of field!")
s = F(s)
if hibit(s): raise Exception("Invalid: s has high bit")
if s == _sage_const_0 : return (_sage_const_0 ,_sage_const_1 )
curve_eqn = s**_sage_const_4 + (_sage_const_2 -_sage_const_4 *dM)*s**_sage_const_2 + _sage_const_1
isr,inv,isr_times_inv = isqrt_trick(curve_eqn,s*(_sage_const_1 -s**_sage_const_2 )*(_sage_const_1 +s**_sage_const_2 ))
if isr == _sage_const_0 : raise Exception("Invalid: nonstandard encoding of zero")
t = isr_times_inv * curve_eqn * (_sage_const_1 -s**_sage_const_2 ) * (_sage_const_1 +s**_sage_const_2 )
x = _sage_const_2 * magic * s * isr
y = (_sage_const_1 -s**_sage_const_2 )**_sage_const_2 * s * inv
hibit_t = hibit(t)
if hibit_t: x = -x
if lobit(t * (_sage_const_1 +s**_sage_const_2 )**_sage_const_2 * s * inv) != hibit_t: raise Exception("invalid: t/y has high bit")
assert y**_sage_const_2 - x**_sage_const_2 == _sage_const_1 +d*x**_sage_const_2 *y**_sage_const_2
return (x,y)
# Tests, but they require some pregenerated points
points = [
(_sage_const_0x2cafede3353135c821d79fa40e25cf3699be47f926ab6e53d0e0adfd014f19a0 ,_sage_const_0x70d0fdefa7d4b83d5a56347b1489aaaaa13c91c711f533e91ec3e420e731fd98 ),
(_sage_const_0x28471815cf3c4d9b8dcd7684dcdd676cb3117c5255b03f32238356a4acae433f ,_sage_const_0x306740203a6d0b4d414b19b907967c3c7ccb5c68bb30c39addd914c47284dfeb ),
(_sage_const_0x2c0c4eb0ae23d2728b13e9af3ab5b4ba6a4727dcf2cd6b269847e36ba91dbe45 ,_sage_const_0x7789d470a4b21a14ded681de924eaf4af17833197a0c9dedd223b6588e529e1b ),
(_sage_const_0x0a9a0497e538bc1856892cc61b859dea42156fcdc62b66e1cc341c7a6d44affb ,_sage_const_0x3bc531dcd67c252f0d4578ce80f06650339d5a6a7bac6fb220d994d8cff03c67 ),
(_sage_const_0x040876b83d9a59788d5a8b308e1c74255603d603353c0748dea9e46e6297f460 ,_sage_const_0x44caffd4eb82c01ea9ab40875f951c3a86605535d25a3ca40b09dfa47c3f0079 ),
(_sage_const_0x08e2924ad3c0a7b1e1a8095d7105d152eb3734e9b046153946a6e1ffb81ee042 ,_sage_const_0x4831ef0a61c4f14cb88a4f05dd05206c21734ece774fcabb10ccdffbf68e4d31 ),
(_sage_const_0x105d54f7395076044be05e2a1f92b5d35aa309bd3b51ffa13930c79aea9086a0 ,_sage_const_0x5d8f2bf40b593f40fec4bcd1937e2e3c42221ab5a53fb9ccd6f7b40361f500f5 ),
(_sage_const_0x0607a3605a63a801ecbc78f0c662891b13c32eb088e4010509520f341ff405ab ,_sage_const_0x1fb01b611b929b0ae76d5d9fcb3230eae78e4a5b883bf063ac8294e7faa0a1ac ),
(_sage_const_0x25a6dbb50f51e49cd2f1ca4ef2f611a0f2bbee5fde9bb16ab2bd2fd544524d08 ,_sage_const_0x3723a079cd53bb0f66900c9520c482783ede56c04577cfa1b7ec180eaeb96535 ),
(_sage_const_0x21ace5501adc79b894479c8b1859975daf5896e5cee056d3f36d3604df414151 ,_sage_const_0x1581fc4e7ba5815300e7d4229b0a33b3270f9c14ba00a272cdf657ed4ab40147 ),
(_sage_const_0x2afd4db18777a9d4be03613982984496c8d2ebd6437611381e275e1b1a489356 ,_sage_const_0x7714135c85cfc9980523d6da7eeecee940240e8060be8261bbde607d229beb5c ),
(_sage_const_0x2c55dcc8fa715d32fcf51bce0f0a31e44c3f670fedb77b11ae1fbcc7266a95d0 ,_sage_const_0x274c10042bf6d54c20631ddf5ac118d0f27db35b9f471643e309564cf0406ac9 ),
(_sage_const_0x3771e9a7f4176a9e0c86d6794985ea7a21377d0c3e07160135ef0bbd33193c88 ,_sage_const_0x5c053b543c6410dc46c9beaca721332f618ea6f8ee612e7a01c90f492d34884d ),
(_sage_const_0x12a902fe4628586de88003082c8b409049f403d9d8467251967eb01a8d0d46bc ,_sage_const_0x57126efc10bab3b42919389f5adabe3852612d4f5431380172e114f5e3f9478e ),
(_sage_const_0x1925479ebacce955c4c6ca85236702201b28b7acbee64aa7e13b263af156d5f7 ,_sage_const_0x1012c8da84330dd7e57599d7230a798ad8d782e53951002d8ce2d6dd61ad22ab ),
(_sage_const_0x3fb563cb76965388d2b7cd310e8883168a44b54f51e87e85450d8082cda27c32 ,_sage_const_0x5cc8f5a6d9c932f3c5bb73d8aaad712be8a612f3afafa968dfdfe4bf89de9cec ),
(_sage_const_0x39a7f4702b154a76a412f600a07faec89a08aebea657c60b95009ba7459ef89f ,_sage_const_0x600c2f535231b50626e8f82c941c65083aa2b7f0b05abe837a3f8a561eb4c111 ),
(_sage_const_0x04bb82dbce90079b8e900971137cc6b250361e904e2dd3f493d5de030a64da18 ,_sage_const_0x3131af645b0328fbc1ba1041940f792892f330801ae4cbae720d0eed272852ea ),
(_sage_const_0x33816ae7bcdd2e844fd2317e6ed2b7a9064ade4c550c34683107550a102a5844 ,_sage_const_0x1f75ef4620c6b9fc1df0550cebe7e77a312d52ebcec44531fc8fd579cd73a6f7 ),
(_sage_const_0x243612efda2f2520e619461d64e56650f5a2926d6d0e7e0f5fb2bde0bc5bdd02 ,_sage_const_0x19bab77d15fbc92056f85d17e47ae57530aacc3ebf17209772aba8cf221aadbf ),
(_sage_const_0x2440bf712fdeef73e6d55c6ea69a61596df81419d1921882d1f09d7f0fc21e7c ,_sage_const_0x1d3ceb3cc153bd08590cb470dde69dbb9abae762f5fabf30079d9aa188072575 ),
(_sage_const_0x015c63020852ff73044ecea7037602ee6f9fede01a7ce27489bad570d6946dbf ,_sage_const_0x4ec777e530371db59c116842447e2289ab66a1291c99b195d2665b6fc8254988 ),
(_sage_const_0x0412cb12cc650ab9d7a78e98c02a254d2efb27525850c362c7e25f7a140aa261 ,_sage_const_0x499d7f57650ecd1fe14ca5091a891f6706eec2323f479fdd3608305e24a5474d ),
(_sage_const_0x03bfd7b5ce73d5e328aa268828228ef133bd656a13f6bf5a85c42f7b81c98b3c ,_sage_const_0x22f309299d1bbf9f9e0d7005d6bb9e3c9063bafa5460e3910f4f11dc333daea2 ),
(_sage_const_0x047e543079617fca27a7128d03935f4c2ef763dcf12fdf56435c7a637d6c3e94 ,_sage_const_0x0c478ede8b604da9a9916fdd90da33cf2a7fe7a41b4649e22daab5010b0e414c ),
(_sage_const_0x04952979f1593cdb8789d50749d4293113b6d9ea0cc883dbab06254d9ffac408 ,_sage_const_0x365d7e1057f73d720e4a0be812fce840a5e2f26364a056b5449c2680d834567c ),
(_sage_const_0x17668b514606ff6272cf0c2d5ab0be82556ac53ce7158dc6dcd0b1b197063ebf ,_sage_const_0x0bf36715a34fcadaf9b14d7ace129042cb0b2d93d3b8832bc7e44c801cb0c0b2 ),
(_sage_const_0x064bd96210b06010d3aa5a3b60630634bbc398c03fe13c6d3ac06590604eec28 ,_sage_const_0x60496e9d6d158e82b5e4436eb250be5ac907b619f8a53eb9f615b80462d05a7e ),
(_sage_const_0x16c94534e51a349a2578283ef836c966fee6d76a9402c2dfb1797817c553e7ad ,_sage_const_0x5cd65e6ea0651537d545ea9d2aaf7a0ab3d75e85b807aafb353b92e8782165d7 ),
(_sage_const_0x3a2f74ac2e759c2a84c8a15ccd8a330e4cbbcdd4f0f78f8c05a18b5e5237cb57 ,_sage_const_0x19571437d279b1be994a05bc4de7a24f664eea8960ba8128eb15be6ed6bc78b3 ),
(_sage_const_0x15e1c51b351ca71c5d6fa8aa91a568d61821df00c97672540d21169dcbb39ec7 ,_sage_const_0x6806c3d0918d189ac2d744fa2ebe968d397b4b3028978af910b91ce1d10d9430 ),
(_sage_const_0x1dd8871f80931ec7d408dd7f1c45d056429d8b975f0fd1b3e5b17be43380f360 ,_sage_const_0x48d984bc9b5e16fa12274cab9b3fdeb5753eb9d2195e1edde15710b1ff4a78b7 ),
(_sage_const_0x013dfd1f90e3df0b9ab3bc0a15955b4df218798cf65e50123167570d7e7d5435 ,_sage_const_0x373878bbd07997f352534a4e237d6cabdcdfc38a558b1e6c4f02cdd495a99a1d ),
(_sage_const_0x3b191f28f07028b2d4737ee9c9a8b38b9cb1546fa69dbc1f889d36476f7970ae ,_sage_const_0x13dc6f5094bc37892298d4b25d37b7241f443f49011cf89fa93a2a7c83639405 ),
(_sage_const_0x04013f892451bfae60cc2016f4488f8557a476c4fba67a37bbb7755ea6421229 ,_sage_const_0x4e7c1ffb43a208f6c63a1599477f4912b23ff764c096d58ef6121dbc5260edb9 ),
(_sage_const_0x3df16cb8ba0d99c2a83623080968e0a0e737b9791db5518b33f774c1b1ca1066 ,_sage_const_0x106d22b8426ea5d6c02790ffc8499bcb4fff4ace5815b52e59198fda1522d22b ),
(_sage_const_0x126a70cdd2ea42fad54cdad0bbc6d43b4035e8796608b8a71fca7a64b4c866b5 ,_sage_const_0x35b20471c614035813ecb721ab724c9936670be4dd9083af30c565859afb451d ),
(_sage_const_0x27003d61ae8bae9abe323a2417326fc9605373842133569e68db07766fdf13b4 ,_sage_const_0x5134cfa6c9092ca7b70c97a2cd4f2f920a3d3f84a813479c001599ec6e0a762c ),
(_sage_const_0x35d0a38a36d9a0f9f86d5e57ae4476a21a66c15b38dc69ede62a0e7aa19177e0 ,_sage_const_0x4827367a0fbbbf849d1728d247d75ba027afa74278ee781c521b9bd2387622b7 ),
(_sage_const_0x20f482e14c42db88844c1aeae82c8e5223508232b1f17f23da39c3ef728e0a01 ,_sage_const_0x519b487006b05dd35de8457092fe3ed666591f4ebe1cffcf699198ef28f0bac1 ),
(_sage_const_0x0455543fbe2bb34477f589423af9e780a09b5bdc70a8e7c9e883391177ce3a41 ,_sage_const_0x4f7954a5999c33ee181676b0f94060bbcf9d9fe5c6ba8bf88da84b1de02a3371 ),
(_sage_const_0x3dfc6e4032693471f402fb3b20ae6ff0cac0dcb6335e4791b9afec311c489de1 ,_sage_const_0x05821acf4d3968aad6978fd5790edb5f5df303190ec2811aab0954e5ec178427 ),
(_sage_const_0x196f7fdf033a6057bc85bab561eb75bb7c1fbdfe01dd4647c53faa18acb55552 ,_sage_const_0x2e0290811c4c96b696e5a4f4f51f44e9281933759926e993e32d7a5a203a83c9 ),
(_sage_const_0x1e1b753c869f8f455e9ca31cac2b9750ed0db980b64899bfbf6ed4057a9d18ef ,_sage_const_0x661eb553df91b11ee336b0cd803a62b3cd5096b9b7ffe7df5447fd17c6eead95 ),
(_sage_const_0x0fffada295bda0429748932b87cbdef25c25dcd600764a0ca7a523175ee1aaf8 ,_sage_const_0x2260e201c9447f7f5fc96dbf94ecfd84025e32bdb6527e5f6ea7b491ec0868ac ),
(_sage_const_0x01207be02e598e827d104537a9f6374e05db179868bc5d1f5b4465327af53144 ,_sage_const_0x7f9e45d36a1a06cbe82c4a5b158a5c8d4052a4ce3b243c97d732f4f96d51ad69 ),
(_sage_const_0x2abd34573b48783a1de5f46331a812251a044a541b4b22cd9c5511b4970cc7e9 ,_sage_const_0x16f9615d8cfe906b7d011dd3028c8c1259d3197263d4f2c8104c1047c70cd1b5 ),
(_sage_const_0x0eedecfb37efd92ac7521f6671a93e9254d7bc14a1a983c88b9b87988af470e7 ,_sage_const_0x74ce5bac7f261a5defc289d7821e7919b640b96d37db5e2286c2bf2511286d71 ),
(_sage_const_0x260a03d8414634a8b6369a20aca5a3e2eebe47cff1e00a5b15c1041af0101402 ,_sage_const_0x3d17189e0f1de4c53e0415c6e7d7e4fd947ade06ab462ccce78c9947d6d8a32c ),
(_sage_const_0x2884875ff2049d26104a2d850e93bb553dc11580374f9662cda1291b91fbc7be ,_sage_const_0x362c024afc7e10fc16fc10e3390851ce1639cf40dffa8e40ff1e6f564093d075 ),
(_sage_const_0x03e7b8bc5eda3558c334fc7425dd008828f34fdc7df6e61f275d941cedd9f51b ,_sage_const_0x10bb3cdb00abe58aaa55dbbf14666f2f837443f65f104b56c80c0aa20e8f7045 ),
(_sage_const_0x0c41e8d17434987c3fe1f83ca28c4c7a29275477c1829e33d272e3c8634c6ccf ,_sage_const_0x48078d11dee44befe72a119cd3fba0cec3f2807f1c585d52f39d46cf63691236 ),
(_sage_const_0x3f0559e744ece5cf0f556a91f747f574946f2d305a61e6d4ad0699d056fcacca ,_sage_const_0x6a28ebe3a77282e19b91b30b9ff316dc84f4bc84de5146b2f0dc0a4043a2c26c ),
(_sage_const_0x041304aefe8d8e19d8c301a40e633268eaaca71679cb928eb3144707d830eec1 ,_sage_const_0x565323812efe5aa85d62ae86ef0094f096fd3cd02fe0fde621dcac47932b9cd5 ),
(_sage_const_0x2a17f7f61f2704e03a14ba008c3159ff0d1ef4733fe1f5f2b2b90e452976bc8b ,_sage_const_0x26bab8aa41bdf0f45350b6185389d50417a95a3b951cc1a5095fd9ed2c8c9bb3 ),
(_sage_const_0x04bc19c92dc1b8fc6fb66631c2e7134dde56a9f185680e3ceaa438e2a2cfb2ec ,_sage_const_0x304318c6b602176cd8920fe37da7a62ab71084a2cd4f6a529fc6d5fa2bed7761 ),
(_sage_const_0x0004962f43c64c0a1f25e03666b017af771050417f5eda0ad217cf3659e8854f ,_sage_const_0x7a8f0501d26438df711067b0ae3131351c44905d5ba5c0c21da21c455f127b50 ),
(_sage_const_0x3f4cd4a37036d932dab0234139707d3176c77c692f8c1a12e114526f4b1e903a ,_sage_const_0x49e8ec31649d4d0c13c18bc39c205f89bce5469c1b9ba435c5e32dc6c60c2412 ),
(_sage_const_0x3fea5f5dd191dab2222ddb15ad45ad96db58326f453dcc179cba27e71e169e3e ,_sage_const_0x67b4e3c3834e8a8ece8f7e9abe535deb51155d276acf30e0d79832115b35814d ),
(_sage_const_0x0744e0bbd51d38f635827f8dda3d019e61a0337c274cfa3aa040971aeafea861 ,_sage_const_0x4d14417bed8e4bb606f83676b3fb11dc95622d2788825a8a92addf381b062ac7 ),
(_sage_const_0x2ef66265e7e66ac9fbaa225c1ddc30a5f4d14c50bc04f1f7b5a4f9431fc5f3b0 ,_sage_const_0x741d4d2574942720802252deb7f0a0db2d12d2ea8baa4e70e72d4c202940870d ),
(_sage_const_0x316d162ecfa06096e1d42664e627d41954bee5aede57959a88fec86872f1633b ,_sage_const_0x589b2fd4a8febb704a1fbfb92db961f641f93d4fc78122d9877e1ca6da5c8b5f ),
(_sage_const_0x37ccc28e4680291c3df089c6fe4e0b69ecd5a560f43304c614d7dced6ee08eb1 ,_sage_const_0x4e80c89db679fae351891aa36921ab5f48b637a9de0b58cc37af904c1d5c7e07 ),
(_sage_const_0x16374bf8ab4a7a0df026b79aaeb413f6b1b3a1cc817751c8e5c7c8a00eb3d276 ,_sage_const_0x490c4b81e9e4db79f978f95958827a9d03d2dead4fdcba50815cb7ef74f37d1a ),
(_sage_const_0x05fb64b41c47522bef7e9c7b06f98abdf67be15cad4e04546a14526e64a55ebc ,_sage_const_0x5d6049184ad917bec99aa55dc97c8edb585e973f0d07ccfa50e46518db28275b ),
(_sage_const_0x0d5ec71f95d3b0b7ed6155dc1c482db9e4bd97a0c7ddd8dddd72a5cf4702e49a ,_sage_const_0x09c6eb0e7d84be6b3373d4337add32a79623898dd6a5ea57f8216bf4c6ec4882 ),
(_sage_const_0x2af93d9cb12db49561066d66aac851945562dc9e5917a7e4aea385b96a8b0100 ,_sage_const_0x78b661cb135ce2751f924f7a62a951c7c24b2f436fdc90be48fb0c4b1997dd85 ),
(_sage_const_0x2c325cee3dd9b9a68403a0be9a26d20aa057804417e3c1d7caf174f407e5be05 ,_sage_const_0x6bff133e8bdd62ad16971723b400aaa021d2fded7b79a9362e103380c28ecaff ),
(_sage_const_0x35d770965a1dcc6b0cbd3bc3858244b5c7d249624494839bc9f9cca32b5ad096 ,_sage_const_0x08f10f3cd2d4b6fda5bb3fa097f5b608d9f2b8965b148478376b45b8af179a02 ),
(_sage_const_0x1179e35b72b63758f8c7e2647c6ca81752ea789305cf38d34422e9676f550217 ,_sage_const_0x55c7869968a944116c631dd70f5d8f892915e1ddfbae57b747afe8e257025b0d ),
(_sage_const_0x2af1b78b30a30057c0d40ec76a8a1328983fa474bee0788156a8a202ea894868 ,_sage_const_0x694a5dfe341e94eab7026a4639bea425e90bd23a759b18c6a5310b57275c9da0 ),
(_sage_const_0x256f0a2bd740800aa7ebad9803215a8f75a3117e18a3f97a86e1a9a697471a65 ,_sage_const_0x3a08084aa98a06ce5f2d3c4d75f3a2f8f1aea4989ac54bb33abfeed8cb93be20 ),
(_sage_const_0x2283172d056278a24e11024ed27d3f9c27ad91bb3269230202e649ed08507bd2 ,_sage_const_0x57ff3e743dc4633cce8cb2365a1294082c8d1b0399360a2f6e3e1b5a47a99a9c ),
(_sage_const_0x0f501eb466656489a2f5c1c359bc719257a2b96aa289750f9e7e8fd9592fcb79 ,_sage_const_0x098725a529e09592e80c948f98e4c24d26c2929b4dd64ecb82fb17ce6fb3e62e ),
(_sage_const_0x155e9a1ec1efa4723b03652aae3bde573fa4f88e12ee3dce9260b5d70a34420f ,_sage_const_0x57c790966cf4733d05907d98024f3a6edda98e4205b905a270001e9b55d1dca5 ),
(_sage_const_0x0fd200084e28fb73f308b72d1f4817c8e85402b64055fd4470d93320d3785d1a ,_sage_const_0x761607f0fc020df5268674ca1da73bbe86a532b5d8641995aabddfff4e12ce57 ),
(_sage_const_0x2e2ef1956c0f7c0d670e2aaf746f862dd357827a1f11011fc668a65540768da1 ,_sage_const_0x68438813134e5a4a79adbef436ac4b346c27f91d77e82df89b2a6aeb4890abfa ),
(_sage_const_0x2c8ccf4b35c3798eaa92b7baeecae9977c5f0a0185be5712ae4da969a6eedfde ,_sage_const_0x3b1f8c06a7ef8bae09cdaef430c04be537887a89fc667bdc1a34c4fac1595148 ),
(_sage_const_0x2a7ef5d233bcf7dac30fd26cb9bf97e488f0ba67c0efe9c77fed2e93944d8ec5 ,_sage_const_0x4295b408841af3d360c848b4b453e73c1abf3eb644ee5161686db5a8909dc2a1 ),
(_sage_const_0x07a6beb071c915cf0d0ebfe70b35b0ff0bef1fe20b86f4b7d1e876d48b31acb3 ,_sage_const_0x1fca672d02dff4440676693da4a4b0297b6486753111ff3863ba45d32913ade6 ),
(_sage_const_0x0578987fe8e5d51c14adec2e1f6db305845e8c352c334db308c7d9f7d311e055 ,_sage_const_0x0917e1b0f269a7fd11db658bd0b061fb14d2a4d42e0965036046a7ddea8f71b4 ),
(_sage_const_0x0a03bde685d1a307cc534124e8105e627f338e3269dca9a47b6d28992df18eef ,_sage_const_0x61eb41b04e2adcc2667484a34110abeebe096e45c41c8f3605ee4fa1c89ca24f ),
(_sage_const_0x27769a082445fd9ed0395001b05e5681e6486882c10e1e267512708a046e7305 ,_sage_const_0x45907da5f53a6039c250821c0f8cdc5d3b74a080674596ad961abe45a1414241 ),
(_sage_const_0x1789c532f5787319639987f2e3d04be21ef57d9e1cd5151642e590336a1c0f9e ,_sage_const_0x0d0a4c3011f99055ea1b2fd72ed4feb9610b3619976669a1bf0920d37c2a81c7 ),
(_sage_const_0x121f8c18143c861112f6aae5e3b67b7a80b1d5a8d0068114cce8f7743d222f78 ,_sage_const_0x5c93f690d3fbf300654279bcfa3924109a75e66965d932629162e9f0d11fc172 ),
(_sage_const_0x38fe960a91cb517d9366446ce6b038b91961cf08526a2fa3c999ec29fc4a3ba7 ,_sage_const_0x4113f20d6fbbaa11b9dcc35548d2df58da22de5141b0582f8a0f55e85b040efc ),
(_sage_const_0x1fa7119b52433e5d81e645f132a2c908ff42c27555e0698160252d2d515a43fc ,_sage_const_0x7dd8f143d237b554a8fcb2154bbdf445de37fb2ce4656f2369db17ed3c993a52 ),
(_sage_const_0x33daa340989ac98ff07765a7bac4db9a37fad1cd0711ee3cb04401b04d3f2284 ,_sage_const_0x33ffa6e4db20db65069e87d2be7bf28a38bbdf2f71aa5379b38cbfee2d293d6b ),
(_sage_const_0x1ebd2835579b603aa79c38e56982f436c17ddd354c2bd03b02cc8e2be36ac5ee ,_sage_const_0x06de6550ca5d285ae183b33a2c504b96f842f97a4c6e7c99a00f51f00b15a525 ),
(_sage_const_0x20c8ca7fcf284a0b11afe6b735d34ff8c5a4b6e30d7c73193c0359eed19474c0 ,_sage_const_0x62c5cc2ac62fd562fcb633fcac2c16c04dde3eba86005ed9f6721308f1d4c06e ),
(_sage_const_0x10b6b313cd4bbddd943dafe69c54a97c94cddca1f55b259cf2356e0b1c8334a2 ,_sage_const_0x3e13285c9356f3435852c36e0a28efacab11440e6de04b11487e75f064ee43c8 ),
(_sage_const_0x1bb9a4c13f3b8ee114963f7540f70e5e2b057023126a512ed237779a4edf5308 ,_sage_const_0x269dfda804d7f866b0ba69c04cee25bb5e3c8b36ab93af335ce666345b7bd4c6 ),
(_sage_const_0x07c1fa5c3b6dc3e3761ab5dcbad794beeeefd74bb2358eaa4f184a179c8df0a4 ,_sage_const_0x27ff30d28901d110cf50f9deffa0d1b480b1345fec801a2dde95e7f6649b03c4 ),
(_sage_const_0x1404397546c70250c71bdd77aac118c723a9a6707f4d2937e315d62577d00913 ,_sage_const_0x7a6d3cfcc08238236a8212a6a15575be8f8c1c669a7d6f333bbf3d6cdc1c52fa ),
(_sage_const_0x092fc5f2ffe64b2ee96fc185c9b08b6cead1d1726ab6c2fb986a71140dacce99 ,_sage_const_0x673f1b27ed4c827abb4521e17c8a9114d4a774a48b904189d0e5bff9bf29714a ),
(_sage_const_0x39d8ae7edb0e30073a0e109892716e36add92fa0c0ce99797f55e182bee15245 ,_sage_const_0x0d624b1171bbb7f54396a3c276769f06f13a4c5ba663a50e788534a52474408f ),
(_sage_const_0x1b9b1daaa7ea6b89f43ccc4ee456f9af174bc95ecf401349e0a4461cd1d18871 ,_sage_const_0x419f6832e80aa836f5fee537593318316aa93f83b3c3869163025d93d860bd63 ),
(_sage_const_0x1c2bb67d85d72d79c0f8d7f902d3e65b3f1220d0d896db35a73f50072a9fb61a ,_sage_const_0x0dcf37b880e6d34b47865b30c3accd9312e18c7a138e8b68d75fb45de2cc2875 ),
(_sage_const_0x1377aa06b3dd07c58e0370706ebd0563c155a888c3d9d33d1cc9a423ba5740b9 ,_sage_const_0x6725250788de0f6fa3c8db5f0f7e6d479e86791445a8d18e611c959bddf7f3ae ),
(_sage_const_0x2e778e955a007f2d336781bf08319397eb01b3bc922d6dd7a47d0db4cd523bfe ,_sage_const_0x2824d9c49abb270c56dc53f99101b97d5b85d76239d226817b72709eeb5c3f16 )
]
base = (_sage_const_0x2697d8373aeb763fdc1dbf5eefffd1b3e8d90bc1b61f92c9a5aad68079eeef2c ,_sage_const_0x47d0e827cb1595e1470eb88580d5716c4cf22832ea2f0ff0df38ab61ca32112f )
print [eddsa_to_decaf_opt(x,y) == eddsa_to_decaf(x,y) for _,enc in points for x,y in [eddsa_decode(enc)]]
print [s == eddsa_to_decaf(*decaf_to_eddsa(s)) for s,_ in points]
print [s == eddsa_to_decaf_opt(*decaf_to_eddsa_opt(s)) for s,_ in points]
| 224.512397 | 32,164 | 0.923581 | 2,641 | 54,332 | 18.39114 | 0.134797 | 0.094501 | 0.009059 | 0.005477 | 0.039736 | 0.035968 | 0.033065 | 0.030533 | 0.027012 | 0.026291 | 0 | 0.482312 | 0.041633 | 54,332 | 241 | 32,165 | 225.443983 | 0.45049 | 0.005558 | 0 | 0.086022 | 1 | 0 | 0.004669 | 0 | 0 | 1 | 0.250984 | 0 | 0.026882 | 0 | null | null | 0 | 0.005376 | null | null | 0.016129 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
739a67c0ff6ce5679e23971e647813b2bd131949 | 1,316 | py | Python | Python3/1326.py | rakhi2001/ecom7 | 73790d44605fbd51e8f7e804b9808e364fcfc680 | [
"MIT"
] | 854 | 2018-11-09T08:06:16.000Z | 2022-03-31T06:05:53.000Z | Python3/1326.py | rakhi2001/ecom7 | 73790d44605fbd51e8f7e804b9808e364fcfc680 | [
"MIT"
] | 29 | 2019-06-02T05:02:25.000Z | 2021-11-15T04:09:37.000Z | Python3/1326.py | rakhi2001/ecom7 | 73790d44605fbd51e8f7e804b9808e364fcfc680 | [
"MIT"
] | 347 | 2018-12-23T01:57:37.000Z | 2022-03-12T14:51:21.000Z | __________________________________________________________________________________________________
sample 144 ms submission
class Solution:
def minTaps(self, n: int, ranges: List[int]) -> int:
pool, ans, cur, new = sorted([max(0, i - r), min(n, i + r)] for i, r in enumerate(ranges)), 0, 0, 0
for i, j in pool:
if i > cur:
if new < i:
return -1
cur, new = new, j
ans += 1
else:
new = max(new, j)
if new == n:
return ans if cur == n else ans + 1
return -1
__________________________________________________________________________________________________
sample 148 ms submission
class Solution:
def minTaps(self, n: int, ranges: List[int]) -> int:
pool, ans, cur, new = sorted([max(0, i - r), min(n, i + r)] for i, r in enumerate(ranges)), 0, 0, 0
for i, j in pool:
if i > cur:
if new < i:
return -1
cur, new = new, j
ans += 1
else:
new = max(new, j)
if new == n:
return ans + 1
return -1
__________________________________________________________________________________________________
| 38.705882 | 107 | 0.544833 | 143 | 1,316 | 2.958042 | 0.223776 | 0.028369 | 0.080378 | 0.118203 | 0.888889 | 0.888889 | 0.888889 | 0.888889 | 0.888889 | 0.888889 | 0 | 0.026411 | 0.367021 | 1,316 | 33 | 108 | 39.878788 | 0.481393 | 0 | 0 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
73a47a4bb8bd0c69ee9f4ade56ccca482174de67 | 73,033 | py | Python | quark/tests/plugin_modules/test_ports.py | Cerberus98/quark | 53848e357a1b4d5d23f565963f22115d8997e38f | [
"Apache-2.0"
] | null | null | null | quark/tests/plugin_modules/test_ports.py | Cerberus98/quark | 53848e357a1b4d5d23f565963f22115d8997e38f | [
"Apache-2.0"
] | null | null | null | quark/tests/plugin_modules/test_ports.py | Cerberus98/quark | 53848e357a1b4d5d23f565963f22115d8997e38f | [
"Apache-2.0"
] | null | null | null | # Copyright 2013 Openstack Foundation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import contextlib
import json
import mock
import netaddr
from neutron.api.v2 import attributes as neutron_attrs
from neutron.common import exceptions
from oslo_config import cfg
from quark.db import models
from quark import exceptions as q_exc
from quark import network_strategy
from quark.plugin_modules import ports as quark_ports
from quark import tags
from quark.tests import test_quark_plugin
class TestQuarkGetPorts(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, ports=None, addrs=None):
port_models = []
addr_models = None
if addrs:
addr_models = []
for address in addrs:
a = models.IPAddress(**address)
addr_models.append(a)
if isinstance(ports, list):
for port in ports:
port_model = models.Port(**port)
if addr_models:
port_model.ip_addresses = addr_models
port_models.append(port_model)
elif ports is None:
port_models = None
else:
port_model = models.Port(**ports)
if addr_models:
port_model.ip_addresses = addr_models
port_models = port_model
with contextlib.nested(
mock.patch("quark.db.api.port_find")
) as (port_find,):
port_find.return_value = port_models
yield
def test_port_list_no_ports(self):
with self._stubs(ports=[]):
ports = self.plugin.get_ports(self.context, filters=None,
fields=None)
self.assertEqual(ports, [])
def test_port_list_with_device_owner_dhcp(self):
ip = dict(id=1, address=netaddr.IPAddress("192.168.1.100").value,
address_readable="192.168.1.100", subnet_id=1, network_id=2,
version=4)
filters = {'network_id': ip['network_id'],
'device_owner': 'network:dhcp'}
port = dict(mac_address="AA:BB:CC:DD:EE:FF", network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
bridge="xenbr0", device_owner='network:dhcp')
with self._stubs(ports=[port], addrs=[ip]):
ports = self.plugin.get_ports(self.context, filters=filters,
fields=None)
self.assertEqual(len(ports), 1)
self.assertEqual(ports[0]["device_owner"], "network:dhcp")
def test_port_list_with_ports(self):
ip = dict(id=1, address=netaddr.IPAddress("192.168.1.100").value,
address_readable="192.168.1.100", subnet_id=1, network_id=2,
version=4)
port = dict(mac_address="AA:BB:CC:DD:EE:FF", network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
bridge="xenbr0")
expected = {'status': "ACTIVE",
'device_owner': None,
'mac_address': 'AA:BB:CC:DD:EE:FF',
'network_id': 1,
'bridge': "xenbr0",
'tenant_id': self.context.tenant_id,
'admin_state_up': None,
'device_id': 2}
with self._stubs(ports=[port], addrs=[ip]):
ports = self.plugin.get_ports(self.context, filters=None,
fields=None)
self.assertEqual(len(ports), 1)
fixed_ips = ports[0].pop("fixed_ips")
for key in expected.keys():
self.assertEqual(ports[0][key], expected[key])
self.assertEqual(fixed_ips[0]["subnet_id"], ip["subnet_id"])
self.assertEqual(fixed_ips[0]["ip_address"],
ip["address_readable"])
def test_port_show_with_int_mac(self):
port = dict(mac_address=int('AABBCCDDEEFF', 16), network_id=1,
tenant_id=self.context.tenant_id, device_id=2)
expected = {'status': "ACTIVE",
'device_owner': None,
'mac_address': 'AA:BB:CC:DD:EE:FF',
'network_id': 1,
'tenant_id': self.context.tenant_id,
'admin_state_up': None,
'fixed_ips': [],
'device_id': 2}
with self._stubs(ports=port):
result = self.plugin.get_port(self.context, 1)
for key in expected.keys():
self.assertEqual(result[key], expected[key])
def test_port_show_not_found(self):
with self._stubs(ports=None):
with self.assertRaises(exceptions.PortNotFound):
self.plugin.get_port(self.context, 1)
def test_port_show_vlan_id(self):
"""Prove VLAN IDs are included in port information when available."""
port_tags = [tags.VlanTag().serialize(5)]
port = dict(mac_address=int('AABBCCDDEEFF', 16), network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
tags=port_tags)
expected = {'status': "ACTIVE",
'device_owner': None,
'mac_address': 'AA:BB:CC:DD:EE:FF',
'network_id': 1,
'tenant_id': self.context.tenant_id,
'admin_state_up': None,
'fixed_ips': [],
'device_id': 2,
'vlan_id': '5'}
with self._stubs(ports=port):
result = self.plugin.get_port(self.context, 1)
for key in expected.keys():
self.assertEqual(result[key], expected[key])
def test_port_show_invalid_vlan_id(self):
"""Prove VLAN IDs are included in port information when available."""
port_tags = [tags.VlanTag().serialize('invalid')]
port = dict(mac_address=int('AABBCCDDEEFF', 16), network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
tags=port_tags)
expected = {'status': "ACTIVE",
'device_owner': None,
'mac_address': 'AA:BB:CC:DD:EE:FF',
'network_id': 1,
'tenant_id': self.context.tenant_id,
'admin_state_up': None,
'fixed_ips': [],
'device_id': 2}
with self._stubs(ports=port):
result = self.plugin.get_port(self.context, 1)
for key in expected.keys():
self.assertEqual(result[key], expected[key])
class TestQuarkGetPortsByIPAddress(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, ports=None, addr=None):
addr_models = []
for port in ports:
ip_mod = models.IPAddress()
ip_mod.update(addr)
port_model = models.Port()
port_model.update(port)
ip_mod.ports = [port_model]
addr_models.append(ip_mod)
with contextlib.nested(
mock.patch("quark.db.api.port_find_by_ip_address")
) as (port_find_by_addr,):
port_find_by_addr.return_value = addr_models
yield
def test_port_list_by_ip_address(self):
ip = dict(id=1, address=netaddr.IPAddress("192.168.1.100").value,
address_readable="192.168.1.100", subnet_id=1, network_id=2,
version=4)
port = dict(mac_address="AA:BB:CC:DD:EE:FF", network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
bridge="xenbr0", device_owner='network:dhcp')
with self._stubs(ports=[port], addr=ip):
admin_ctx = self.context.elevated()
filters = {"ip_address": ["192.168.0.1"]}
ports = self.plugin.get_ports(admin_ctx, filters=filters,
fields=None)
self.assertEqual(len(ports), 1)
self.assertEqual(ports[0]["device_owner"], "network:dhcp")
def test_port_list_by_ip_not_admin_raises(self):
with self._stubs(ports=[]):
filters = {"ip_address": ["192.168.0.1"]}
with self.assertRaises(exceptions.NotAuthorized):
self.plugin.get_ports(self.context, filters=filters,
fields=None)
def test_port_list_malformed_address_bad_request(self):
with self._stubs(ports=[]):
filters = {"ip_address": ["malformed-address-here"]}
admin_ctx = self.context.elevated()
with self.assertRaises(exceptions.BadRequest):
self.plugin.get_ports(admin_ctx, filters=filters, fields=None)
class TestQuarkCreatePortFailure(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, port=None, network=None, addr=None, mac=None):
if network:
network["network_plugin"] = "BASE"
network["ipam_strategy"] = "ANY"
port_model = models.Port()
port_model.update(port)
port_models = port_model
with contextlib.nested(
mock.patch("quark.db.api.port_create"),
mock.patch("quark.db.api.network_find"),
mock.patch("quark.db.api.port_find"),
mock.patch("quark.ipam.QuarkIpam.allocate_ip_address"),
mock.patch("quark.ipam.QuarkIpam.allocate_mac_address"),
mock.patch("quark.db.api.port_count_all"),
) as (port_create, net_find, port_find, alloc_ip, alloc_mac,
port_count):
port_create.return_value = port_models
net_find.return_value = network
port_find.return_value = models.Port()
alloc_ip.return_value = addr
alloc_mac.return_value = mac
port_count.return_value = 0
yield port_create
def test_create_multiple_ports_on_same_net_and_device_id_bad_request(self):
network = dict(id=1, tenant_id=self.context.tenant_id)
ip = dict()
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_1 = dict(port=dict(mac_address="AA:BB:CC:DD:EE:00", network_id=1,
tenant_id=self.context.tenant_id, device_id=1,
name="Fake"))
port_2 = dict(port=dict(mac_address="AA:BB:CC:DD:EE:11", network_id=1,
tenant_id=self.context.tenant_id, device_id=1,
name="Faker"))
with self._stubs(port=port_1, network=network, addr=ip, mac=mac):
with self.assertRaises(exceptions.BadRequest):
self.plugin.create_port(self.context, port_1)
self.plugin.create_port(self.context, port_2)
class TestQuarkCreatePortRM9305(test_quark_plugin.TestQuarkPlugin):
def setUp(self):
super(TestQuarkCreatePortRM9305, self).setUp()
strategy = {"00000000-0000-0000-0000-000000000000":
{"bridge": "publicnet",
"subnets": {"4": "public_v4",
"6": "public_v6"}},
"11111111-1111-1111-1111-111111111111":
{"bridge": "servicenet",
"subnets": {"4": "private_v4",
"6": "private_v6"}}}
strategy_json = json.dumps(strategy)
quark_ports.STRATEGY = network_strategy.JSONStrategy(strategy_json)
@contextlib.contextmanager
def _stubs(self, port=None, network=None, addr=None, mac=None):
if network:
network["network_plugin"] = "BASE"
network["ipam_strategy"] = "ANY"
port_model = models.Port()
port_model.update(port)
port_models = port_model
db_mod = "quark.db.api"
ipam = "quark.ipam.QuarkIpam"
with contextlib.nested(
mock.patch("%s.port_create" % db_mod),
mock.patch("%s.network_find" % db_mod),
mock.patch("%s.port_find" % db_mod),
mock.patch("%s.allocate_ip_address" % ipam),
mock.patch("%s.allocate_mac_address" % ipam),
mock.patch("%s.port_count_all" % db_mod),
) as (port_create, net_find, port_find, alloc_ip, alloc_mac,
port_count):
port_create.return_value = port_models
net_find.return_value = network
port_find.return_value = None
alloc_ip.return_value = addr
alloc_mac.return_value = mac
port_count.return_value = 0
yield port_create
def test_RM9305_tenant_create_servicenet_port(self):
network_id = "11111111-1111-1111-1111-111111111111"
network = dict(id=network_id,
tenant_id="rackspace")
ip = dict()
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_1 = dict(port=dict(mac_address="AA:BB:CC:DD:EE:00",
network_id=network_id,
tenant_id=self.context.tenant_id, device_id=2,
segment_id="bar",
name="Fake"))
with self._stubs(port=port_1, network=network, addr=ip, mac=mac):
self.plugin.create_port(self.context, port_1)
def test_RM9305_tenant_create_publicnet_port(self):
network_id = "00000000-0000-0000-0000-000000000000"
network = dict(id=network_id,
tenant_id="rackspace")
ip = dict()
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_1 = dict(port=dict(mac_address="AA:BB:CC:DD:EE:00",
network_id=network_id,
tenant_id=self.context.tenant_id, device_id=3,
segment_id="bar",
name="Fake"))
with self._stubs(port=port_1, network=network, addr=ip, mac=mac):
self.plugin.create_port(self.context, port_1)
def test_RM9305_tenant_create_tenants_port(self):
network_id = "foobar"
network = dict(id=network_id,
tenant_id=self.context.tenant_id)
ip = dict()
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_1 = dict(port=dict(mac_address="AA:BB:CC:DD:EE:00",
network_id=network_id,
tenant_id=self.context.tenant_id, device_id=4,
name="Fake"))
with self._stubs(port=port_1, network=network, addr=ip, mac=mac):
self.plugin.create_port(self.context, port_1)
def test_RM9305_tenant_create_other_tenants_port(self):
network_id = "foobar"
network = dict(id=network_id,
tenant_id="other_tenant")
ip = dict()
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_1 = dict(port=dict(mac_address="AA:BB:CC:DD:EE:00",
network_id=network_id,
tenant_id=self.context.tenant_id, device_id=5,
name="Fake"))
with self._stubs(port=port_1, network=network, addr=ip, mac=mac):
with self.assertRaises(exceptions.NotAuthorized):
self.plugin.create_port(self.context, port_1)
class TestQuarkCreatePortsSameDevBadRequest(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, port=None, network=None, addr=None, mac=None,
limit_checks=None, subnet=None):
subnet_model = None
if subnet:
subnet_model = models.Subnet()
subnet_model.update(subnet)
if network:
network["network_plugin"] = "BASE"
network["ipam_strategy"] = "ANY"
def _create_db_port(context, **kwargs):
port_model = models.Port()
port_model.update(kwargs)
return port_model
def _alloc_ip(context, new_ips, *args, **kwargs):
ip_mod = models.IPAddress()
ip_mod.update(addr)
ip_mod.enabled_for_port = lambda x: True
new_ips.extend([ip_mod])
return mock.DEFAULT
with contextlib.nested(
mock.patch("quark.db.api.port_create"),
mock.patch("quark.db.api.network_find"),
mock.patch("quark.ipam.QuarkIpam.allocate_ip_address"),
mock.patch("quark.ipam.QuarkIpam.allocate_mac_address"),
mock.patch("quark.db.api.port_count_all"),
mock.patch("neutron.quota.QuotaEngine.limit_check"),
mock.patch("quark.db.api.subnet_find"),
) as (port_create, net_find, alloc_ip, alloc_mac, port_count,
limit_check, subnet_find):
port_create.side_effect = _create_db_port
net_find.return_value = network
alloc_ip.side_effect = _alloc_ip
alloc_mac.return_value = mac
if subnet:
subnet_find.return_value = [subnet_model]
port_count.return_value = 0
if limit_checks:
limit_check.side_effect = limit_checks
yield port_create
def test_create_port(self):
network = dict(id=1, tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name))
expected = {'status': "ACTIVE",
'name': port_name,
'device_owner': None,
'mac_address': mac["address"],
'network_id': network["id"],
'tenant_id': self.context.tenant_id,
'admin_state_up': None,
'fixed_ips': [],
'device_id': 2}
with self._stubs(port=port["port"], network=network, addr=ip,
mac=mac) as port_create:
result = self.plugin.create_port(self.context, port)
self.assertTrue(port_create.called)
for key in expected.keys():
self.assertEqual(result[key], expected[key])
def test_create_port_segment_id_on_unshared_net_ignored(self):
network = dict(id=1, tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
segment_id="cell01", name=port_name))
expected = {'status': "ACTIVE",
'name': port_name,
'device_owner': None,
'mac_address': mac["address"],
'network_id': network["id"],
'tenant_id': self.context.tenant_id,
'admin_state_up': None,
'fixed_ips': [],
'device_id': 2}
with self._stubs(port=port["port"], network=network, addr=ip,
mac=mac) as port_create:
result = self.plugin.create_port(self.context, port)
self.assertTrue(port_create.called)
for key in expected.keys():
self.assertEqual(result[key], expected[key])
def test_create_port_mac_address_not_specified(self):
network = dict(id=1, tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2))
expected = {'status': "ACTIVE",
'device_owner': None,
'mac_address': mac["address"],
'network_id': network["id"],
'tenant_id': self.context.tenant_id,
'admin_state_up': None,
'fixed_ips': [],
'device_id': 2}
with self._stubs(port=port["port"], network=network, addr=ip,
mac=mac) as port_create:
port["port"]["mac_address"] = neutron_attrs.ATTR_NOT_SPECIFIED
result = self.plugin.create_port(self.context, port)
self.assertTrue(port_create.called)
for key in expected.keys():
self.assertEqual(result[key], expected[key])
@mock.patch("quark.network_strategy.JSONStrategy.is_provider_network")
def test_create_providernet_port_fixed_ip_not_authorized(self, is_parent):
is_parent.return_value = True
network = dict(id='1', tenant_id=self.context.tenant_id)
subnet = dict(id=1, network_id=network["id"])
mac = dict(address="AA:BB:CC:DD:EE:FF")
ip = mock.MagicMock()
ip.get = lambda x, *y: 1 if x == "subnet_id" else None
ip.formatted = lambda: "192.168.10.45"
ip.enabled_for_port = lambda x: True
fixed_ips = [dict(subnet_id=1, enabled=True,
ip_address="192.168.10.45")]
port = dict(port=dict(mac_address=mac["address"], network_id='1',
tenant_id=self.context.tenant_id, device_id=2,
fixed_ips=fixed_ips, ip_addresses=[ip],
segment_id="provider_segment"))
with self._stubs(port=port["port"], network=network, addr=ip,
mac=mac, subnet=subnet):
with self.assertRaises(exceptions.NotAuthorized):
self.plugin.create_port(self.context, port)
@mock.patch("quark.network_strategy.JSONStrategy.is_provider_network")
def test_create_providernet_port_fixed_ip_wrong_segment(self, is_parent):
is_parent.return_value = True
network = dict(id='1', tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
subnet = dict(id=1, network_id=network["id"])
ip = mock.MagicMock()
ip.get = lambda x, *y: 1 if x == "subnet_id" else None
ip.formatted = lambda: "192.168.10.45"
ip.enabled_for_port = lambda x: True
fixed_ips = [dict(subnet_id=1, enabled=True,
ip_address="192.168.10.45")]
port = dict(port=dict(mac_address=mac["address"], network_id='1',
tenant_id=self.context.tenant_id, device_id=2,
fixed_ips=fixed_ips, ip_addresses=[ip],
segment_id="provider_segment"))
with self._stubs(port=port["port"], network=network, addr=ip,
mac=mac, subnet=subnet):
with self.assertRaises(q_exc.AmbiguousNetworkId):
self.plugin.create_port(self.context.elevated(), port)
def test_create_port_fixed_ip_subnet_not_found(self):
network = dict(id='1', tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
ip = mock.MagicMock()
ip.get = lambda x, *y: 1 if x == "subnet_id" else None
ip.formatted = lambda: "192.168.10.45"
ip.enabled_for_port = lambda x: True
fixed_ips = [dict(subnet_id=1, enabled=True,
ip_address="192.168.10.45")]
port = dict(port=dict(mac_address=mac["address"], network_id='1',
tenant_id=self.context.tenant_id, device_id=2,
fixed_ips=fixed_ips, ip_addresses=[ip],
segment_id="provider_segment"))
with self._stubs(port=port["port"], network=network, addr=ip,
mac=mac):
with self.assertRaises(exceptions.NotFound):
self.plugin.create_port(self.context.elevated(), port)
def test_create_port_fixed_ip_subnet_not_in_network(self):
network = dict(id='1', tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
subnet = dict(id=1, network_id='2')
ip = mock.MagicMock()
ip.get = lambda x, *y: 1 if x == "subnet_id" else None
ip.formatted = lambda: "192.168.10.45"
ip.enabled_for_port = lambda x: True
fixed_ips = [dict(subnet_id=1, enabled=True,
ip_address="192.168.10.45")]
port = dict(port=dict(mac_address=mac["address"], network_id='1',
tenant_id=self.context.tenant_id, device_id=2,
fixed_ips=fixed_ips, ip_addresses=[ip],
segment_id="provider_segment"))
with self._stubs(port=port["port"], network=network, addr=ip,
mac=mac, subnet=subnet):
with self.assertRaises(exceptions.InvalidInput):
self.plugin.create_port(self.context.elevated(), port)
def test_create_port_fixed_ips_bad_request(self):
network = dict(id=1, tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
ip = mock.MagicMock()
ip.get = lambda x: 1 if x == "subnet_id" else None
ip.formatted = lambda: "192.168.10.45"
fixed_ips = [dict()]
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
fixed_ips=fixed_ips, ip_addresses=[ip]))
with self._stubs(port=port["port"], network=network, addr=ip,
mac=mac):
with self.assertRaises(exceptions.BadRequest):
self.plugin.create_port(self.context, port)
def test_create_port_no_network_found(self):
port = dict(port=dict(network_id=1, tenant_id=self.context.tenant_id,
device_id=2))
with self._stubs(network=None, port=port["port"]):
with self.assertRaises(exceptions.NetworkNotFound):
self.plugin.create_port(self.context, port)
def test_create_port_security_groups_raises(self, groups=[1]):
network = dict(id=1, tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
group = models.SecurityGroup()
group.update({'id': 1, 'tenant_id': self.context.tenant_id,
'name': 'foo', 'description': 'bar'})
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name, security_groups=[group]))
with self._stubs(port=port["port"], network=network, addr=ip,
mac=mac):
with mock.patch("quark.db.api.security_group_find"):
with self.assertRaises(q_exc.SecurityGroupsNotImplemented):
self.plugin.create_port(self.context, port)
class TestQuarkPortCreateQuota(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, port=None, network=None, addr=None, mac=None):
if network:
network["network_plugin"] = "BASE"
network["ipam_strategy"] = "ANY"
port_model = models.Port()
port_model.update(port)
port_models = port_model
with contextlib.nested(
mock.patch("quark.db.api.port_create"),
mock.patch("quark.db.api.network_find"),
mock.patch("quark.ipam.QuarkIpam.allocate_ip_address"),
mock.patch("quark.ipam.QuarkIpam.allocate_mac_address"),
mock.patch("quark.db.api.port_count_all"),
mock.patch("neutron.quota.QuotaEngine.limit_check")
) as (port_create, net_find, alloc_ip, alloc_mac, port_count,
limit_check):
port_create.return_value = port_models
net_find.return_value = network
alloc_ip.return_value = addr
alloc_mac.return_value = mac
port_count.return_value = len(network["ports"])
limit_check.side_effect = exceptions.OverQuota
yield port_create
def test_create_port_net_at_max(self):
network = dict(id=1, ports=[models.Port()],
tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name))
with self._stubs(port=port["port"], network=network, addr=ip, mac=mac):
with self.assertRaises(exceptions.OverQuota):
self.plugin.create_port(self.context, port)
class TestQuarkPortCreateFixedIpsQuota(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, network):
network["network_plugin"] = "BASE"
network["ipam_strategy"] = "ANY"
with mock.patch("quark.db.api.network_find") as net_find:
net_find.return_value = network
yield
def test_create_port_fixed_ips_over_quota(self):
network = {"id": 1, "tenant_id": self.context.tenant_id}
fixed_ips = [{"subnet_id": 1}, {"subnet_id": 1}, {"subnet_id": 1},
{"subnet_id": 1}, {"subnet_id": 1}, {"subnet_id": 1},
{"subnet_id": 1}]
port = {"port": {"network_id": 1, "tenant_id": self.context.tenant_id,
"device_id": 2, "fixed_ips": fixed_ips}}
with self._stubs(network=network):
with self.assertRaises(exceptions.OverQuota):
self.plugin.create_port(self.context, port)
class TestQuarkUpdatePort(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, port, new_ips=None, parent_net=False):
port_model = None
if port:
net_model = models.Network()
net_model["network_plugin"] = "BASE"
port_model = models.Port()
port_model.network = net_model
port_model.update(port)
with contextlib.nested(
mock.patch("quark.db.api.port_find"),
mock.patch("quark.db.api.port_update"),
mock.patch("quark.ipam.QuarkIpam.allocate_ip_address"),
mock.patch("quark.ipam.QuarkIpam.deallocate_ips_by_port"),
) as (port_find, port_update, alloc_ip, dealloc_ip):
port_find.return_value = port_model
port_update.return_value = port_model
if new_ips:
alloc_ip.return_value = new_ips
yield port_find, port_update, alloc_ip, dealloc_ip
def test_update_port_not_found(self):
with self._stubs(port=None):
with self.assertRaises(exceptions.PortNotFound):
self.plugin.update_port(self.context, 1, {})
def test_update_port(self):
with self._stubs(
port=dict(id=1, name="myport")
) as (port_find, port_update, alloc_ip, dealloc_ip):
new_port = dict(port=dict(name="ourport"))
self.plugin.update_port(self.context, 1, new_port)
self.assertEqual(port_find.call_count, 2)
port_update.assert_called_once_with(
self.context,
port_find(),
name="ourport",
security_groups=[])
def test_update_port_fixed_ip_bad_request(self):
with self._stubs(
port=dict(id=1, name="myport")
) as (port_find, port_update, alloc_ip, dealloc_ip):
new_port = dict(port=dict(
fixed_ips=[dict(subnet_id=None,
ip_address=None)]))
with self.assertRaises(exceptions.BadRequest):
self.plugin.update_port(self.context, 1, new_port)
def test_update_port_fixed_ip_bad_request_malformed_address(self):
with self._stubs(
port=dict(id=1, name="myport", mac_address="0:0:0:0:0:1")
) as (port_find, port_update, alloc_ip, dealloc_ip):
new_port = dict(port=dict(
fixed_ips=[dict(subnet_id=1,
ip_address="malformed-address-here")]))
with self.assertRaises(exceptions.BadRequest):
self.plugin.update_port(self.context, 1, new_port)
def test_update_port_fixed_ip(self):
with self._stubs(
port=dict(id=1, name="myport", mac_address="0:0:0:0:0:1")
) as (port_find, port_update, alloc_ip, dealloc_ip):
new_port = dict(port=dict(
fixed_ips=[dict(subnet_id=1,
ip_address="1.1.1.1")]))
self.plugin.update_port(self.context, 1, new_port)
self.assertEqual(alloc_ip.call_count, 1)
def test_update_port_fixed_ip_no_subnet_raises(self):
with self._stubs(
port=dict(id=1, name="myport", mac_address="0:0:0:0:0:1")
) as (port_find, port_update, alloc_ip, dealloc_ip):
new_port = dict(port=dict(
fixed_ips=[dict(ip_address="1.1.1.1")]))
with self.assertRaises(exceptions.BadRequest):
self.plugin.update_port(self.context, 1, new_port)
def test_update_port_fixed_ip_subnet_only_allocates_ip(self):
with self._stubs(
port=dict(id=1, name="myport", mac_address="0:0:0:0:0:1")
) as (port_find, port_update, alloc_ip, dealloc_ip):
new_port = dict(port=dict(
fixed_ips=[dict(subnet_id=1)]))
self.plugin.update_port(self.context, 1, new_port)
self.assertEqual(alloc_ip.call_count, 1)
def test_update_port_fixed_ip_allocs_new_deallocs_existing(self):
addr_dict = {"address": 0, "address_readable": "0.0.0.0"}
addr = models.IPAddress()
addr.update(addr_dict)
new_addr_dict = {"address": netaddr.IPAddress("1.1.1.1"),
"address_readable": "1.1.1.1"}
new_addr = models.IPAddress()
new_addr.update(new_addr_dict)
with self._stubs(
port=dict(id=1, name="myport", mac_address="0:0:0:0:0:1",
ip_addresses=[addr]),
new_ips=[new_addr]
) as (port_find, port_update, alloc_ip, dealloc_ip):
new_port = dict(port=dict(
fixed_ips=[dict(subnet_id=1,
ip_address=new_addr["address_readable"])]))
self.plugin.update_port(self.context, 1, new_port)
self.assertEqual(alloc_ip.call_count, 1)
def test_update_port_goes_over_quota(self):
fixed_ips = {"fixed_ips": [{"subnet_id": 1},
{"subnet_id": 1},
{"subnet_id": 1},
{"subnet_id": 1},
{"subnet_id": 1},
{"subnet_id": 1},
{"subnet_id": 1}]}
with self._stubs(
port=dict(id=1, name="myport", mac_address="0:0:0:0:0:1")
) as (port_find, port_update, alloc_ip, dealloc_ip):
new_port = {"port": fixed_ips}
with self.assertRaises(exceptions.OverQuota):
self.plugin.update_port(self.context, 1, new_port)
class TestQuarkUpdatePortSecurityGroups(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, port, new_ips=None, parent_net=False):
port_model = None
sg_mod = models.SecurityGroup()
if port:
net_model = models.Network()
net_model["network_plugin"] = "BASE"
port_model = models.Port()
port_model.network = net_model
port_model.update(port)
port_model["security_groups"].append(sg_mod)
with contextlib.nested(
mock.patch("quark.db.api.port_find"),
mock.patch("quark.db.api.port_update"),
mock.patch("quark.ipam.QuarkIpam.allocate_ip_address"),
mock.patch("quark.ipam.QuarkIpam.deallocate_ips_by_port"),
mock.patch("neutron.quota.QuotaEngine.limit_check"),
mock.patch("quark.plugin_modules.ports.STRATEGY"
".is_provider_network"),
mock.patch("quark.db.api.security_group_find"),
mock.patch("quark.drivers.base.BaseDriver.update_port")
) as (port_find, port_update, alloc_ip, dealloc_ip, limit_check,
net_strat, sg_find, driver_port_update):
port_find.return_value = port_model
def _port_update(context, port_db, **kwargs):
return port_db.update(kwargs)
port_update.side_effect = _port_update
if new_ips:
alloc_ip.return_value = new_ips
net_strat.return_value = parent_net
sg_find.return_value = sg_mod
yield (port_find, port_update, alloc_ip, dealloc_ip, sg_find,
driver_port_update)
def test_update_port_security_groups(self):
with self._stubs(
port=dict(id=1, device_id="device"), parent_net=True
) as (port_find, port_update, alloc_ip, dealloc_ip, sg_find,
driver_port_update):
new_port = dict(port=dict(name="ourport",
security_groups=[1]))
port = self.plugin.update_port(self.context, 1, new_port)
port_update.assert_called_once_with(
self.context,
port_find(),
name="ourport",
security_groups=[sg_find()])
self.assertEqual(sg_find()["id"], port["security_groups"][0])
def test_update_port_empty_list_security_groups(self):
port_dict = {"id": 1, "mac_address": "AA:BB:CC:DD:EE:FF",
"device_id": 2, "backend_key": 3}
with self._stubs(
port=port_dict, parent_net=True
) as (port_find, port_update, alloc_ip, dealloc_ip, sg_find,
driver_port_update):
new_port = dict(port=dict(name="ourport",
security_groups=[]))
port = self.plugin.update_port(self.context, 1, new_port)
self.assertEqual(port["security_groups"], [])
port_update.assert_called_once_with(
self.context,
port_find(),
name="ourport",
security_groups=[])
driver_port_update.assert_called_once_with(
self.context, port_id=port_dict["backend_key"],
mac_address=port_dict["mac_address"],
device_id=port_dict["device_id"],
security_groups=[])
def test_update_port_no_security_groups(self):
port_dict = {"id": 1, "mac_address": "AA:BB:CC:DD:EE:FF",
"device_id": 2, "backend_key": 3}
with self._stubs(
port=port_dict, parent_net=True
) as (port_find, port_update, alloc_ip, dealloc_ip, sg_find,
driver_port_update):
new_port = dict(port=dict(name="ourport"))
self.plugin.update_port(self.context, 1, new_port)
driver_port_update.assert_called_once_with(
self.context, port_id=port_dict["backend_key"],
mac_address=port_dict["mac_address"],
device_id=port_dict["device_id"])
def test_update_port_security_groups_no_device_id_raises(self):
with self._stubs(
port=dict(id=1), parent_net=True
) as (port_find, port_update, alloc_ip, dealloc_ip, sg_find,
driver_port_update):
new_port = dict(port=dict(name="ourport",
security_groups=[1]))
with self.assertRaises(q_exc.SecurityGroupsRequireDevice):
self.plugin.update_port(self.context, 1, new_port)
class TestQuarkUpdatePortSetsIps(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, port, new_ips=None):
def alloc_mock(kls, context, addresses, *args, **kwargs):
addresses.extend(new_ips)
self.called = True
port_model = None
if port:
net_model = models.Network()
net_model["network_plugin"] = "BASE"
port_model = models.Port()
port_model['network'] = net_model
port_model.update(port)
with contextlib.nested(
mock.patch("quark.db.api.port_find"),
mock.patch("quark.db.api.port_update"),
mock.patch("quark.ipam.QuarkIpam.deallocate_ips_by_port"),
mock.patch("neutron.quota.QuotaEngine.limit_check")
) as (port_find, port_update, dealloc_ip, limit_check):
port_find.return_value = port_model
port_update.return_value = port_model
alloc_ip = mock.patch("quark.ipam.QuarkIpam.allocate_ip_address",
new=alloc_mock)
alloc_ip.start()
yield port_find, port_update, alloc_ip, dealloc_ip
alloc_ip.stop()
def test_update_port_fixed_ip_subnet_only_allocates_ip(self):
self.called = False
new_addr_dict = {"address": netaddr.IPAddress('1.1.1.1'),
"address_readable": "1.1.1.1"}
new_addr = models.IPAddress()
new_addr.update(new_addr_dict)
with self._stubs(
port=dict(id=1, name="myport", mac_address="0:0:0:0:0:1"),
new_ips=[new_addr]
) as (port_find, port_update, alloc_ip, dealloc_ip):
new_port = dict(port=dict(
fixed_ips=[dict(subnet_id=1)]))
self.plugin.update_port(self.context, 1, new_port)
self.assertTrue(self.called)
class TestQuarkCreatePortOnSharedNetworks(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, port=None, network=None, addr=None, mac=None):
self.strategy = {"public_network":
{"bridge": "xenbr0",
"subnets": {"4": "public_v4",
"6": "public_v6"}}}
strategy_json = json.dumps(self.strategy)
quark_ports.STRATEGY = network_strategy.JSONStrategy(strategy_json)
if network:
network["network_plugin"] = "BASE"
network["ipam_strategy"] = "ANY"
port_model = models.Port()
port_model.update(port)
port_models = port_model
with contextlib.nested(
mock.patch("quark.db.api.port_create"),
mock.patch("quark.db.api.network_find"),
mock.patch("quark.ipam.QuarkIpam.allocate_ip_address"),
mock.patch("quark.ipam.QuarkIpam.allocate_mac_address"),
mock.patch("neutron.quota.QuotaEngine.limit_check")
) as (port_create, net_find, alloc_ip, alloc_mac, limit_check):
port_create.return_value = port_models
net_find.return_value = network
alloc_ip.return_value = addr
alloc_mac.return_value = mac
yield port_create
def test_create_port_shared_net_no_quota_check(self):
network = dict(id=1, ports=[models.Port()],
tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"],
network_id="public_network",
tenant_id=self.context.tenant_id, device_id=2,
segment_id="cell01",
name=port_name))
with self._stubs(port=port["port"], network=network, addr=ip, mac=mac):
try:
self.plugin.create_port(self.context, port)
except Exception:
self.fail("create_port raised OverQuota")
def test_create_port_shared_net_no_segment_id_fails(self):
network = dict(id=1, ports=[models.Port()],
tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"],
network_id="public_network",
tenant_id=self.context.tenant_id, device_id=2,
name=port_name))
with self._stubs(port=port["port"], network=network, addr=ip, mac=mac):
with self.assertRaises(q_exc.AmbiguousNetworkId):
self.plugin.create_port(self.context, port)
class TestQuarkGetPortCount(test_quark_plugin.TestQuarkPlugin):
def test_get_port_count(self):
"""This isn't really testable."""
with mock.patch("quark.db.api.port_count_all"):
self.plugin.get_ports_count(self.context, {})
class TestQuarkDeletePort(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, port=None, addr=None, mac=None):
port_models = None
if port:
net_model = models.Network()
net_model["network_plugin"] = "BASE"
net_model["ipam_strategy"] = "ANY"
port_model = models.Port()
port_model.update(port)
port_model.network = net_model
port_models = port_model
with contextlib.nested(
mock.patch("quark.db.api.port_find"),
mock.patch("quark.ipam.QuarkIpam.deallocate_ips_by_port"),
mock.patch("quark.ipam.QuarkIpam.deallocate_mac_address"),
mock.patch("quark.db.api.port_delete"),
mock.patch("quark.drivers.base.BaseDriver.delete_port")
) as (port_find, dealloc_ip, dealloc_mac, db_port_del,
driver_port_del):
port_find.return_value = port_models
dealloc_ip.return_value = addr
dealloc_mac.return_value = mac
yield db_port_del, driver_port_del
def test_port_delete(self):
port = dict(port=dict(network_id=1, tenant_id=self.context.tenant_id,
device_id=2, mac_address="AA:BB:CC:DD:EE:FF",
backend_key="foo"))
with self._stubs(port=port["port"]) as (db_port_del, driver_port_del):
self.plugin.delete_port(self.context, 1)
self.assertTrue(db_port_del.called)
driver_port_del.assert_called_with(
self.context, "foo", mac_address=port["port"]["mac_address"],
device_id=port["port"]["device_id"])
def test_port_delete_port_not_found_fails(self):
with self._stubs(port=None) as (db_port_del, driver_port_del):
with self.assertRaises(exceptions.PortNotFound):
self.plugin.delete_port(self.context, 1)
class TestPortDiagnose(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, port, list_format=False):
port_res = None
if port:
network_mod = models.Network()
port_mod = models.Port()
port_mod.update(port)
network_mod["network_plugin"] = "UNMANAGED"
port_mod.network = network_mod
port_res = port_mod
if list_format:
ports = mock.MagicMock()
ports.all.return_value = [port_mod]
port_res = ports
with mock.patch("quark.db.api.port_find") as port_find:
port_find.return_value = port_res
yield
def test_port_diagnose(self):
ip = dict(id=1, address=netaddr.IPAddress("192.168.1.100"),
address_readable="192.168.1.100", subnet_id=1, network_id=2,
version=4)
fixed_ips = [{"subnet_id": ip["subnet_id"],
"ip_address": ip["address_readable"]}]
port = dict(port=dict(network_id=1, tenant_id=self.context.tenant_id,
device_id=2, mac_address="AA:BB:CC:DD:EE:FF",
backend_key="foo", fixed_ips=fixed_ips,
network_plugin="UNMANAGED"))
with self._stubs(port=port):
diag = self.plugin.diagnose_port(self.context.elevated(), 1, [])
ports = diag["ports"]
# All none because we're using the unmanaged driver, which
# doesn't do anything with these
self.assertEqual(ports["status"], "ACTIVE")
self.assertEqual(ports["device_owner"], None)
self.assertEqual(ports["fixed_ips"], [])
self.assertEqual(ports["security_groups"], [])
self.assertEqual(ports["device_id"], None)
self.assertEqual(ports["admin_state_up"], None)
self.assertEqual(ports["network_id"], None)
self.assertEqual(ports["tenant_id"], None)
self.assertEqual(ports["mac_address"], None)
def test_port_diagnose_with_wildcard(self):
ip = dict(id=1, address=netaddr.IPAddress("192.168.1.100"),
address_readable="192.168.1.100", subnet_id=1, network_id=2,
version=4)
fixed_ips = [{"subnet_id": ip["subnet_id"],
"ip_address": ip["address_readable"]}]
port = dict(port=dict(network_id=1, tenant_id=self.context.tenant_id,
device_id=2, mac_address="AA:BB:CC:DD:EE:FF",
backend_key="foo", fixed_ips=fixed_ips,
network_plugin="UNMANAGED"))
with self._stubs(port=port, list_format=True):
diag = self.plugin.diagnose_port(self.context.elevated(), '*', [])
ports = diag["ports"]
# All none because we're using the unmanaged driver, which
# doesn't do anything with these
self.assertEqual(ports[0]["status"], "ACTIVE")
self.assertEqual(ports[0]["device_owner"], None)
self.assertEqual(ports[0]["fixed_ips"], [])
self.assertEqual(ports[0]["security_groups"], [])
self.assertEqual(ports[0]["device_id"], None)
self.assertEqual(ports[0]["admin_state_up"], None)
self.assertEqual(ports[0]["network_id"], None)
self.assertEqual(ports[0]["tenant_id"], None)
self.assertEqual(ports[0]["mac_address"], None)
def test_port_diagnose_with_config_field(self):
ip = dict(id=1, address=netaddr.IPAddress("192.168.1.100"),
address_readable="192.168.1.100", subnet_id=1, network_id=2,
version=4)
fixed_ips = [{"subnet_id": ip["subnet_id"],
"ip_address": ip["address_readable"]}]
port = dict(port=dict(network_id=1, tenant_id=self.context.tenant_id,
device_id=2, mac_address="AA:BB:CC:DD:EE:FF",
backend_key="foo", fixed_ips=fixed_ips,
network_plugin="UNMANAGED"))
with self._stubs(port=port, list_format=True):
diag = self.plugin.diagnose_port(self.context.elevated(), '*',
["config"])
ports = diag["ports"]
# All none because we're using the unmanaged driver, which
# doesn't do anything with these
self.assertEqual(ports[0]["status"], "ACTIVE")
self.assertEqual(ports[0]["device_owner"], None)
self.assertEqual(ports[0]["fixed_ips"], [])
self.assertEqual(ports[0]["security_groups"], [])
self.assertEqual(ports[0]["device_id"], None)
self.assertEqual(ports[0]["admin_state_up"], None)
self.assertEqual(ports[0]["network_id"], None)
self.assertEqual(ports[0]["tenant_id"], None)
self.assertEqual(ports[0]["mac_address"], None)
def test_port_diagnose_no_port_raises(self):
with self._stubs(port=None):
with self.assertRaises(exceptions.PortNotFound):
self.plugin.diagnose_port(self.context.elevated(), 1, [])
def test_port_diagnose_not_authorized(self):
with self._stubs(port=None):
with self.assertRaises(exceptions.NotAuthorized):
self.plugin.diagnose_port(self.context, 1, [])
class TestPortNetworkPlugin(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, network=None, addr=None, mac=None,
compat_map=None, driver_res=None):
network["ipam_strategy"] = "ANY"
# Response from the backend driver
self.expected_bridge = "backend-drivers-bridge"
if driver_res is None:
driver_res = {"uuid": 1, "bridge": self.expected_bridge}
# Mock out the driver registry
foo_driver = mock.Mock()
foo_driver.create_port.return_value = driver_res
bar_driver = mock.Mock()
bar_driver.create_port.return_value = driver_res
drivers = {"FOO": foo_driver,
"BAR": bar_driver}
compat_map = compat_map or {}
with contextlib.nested(
mock.patch("quark.db.api.port_create"),
mock.patch("quark.db.api.network_find"),
mock.patch("quark.ipam.QuarkIpam.allocate_ip_address"),
mock.patch("quark.ipam.QuarkIpam.allocate_mac_address"),
mock.patch("oslo_utils.uuidutils.generate_uuid"),
mock.patch("quark.plugin_views._make_port_dict"),
mock.patch("quark.db.api.port_count_all"),
mock.patch("neutron.quota.QuotaEngine.limit_check"),
mock.patch("quark.plugin_modules.ports.registry."
"DRIVER_REGISTRY.drivers",
new_callable=mock.PropertyMock(return_value=drivers)),
mock.patch("quark.plugin_modules.ports.registry."
"DRIVER_REGISTRY.port_driver_compat_map",
new_callable=mock.PropertyMock(return_value=compat_map))
) as (port_create, net_find, alloc_ip, alloc_mac, gen_uuid, make_port,
port_count, limit_check, _, _):
net_find.return_value = network
alloc_ip.return_value = addr
alloc_mac.return_value = mac
gen_uuid.return_value = 1
port_count.return_value = 0
yield port_create, alloc_mac, net_find
def test_create_port_with_bad_network_plugin_fails(self):
network_dict = dict(id=1, tenant_id=self.context.tenant_id)
port_name = "foobar"
mac = dict(address="AA:BB:CC:DD:EE:FF")
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name))
network = models.Network()
network.update(network_dict)
network["network_plugin"] = "FAIL"
port_model = models.Port()
port_model.update(port)
port_models = port_model
with self._stubs(network=network, addr=ip,
mac=mac) as (port_create, alloc_mac, net_find):
port_create.return_value = port_models
exc = "Driver FAIL is not registered."
with self.assertRaisesRegexp(exceptions.BadRequest, exc):
self.plugin.create_port(self.context, port)
def test_create_port_with_bad_port_network_plugin_fails(self):
network_dict = dict(id=1, tenant_id=self.context.tenant_id)
port_name = "foobar"
mac = dict(address="AA:BB:CC:DD:EE:FF")
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name, network_plugin="FAIL"))
network = models.Network()
network.update(network_dict)
network["network_plugin"] = "FOO"
port_model = models.Port()
port_model.update(port)
port_models = port_model
with self._stubs(network=network, addr=ip,
mac=mac) as (port_create, alloc_mac, net_find):
port_create.return_value = port_models
exc = "Driver FAIL is not registered."
admin_ctx = self.context.elevated()
with self.assertRaisesRegexp(exceptions.BadRequest, exc):
self.plugin.create_port(admin_ctx, port)
def test_create_port_with_incompatable_port_network_plugin_fails(self):
network_dict = dict(id=1, tenant_id=self.context.tenant_id)
port_name = "foobar"
mac = dict(address="AA:BB:CC:DD:EE:FF")
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name, network_plugin="BAR"))
network = models.Network()
network.update(network_dict)
network["network_plugin"] = "FOO"
port_model = models.Port()
port_model.update(port)
port_models = port_model
with self._stubs(network=network, addr=ip,
mac=mac) as (port_create, alloc_mac, net_find):
port_create.return_value = port_models
exc = ("Port driver BAR not allowed for underlying network "
"driver FOO.")
admin_ctx = self.context.elevated()
with self.assertRaisesRegexp(exceptions.BadRequest, exc):
self.plugin.create_port(admin_ctx, port)
def test_create_port_with_port_network_plugin(self):
network = dict(id=1, tenant_id=self.context.tenant_id,
network_plugin="FOO")
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name, device_owner="quark_tests",
bridge="quark_bridge", admin_state_up=False))
expected_mac = "DE:AD:BE:EF:00:00"
expected_bridge = "new_bridge"
expected_device_owner = "new_device_owner"
expected_admin_state = "new_state"
expected_network_plugin = "FOO"
port_create_dict = {}
port_create_dict["port"] = port["port"].copy()
port_create_dict["port"]["mac_address"] = expected_mac
port_create_dict["port"]["device_owner"] = expected_device_owner
port_create_dict["port"]["bridge"] = expected_bridge
port_create_dict["port"]["admin_state_up"] = expected_admin_state
port_create_dict["port"]["network_plugin"] = expected_network_plugin
admin_ctx = self.context.elevated()
with self._stubs(network=network, addr=ip,
mac=mac) as (port_create, alloc_mac, net_find):
self.plugin.create_port(admin_ctx, port_create_dict)
alloc_mac.assert_called_once_with(
admin_ctx, network["id"], 1,
cfg.CONF.QUARK.ipam_reuse_after,
mac_address=expected_mac, use_forbidden_mac_range=False)
port_create.assert_called_once_with(
admin_ctx, bridge=self.expected_bridge, uuid=1, name="foobar",
admin_state_up=expected_admin_state, network_id=1,
tenant_id="fake", id=1, device_owner=expected_device_owner,
mac_address=mac["address"], device_id=2, backend_key=1,
security_groups=[], addresses=[],
network_plugin=expected_network_plugin)
def test_create_port_with_compatible_port_network_plugin(self):
network = dict(id=1, tenant_id=self.context.tenant_id,
network_plugin="FOO")
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name, device_owner="quark_tests",
bridge="quark_bridge", admin_state_up=False))
expected_mac = "DE:AD:BE:EF:00:00"
expected_bridge = "new_bridge"
expected_device_owner = "new_device_owner"
expected_admin_state = "new_state"
expected_network_plugin = "BAR"
port_create_dict = {}
port_create_dict["port"] = port["port"].copy()
port_create_dict["port"]["mac_address"] = expected_mac
port_create_dict["port"]["device_owner"] = expected_device_owner
port_create_dict["port"]["bridge"] = expected_bridge
port_create_dict["port"]["admin_state_up"] = expected_admin_state
port_create_dict["port"]["network_plugin"] = expected_network_plugin
compat_map = {"BAR": ["FOO"]}
admin_ctx = self.context.elevated()
with self._stubs(network=network, addr=ip, mac=mac,
compat_map=compat_map) as (port_create, alloc_mac,
net_find):
self.plugin.create_port(admin_ctx, port_create_dict)
alloc_mac.assert_called_once_with(
admin_ctx, network["id"], 1,
cfg.CONF.QUARK.ipam_reuse_after,
mac_address=expected_mac, use_forbidden_mac_range=False)
port_create.assert_called_once_with(
admin_ctx, bridge=self.expected_bridge, uuid=1, name="foobar",
admin_state_up=expected_admin_state, network_id=1,
tenant_id="fake", id=1, device_owner=expected_device_owner,
mac_address=mac["address"], device_id=2, backend_key=1,
security_groups=[], addresses=[],
network_plugin=expected_network_plugin)
def test_create_port_network_plugin_response_no_uuid_raises(self):
network_dict = dict(id=1, tenant_id=self.context.tenant_id)
port_name = "foobar"
mac = dict(address="AA:BB:CC:DD:EE:FF")
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name))
network = models.Network()
network.update(network_dict)
network["network_plugin"] = "FOO"
port_model = models.Port()
port_model.update(port)
port_models = port_model
with self._stubs(network=network, addr=ip,
mac=mac, driver_res={}) as (port_create,
alloc_mac,
net_find):
port_create.return_value = port_models
exc = "uuid"
with self.assertRaisesRegexp(KeyError, exc):
self.plugin.create_port(self.context, port)
def test_create_port_network_plugin_response_is_filtered(self):
network = dict(id=1, tenant_id=self.context.tenant_id,
network_plugin="FOO")
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name, device_owner="quark_tests",
bridge="quark_bridge", admin_state_up=False))
expected_mac = "DE:AD:BE:EF:00:00"
expected_bridge = "new_bridge"
expected_device_owner = "new_device_owner"
expected_admin_state = "new_state"
port_create_dict = {}
port_create_dict["port"] = port["port"].copy()
port_create_dict["port"]["mac_address"] = expected_mac
port_create_dict["port"]["device_owner"] = expected_device_owner
port_create_dict["port"]["bridge"] = expected_bridge
port_create_dict["port"]["admin_state_up"] = expected_admin_state
driver_res = {
"uuid": 5,
"vlan_id": 50,
"tags": [123, {"foo": "bar"}],
"id": "fail",
"randomkey": None
}
admin_ctx = self.context.elevated()
with self._stubs(network=network, addr=ip,
mac=mac, driver_res=driver_res) as (port_create,
alloc_mac,
net_find):
self.plugin.create_port(admin_ctx, port_create_dict)
alloc_mac.assert_called_once_with(
admin_ctx, network["id"], 1,
cfg.CONF.QUARK.ipam_reuse_after,
mac_address=expected_mac, use_forbidden_mac_range=False)
port_create.assert_called_once_with(
admin_ctx, bridge=expected_bridge, uuid=5, name="foobar",
admin_state_up=expected_admin_state, network_id=1,
tenant_id="fake", id=1, device_owner=expected_device_owner,
mac_address=mac["address"], device_id=2, backend_key=5,
security_groups=[], addresses=[], vlan_id=50)
class TestQuarkPortCreateFiltering(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self, network=None, addr=None, mac=None):
network["network_plugin"] = "BASE"
network["ipam_strategy"] = "ANY"
with contextlib.nested(
mock.patch("quark.db.api.port_create"),
mock.patch("quark.db.api.network_find"),
mock.patch("quark.ipam.QuarkIpam.allocate_ip_address"),
mock.patch("quark.ipam.QuarkIpam.allocate_mac_address"),
mock.patch("oslo_utils.uuidutils.generate_uuid"),
mock.patch("quark.plugin_views._make_port_dict"),
mock.patch("quark.db.api.port_count_all"),
mock.patch("neutron.quota.QuotaEngine.limit_check")
) as (port_create, net_find, alloc_ip, alloc_mac, gen_uuid, make_port,
port_count, limit_check):
net_find.return_value = network
alloc_ip.return_value = addr
alloc_mac.return_value = mac
gen_uuid.return_value = 1
port_count.return_value = 0
yield port_create, alloc_mac, net_find
def test_create_port_attribute_filtering(self):
network = dict(id=1, tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name, device_owner="quark_tests",
bridge="quark_bridge", admin_state_up=False))
port_create_dict = {}
port_create_dict["port"] = port["port"].copy()
port_create_dict["port"]["mac_address"] = "DE:AD:BE:EF:00:00"
port_create_dict["port"]["device_owner"] = "ignored"
port_create_dict["port"]["bridge"] = "ignored"
port_create_dict["port"]["admin_state_up"] = "ignored"
port_create_dict["port"]["network_plugin"] = "ignored"
with self._stubs(network=network, addr=ip,
mac=mac) as (port_create, alloc_mac, net_find):
self.plugin.create_port(self.context, port_create_dict)
alloc_mac.assert_called_once_with(
self.context, network["id"], 1,
cfg.CONF.QUARK.ipam_reuse_after,
mac_address=None, use_forbidden_mac_range=False)
port_create.assert_called_once_with(
self.context, addresses=[], network_id=network["id"],
tenant_id="fake", uuid=1, name="foobar",
mac_address=alloc_mac()["address"], backend_key=1, id=1,
security_groups=[], device_id=2)
def test_create_port_attribute_filtering_admin(self):
network = dict(id=1, tenant_id=self.context.tenant_id)
mac = dict(address="AA:BB:CC:DD:EE:FF")
port_name = "foobar"
ip = dict()
port = dict(port=dict(mac_address=mac["address"], network_id=1,
tenant_id=self.context.tenant_id, device_id=2,
name=port_name, device_owner="quark_tests",
bridge="quark_bridge", admin_state_up=False))
expected_mac = "DE:AD:BE:EF:00:00"
expected_bridge = "new_bridge"
expected_device_owner = "new_device_owner"
expected_admin_state = "new_state"
expected_network_plugin = "BASE"
port_create_dict = {}
port_create_dict["port"] = port["port"].copy()
port_create_dict["port"]["mac_address"] = expected_mac
port_create_dict["port"]["device_owner"] = expected_device_owner
port_create_dict["port"]["bridge"] = expected_bridge
port_create_dict["port"]["admin_state_up"] = expected_admin_state
port_create_dict["port"]["network_plugin"] = expected_network_plugin
admin_ctx = self.context.elevated()
with self._stubs(network=network, addr=ip,
mac=mac) as (port_create, alloc_mac, net_find):
self.plugin.create_port(admin_ctx, port_create_dict)
alloc_mac.assert_called_once_with(
admin_ctx, network["id"], 1,
cfg.CONF.QUARK.ipam_reuse_after,
mac_address=expected_mac, use_forbidden_mac_range=False)
port_create.assert_called_once_with(
admin_ctx, bridge=expected_bridge, uuid=1, name="foobar",
admin_state_up=expected_admin_state, network_id=1,
tenant_id="fake", id=1, device_owner=expected_device_owner,
mac_address=mac["address"], device_id=2, backend_key=1,
security_groups=[], addresses=[],
network_plugin=expected_network_plugin)
class TestQuarkPortUpdateFiltering(test_quark_plugin.TestQuarkPlugin):
@contextlib.contextmanager
def _stubs(self):
with contextlib.nested(
mock.patch("quark.db.api.port_find"),
mock.patch("quark.db.api.port_update"),
mock.patch("quark.drivers.registry.DriverRegistry.get_driver"),
mock.patch("quark.plugin_views._make_port_dict"),
mock.patch("neutron.quota.QuotaEngine.limit_check")
) as (port_find, port_update, get_driver, make_port, limit_check):
yield port_find, port_update
def test_update_port_attribute_filtering(self):
new_port = {}
new_port["port"] = {
"mac_address": "DD:EE:FF:00:00:00", "device_owner": "new_owner",
"bridge": "new_bridge", "admin_state_up": False, "device_id": 3,
"network_id": 10, "backend_key": 1234, "name": "new_name",
"network_plugin": "BASE"}
with self._stubs() as (port_find, port_update):
self.plugin.update_port(self.context, 1, new_port)
port_update.assert_called_once_with(
self.context,
port_find(),
name="new_name",
security_groups=[])
def test_update_port_attribute_filtering_admin(self):
new_port = {}
new_port["port"] = {
"mac_address": "DD:EE:FF:00:00:00", "device_owner": "new_owner",
"bridge": "new_bridge", "admin_state_up": False, "device_id": 3,
"network_id": 10, "backend_key": 1234, "name": "new_name",
"network_plugin": "BASE"}
admin_ctx = self.context.elevated()
with self._stubs() as (port_find, port_update):
self.plugin.update_port(admin_ctx, 1, new_port)
port_update.assert_called_once_with(
admin_ctx,
port_find(),
name="new_name",
bridge=new_port["port"]["bridge"],
admin_state_up=new_port["port"]["admin_state_up"],
device_owner=new_port["port"]["device_owner"],
mac_address=new_port["port"]["mac_address"],
device_id=new_port["port"]["device_id"],
security_groups=[])
| 45.961611 | 79 | 0.584763 | 8,793 | 73,033 | 4.578301 | 0.044353 | 0.030405 | 0.021164 | 0.03351 | 0.863949 | 0.838239 | 0.809126 | 0.783963 | 0.764786 | 0.74787 | 0 | 0.016593 | 0.301042 | 73,033 | 1,588 | 80 | 45.990554 | 0.772034 | 0.014541 | 0 | 0.716751 | 0 | 0 | 0.122265 | 0.039634 | 0 | 0 | 0 | 0 | 0.071172 | 1 | 0.059669 | false | 0 | 0.009346 | 0.000719 | 0.083393 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
73b92b6830cdc2e4029e454db92eae4b3e05903a | 1,815 | py | Python | tests/geographic/utils/test_is_outlier.py | PEM-Humboldt/geographic-validation | e8584fdbccd40847e94fef2481bc8f420e7fddb3 | [
"MIT"
] | null | null | null | tests/geographic/utils/test_is_outlier.py | PEM-Humboldt/geographic-validation | e8584fdbccd40847e94fef2481bc8f420e7fddb3 | [
"MIT"
] | 24 | 2021-04-30T18:02:30.000Z | 2021-10-13T13:56:24.000Z | tests/geographic/utils/test_is_outlier.py | PEM-Humboldt/geographic-validation | e8584fdbccd40847e94fef2481bc8f420e7fddb3 | [
"MIT"
] | 1 | 2021-06-16T16:52:52.000Z | 2021-06-16T16:52:52.000Z | """
Test cases for the regi0.geographic.utils.is_outlier function.
"""
import numpy as np
import pytest
from regi0.geographic.utils import is_outlier
@pytest.fixture()
def values():
return np.array([52, 56, 53, 57, 51, 59, 1, 99])
def test_iqr(values):
result = is_outlier(values, method="iqr")
expected = np.array([False, False, False, False, False, False, True, True])
np.testing.assert_array_equal(result, expected)
def test_std(values):
result = is_outlier(values, method="std")
expected = np.array([False, False, False, False, False, False, True, False])
np.testing.assert_array_equal(result, expected)
def test_std_smaller_threshold(values):
result = is_outlier(values, method="std", threshold=1.0)
expected = np.array([False, False, False, False, False, False, True, True])
np.testing.assert_array_equal(result, expected)
def test_std_greater_threshold(values):
result = is_outlier(values, method="std", threshold=3.0)
expected = np.array([False, False, False, False, False, False, False, False])
np.testing.assert_array_equal(result, expected)
def test_zscore(values):
result = is_outlier(values, method="zscore")
expected = np.array([False, False, False, False, False, False, True, False])
np.testing.assert_array_equal(result, expected)
def test_zscore_smaller_threshold(values):
result = is_outlier(values, method="zscore", threshold=1.0)
expected = np.array([False, False, False, False, False, False, True, True])
np.testing.assert_array_equal(result, expected)
def test_zscore_greater_threshold(values):
result = is_outlier(values, method="zscore", threshold=3.0)
expected = np.array([False, False, False, False, False, False, False, False])
np.testing.assert_array_equal(result, expected)
| 33 | 82 | 0.718457 | 254 | 1,815 | 4.984252 | 0.177165 | 0.308057 | 0.379147 | 0.394945 | 0.85703 | 0.85703 | 0.830964 | 0.771722 | 0.749605 | 0.602686 | 0 | 0.016202 | 0.149862 | 1,815 | 54 | 83 | 33.611111 | 0.804277 | 0.03416 | 0 | 0.411765 | 0 | 0 | 0.017192 | 0 | 0 | 0 | 0 | 0 | 0.205882 | 1 | 0.235294 | false | 0 | 0.088235 | 0.029412 | 0.352941 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
fb657c7e86e39d40f78cec72985a12c8b582982c | 210 | py | Python | chainercv/utils/iterator/__init__.py | iory/chainercv | ecb1953f78c526dfd38308d68a4094c9f4df3a8d | [
"MIT"
] | 1,600 | 2017-06-01T15:37:52.000Z | 2022-03-09T08:39:09.000Z | chainercv/utils/iterator/__init__.py | iory/chainercv | ecb1953f78c526dfd38308d68a4094c9f4df3a8d | [
"MIT"
] | 547 | 2017-06-01T06:43:16.000Z | 2021-05-28T17:14:05.000Z | chainercv/utils/iterator/__init__.py | iory/chainercv | ecb1953f78c526dfd38308d68a4094c9f4df3a8d | [
"MIT"
] | 376 | 2017-06-02T01:29:10.000Z | 2022-03-13T11:19:59.000Z | from chainercv.utils.iterator.apply_to_iterator import apply_to_iterator # NOQA
from chainercv.utils.iterator.progress_hook import ProgressHook # NOQA
from chainercv.utils.iterator.unzip import unzip # NOQA
| 52.5 | 80 | 0.842857 | 29 | 210 | 5.931034 | 0.413793 | 0.226744 | 0.313953 | 0.453488 | 0.348837 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 210 | 3 | 81 | 70 | 0.910053 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
fb6654ce7d939db7ba6d4d2c2e4bd04528f16d42 | 9,755 | py | Python | tests/components/mold_indicator/test_sensor.py | zalke/home-assistant | a31e49c857722c0723dc5297cd83cbce0f8716f6 | [
"Apache-2.0"
] | 4 | 2019-07-03T22:36:57.000Z | 2019-08-10T15:33:25.000Z | tests/components/mold_indicator/test_sensor.py | zalke/home-assistant | a31e49c857722c0723dc5297cd83cbce0f8716f6 | [
"Apache-2.0"
] | 7 | 2019-08-23T05:26:02.000Z | 2022-03-11T23:57:18.000Z | tests/components/mold_indicator/test_sensor.py | zalke/home-assistant | a31e49c857722c0723dc5297cd83cbce0f8716f6 | [
"Apache-2.0"
] | 3 | 2016-08-26T12:32:49.000Z | 2020-02-26T21:01:35.000Z | """The tests for the MoldIndicator sensor."""
import unittest
from homeassistant.setup import setup_component
import homeassistant.components.sensor as sensor
from homeassistant.components.mold_indicator.sensor import (ATTR_DEWPOINT,
ATTR_CRITICAL_TEMP)
from homeassistant.const import (
ATTR_UNIT_OF_MEASUREMENT, STATE_UNKNOWN, TEMP_CELSIUS)
from tests.common import get_test_home_assistant
class TestSensorMoldIndicator(unittest.TestCase):
"""Test the MoldIndicator sensor."""
def setUp(self):
"""Set up things to be run when tests are started."""
self.hass = get_test_home_assistant()
self.hass.states.set('test.indoortemp', '20',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.states.set('test.outdoortemp', '10',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.states.set('test.indoorhumidity', '50',
{ATTR_UNIT_OF_MEASUREMENT: '%'})
def tearDown(self):
"""Stop down everything that was started."""
self.hass.stop()
def test_setup(self):
"""Test the mold indicator sensor setup."""
assert setup_component(self.hass, sensor.DOMAIN, {
'sensor': {
'platform': 'mold_indicator',
'indoor_temp_sensor': 'test.indoortemp',
'outdoor_temp_sensor': 'test.outdoortemp',
'indoor_humidity_sensor': 'test.indoorhumidity',
'calibration_factor': 2.0
}
})
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
assert '%' == moldind.attributes.get('unit_of_measurement')
def test_invalidcalib(self):
"""Test invalid sensor values."""
self.hass.states.set('test.indoortemp', '10',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.states.set('test.outdoortemp', '10',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.states.set('test.indoorhumidity', '0',
{ATTR_UNIT_OF_MEASUREMENT: '%'})
assert setup_component(self.hass, sensor.DOMAIN, {
'sensor': {
'platform': 'mold_indicator',
'indoor_temp_sensor': 'test.indoortemp',
'outdoor_temp_sensor': 'test.outdoortemp',
'indoor_humidity_sensor': 'test.indoorhumidity',
'calibration_factor': 0
}
})
self.hass.start()
self.hass.block_till_done()
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
assert moldind.state == 'unavailable'
assert moldind.attributes.get(ATTR_DEWPOINT) is None
assert moldind.attributes.get(ATTR_CRITICAL_TEMP) is None
def test_invalidhum(self):
"""Test invalid sensor values."""
self.hass.states.set('test.indoortemp', '10',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.states.set('test.outdoortemp', '10',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.states.set('test.indoorhumidity', '-1',
{ATTR_UNIT_OF_MEASUREMENT: '%'})
assert setup_component(self.hass, sensor.DOMAIN, {
'sensor': {
'platform': 'mold_indicator',
'indoor_temp_sensor': 'test.indoortemp',
'outdoor_temp_sensor': 'test.outdoortemp',
'indoor_humidity_sensor': 'test.indoorhumidity',
'calibration_factor': 2.0
}
})
self.hass.start()
self.hass.block_till_done()
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
assert moldind.state == 'unavailable'
assert moldind.attributes.get(ATTR_DEWPOINT) is None
assert moldind.attributes.get(ATTR_CRITICAL_TEMP) is None
self.hass.states.set('test.indoorhumidity', 'A',
{ATTR_UNIT_OF_MEASUREMENT: '%'})
self.hass.block_till_done()
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
assert moldind.state == 'unavailable'
assert moldind.attributes.get(ATTR_DEWPOINT) is None
assert moldind.attributes.get(ATTR_CRITICAL_TEMP) is None
self.hass.states.set('test.indoorhumidity', '10',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.block_till_done()
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
assert moldind.state == 'unavailable'
assert moldind.attributes.get(ATTR_DEWPOINT) is None
assert moldind.attributes.get(ATTR_CRITICAL_TEMP) is None
def test_calculation(self):
"""Test the mold indicator internal calculations."""
assert setup_component(self.hass, sensor.DOMAIN, {
'sensor': {
'platform': 'mold_indicator',
'indoor_temp_sensor': 'test.indoortemp',
'outdoor_temp_sensor': 'test.outdoortemp',
'indoor_humidity_sensor': 'test.indoorhumidity',
'calibration_factor': 2.0
}
})
self.hass.start()
self.hass.block_till_done()
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
# assert dewpoint
dewpoint = moldind.attributes.get(ATTR_DEWPOINT)
assert dewpoint
assert dewpoint > 9.25
assert dewpoint < 9.26
# assert temperature estimation
esttemp = moldind.attributes.get(ATTR_CRITICAL_TEMP)
assert esttemp
assert esttemp > 14.9
assert esttemp < 15.1
# assert mold indicator value
state = moldind.state
assert state
assert state == '68'
def test_unknown_sensor(self):
"""Test the sensor_changed function."""
assert setup_component(self.hass, sensor.DOMAIN, {
'sensor': {
'platform': 'mold_indicator',
'indoor_temp_sensor': 'test.indoortemp',
'outdoor_temp_sensor': 'test.outdoortemp',
'indoor_humidity_sensor': 'test.indoorhumidity',
'calibration_factor': 2.0
}
})
self.hass.start()
self.hass.states.set('test.indoortemp', STATE_UNKNOWN,
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.block_till_done()
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
assert moldind.state == 'unavailable'
assert moldind.attributes.get(ATTR_DEWPOINT) is None
assert moldind.attributes.get(ATTR_CRITICAL_TEMP) is None
self.hass.states.set('test.indoortemp', '30',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.states.set('test.outdoortemp', STATE_UNKNOWN,
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.block_till_done()
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
assert moldind.state == 'unavailable'
assert moldind.attributes.get(ATTR_DEWPOINT) is None
assert moldind.attributes.get(ATTR_CRITICAL_TEMP) is None
self.hass.states.set('test.outdoortemp', '25',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.states.set('test.indoorhumidity', STATE_UNKNOWN,
{ATTR_UNIT_OF_MEASUREMENT: '%'})
self.hass.block_till_done()
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
assert moldind.state == 'unavailable'
assert moldind.attributes.get(ATTR_DEWPOINT) is None
assert moldind.attributes.get(ATTR_CRITICAL_TEMP) is None
self.hass.states.set('test.indoorhumidity', '20',
{ATTR_UNIT_OF_MEASUREMENT: '%'})
self.hass.block_till_done()
moldind = self.hass.states.get('sensor.mold_indicator')
assert moldind
assert moldind.state == '23'
dewpoint = moldind.attributes.get(ATTR_DEWPOINT)
assert dewpoint
assert dewpoint > 4.58
assert dewpoint < 4.59
esttemp = moldind.attributes.get(ATTR_CRITICAL_TEMP)
assert esttemp
assert esttemp == 27.5
def test_sensor_changed(self):
"""Test the sensor_changed function."""
assert setup_component(self.hass, sensor.DOMAIN, {
'sensor': {
'platform': 'mold_indicator',
'indoor_temp_sensor': 'test.indoortemp',
'outdoor_temp_sensor': 'test.outdoortemp',
'indoor_humidity_sensor': 'test.indoorhumidity',
'calibration_factor': 2.0
}
})
self.hass.start()
self.hass.states.set('test.indoortemp', '30',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.block_till_done()
assert self.hass.states.get('sensor.mold_indicator').state == '90'
self.hass.states.set('test.outdoortemp', '25',
{ATTR_UNIT_OF_MEASUREMENT: TEMP_CELSIUS})
self.hass.block_till_done()
assert self.hass.states.get('sensor.mold_indicator').state == '57'
self.hass.states.set('test.indoorhumidity', '20',
{ATTR_UNIT_OF_MEASUREMENT: '%'})
self.hass.block_till_done()
assert self.hass.states.get('sensor.mold_indicator').state == '23'
| 40.987395 | 79 | 0.598155 | 1,017 | 9,755 | 5.519174 | 0.106195 | 0.082665 | 0.082309 | 0.078568 | 0.843221 | 0.834313 | 0.826652 | 0.826652 | 0.826652 | 0.826296 | 0 | 0.010461 | 0.294413 | 9,755 | 237 | 80 | 41.160338 | 0.805027 | 0.045208 | 0 | 0.742268 | 0 | 0 | 0.182162 | 0.043732 | 0 | 0 | 0 | 0 | 0.283505 | 1 | 0.041237 | false | 0 | 0.030928 | 0 | 0.07732 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fb7464f6c262a2584fa2cfa12dc5f55c05616df4 | 166 | py | Python | mqdq/__init__.py | bnagy/mqdq-parser | c477a925ce1f1fa08ff5a152d1b186a58f557c17 | [
"BSD-3-Clause"
] | 2 | 2019-10-11T13:34:05.000Z | 2022-01-17T09:30:29.000Z | mqdq/__init__.py | bnagy/mqdq-parser | c477a925ce1f1fa08ff5a152d1b186a58f557c17 | [
"BSD-3-Clause"
] | 1 | 2021-01-19T13:24:21.000Z | 2021-01-20T00:57:23.000Z | mqdq/__init__.py | bnagy/mqdq-parser | c477a925ce1f1fa08ff5a152d1b186a58f557c17 | [
"BSD-3-Clause"
] | null | null | null | from . import line_analyzer
from . import mahalanobis
from . import utils
from . import rhyme
from . import rhyme_classes
from . import cltk_hax
from . import babble
| 20.75 | 27 | 0.789157 | 24 | 166 | 5.333333 | 0.458333 | 0.546875 | 0.234375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168675 | 166 | 7 | 28 | 23.714286 | 0.927536 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fbcd900a762cd4612bb7e5e35d9ace14621a4cd0 | 51,082 | py | Python | sdk/lusid/api/corporate_action_sources_api.py | mneedham/lusid-sdk-python-preview | f4494009d1a2f3431d931c813cab679bdbd92c84 | [
"MIT"
] | null | null | null | sdk/lusid/api/corporate_action_sources_api.py | mneedham/lusid-sdk-python-preview | f4494009d1a2f3431d931c813cab679bdbd92c84 | [
"MIT"
] | null | null | null | sdk/lusid/api/corporate_action_sources_api.py | mneedham/lusid-sdk-python-preview | f4494009d1a2f3431d931c813cab679bdbd92c84 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
LUSID API
FINBOURNE Technology # noqa: E501
The version of the OpenAPI document: 0.11.3192
Contact: info@finbourne.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from lusid.api_client import ApiClient
from lusid.exceptions import (
ApiTypeError,
ApiValueError
)
class CorporateActionSourcesApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def batch_upsert_corporate_actions(self, scope, code, **kwargs): # noqa: E501
"""[BETA] Upsert corporate actions # noqa: E501
Create or update one or more corporate actions in a particular corporate action source. Failures are identified in the body of the response. If a corporate action is upserted at exactly the same effective datetime as a transaction for the same instrument, the corporate action takes precedence. Depending on the nature of the corporate action, this may mean it affects the transaction. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.batch_upsert_corporate_actions(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of corporate action source (required)
:param str code: The code of the corporate action source (required)
:param list[UpsertCorporateActionRequest] upsert_corporate_action_request: The corporate action definitions
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: UpsertCorporateActionsResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.batch_upsert_corporate_actions_with_http_info(scope, code, **kwargs) # noqa: E501
def batch_upsert_corporate_actions_with_http_info(self, scope, code, **kwargs): # noqa: E501
"""[BETA] Upsert corporate actions # noqa: E501
Create or update one or more corporate actions in a particular corporate action source. Failures are identified in the body of the response. If a corporate action is upserted at exactly the same effective datetime as a transaction for the same instrument, the corporate action takes precedence. Depending on the nature of the corporate action, this may mean it affects the transaction. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.batch_upsert_corporate_actions_with_http_info(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of corporate action source (required)
:param str code: The code of the corporate action source (required)
:param list[UpsertCorporateActionRequest] upsert_corporate_action_request: The corporate action definitions
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(UpsertCorporateActionsResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'upsert_corporate_action_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method batch_upsert_corporate_actions" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `batch_upsert_corporate_actions`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `batch_upsert_corporate_actions`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `batch_upsert_corporate_actions`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `batch_upsert_corporate_actions`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `batch_upsert_corporate_actions`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `batch_upsert_corporate_actions`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'upsert_corporate_action_request' in local_var_params:
body_params = local_var_params['upsert_corporate_action_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json-patch+json', 'application/json', 'text/json', 'application/*+json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.3192'
return self.api_client.call_api(
'/api/corporateactionsources/{scope}/{code}/corporateactions', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='UpsertCorporateActionsResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def create_corporate_action_source(self, create_corporate_action_source_request, **kwargs): # noqa: E501
"""[BETA] Create corporate action source # noqa: E501
Create a corporate action source. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_corporate_action_source(create_corporate_action_source_request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param CreateCorporateActionSourceRequest create_corporate_action_source_request: The corporate action source definition (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: CorporateActionSource
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.create_corporate_action_source_with_http_info(create_corporate_action_source_request, **kwargs) # noqa: E501
def create_corporate_action_source_with_http_info(self, create_corporate_action_source_request, **kwargs): # noqa: E501
"""[BETA] Create corporate action source # noqa: E501
Create a corporate action source. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_corporate_action_source_with_http_info(create_corporate_action_source_request, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param CreateCorporateActionSourceRequest create_corporate_action_source_request: The corporate action source definition (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(CorporateActionSource, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['create_corporate_action_source_request'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_corporate_action_source" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'create_corporate_action_source_request' is set
if ('create_corporate_action_source_request' not in local_var_params or
local_var_params['create_corporate_action_source_request'] is None):
raise ApiValueError("Missing the required parameter `create_corporate_action_source_request` when calling `create_corporate_action_source`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'create_corporate_action_source_request' in local_var_params:
body_params = local_var_params['create_corporate_action_source_request']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json-patch+json', 'application/json', 'text/json', 'application/*+json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.3192'
return self.api_client.call_api(
'/api/corporateactionsources', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CorporateActionSource', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_corporate_action_source(self, scope, code, **kwargs): # noqa: E501
"""[BETA] Delete a corporate action source # noqa: E501
Deletes a single corporate action source # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_corporate_action_source(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the corporate action source to be deleted (required)
:param str code: The code of the corporate action source to be deleted (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: DeletedEntityResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.delete_corporate_action_source_with_http_info(scope, code, **kwargs) # noqa: E501
def delete_corporate_action_source_with_http_info(self, scope, code, **kwargs): # noqa: E501
"""[BETA] Delete a corporate action source # noqa: E501
Deletes a single corporate action source # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_corporate_action_source_with_http_info(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the corporate action source to be deleted (required)
:param str code: The code of the corporate action source to be deleted (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(DeletedEntityResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_corporate_action_source" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_corporate_action_source`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_corporate_action_source`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_corporate_action_source`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `delete_corporate_action_source`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `delete_corporate_action_source`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `delete_corporate_action_source`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.3192'
return self.api_client.call_api(
'/api/corporateactionsources/{scope}/{code}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeletedEntityResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_corporate_actions(self, scope, code, corporate_action_ids, **kwargs): # noqa: E501
"""[EXPERIMENTAL] Delete corporate actions # noqa: E501
Delete one or more corporate actions from a particular corporate action source. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_corporate_actions(scope, code, corporate_action_ids, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the corporate action source (required)
:param str code: The code of the corporate action source (required)
:param list[str] corporate_action_ids: The IDs of the corporate actions to delete (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: DeletedEntityResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.delete_corporate_actions_with_http_info(scope, code, corporate_action_ids, **kwargs) # noqa: E501
def delete_corporate_actions_with_http_info(self, scope, code, corporate_action_ids, **kwargs): # noqa: E501
"""[EXPERIMENTAL] Delete corporate actions # noqa: E501
Delete one or more corporate actions from a particular corporate action source. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_corporate_actions_with_http_info(scope, code, corporate_action_ids, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the corporate action source (required)
:param str code: The code of the corporate action source (required)
:param list[str] corporate_action_ids: The IDs of the corporate actions to delete (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(DeletedEntityResponse, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'corporate_action_ids'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_corporate_actions" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'corporate_action_ids' is set
if ('corporate_action_ids' not in local_var_params or
local_var_params['corporate_action_ids'] is None):
raise ApiValueError("Missing the required parameter `corporate_action_ids` when calling `delete_corporate_actions`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_corporate_actions`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_corporate_actions`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `delete_corporate_actions`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `delete_corporate_actions`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `delete_corporate_actions`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `delete_corporate_actions`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
if 'corporate_action_ids' in local_var_params:
query_params.append(('corporateActionIds', local_var_params['corporate_action_ids'])) # noqa: E501
collection_formats['corporateActionIds'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.3192'
return self.api_client.call_api(
'/api/corporateactionsources/{scope}/{code}/corporateactions', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DeletedEntityResponse', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_corporate_actions(self, scope, code, **kwargs): # noqa: E501
"""[BETA] Get corporate actions # noqa: E501
Get corporate actions from a particular corporate action source. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_corporate_actions(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the corporate action source. (required)
:param str code: The code of the corporate action source. (required)
:param str from_effective_at: Optional. The start effective date of the data range.
:param str to_effective_at: Optional. The end effective date of the data range.
:param datetime as_at: Optional. The AsAt date of the data.
:param list[str] sort_by: Optional. Order the results by these fields. Use use the '-' sign to denote descending order e.g. -MyFieldName
:param int limit: Optional. When paginating, limit the results to this number.
:param str filter: Optional. Expression to filter the result set. For example, to filter on the Announcement Date, use \"announcementDate eq '2020-03-06'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: ResourceListOfCorporateAction
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_corporate_actions_with_http_info(scope, code, **kwargs) # noqa: E501
def get_corporate_actions_with_http_info(self, scope, code, **kwargs): # noqa: E501
"""[BETA] Get corporate actions # noqa: E501
Get corporate actions from a particular corporate action source. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_corporate_actions_with_http_info(scope, code, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str scope: The scope of the corporate action source. (required)
:param str code: The code of the corporate action source. (required)
:param str from_effective_at: Optional. The start effective date of the data range.
:param str to_effective_at: Optional. The end effective date of the data range.
:param datetime as_at: Optional. The AsAt date of the data.
:param list[str] sort_by: Optional. Order the results by these fields. Use use the '-' sign to denote descending order e.g. -MyFieldName
:param int limit: Optional. When paginating, limit the results to this number.
:param str filter: Optional. Expression to filter the result set. For example, to filter on the Announcement Date, use \"announcementDate eq '2020-03-06'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(ResourceListOfCorporateAction, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['scope', 'code', 'from_effective_at', 'to_effective_at', 'as_at', 'sort_by', 'limit', 'filter'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_corporate_actions" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('scope' in local_var_params and
len(local_var_params['scope']) > 64):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_corporate_actions`, length must be less than or equal to `64`") # noqa: E501
if ('scope' in local_var_params and
len(local_var_params['scope']) < 1):
raise ApiValueError("Invalid value for parameter `scope` when calling `get_corporate_actions`, length must be greater than or equal to `1`") # noqa: E501
if 'scope' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['scope']): # noqa: E501
raise ApiValueError("Invalid value for parameter `scope` when calling `get_corporate_actions`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) > 64):
raise ApiValueError("Invalid value for parameter `code` when calling `get_corporate_actions`, length must be less than or equal to `64`") # noqa: E501
if ('code' in local_var_params and
len(local_var_params['code']) < 1):
raise ApiValueError("Invalid value for parameter `code` when calling `get_corporate_actions`, length must be greater than or equal to `1`") # noqa: E501
if 'code' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_]+$', local_var_params['code']): # noqa: E501
raise ApiValueError("Invalid value for parameter `code` when calling `get_corporate_actions`, must conform to the pattern `/^[a-zA-Z0-9\-_]+$/`") # noqa: E501
if ('from_effective_at' in local_var_params and
len(local_var_params['from_effective_at']) > 256):
raise ApiValueError("Invalid value for parameter `from_effective_at` when calling `get_corporate_actions`, length must be less than or equal to `256`") # noqa: E501
if ('from_effective_at' in local_var_params and
len(local_var_params['from_effective_at']) < 0):
raise ApiValueError("Invalid value for parameter `from_effective_at` when calling `get_corporate_actions`, length must be greater than or equal to `0`") # noqa: E501
if 'from_effective_at' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_\+:\.]+$', local_var_params['from_effective_at']): # noqa: E501
raise ApiValueError("Invalid value for parameter `from_effective_at` when calling `get_corporate_actions`, must conform to the pattern `/^[a-zA-Z0-9\-_\+:\.]+$/`") # noqa: E501
if ('to_effective_at' in local_var_params and
len(local_var_params['to_effective_at']) > 256):
raise ApiValueError("Invalid value for parameter `to_effective_at` when calling `get_corporate_actions`, length must be less than or equal to `256`") # noqa: E501
if ('to_effective_at' in local_var_params and
len(local_var_params['to_effective_at']) < 0):
raise ApiValueError("Invalid value for parameter `to_effective_at` when calling `get_corporate_actions`, length must be greater than or equal to `0`") # noqa: E501
if 'to_effective_at' in local_var_params and not re.search(r'^[a-zA-Z0-9\-_\+:\.]+$', local_var_params['to_effective_at']): # noqa: E501
raise ApiValueError("Invalid value for parameter `to_effective_at` when calling `get_corporate_actions`, must conform to the pattern `/^[a-zA-Z0-9\-_\+:\.]+$/`") # noqa: E501
if ('filter' in local_var_params and
len(local_var_params['filter']) > 2147483647):
raise ApiValueError("Invalid value for parameter `filter` when calling `get_corporate_actions`, length must be less than or equal to `2147483647`") # noqa: E501
if ('filter' in local_var_params and
len(local_var_params['filter']) < 0):
raise ApiValueError("Invalid value for parameter `filter` when calling `get_corporate_actions`, length must be greater than or equal to `0`") # noqa: E501
if 'filter' in local_var_params and not re.search(r'^[\s\S]*$', local_var_params['filter']): # noqa: E501
raise ApiValueError("Invalid value for parameter `filter` when calling `get_corporate_actions`, must conform to the pattern `/^[\s\S]*$/`") # noqa: E501
collection_formats = {}
path_params = {}
if 'scope' in local_var_params:
path_params['scope'] = local_var_params['scope'] # noqa: E501
if 'code' in local_var_params:
path_params['code'] = local_var_params['code'] # noqa: E501
query_params = []
if 'from_effective_at' in local_var_params:
query_params.append(('fromEffectiveAt', local_var_params['from_effective_at'])) # noqa: E501
if 'to_effective_at' in local_var_params:
query_params.append(('toEffectiveAt', local_var_params['to_effective_at'])) # noqa: E501
if 'as_at' in local_var_params:
query_params.append(('asAt', local_var_params['as_at'])) # noqa: E501
if 'sort_by' in local_var_params:
query_params.append(('sortBy', local_var_params['sort_by'])) # noqa: E501
collection_formats['sortBy'] = 'multi' # noqa: E501
if 'limit' in local_var_params:
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'filter' in local_var_params:
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.3192'
return self.api_client.call_api(
'/api/corporateactionsources/{scope}/{code}/corporateactions', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResourceListOfCorporateAction', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def list_corporate_action_sources(self, **kwargs): # noqa: E501
"""[BETA] List corporate action sources # noqa: E501
Gets a list of all corporate action sources # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_corporate_action_sources(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param datetime as_at: Optional. The AsAt date of the data
:param list[str] sort_by: Optional. Order the results by these fields. Use use the '-' sign to denote descending order e.g. -MyFieldName
:param int limit: Optional. When paginating, limit the number of returned results to this many. If not specified, a default of 100 is used.
:param str filter: Optional. Expression to filter the result set. For example, to filter on the Display Name, use \"displayName eq 'string'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param str page: Optional. The pagination token to use to continue listing items from a previous call. Page values are return from list calls, and must be supplied exactly as returned. Additionally, when specifying this value, the filter, asAt, and limit must not be modified.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: PagedResourceListOfCorporateActionSource
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.list_corporate_action_sources_with_http_info(**kwargs) # noqa: E501
def list_corporate_action_sources_with_http_info(self, **kwargs): # noqa: E501
"""[BETA] List corporate action sources # noqa: E501
Gets a list of all corporate action sources # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_corporate_action_sources_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param datetime as_at: Optional. The AsAt date of the data
:param list[str] sort_by: Optional. Order the results by these fields. Use use the '-' sign to denote descending order e.g. -MyFieldName
:param int limit: Optional. When paginating, limit the number of returned results to this many. If not specified, a default of 100 is used.
:param str filter: Optional. Expression to filter the result set. For example, to filter on the Display Name, use \"displayName eq 'string'\" Read more about filtering results from LUSID here https://support.lusid.com/filtering-results-from-lusid.
:param str page: Optional. The pagination token to use to continue listing items from a previous call. Page values are return from list calls, and must be supplied exactly as returned. Additionally, when specifying this value, the filter, asAt, and limit must not be modified.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(PagedResourceListOfCorporateActionSource, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = ['as_at', 'sort_by', 'limit', 'filter', 'page'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_corporate_action_sources" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
if ('filter' in local_var_params and
len(local_var_params['filter']) > 2147483647):
raise ApiValueError("Invalid value for parameter `filter` when calling `list_corporate_action_sources`, length must be less than or equal to `2147483647`") # noqa: E501
if ('filter' in local_var_params and
len(local_var_params['filter']) < 0):
raise ApiValueError("Invalid value for parameter `filter` when calling `list_corporate_action_sources`, length must be greater than or equal to `0`") # noqa: E501
if 'filter' in local_var_params and not re.search(r'^[\s\S]*$', local_var_params['filter']): # noqa: E501
raise ApiValueError("Invalid value for parameter `filter` when calling `list_corporate_action_sources`, must conform to the pattern `/^[\s\S]*$/`") # noqa: E501
if ('page' in local_var_params and
len(local_var_params['page']) > 500):
raise ApiValueError("Invalid value for parameter `page` when calling `list_corporate_action_sources`, length must be less than or equal to `500`") # noqa: E501
if ('page' in local_var_params and
len(local_var_params['page']) < 1):
raise ApiValueError("Invalid value for parameter `page` when calling `list_corporate_action_sources`, length must be greater than or equal to `1`") # noqa: E501
if 'page' in local_var_params and not re.search(r'^[a-zA-Z0-9\+\/]*={0,3}$', local_var_params['page']): # noqa: E501
raise ApiValueError("Invalid value for parameter `page` when calling `list_corporate_action_sources`, must conform to the pattern `/^[a-zA-Z0-9\+\/]*={0,3}$/`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'as_at' in local_var_params:
query_params.append(('asAt', local_var_params['as_at'])) # noqa: E501
if 'sort_by' in local_var_params:
query_params.append(('sortBy', local_var_params['sort_by'])) # noqa: E501
collection_formats['sortBy'] = 'multi' # noqa: E501
if 'limit' in local_var_params:
query_params.append(('limit', local_var_params['limit'])) # noqa: E501
if 'filter' in local_var_params:
query_params.append(('filter', local_var_params['filter'])) # noqa: E501
if 'page' in local_var_params:
query_params.append(('page', local_var_params['page'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['text/plain', 'application/json', 'text/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
# set the LUSID header
header_params['X-LUSID-SDK-Language'] = 'Python'
header_params['X-LUSID-SDK-Version'] = '0.11.3192'
return self.api_client.call_api(
'/api/corporateactionsources', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PagedResourceListOfCorporateActionSource', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 58.179954 | 422 | 0.643005 | 6,237 | 51,082 | 5.04826 | 0.049062 | 0.047259 | 0.077368 | 0.032014 | 0.961793 | 0.956997 | 0.9475 | 0.94064 | 0.925554 | 0.917741 | 0 | 0.019378 | 0.271622 | 51,082 | 877 | 423 | 58.246294 | 0.826861 | 0.406405 | 0 | 0.716216 | 1 | 0.087838 | 0.334978 | 0.102922 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029279 | false | 0 | 0.011261 | 0 | 0.06982 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
83f57613cda49d061965dac1f60b8bc7c31f9ed1 | 210 | py | Python | codes_/0459_Repeated_Substring_Pattern.py | SaitoTsutomu/leetcode | 4656d66ab721a5c7bc59890db9a2331c6823b2bf | [
"MIT"
] | null | null | null | codes_/0459_Repeated_Substring_Pattern.py | SaitoTsutomu/leetcode | 4656d66ab721a5c7bc59890db9a2331c6823b2bf | [
"MIT"
] | null | null | null | codes_/0459_Repeated_Substring_Pattern.py | SaitoTsutomu/leetcode | 4656d66ab721a5c7bc59890db9a2331c6823b2bf | [
"MIT"
] | null | null | null | # %% [459. Repeated Substring Pattern](https://leetcode.com/problems/repeated-substring-pattern/)
class Solution:
def repeatedSubstringPattern(self, s: str) -> bool:
return re.match(r"(.+)\1+$", s)
| 42 | 97 | 0.680952 | 25 | 210 | 5.72 | 0.84 | 0.237762 | 0.335664 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022099 | 0.138095 | 210 | 4 | 98 | 52.5 | 0.767956 | 0.452381 | 0 | 0 | 0 | 0 | 0.070796 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
f7f083696b538c5236c621c57ce8917e258014c2 | 1,067 | py | Python | venv/lib/python2.7/abc.py | sunlum/Deep-Semantic-Space-NST | 468ac2590385f48e65df12c1a3c9db0ed8d49477 | [
"MIT"
] | null | null | null | venv/lib/python2.7/abc.py | sunlum/Deep-Semantic-Space-NST | 468ac2590385f48e65df12c1a3c9db0ed8d49477 | [
"MIT"
] | null | null | null | venv/lib/python2.7/abc.py | sunlum/Deep-Semantic-Space-NST | 468ac2590385f48e65df12c1a3c9db0ed8d49477 | [
"MIT"
] | null | null | null | XSym
0031
a8ae748378938322ffe624347b368329
/anaconda2/lib/python2.7/abc.py
| 213.4 | 992 | 0.060918 | 9 | 1,067 | 7.222222 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.450704 | 0.933458 | 1,067 | 5 | 992 | 213.4 | 0.464789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
790ee4ce06810fca709eb06e736f55e192057ee7 | 162 | py | Python | tests/test_utils/__init__.py | matrig/genrl | 25eb018f18a9a1d0865c16e5233a2a7ccddbfd78 | [
"MIT"
] | 390 | 2020-05-03T17:34:02.000Z | 2022-03-05T11:29:07.000Z | tests/test_utils/__init__.py | matrig/genrl | 25eb018f18a9a1d0865c16e5233a2a7ccddbfd78 | [
"MIT"
] | 306 | 2020-05-03T05:53:53.000Z | 2022-03-12T00:27:28.000Z | tests/test_utils/__init__.py | matrig/genrl | 25eb018f18a9a1d0865c16e5233a2a7ccddbfd78 | [
"MIT"
] | 64 | 2020-05-05T20:23:30.000Z | 2022-03-30T08:43:10.000Z | from tests.test_utils.test_logger import TestLogger
from tests.test_utils.test_models import TestClassicalModel
from tests.test_utils.test_utils import TestUtils
| 40.5 | 59 | 0.888889 | 24 | 162 | 5.75 | 0.416667 | 0.26087 | 0.282609 | 0.391304 | 0.478261 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 162 | 3 | 60 | 54 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
7925c1bde737d22bed03ea693d3a57d442dce590 | 39 | py | Python | devel/lib/python3/dist-packages/my_custom_srv_msg_pkg/srv/__init__.py | euivmar/pruebasROS | 97954e87eb2400a42b997ed9b38f1ee2bf282b69 | [
"MIT"
] | null | null | null | devel/lib/python3/dist-packages/my_custom_srv_msg_pkg/srv/__init__.py | euivmar/pruebasROS | 97954e87eb2400a42b997ed9b38f1ee2bf282b69 | [
"MIT"
] | null | null | null | devel/lib/python3/dist-packages/my_custom_srv_msg_pkg/srv/__init__.py | euivmar/pruebasROS | 97954e87eb2400a42b997ed9b38f1ee2bf282b69 | [
"MIT"
] | null | null | null | from ._MyCustomServiceMessage import *
| 19.5 | 38 | 0.846154 | 3 | 39 | 10.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 39 | 1 | 39 | 39 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f73839ebce6f8d749f7e86c9380c3350213d6360 | 19,983 | py | Python | evolved5g/swagger_client/api/location_frontend_api.py | EVOLVED-5G/SDK-CLI | 0f289c7b21c14c3e349164d21cc78d9b6af0a237 | [
"Apache-2.0"
] | 3 | 2021-10-19T14:37:14.000Z | 2021-11-01T10:43:33.000Z | evolved5g/swagger_client/api/location_frontend_api.py | skolome/evolved5g_cli | b202a878befe22b8dda66ee05610408777f4f006 | [
"Apache-2.0"
] | 14 | 2021-11-02T10:30:56.000Z | 2022-03-10T11:30:59.000Z | evolved5g/swagger_client/api/location_frontend_api.py | skolome/evolved5g_cli | b202a878befe22b8dda66ee05610408777f4f006 | [
"Apache-2.0"
] | 1 | 2021-11-16T16:20:31.000Z | 2021-11-16T16:20:31.000Z | # coding: utf-8
"""
NEF_Emulator
No description provided (generated by Swagger Codegen https://github.com/swagger-api/swagger-codegen) # noqa: E501
OpenAPI spec version: 0.1.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from evolved5g.swagger_client.api_client import ApiClient
class LocationFrontendApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_path_api_v1_frontend_location_post(self, body, **kwargs): # noqa: E501
"""Create Path # noqa: E501
Create new path. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_path_api_v1_frontend_location_post(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param PathCreate body: (required)
:return: Path
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_path_api_v1_frontend_location_post_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.create_path_api_v1_frontend_location_post_with_http_info(body, **kwargs) # noqa: E501
return data
def create_path_api_v1_frontend_location_post_with_http_info(self, body, **kwargs): # noqa: E501
"""Create Path # noqa: E501
Create new path. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_path_api_v1_frontend_location_post_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param PathCreate body: (required)
:return: Path
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_path_api_v1_frontend_location_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `create_path_api_v1_frontend_location_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['OAuth2PasswordBearer'] # noqa: E501
return self.api_client.call_api(
'/api/v1/frontend/location/', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Path', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_path_api_v1_frontend_location_id_delete(self, id, **kwargs): # noqa: E501
"""Delete Path # noqa: E501
Delete an path. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_path_api_v1_frontend_location_id_delete(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: Path
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_path_api_v1_frontend_location_id_delete_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_path_api_v1_frontend_location_id_delete_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_path_api_v1_frontend_location_id_delete_with_http_info(self, id, **kwargs): # noqa: E501
"""Delete Path # noqa: E501
Delete an path. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_path_api_v1_frontend_location_id_delete_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: Path
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_path_api_v1_frontend_location_id_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `delete_path_api_v1_frontend_location_id_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['OAuth2PasswordBearer'] # noqa: E501
return self.api_client.call_api(
'/api/v1/frontend/location/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Path', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def read_path_api_v1_frontend_location_id_get(self, id, **kwargs): # noqa: E501
"""Read Path # noqa: E501
Get path by ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.read_path_api_v1_frontend_location_id_get(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: Path
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.read_path_api_v1_frontend_location_id_get_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.read_path_api_v1_frontend_location_id_get_with_http_info(id, **kwargs) # noqa: E501
return data
def read_path_api_v1_frontend_location_id_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Read Path # noqa: E501
Get path by ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.read_path_api_v1_frontend_location_id_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: (required)
:return: Path
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method read_path_api_v1_frontend_location_id_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `read_path_api_v1_frontend_location_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['OAuth2PasswordBearer'] # noqa: E501
return self.api_client.call_api(
'/api/v1/frontend/location/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Path', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def read_paths_api_v1_frontend_location_get(self, **kwargs): # noqa: E501
"""Read Paths # noqa: E501
Retrieve paths. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.read_paths_api_v1_frontend_location_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int skip:
:param int limit:
:return: list[Path]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.read_paths_api_v1_frontend_location_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.read_paths_api_v1_frontend_location_get_with_http_info(**kwargs) # noqa: E501
return data
def read_paths_api_v1_frontend_location_get_with_http_info(self, **kwargs): # noqa: E501
"""Read Paths # noqa: E501
Retrieve paths. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.read_paths_api_v1_frontend_location_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int skip:
:param int limit:
:return: list[Path]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['skip', 'limit'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method read_paths_api_v1_frontend_location_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'skip' in params:
query_params.append(('skip', params['skip'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['OAuth2PasswordBearer'] # noqa: E501
return self.api_client.call_api(
'/api/v1/frontend/location/', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[Path]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_path_api_v1_frontend_location_id_put(self, body, id, **kwargs): # noqa: E501
"""Update Path # noqa: E501
Update an path. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_path_api_v1_frontend_location_id_put(body, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param PathUpdate body: (required)
:param int id: (required)
:return: Path
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_path_api_v1_frontend_location_id_put_with_http_info(body, id, **kwargs) # noqa: E501
else:
(data) = self.update_path_api_v1_frontend_location_id_put_with_http_info(body, id, **kwargs) # noqa: E501
return data
def update_path_api_v1_frontend_location_id_put_with_http_info(self, body, id, **kwargs): # noqa: E501
"""Update Path # noqa: E501
Update an path. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_path_api_v1_frontend_location_id_put_with_http_info(body, id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param PathUpdate body: (required)
:param int id: (required)
:return: Path
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body', 'id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_path_api_v1_frontend_location_id_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_path_api_v1_frontend_location_id_put`") # noqa: E501
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `update_path_api_v1_frontend_location_id_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['OAuth2PasswordBearer'] # noqa: E501
return self.api_client.call_api(
'/api/v1/frontend/location/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Path', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 38.062857 | 143 | 0.61172 | 2,366 | 19,983 | 4.862637 | 0.06847 | 0.052151 | 0.050847 | 0.082138 | 0.954542 | 0.949848 | 0.941156 | 0.919774 | 0.895089 | 0.885702 | 0 | 0.020315 | 0.297953 | 19,983 | 524 | 144 | 38.135496 | 0.799772 | 0.303858 | 0 | 0.794224 | 1 | 0 | 0.182704 | 0.070649 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039711 | false | 0.018051 | 0.01444 | 0 | 0.111913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f758ced5a2bb541850827028d347ba02ae8f2e33 | 1,351 | py | Python | nicos_mlz/sans1/setups/pilz.py | ebadkamil/nicos | 0355a970d627aae170c93292f08f95759c97f3b5 | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | nicos_mlz/sans1/setups/pilz.py | ebadkamil/nicos | 0355a970d627aae170c93292f08f95759c97f3b5 | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 1 | 2021-08-18T10:55:42.000Z | 2021-08-18T10:55:42.000Z | nicos_mlz/sans1/setups/pilz.py | ISISComputingGroup/nicos | 94cb4d172815919481f8c6ee686f21ebb76f2068 | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | description = 'Inputs from the Pilz control box'
group = 'lowlevel'
tango_base = 'tango://sans1hw.sans1.frm2:10000/sans1/modbus/'
# TODO: Mapping and description of the devices
devices = dict(
iatt1 = device('nicos.devices.tango.NamedDigitalInput',
description = 'Unknown',
tangodevice = tango_base + 'iatt1',
),
iatt2 = device('nicos.devices.tango.NamedDigitalInput',
description = 'Unknown',
tangodevice = tango_base + 'iatt2',
),
ishutter = device('nicos.devices.tango.NamedDigitalInput',
description = 'Unknown',
tangodevice = tango_base + 'ishutter',
),
door = device('nicos.devices.tango.NamedDigitalInput',
description = 'Unknown',
tangodevice = tango_base + 'door',
),
irc = device('nicos.devices.tango.NamedDigitalInput',
description = 'Unknown',
tangodevice = tango_base + 'irc',
),
eatt1 = device('nicos.devices.tango.NamedDigitalInput',
description = 'Unknown',
tangodevice = tango_base + 'eatt1',
),
eatt2 = device('nicos.devices.tango.NamedDigitalInput',
description = 'Unknown',
tangodevice = tango_base + 'eatt2',
),
eatt3 = device('nicos.devices.tango.NamedDigitalInput',
description = 'Unknown',
tangodevice = tango_base + 'eatt3',
),
)
| 31.418605 | 62 | 0.635085 | 123 | 1,351 | 6.902439 | 0.284553 | 0.095406 | 0.169611 | 0.216726 | 0.734982 | 0.734982 | 0.734982 | 0.734982 | 0.734982 | 0.734982 | 0 | 0.018393 | 0.235381 | 1,351 | 42 | 63 | 32.166667 | 0.803485 | 0.032568 | 0 | 0.432432 | 0 | 0 | 0.366284 | 0.262069 | 0 | 0 | 0 | 0.02381 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f764c17c2e33a44a7c94028d2e191b1722c3c319 | 28,338 | py | Python | sdk/python/pulumi_azure/devtest/global_vm_shutdown_schedule.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2018-06-18T00:19:44.000Z | 2022-02-20T05:32:57.000Z | sdk/python/pulumi_azure/devtest/global_vm_shutdown_schedule.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 663 | 2018-06-18T21:08:46.000Z | 2022-03-31T20:10:11.000Z | sdk/python/pulumi_azure/devtest/global_vm_shutdown_schedule.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 41 | 2018-07-19T22:37:38.000Z | 2022-03-14T10:56:26.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['GlobalVMShutdownScheduleArgs', 'GlobalVMShutdownSchedule']
@pulumi.input_type
class GlobalVMShutdownScheduleArgs:
def __init__(__self__, *,
daily_recurrence_time: pulumi.Input[str],
notification_settings: pulumi.Input['GlobalVMShutdownScheduleNotificationSettingsArgs'],
timezone: pulumi.Input[str],
virtual_machine_id: pulumi.Input[str],
enabled: Optional[pulumi.Input[bool]] = None,
location: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a GlobalVMShutdownSchedule resource.
:param pulumi.Input[str] daily_recurrence_time: The time each day when the schedule takes effect. Must match the format HHmm where HH is 00-23 and mm is 00-59 (e.g. 0930, 2300, etc.)
:param pulumi.Input[str] timezone: The time zone ID (e.g. Pacific Standard time). Refer to this guide for a [full list of accepted time zone names](https://jackstromberg.com/2017/01/list-of-time-zones-consumed-by-azure/).
:param pulumi.Input[str] virtual_machine_id: The resource ID of the target ARM-based Virtual Machine. Changing this forces a new resource to be created.
:param pulumi.Input[bool] enabled: Whether to enable the schedule. Possible values are `true` and `false`. Defaults to `true`.
:param pulumi.Input[str] location: The location where the schedule is created. Changing this forces a new resource to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
"""
pulumi.set(__self__, "daily_recurrence_time", daily_recurrence_time)
pulumi.set(__self__, "notification_settings", notification_settings)
pulumi.set(__self__, "timezone", timezone)
pulumi.set(__self__, "virtual_machine_id", virtual_machine_id)
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
if location is not None:
pulumi.set(__self__, "location", location)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="dailyRecurrenceTime")
def daily_recurrence_time(self) -> pulumi.Input[str]:
"""
The time each day when the schedule takes effect. Must match the format HHmm where HH is 00-23 and mm is 00-59 (e.g. 0930, 2300, etc.)
"""
return pulumi.get(self, "daily_recurrence_time")
@daily_recurrence_time.setter
def daily_recurrence_time(self, value: pulumi.Input[str]):
pulumi.set(self, "daily_recurrence_time", value)
@property
@pulumi.getter(name="notificationSettings")
def notification_settings(self) -> pulumi.Input['GlobalVMShutdownScheduleNotificationSettingsArgs']:
return pulumi.get(self, "notification_settings")
@notification_settings.setter
def notification_settings(self, value: pulumi.Input['GlobalVMShutdownScheduleNotificationSettingsArgs']):
pulumi.set(self, "notification_settings", value)
@property
@pulumi.getter
def timezone(self) -> pulumi.Input[str]:
"""
The time zone ID (e.g. Pacific Standard time). Refer to this guide for a [full list of accepted time zone names](https://jackstromberg.com/2017/01/list-of-time-zones-consumed-by-azure/).
"""
return pulumi.get(self, "timezone")
@timezone.setter
def timezone(self, value: pulumi.Input[str]):
pulumi.set(self, "timezone", value)
@property
@pulumi.getter(name="virtualMachineId")
def virtual_machine_id(self) -> pulumi.Input[str]:
"""
The resource ID of the target ARM-based Virtual Machine. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "virtual_machine_id")
@virtual_machine_id.setter
def virtual_machine_id(self, value: pulumi.Input[str]):
pulumi.set(self, "virtual_machine_id", value)
@property
@pulumi.getter
def enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to enable the schedule. Possible values are `true` and `false`. Defaults to `true`.
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enabled", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
The location where the schedule is created. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _GlobalVMShutdownScheduleState:
def __init__(__self__, *,
daily_recurrence_time: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
location: Optional[pulumi.Input[str]] = None,
notification_settings: Optional[pulumi.Input['GlobalVMShutdownScheduleNotificationSettingsArgs']] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
timezone: Optional[pulumi.Input[str]] = None,
virtual_machine_id: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering GlobalVMShutdownSchedule resources.
:param pulumi.Input[str] daily_recurrence_time: The time each day when the schedule takes effect. Must match the format HHmm where HH is 00-23 and mm is 00-59 (e.g. 0930, 2300, etc.)
:param pulumi.Input[bool] enabled: Whether to enable the schedule. Possible values are `true` and `false`. Defaults to `true`.
:param pulumi.Input[str] location: The location where the schedule is created. Changing this forces a new resource to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
:param pulumi.Input[str] timezone: The time zone ID (e.g. Pacific Standard time). Refer to this guide for a [full list of accepted time zone names](https://jackstromberg.com/2017/01/list-of-time-zones-consumed-by-azure/).
:param pulumi.Input[str] virtual_machine_id: The resource ID of the target ARM-based Virtual Machine. Changing this forces a new resource to be created.
"""
if daily_recurrence_time is not None:
pulumi.set(__self__, "daily_recurrence_time", daily_recurrence_time)
if enabled is not None:
pulumi.set(__self__, "enabled", enabled)
if location is not None:
pulumi.set(__self__, "location", location)
if notification_settings is not None:
pulumi.set(__self__, "notification_settings", notification_settings)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if timezone is not None:
pulumi.set(__self__, "timezone", timezone)
if virtual_machine_id is not None:
pulumi.set(__self__, "virtual_machine_id", virtual_machine_id)
@property
@pulumi.getter(name="dailyRecurrenceTime")
def daily_recurrence_time(self) -> Optional[pulumi.Input[str]]:
"""
The time each day when the schedule takes effect. Must match the format HHmm where HH is 00-23 and mm is 00-59 (e.g. 0930, 2300, etc.)
"""
return pulumi.get(self, "daily_recurrence_time")
@daily_recurrence_time.setter
def daily_recurrence_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "daily_recurrence_time", value)
@property
@pulumi.getter
def enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to enable the schedule. Possible values are `true` and `false`. Defaults to `true`.
"""
return pulumi.get(self, "enabled")
@enabled.setter
def enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enabled", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
The location where the schedule is created. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter(name="notificationSettings")
def notification_settings(self) -> Optional[pulumi.Input['GlobalVMShutdownScheduleNotificationSettingsArgs']]:
return pulumi.get(self, "notification_settings")
@notification_settings.setter
def notification_settings(self, value: Optional[pulumi.Input['GlobalVMShutdownScheduleNotificationSettingsArgs']]):
pulumi.set(self, "notification_settings", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter
def timezone(self) -> Optional[pulumi.Input[str]]:
"""
The time zone ID (e.g. Pacific Standard time). Refer to this guide for a [full list of accepted time zone names](https://jackstromberg.com/2017/01/list-of-time-zones-consumed-by-azure/).
"""
return pulumi.get(self, "timezone")
@timezone.setter
def timezone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "timezone", value)
@property
@pulumi.getter(name="virtualMachineId")
def virtual_machine_id(self) -> Optional[pulumi.Input[str]]:
"""
The resource ID of the target ARM-based Virtual Machine. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "virtual_machine_id")
@virtual_machine_id.setter
def virtual_machine_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "virtual_machine_id", value)
class GlobalVMShutdownSchedule(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
daily_recurrence_time: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
location: Optional[pulumi.Input[str]] = None,
notification_settings: Optional[pulumi.Input[pulumi.InputType['GlobalVMShutdownScheduleNotificationSettingsArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
timezone: Optional[pulumi.Input[str]] = None,
virtual_machine_id: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Manages automated shutdown schedules for Azure VMs that are not within an Azure DevTest Lab. While this is part of the DevTest Labs service in Azure,
this resource applies only to standard VMs, not DevTest Lab VMs. To manage automated shutdown schedules for DevTest Lab VMs, reference the
`devtest.Schedule` resource
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_virtual_network = azure.network.VirtualNetwork("exampleVirtualNetwork",
address_spaces=["10.0.0.0/16"],
location=example_resource_group.location,
resource_group_name=example_resource_group.name)
example_subnet = azure.network.Subnet("exampleSubnet",
resource_group_name=example_resource_group.name,
virtual_network_name=example_virtual_network.name,
address_prefixes=["10.0.2.0/24"])
example_network_interface = azure.network.NetworkInterface("exampleNetworkInterface",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
ip_configurations=[azure.network.NetworkInterfaceIpConfigurationArgs(
name="testconfiguration1",
subnet_id=example_subnet.id,
private_ip_address_allocation="Dynamic",
)])
example_linux_virtual_machine = azure.compute.LinuxVirtualMachine("exampleLinuxVirtualMachine",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
network_interface_ids=[example_network_interface.id],
size="Standard_B2s",
source_image_reference=azure.compute.LinuxVirtualMachineSourceImageReferenceArgs(
publisher="Canonical",
offer="UbuntuServer",
sku="16.04-LTS",
version="latest",
),
os_disk=azure.compute.LinuxVirtualMachineOsDiskArgs(
name="myosdisk-%d",
caching="ReadWrite",
managed_disk_type="Standard_LRS",
),
admin_username="testadmin",
admin_password="Password1234!",
disable_password_authentication=False)
example_global_vm_shutdown_schedule = azure.devtest.GlobalVMShutdownSchedule("exampleGlobalVMShutdownSchedule",
virtual_machine_id=azurerm_virtual_machine["example"]["id"],
location=example_resource_group.location,
enabled=True,
daily_recurrence_time="1100",
timezone="Pacific Standard Time",
notification_settings=azure.devtest.GlobalVMShutdownScheduleNotificationSettingsArgs(
enabled=True,
time_in_minutes=60,
webhook_url="https://sample-webhook-url.example.com",
))
```
## Import
An existing Dev Test Global Shutdown Schedule can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:devtest/globalVMShutdownSchedule:GlobalVMShutdownSchedule example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/sample-rg/providers/Microsoft.DevTestLab/schedules/shutdown-computevm-SampleVM
```
The name of the resource within the `resource id` will always follow the format `shutdown-computevm-<VM Name>` where `<VM Name>` is replaced by the name of the target Virtual Machine
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] daily_recurrence_time: The time each day when the schedule takes effect. Must match the format HHmm where HH is 00-23 and mm is 00-59 (e.g. 0930, 2300, etc.)
:param pulumi.Input[bool] enabled: Whether to enable the schedule. Possible values are `true` and `false`. Defaults to `true`.
:param pulumi.Input[str] location: The location where the schedule is created. Changing this forces a new resource to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
:param pulumi.Input[str] timezone: The time zone ID (e.g. Pacific Standard time). Refer to this guide for a [full list of accepted time zone names](https://jackstromberg.com/2017/01/list-of-time-zones-consumed-by-azure/).
:param pulumi.Input[str] virtual_machine_id: The resource ID of the target ARM-based Virtual Machine. Changing this forces a new resource to be created.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: GlobalVMShutdownScheduleArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages automated shutdown schedules for Azure VMs that are not within an Azure DevTest Lab. While this is part of the DevTest Labs service in Azure,
this resource applies only to standard VMs, not DevTest Lab VMs. To manage automated shutdown schedules for DevTest Lab VMs, reference the
`devtest.Schedule` resource
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_virtual_network = azure.network.VirtualNetwork("exampleVirtualNetwork",
address_spaces=["10.0.0.0/16"],
location=example_resource_group.location,
resource_group_name=example_resource_group.name)
example_subnet = azure.network.Subnet("exampleSubnet",
resource_group_name=example_resource_group.name,
virtual_network_name=example_virtual_network.name,
address_prefixes=["10.0.2.0/24"])
example_network_interface = azure.network.NetworkInterface("exampleNetworkInterface",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
ip_configurations=[azure.network.NetworkInterfaceIpConfigurationArgs(
name="testconfiguration1",
subnet_id=example_subnet.id,
private_ip_address_allocation="Dynamic",
)])
example_linux_virtual_machine = azure.compute.LinuxVirtualMachine("exampleLinuxVirtualMachine",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
network_interface_ids=[example_network_interface.id],
size="Standard_B2s",
source_image_reference=azure.compute.LinuxVirtualMachineSourceImageReferenceArgs(
publisher="Canonical",
offer="UbuntuServer",
sku="16.04-LTS",
version="latest",
),
os_disk=azure.compute.LinuxVirtualMachineOsDiskArgs(
name="myosdisk-%d",
caching="ReadWrite",
managed_disk_type="Standard_LRS",
),
admin_username="testadmin",
admin_password="Password1234!",
disable_password_authentication=False)
example_global_vm_shutdown_schedule = azure.devtest.GlobalVMShutdownSchedule("exampleGlobalVMShutdownSchedule",
virtual_machine_id=azurerm_virtual_machine["example"]["id"],
location=example_resource_group.location,
enabled=True,
daily_recurrence_time="1100",
timezone="Pacific Standard Time",
notification_settings=azure.devtest.GlobalVMShutdownScheduleNotificationSettingsArgs(
enabled=True,
time_in_minutes=60,
webhook_url="https://sample-webhook-url.example.com",
))
```
## Import
An existing Dev Test Global Shutdown Schedule can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:devtest/globalVMShutdownSchedule:GlobalVMShutdownSchedule example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/sample-rg/providers/Microsoft.DevTestLab/schedules/shutdown-computevm-SampleVM
```
The name of the resource within the `resource id` will always follow the format `shutdown-computevm-<VM Name>` where `<VM Name>` is replaced by the name of the target Virtual Machine
:param str resource_name: The name of the resource.
:param GlobalVMShutdownScheduleArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(GlobalVMShutdownScheduleArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
daily_recurrence_time: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
location: Optional[pulumi.Input[str]] = None,
notification_settings: Optional[pulumi.Input[pulumi.InputType['GlobalVMShutdownScheduleNotificationSettingsArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
timezone: Optional[pulumi.Input[str]] = None,
virtual_machine_id: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = GlobalVMShutdownScheduleArgs.__new__(GlobalVMShutdownScheduleArgs)
if daily_recurrence_time is None and not opts.urn:
raise TypeError("Missing required property 'daily_recurrence_time'")
__props__.__dict__["daily_recurrence_time"] = daily_recurrence_time
__props__.__dict__["enabled"] = enabled
__props__.__dict__["location"] = location
if notification_settings is None and not opts.urn:
raise TypeError("Missing required property 'notification_settings'")
__props__.__dict__["notification_settings"] = notification_settings
__props__.__dict__["tags"] = tags
if timezone is None and not opts.urn:
raise TypeError("Missing required property 'timezone'")
__props__.__dict__["timezone"] = timezone
if virtual_machine_id is None and not opts.urn:
raise TypeError("Missing required property 'virtual_machine_id'")
__props__.__dict__["virtual_machine_id"] = virtual_machine_id
super(GlobalVMShutdownSchedule, __self__).__init__(
'azure:devtest/globalVMShutdownSchedule:GlobalVMShutdownSchedule',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
daily_recurrence_time: Optional[pulumi.Input[str]] = None,
enabled: Optional[pulumi.Input[bool]] = None,
location: Optional[pulumi.Input[str]] = None,
notification_settings: Optional[pulumi.Input[pulumi.InputType['GlobalVMShutdownScheduleNotificationSettingsArgs']]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
timezone: Optional[pulumi.Input[str]] = None,
virtual_machine_id: Optional[pulumi.Input[str]] = None) -> 'GlobalVMShutdownSchedule':
"""
Get an existing GlobalVMShutdownSchedule resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] daily_recurrence_time: The time each day when the schedule takes effect. Must match the format HHmm where HH is 00-23 and mm is 00-59 (e.g. 0930, 2300, etc.)
:param pulumi.Input[bool] enabled: Whether to enable the schedule. Possible values are `true` and `false`. Defaults to `true`.
:param pulumi.Input[str] location: The location where the schedule is created. Changing this forces a new resource to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
:param pulumi.Input[str] timezone: The time zone ID (e.g. Pacific Standard time). Refer to this guide for a [full list of accepted time zone names](https://jackstromberg.com/2017/01/list-of-time-zones-consumed-by-azure/).
:param pulumi.Input[str] virtual_machine_id: The resource ID of the target ARM-based Virtual Machine. Changing this forces a new resource to be created.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _GlobalVMShutdownScheduleState.__new__(_GlobalVMShutdownScheduleState)
__props__.__dict__["daily_recurrence_time"] = daily_recurrence_time
__props__.__dict__["enabled"] = enabled
__props__.__dict__["location"] = location
__props__.__dict__["notification_settings"] = notification_settings
__props__.__dict__["tags"] = tags
__props__.__dict__["timezone"] = timezone
__props__.__dict__["virtual_machine_id"] = virtual_machine_id
return GlobalVMShutdownSchedule(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="dailyRecurrenceTime")
def daily_recurrence_time(self) -> pulumi.Output[str]:
"""
The time each day when the schedule takes effect. Must match the format HHmm where HH is 00-23 and mm is 00-59 (e.g. 0930, 2300, etc.)
"""
return pulumi.get(self, "daily_recurrence_time")
@property
@pulumi.getter
def enabled(self) -> pulumi.Output[Optional[bool]]:
"""
Whether to enable the schedule. Possible values are `true` and `false`. Defaults to `true`.
"""
return pulumi.get(self, "enabled")
@property
@pulumi.getter
def location(self) -> pulumi.Output[str]:
"""
The location where the schedule is created. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "location")
@property
@pulumi.getter(name="notificationSettings")
def notification_settings(self) -> pulumi.Output['outputs.GlobalVMShutdownScheduleNotificationSettings']:
return pulumi.get(self, "notification_settings")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter
def timezone(self) -> pulumi.Output[str]:
"""
The time zone ID (e.g. Pacific Standard time). Refer to this guide for a [full list of accepted time zone names](https://jackstromberg.com/2017/01/list-of-time-zones-consumed-by-azure/).
"""
return pulumi.get(self, "timezone")
@property
@pulumi.getter(name="virtualMachineId")
def virtual_machine_id(self) -> pulumi.Output[str]:
"""
The resource ID of the target ARM-based Virtual Machine. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "virtual_machine_id")
| 50.876122 | 242 | 0.671325 | 3,256 | 28,338 | 5.65172 | 0.094287 | 0.062167 | 0.050973 | 0.032279 | 0.883654 | 0.86871 | 0.851375 | 0.831866 | 0.824639 | 0.812901 | 0 | 0.012833 | 0.232797 | 28,338 | 556 | 243 | 50.967626 | 0.833586 | 0.455466 | 0 | 0.684211 | 1 | 0 | 0.139535 | 0.076569 | 0 | 0 | 0 | 0 | 0 | 1 | 0.157895 | false | 0.003759 | 0.026316 | 0.011278 | 0.278195 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.