hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
abb5e98762102a68f6572ca79ebf088c7dd9b6c6 | 355 | py | Python | beacon_controller/__init__.py | NCATS-Tangerine/translator-knowledge-beacon | e0e36c2a6e77e998812e132838e62f8d1b154c88 | [
"MIT"
] | 6 | 2017-06-16T19:33:39.000Z | 2021-05-31T19:41:28.000Z | beacon_controller/__init__.py | NCATS-Tangerine/translator-knowledge-beacon | e0e36c2a6e77e998812e132838e62f8d1b154c88 | [
"MIT"
] | 51 | 2017-05-22T15:55:21.000Z | 2021-06-08T00:11:24.000Z | beacon_controller/__init__.py | NCATS-Tangerine/translator-knowledge-beacon | e0e36c2a6e77e998812e132838e62f8d1b154c88 | [
"MIT"
] | 2 | 2018-03-01T22:18:09.000Z | 2018-09-29T18:37:44.000Z | from config import config
from . import biolink_model
from .concepts_controller import get_concept_details, get_concepts, get_exact_matches_to_concept_list
from .statements_controller import get_statement_details, get_statements
from .metadata_controller import get_concept_categories, get_knowledge_map, get_predicates
from .main_controller import main
| 44.375 | 101 | 0.887324 | 49 | 355 | 6 | 0.44898 | 0.217687 | 0.193878 | 0.176871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084507 | 355 | 7 | 102 | 50.714286 | 0.904615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
552598af2336b3cd74d70bac9edf3c921c939955 | 25,010 | py | Python | src/sdk/python/OsduClient/api/search_api.py | mstest123/self-managed-osdu_from_Daniel | 10a0c1d25804caa920bf18c6c7c1d8e711c63756 | [
"MIT"
] | 3 | 2021-11-05T20:52:54.000Z | 2021-11-23T23:02:29.000Z | src/sdk/python/OsduClient/api/search_api.py | mstest123/self-managed-osdu_from_Daniel | 10a0c1d25804caa920bf18c6c7c1d8e711c63756 | [
"MIT"
] | 4 | 2021-11-05T19:57:08.000Z | 2021-12-14T13:59:04.000Z | src/sdk/python/OsduClient/api/search_api.py | mstest123/self-managed-osdu_from_Daniel | 10a0c1d25804caa920bf18c6c7c1d8e711c63756 | [
"MIT"
] | 36 | 2021-08-31T20:58:25.000Z | 2022-03-30T17:02:57.000Z | # coding: utf-8
"""
self-managed-osdu
Rest API Documentation for Self Managed OSDU # noqa: E501
OpenAPI spec version: 0.11.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from OsduClient.api_client import ApiClient
class SearchApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def delete_index(self, osdu_account_id, kind, **kwargs): # noqa: E501
"""Deletes all documents from index for given 'kind'. # noqa: E501
The API can be used to purge all indexed documents for a kind. Required access level to use this API is service.search.admin # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_index(osdu_account_id, kind, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str osdu_account_id: Account ID is the active OSDU account (OSDU account or customer's account) which the users choose to use with the Search API. (required)
:param str kind: Kind of the record. (required)
:param str osdu_on_behalf_of: On behalf email or token is the token/email of the original user making the call. For now, only email is supported but eventually, primary usage will be token.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_index_with_http_info(osdu_account_id, kind, **kwargs) # noqa: E501
else:
(data) = self.delete_index_with_http_info(osdu_account_id, kind, **kwargs) # noqa: E501
return data
def delete_index_with_http_info(self, osdu_account_id, kind, **kwargs): # noqa: E501
"""Deletes all documents from index for given 'kind'. # noqa: E501
The API can be used to purge all indexed documents for a kind. Required access level to use this API is service.search.admin # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_index_with_http_info(osdu_account_id, kind, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str osdu_account_id: Account ID is the active OSDU account (OSDU account or customer's account) which the users choose to use with the Search API. (required)
:param str kind: Kind of the record. (required)
:param str osdu_on_behalf_of: On behalf email or token is the token/email of the original user making the call. For now, only email is supported but eventually, primary usage will be token.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['osdu_account_id', 'kind', 'osdu_on_behalf_of'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_index" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'osdu_account_id' is set
if self.api_client.client_side_validation and ('osdu_account_id' not in params or
params['osdu_account_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `osdu_account_id` when calling `delete_index`") # noqa: E501
# verify the required parameter 'kind' is set
if self.api_client.client_side_validation and ('kind' not in params or
params['kind'] is None): # noqa: E501
raise ValueError("Missing the required parameter `kind` when calling `delete_index`") # noqa: E501
collection_formats = {}
path_params = {}
if 'kind' in params:
path_params['kind'] = params['kind'] # noqa: E501
query_params = []
header_params = {}
if 'osdu_account_id' in params:
header_params['OSDU-Account-Id'] = params['osdu_account_id'] # noqa: E501
if 'osdu_on_behalf_of' in params:
header_params['OSDU-On-Behalf-Of'] = params['osdu_on_behalf_of'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/api/search/v2/index/{kind}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_kind_schema(self, osdu_account_id, kind, **kwargs): # noqa: E501
"""Returns the index schema for given 'kind'. # noqa: E501
The API returns the schema for a given kind which is used find what attributes are indexed and their respective data types (at index time). Required access levels to use this API are service.search.user, service.search.admin # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_kind_schema(osdu_account_id, kind, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str osdu_account_id: Account ID is the active OSDU account (OSDU account or customer's account) which the users choose to use with the Search API. (required)
:param str kind: Kind of the record. (required)
:param str osdu_on_behalf_of: On behalf email or token is the token/email of the original user making the call. For now, only email is supported but eventually, primary usage will be token.
:return: str
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_kind_schema_with_http_info(osdu_account_id, kind, **kwargs) # noqa: E501
else:
(data) = self.get_kind_schema_with_http_info(osdu_account_id, kind, **kwargs) # noqa: E501
return data
def get_kind_schema_with_http_info(self, osdu_account_id, kind, **kwargs): # noqa: E501
"""Returns the index schema for given 'kind'. # noqa: E501
The API returns the schema for a given kind which is used find what attributes are indexed and their respective data types (at index time). Required access levels to use this API are service.search.user, service.search.admin # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_kind_schema_with_http_info(osdu_account_id, kind, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str osdu_account_id: Account ID is the active OSDU account (OSDU account or customer's account) which the users choose to use with the Search API. (required)
:param str kind: Kind of the record. (required)
:param str osdu_on_behalf_of: On behalf email or token is the token/email of the original user making the call. For now, only email is supported but eventually, primary usage will be token.
:return: str
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['osdu_account_id', 'kind', 'osdu_on_behalf_of'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_kind_schema" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'osdu_account_id' is set
if self.api_client.client_side_validation and ('osdu_account_id' not in params or
params['osdu_account_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `osdu_account_id` when calling `get_kind_schema`") # noqa: E501
# verify the required parameter 'kind' is set
if self.api_client.client_side_validation and ('kind' not in params or
params['kind'] is None): # noqa: E501
raise ValueError("Missing the required parameter `kind` when calling `get_kind_schema`") # noqa: E501
collection_formats = {}
path_params = {}
if 'kind' in params:
path_params['kind'] = params['kind'] # noqa: E501
query_params = []
header_params = {}
if 'osdu_account_id' in params:
header_params['OSDU-Account-Id'] = params['osdu_account_id'] # noqa: E501
if 'osdu_on_behalf_of' in params:
header_params['OSDU-On-Behalf-Of'] = params['osdu_on_behalf_of'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/api/search/v2/index/schema/{kind}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='str', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def query_records(self, osdu_account_id, body, **kwargs): # noqa: E501
"""Queries the index for the specified kind using the input query string. # noqa: E501
The API supports full text search on string fields, range queries on date, numeric or string fields, along with geo-spatial search. Required access levels to use this API are service.search.user, service.search.admin. In addition, users must be a member of data groups to access the data. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.query_records(osdu_account_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str osdu_account_id: Account ID is the active OSDU account (OSDU account or customer's account) which the users choose to use with the Search API. (required)
:param SearchQueryRequest body: Specifies the API parameters. The only required parameter is the kind which needs to be formatted correctly. (required)
:param str osdu_on_behalf_of: On behalf email or token is the token/email of the original user making the call. For now, only email is supported but eventually, primary usage will be token.
:return: SearchQueryResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.query_records_with_http_info(osdu_account_id, body, **kwargs) # noqa: E501
else:
(data) = self.query_records_with_http_info(osdu_account_id, body, **kwargs) # noqa: E501
return data
def query_records_with_http_info(self, osdu_account_id, body, **kwargs): # noqa: E501
"""Queries the index for the specified kind using the input query string. # noqa: E501
The API supports full text search on string fields, range queries on date, numeric or string fields, along with geo-spatial search. Required access levels to use this API are service.search.user, service.search.admin. In addition, users must be a member of data groups to access the data. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.query_records_with_http_info(osdu_account_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str osdu_account_id: Account ID is the active OSDU account (OSDU account or customer's account) which the users choose to use with the Search API. (required)
:param SearchQueryRequest body: Specifies the API parameters. The only required parameter is the kind which needs to be formatted correctly. (required)
:param str osdu_on_behalf_of: On behalf email or token is the token/email of the original user making the call. For now, only email is supported but eventually, primary usage will be token.
:return: SearchQueryResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['osdu_account_id', 'body', 'osdu_on_behalf_of'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method query_records" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'osdu_account_id' is set
if self.api_client.client_side_validation and ('osdu_account_id' not in params or
params['osdu_account_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `osdu_account_id` when calling `query_records`") # noqa: E501
# verify the required parameter 'body' is set
if self.api_client.client_side_validation and ('body' not in params or
params['body'] is None): # noqa: E501
raise ValueError("Missing the required parameter `body` when calling `query_records`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
if 'osdu_account_id' in params:
header_params['OSDU-Account-Id'] = params['osdu_account_id'] # noqa: E501
if 'osdu_on_behalf_of' in params:
header_params['OSDU-On-Behalf-Of'] = params['osdu_on_behalf_of'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/api/search/v2/query', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SearchQueryResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def query_with_cursor(self, osdu_account_id, body, **kwargs): # noqa: E501
"""Query the index using cursor and optionally return only requested fields. # noqa: E501
The API supports full text search on string fields, range queries on date, numeric or string fields, along with geo-spatial search. Required access levels to use this API are service.search.user, service.search.admin. In addition, users must be a member of data groups to access the data. It can be used to retrieve large numbers of results (or even all results) from a single search request, in much the same way as you would use a cursor on a traditional database. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.query_with_cursor(osdu_account_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str osdu_account_id: Account ID is the active OSDU account (OSDU account or customer's account) which the users choose to use with the Search API. (required)
:param SearchCursorQueryRequest body: Specifies the API parameters. The only required parameter is the kind which needs to be formatted correctly. (required)
:param str osdu_on_behalf_of: On behalf email or token is the token/email of the original user making the call. For now, only email is supported but eventually, primary usage will be token.
:return: SearchCursorQueryResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.query_with_cursor_with_http_info(osdu_account_id, body, **kwargs) # noqa: E501
else:
(data) = self.query_with_cursor_with_http_info(osdu_account_id, body, **kwargs) # noqa: E501
return data
def query_with_cursor_with_http_info(self, osdu_account_id, body, **kwargs): # noqa: E501
"""Query the index using cursor and optionally return only requested fields. # noqa: E501
The API supports full text search on string fields, range queries on date, numeric or string fields, along with geo-spatial search. Required access levels to use this API are service.search.user, service.search.admin. In addition, users must be a member of data groups to access the data. It can be used to retrieve large numbers of results (or even all results) from a single search request, in much the same way as you would use a cursor on a traditional database. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.query_with_cursor_with_http_info(osdu_account_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str osdu_account_id: Account ID is the active OSDU account (OSDU account or customer's account) which the users choose to use with the Search API. (required)
:param SearchCursorQueryRequest body: Specifies the API parameters. The only required parameter is the kind which needs to be formatted correctly. (required)
:param str osdu_on_behalf_of: On behalf email or token is the token/email of the original user making the call. For now, only email is supported but eventually, primary usage will be token.
:return: SearchCursorQueryResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['osdu_account_id', 'body', 'osdu_on_behalf_of'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method query_with_cursor" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'osdu_account_id' is set
if self.api_client.client_side_validation and ('osdu_account_id' not in params or
params['osdu_account_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `osdu_account_id` when calling `query_with_cursor`") # noqa: E501
# verify the required parameter 'body' is set
if self.api_client.client_side_validation and ('body' not in params or
params['body'] is None): # noqa: E501
raise ValueError("Missing the required parameter `body` when calling `query_with_cursor`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
if 'osdu_account_id' in params:
header_params['OSDU-Account-Id'] = params['osdu_account_id'] # noqa: E501
if 'osdu_on_behalf_of' in params:
header_params['OSDU-On-Behalf-Of'] = params['osdu_on_behalf_of'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/api/search/v2/query_with_cursor', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SearchCursorQueryResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 52.212944 | 488 | 0.648061 | 3,262 | 25,010 | 4.771613 | 0.075107 | 0.04266 | 0.053453 | 0.021587 | 0.961773 | 0.960617 | 0.956826 | 0.948988 | 0.948988 | 0.948988 | 0 | 0.014462 | 0.272891 | 25,010 | 478 | 489 | 52.322176 | 0.841463 | 0.4499 | 0 | 0.806324 | 1 | 0 | 0.206842 | 0.029975 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035573 | false | 0 | 0.01581 | 0 | 0.102767 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e999855643fed2f87a51356c3311801b86dada85 | 123 | py | Python | newtonnet/data/__init__.py | THGLab/NewtonNet | fcf2af848a1c998bd08096dcefb58a5610eda03c | [
"MIT"
] | null | null | null | newtonnet/data/__init__.py | THGLab/NewtonNet | fcf2af848a1c998bd08096dcefb58a5610eda03c | [
"MIT"
] | null | null | null | newtonnet/data/__init__.py | THGLab/NewtonNet | fcf2af848a1c998bd08096dcefb58a5610eda03c | [
"MIT"
] | null | null | null | """
"""
from newtonnet.data.loader import *
from newtonnet.data.neighbors import *
from newtonnet.data.parse_raw import * | 17.571429 | 38 | 0.756098 | 16 | 123 | 5.75 | 0.5 | 0.423913 | 0.554348 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121951 | 123 | 7 | 39 | 17.571429 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
75a44306a934709e7ac8ce197b175fc424814f9b | 4,130 | py | Python | phaedo/core/fields.py | utamaru-hiro/phaedo | f54030e7511521b7c1a063b158f2f1052681fedc | [
"MIT"
] | null | null | null | phaedo/core/fields.py | utamaru-hiro/phaedo | f54030e7511521b7c1a063b158f2f1052681fedc | [
"MIT"
] | null | null | null | phaedo/core/fields.py | utamaru-hiro/phaedo | f54030e7511521b7c1a063b158f2f1052681fedc | [
"MIT"
] | null | null | null | from typing import (Callable, Generic, Optional, Type, TypeVar, Union)
from .validation import Validator, ValidationError
from ._registry import DomainModelRegistry
T = TypeVar('T')
class Field(Generic[T]):
def __init__(
self,
field_type: Union[str, Type[T]],
name: str = None,
validators: list[Validator] = None,
converter: Callable[[Optional[T]], T] = None,
load=True,
dump=True,
default: T = None
) -> None:
self._field_type = field_type
self._name = name
self._validators = validators
self._converter = converter
self._load = load
self._dump = dump
self._default = default
self._attribute_name = None
def set_attribute_name(self, attribute_name: str) -> None:
self._attribute_name = attribute_name
def get_attribute_name(self) -> str:
return self._attribute_name
def get_type(self) -> Type:
if isinstance(self._field_type, str):
return DomainModelRegistry.get(self._field_type)
return self._field_type
@staticmethod
def is_list() -> bool:
return False
def can_load(self) -> bool:
return self._load
def can_dump(self) -> bool:
return self._dump
def default_value(self) -> Optional[T]:
return self._default
def __get__(self, obj, obj_type=None) -> T:
return obj.__data__.get(self._attribute_name, None)
def __set__(self, obj, value: Optional[T]) -> None:
if self._validators is not None:
key = self._name if self._name is not None else self._attribute_name
for validator in self._validators:
if not validator.validate(value):
raise ValidationError(key, validator.message())
if self._converter is not None:
obj.__data__[self._attribute_name] = self._converter(value)
else:
obj.__data__[self._attribute_name] = value
class ListField(Generic[T]):
def __init__(
self,
field_type: Union[str, Type[T]],
name: str = None,
validators: list[Validator] = None,
converter: Callable[[Optional[T]], T] = None,
load=True,
dump=True,
default: list[T] = None
) -> None:
self._field_type = field_type
self._name = name
self._validators = validators
self._converter = converter
self._load = load
self._dump = dump
self._default = default
self._attribute_name = None
def set_attribute_name(self, attribute_name: str):
self._attribute_name = attribute_name
def get_attribute_name(self):
return self._attribute_name
def get_type(self) -> Type:
if isinstance(self._field_type, str):
return DomainModelRegistry.get(self._field_type)
return self._field_type
@staticmethod
def is_list() -> bool:
return True
def can_load(self) -> bool:
return self._load
def can_dump(self) -> bool:
return self._dump
def default_value(self) -> Optional[list[T]]:
return self._default
def __get__(self, obj, obj_type=None) -> list[T]:
return obj.__data__.get(self._attribute_name, None)
def __set__(self, obj, value: Optional[list[T]]) -> None:
attribute_value_list: list[T] = []
if value is not None:
for element in value:
if self._validators is not None:
key = self._name if self._name is not None else self._attribute_name
for validator in self._validators:
if not validator.validate(value):
raise ValidationError(key, validator.message())
if self._converter is not None:
attribute_value_list.append(self._converter(element))
else:
attribute_value_list.append(element)
obj.__data__[self._attribute_name] = attribute_value_list
| 30.820896 | 88 | 0.598547 | 477 | 4,130 | 4.861635 | 0.115304 | 0.117723 | 0.109961 | 0.036223 | 0.831393 | 0.800345 | 0.800345 | 0.800345 | 0.800345 | 0.800345 | 0 | 0 | 0.31138 | 4,130 | 133 | 89 | 31.052632 | 0.815401 | 0 | 0 | 0.730769 | 0 | 0 | 0.000242 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.192308 | false | 0 | 0.028846 | 0.115385 | 0.394231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 8 |
75bc1fb325fc6b1386f226f0e4407b357d712433 | 301 | py | Python | spanet/network/jet_reconstruction/__init__.py | Alexanders101/SPANet | 20731bb271b23f0746243e79203ff6b77556c852 | [
"BSD-3-Clause"
] | 13 | 2021-05-20T15:13:01.000Z | 2021-11-24T22:12:45.000Z | spanet/network/jet_reconstruction/__init__.py | Alexanders101/SPANet | 20731bb271b23f0746243e79203ff6b77556c852 | [
"BSD-3-Clause"
] | null | null | null | spanet/network/jet_reconstruction/__init__.py | Alexanders101/SPANet | 20731bb271b23f0746243e79203ff6b77556c852 | [
"BSD-3-Clause"
] | 7 | 2021-06-28T12:18:17.000Z | 2022-01-27T20:05:06.000Z | from spanet.network.jet_reconstruction.jet_reconstruction_training import JetReconstructionTraining
from spanet.network.jet_reconstruction.jet_reconstruction_validation import JetReconstructionValidation
class JetReconstructionModel(JetReconstructionTraining, JetReconstructionValidation):
pass
| 43 | 103 | 0.906977 | 25 | 301 | 10.68 | 0.52 | 0.254682 | 0.127341 | 0.149813 | 0.382022 | 0.382022 | 0.382022 | 0 | 0 | 0 | 0 | 0 | 0.059801 | 301 | 6 | 104 | 50.166667 | 0.943463 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 7 |
fb2f36c55ee709bbb12f57c7b9adb84785a45d5e | 11,429 | py | Python | tuta/trainers.py | PseudoLabs-Demo/TUTA_table_understanding | d0f3fe2f15c56a5ea9f593b210296f170fc74558 | [
"MIT"
] | null | null | null | tuta/trainers.py | PseudoLabs-Demo/TUTA_table_understanding | d0f3fe2f15c56a5ea9f593b210296f170fc74558 | [
"MIT"
] | null | null | null | tuta/trainers.py | PseudoLabs-Demo/TUTA_table_understanding | d0f3fe2f15c56a5ea9f593b210296f170fc74558 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Trainers for Pre-training Model Variants
"""
import time
import torch
from utils import save_model
def train_base(args, gpu_id, rank, loader, model, optimizer, scheduler):
model.train()
start_time = time.time()
total_loss = 0.
total_mlm_loss, total_mlm_crt, total_mlm_cnt = 0., 0., 0.
total_sep_loss, total_sep_crt, total_sep_cnt = 0., 0., 0.
total_tok_loss, total_tok_crt, total_tok_cnt = 0., 0., 0.
total_tcr_loss, total_tcr_crt, total_tcr_cnt = 0., 0., 0.
total_rand_num, total_rand_den = 0., 0.
steps = 1
total_steps = args.total_steps
loader_iter = iter(loader)
while True:
if steps == total_steps + 1:
break
token_id, num_mag, num_pre, num_top, num_low, \
token_order, pos_top, pos_left, format_vec, indicator, \
mlm_label, clc_label, tcr_label = next(loader_iter)
while (clc_label.size()[0] == 0) or (torch.min(clc_label) >= 0) or (torch.max(mlm_label) == -1):
token_id, num_mag, num_pre, num_top, num_low, \
token_order, pos_top, pos_left, format_vec, indicator, \
mlm_label, clc_label, tcr_label = next(loader_iter)
model.zero_grad()
if gpu_id is not None:
token_id = token_id.cuda(gpu_id)
num_mag = num_mag.cuda(gpu_id)
num_pre = num_pre.cuda(gpu_id)
num_top = num_top.cuda(gpu_id)
num_low = num_low.cuda(gpu_id)
token_order = token_order.cuda(gpu_id)
pos_top = pos_top.cuda(gpu_id)
pos_left = pos_left.cuda(gpu_id)
format_vec = format_vec.cuda(gpu_id)
indicator = indicator.cuda(gpu_id)
mlm_label = mlm_label.cuda(gpu_id)
clc_label = clc_label.cuda(gpu_id)
tcr_label = tcr_label.cuda(gpu_id)
# forward
mlm_triple, sep_triple, tok_triple, tcr_triple = model(
token_id, num_mag, num_pre, num_top, num_low,
token_order, pos_top, pos_left, format_vec, indicator,
mlm_label, clc_label, tcr_label
)
mlm_loss, mlm_crt, mlm_cnt = mlm_triple
sep_loss, sep_crt, sep_cnt = sep_triple
tok_loss, tok_crt, tok_cnt = tok_triple
tcr_loss, tcr_crt, tcr_cnt = tcr_triple
# backward
loss = mlm_loss + (sep_loss + tok_loss) * args.clc_weight + tcr_loss
loss.backward()
optimizer.step()
scheduler.step()
total_mlm_loss += mlm_loss.item()
total_mlm_crt += mlm_crt.item()
total_mlm_cnt += mlm_cnt.item()
total_sep_loss += sep_loss.item()
total_sep_crt += sep_crt.item()
total_sep_cnt += sep_cnt.item()
total_tok_loss += tok_loss.item()
total_tok_crt += tok_crt.item()
total_tok_cnt += tok_cnt.item()
rand_num = torch.sum((clc_label > 0).long(), dim=-1)
rand_den = torch.sum(rand_num, dim=-1).item() // 2
rand_num = torch.sum((rand_num > 0).long(), dim=-1).item()
total_rand_num += rand_num
total_rand_den += rand_den
total_tcr_loss += tcr_loss.item()
total_tcr_crt += tcr_crt.item()
total_tcr_cnt += tcr_cnt.item()
total_loss = total_mlm_loss + \
(total_sep_loss + total_tok_loss) * \
args.clc_weight + total_tcr_loss
if steps % args.report_steps == 0 and (not args.dist_train or (args.dist_train and rank == 0)):
elapsed = time.time() - start_time
done_tokens = \
args.batch_size * token_id.size(1) * args.report_steps * args.world_size \
if args.dist_train \
else args.batch_size * token_id.size(1) * args.report_steps
print("| {:8d}/{:8d} steps"
"| {:8.2f} tokens/s"
"| total_loss {:7.2f}"
"| mlm_loss {:7.2f}"
"| mlm_acc {:.3f}"
"| sep_loss {:7.2f}"
"| sep_acc {:.3f}"
"| tok_loss {:7.2f}"
"| tok_acc {:.3f}"
"| rand_acc {:.3f}"
"| tcr_loss {:7.2f}"
"| tcr_acc {:.3f}".format(
steps, total_steps,
done_tokens / elapsed,
total_loss / args.report_steps,
total_mlm_loss / args.report_steps,
total_mlm_crt / total_mlm_cnt,
total_sep_loss / args.report_steps,
total_sep_crt / total_sep_cnt,
total_tok_loss / args.report_steps,
total_tok_crt / total_tok_cnt,
total_rand_num / total_rand_den,
total_tcr_loss / args.report_steps,
total_tcr_crt / total_tcr_cnt))
total_loss = 0.
total_mlm_loss, total_mlm_crt, total_mlm_cnt = 0., 0., 0.
total_sep_loss, total_sep_crt, total_sep_cnt = 0., 0., 0.
total_tok_loss, total_tok_crt, total_tok_cnt = 0., 0., 0.
total_tcr_loss, total_tcr_crt, total_tcr_cnt = 0., 0., 0.
total_rand_num, total_rand_den = 0., 0.
start_time = time.time()
if steps % args.save_checkpoint_steps == 0 and (not args.dist_train or (args.dist_train and rank == 0)):
save_model(model, args.output_model_path + "-" + str(steps))
steps += 1
def train_tuta(args, gpu_id, rank, loader, model, optimizer, scheduler):
model.train()
start_time = time.time()
total_loss = 0.
total_mlm_loss, total_mlm_crt, total_mlm_cnt = 0., 0., 0.
total_sep_loss, total_sep_crt, total_sep_cnt = 0., 0., 0.
total_tok_loss, total_tok_crt, total_tok_cnt = 0., 0., 0.
total_tcr_loss, total_tcr_crt, total_tcr_cnt = 0., 0., 0.
total_rand_num, total_rand_den = 0., 0.
steps = 1
total_steps = args.total_steps
loader_iter = iter(loader)
while True:
if steps == total_steps + 1:
break
token_id, num_mag, num_pre, num_top, num_low, \
token_order, pos_row, pos_col, pos_top, pos_left, format_vec, indicator, \
mlm_label, clc_label, tcr_label = next(loader_iter)
while (clc_label.size()[0] == 0) or (torch.min(clc_label) >= 0) or (torch.max(mlm_label) == -1):
token_id, num_mag, num_pre, num_top, num_low, \
token_order, pos_row, pos_col, pos_top, pos_left, format_vec, indicator, \
mlm_label, clc_label, tcr_label = next(loader_iter)
model.zero_grad()
if gpu_id is not None:
token_id = token_id.cuda(gpu_id)
num_mag = num_mag.cuda(gpu_id)
num_pre = num_pre.cuda(gpu_id)
num_top = num_top.cuda(gpu_id)
num_low = num_low.cuda(gpu_id)
token_order = token_order.cuda(gpu_id)
pos_row = pos_row.cuda(gpu_id)
pos_col = pos_col.cuda(gpu_id)
pos_top = pos_top.cuda(gpu_id)
pos_left = pos_left.cuda(gpu_id)
format_vec = format_vec.cuda(gpu_id)
indicator = indicator.cuda(gpu_id)
mlm_label = mlm_label.cuda(gpu_id)
clc_label = clc_label.cuda(gpu_id)
tcr_label = tcr_label.cuda(gpu_id)
# forward
mlm_triple, sep_triple, tok_triple, tcr_triple = model(
token_id, num_mag, num_pre, num_top, num_low,
token_order, pos_row, pos_col, pos_top, pos_left, format_vec, indicator,
mlm_label, clc_label, tcr_label
)
mlm_loss, mlm_crt, mlm_cnt = mlm_triple
sep_loss, sep_crt, sep_cnt = sep_triple
tok_loss, tok_crt, tok_cnt = tok_triple
tcr_loss, tcr_crt, tcr_cnt = tcr_triple
# backward
loss = mlm_loss + (sep_loss + tok_loss) * args.clc_weight + tcr_loss
loss.backward()
optimizer.step()
scheduler.step()
total_mlm_loss += mlm_loss.item()
total_mlm_crt += mlm_crt.item()
total_mlm_cnt += mlm_cnt.item()
total_sep_loss += sep_loss.item()
total_sep_crt += sep_crt.item()
total_sep_cnt += sep_cnt.item()
total_tok_loss += tok_loss.item()
total_tok_crt += tok_crt.item()
total_tok_cnt += tok_cnt.item()
rand_num = torch.sum((clc_label > 0).long(), dim=-1)
rand_den = torch.sum(rand_num, dim=-1).item() // 2
rand_num = torch.sum((rand_num > 0).long(), dim=-1).item()
total_rand_num += rand_num
total_rand_den += rand_den
total_tcr_loss += tcr_loss.item()
total_tcr_crt += tcr_crt.item()
total_tcr_cnt += tcr_cnt.item()
total_loss = total_mlm_loss + \
(total_sep_loss + total_tok_loss) * \
args.clc_weight + total_tcr_loss
if steps % args.report_steps == 0 and (not args.dist_train or (args.dist_train and rank == 0)):
elapsed = time.time() - start_time
done_tokens = \
args.batch_size * token_id.size(1) * args.report_steps * args.world_size \
if args.dist_train \
else args.batch_size * token_id.size(1) * args.report_steps
print("| {:8d}/{:8d} steps"
"| {:8.2f} tokens/s"
"| total_loss {:7.2f}"
"| mlm_loss {:7.2f}"
"| mlm_acc {:.3f}"
"| sep_loss {:7.2f}"
"| sep_acc {:.3f}"
"| tok_loss {:7.2f}"
"| tok_acc {:.3f}"
"| rand_acc {:.3f}"
"| tcr_loss {:7.2f}"
"| tcr_acc {:.3f}".format(
steps, total_steps,
done_tokens / elapsed,
total_loss / args.report_steps,
total_mlm_loss / args.report_steps,
total_mlm_crt / total_mlm_cnt,
total_sep_loss / args.report_steps,
total_sep_crt / total_sep_cnt,
total_tok_loss / args.report_steps,
total_tok_crt / total_tok_cnt,
total_rand_num / total_rand_den,
total_tcr_loss / args.report_steps,
total_tcr_crt / total_tcr_cnt))
total_loss = 0.
total_mlm_loss, total_mlm_crt, total_mlm_cnt = 0., 0., 0.
total_sep_loss, total_sep_crt, total_sep_cnt = 0., 0., 0.
total_tok_loss, total_tok_crt, total_tok_cnt = 0., 0., 0.
total_tcr_loss, total_tcr_crt, total_tcr_cnt = 0., 0., 0.
total_rand_num, total_rand_den = 0., 0.
start_time = time.time()
if steps % args.save_checkpoint_steps == 0 and (not args.dist_train or (args.dist_train and rank == 0)):
print("saving model")
save_model(model, args.output_model_path + "-" + str(steps))
steps += 1
print("the model wasn't saved")
TRAINERS = {
"base": train_base,
"tuta": train_tuta,
"tuta_explicit": train_tuta,
}
| 40.672598 | 113 | 0.550967 | 1,544 | 11,429 | 3.685233 | 0.071891 | 0.013357 | 0.044288 | 0.016872 | 0.95536 | 0.95536 | 0.95536 | 0.95536 | 0.95536 | 0.95536 | 0 | 0.01839 | 0.343425 | 11,429 | 280 | 114 | 40.817857 | 0.739872 | 0.010325 | 0 | 0.922414 | 0 | 0 | 0.043293 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008621 | false | 0 | 0.012931 | 0 | 0.021552 | 0.017241 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fb342cf20d862ee4c492d0460e06d6f9a19c444d | 108 | py | Python | pyuploadtool/build_systems/exceptions.py | knarfS/pyuploadtool | 72aaf58f1aff35566960cf19b0f95239d245982f | [
"MIT"
] | 6 | 2020-12-12T14:09:28.000Z | 2021-06-29T06:23:11.000Z | pyuploadtool/build_systems/exceptions.py | knarfS/pyuploadtool | 72aaf58f1aff35566960cf19b0f95239d245982f | [
"MIT"
] | 7 | 2020-12-11T19:50:03.000Z | 2022-01-08T00:24:42.000Z | pyuploadtool/build_systems/exceptions.py | knarfS/pyuploadtool | 72aaf58f1aff35566960cf19b0f95239d245982f | [
"MIT"
] | 3 | 2020-12-12T14:00:49.000Z | 2021-06-19T12:31:11.000Z | from pyuploadtool.exceptions import PyUploadtoolError
class BuildSystemError(PyUploadtoolError):
pass
| 18 | 53 | 0.842593 | 9 | 108 | 10.111111 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12037 | 108 | 5 | 54 | 21.6 | 0.957895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
349dee35bf01db0778d0d3670f99f0aa4365c869 | 340 | py | Python | product_management_models/product_supplies/models.py | reimibeta/django-product-management-models | f51e94cc6ae605ea21706ffe2baedc53b980112f | [
"Apache-2.0"
] | null | null | null | product_management_models/product_supplies/models.py | reimibeta/django-product-management-models | f51e94cc6ae605ea21706ffe2baedc53b980112f | [
"Apache-2.0"
] | null | null | null | product_management_models/product_supplies/models.py | reimibeta/django-product-management-models | f51e94cc6ae605ea21706ffe2baedc53b980112f | [
"Apache-2.0"
] | null | null | null | # product supply
from product_management_models.product_supplies.class_models.product_supply import *
# product supply delivery
from product_management_models.product_supplies.class_models.product_supply_deliveries import *
# product supply stock
from product_management_models.product_supplies.class_models.product_supply_stock import *
| 37.777778 | 95 | 0.882353 | 43 | 340 | 6.581395 | 0.255814 | 0.275618 | 0.222615 | 0.286219 | 0.699647 | 0.699647 | 0.699647 | 0.699647 | 0.699647 | 0.699647 | 0 | 0 | 0.073529 | 340 | 8 | 96 | 42.5 | 0.898413 | 0.173529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
9ba91f162154c3093d31954a0ee2506db650d787 | 35,363 | py | Python | tests/models/test_process.py | d-cat-support/fusion-platform-python-sdk | 6f98a60f33a962f6a10861da15affbc28bf4a17a | [
"MIT"
] | null | null | null | tests/models/test_process.py | d-cat-support/fusion-platform-python-sdk | 6f98a60f33a962f6a10861da15affbc28bf4a17a | [
"MIT"
] | null | null | null | tests/models/test_process.py | d-cat-support/fusion-platform-python-sdk | 6f98a60f33a962f6a10861da15affbc28bf4a17a | [
"MIT"
] | null | null | null | #
# Process model class test file.
#
# @author Matthew Casey
#
# (c) Digital Content Analysis Technology Ltd 2022
#
import json
import pytest
import requests
import requests_mock
import uuid
import fusion_platform
from tests.custom_test_case import CustomTestCase
from fusion_platform.common.utilities import json_default
from fusion_platform.models.data import Data
from fusion_platform.models.model import Model, ModelError
from fusion_platform.models.process import Process, ProcessInputSchema, ProcessOptionSchema, ProcessSchema
from fusion_platform.models.process_execution import ProcessExecution
from fusion_platform.session import Session, RequestError
class TestProcess(CustomTestCase):
"""
Process model tests.
"""
def test_init(self):
"""
Test initialisation of the process model class to ensure no exceptions are raised.
"""
process = Process(Session())
self.assertIsNotNone(process)
def test_create(self):
"""
Tests that a process object can be created.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
organisation_id = content.get('organisation_id')
path = Process._PATH_CREATE.format(organisation_id=organisation_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{Process._PATH_NEW.format(organisation_id=organisation_id)}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process._new(organisation_id=organisation_id)
with pytest.raises(RequestError):
mock.post(f"{Session.API_URL_DEFAULT}{path}", exc=requests.exceptions.ConnectTimeout)
process.create()
with pytest.raises(RequestError):
mock.post(f"{Session.API_URL_DEFAULT}{path}", status_code=400)
process.create()
with pytest.raises(ModelError):
mock.post(f"{Session.API_URL_DEFAULT}{path}", text='{}')
process.create()
mock.post(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.create()
schema = ProcessSchema()
for key in content:
if (Model._METADATA_HIDE not in schema.fields[key].metadata) and (content[key] is not None):
self.assertEqual(json.dumps(content[key], default=json_default), json.dumps(getattr(process, key), default=json_default))
def test_delete(self):
"""
Tests that an object can be deleted from the API.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
organisation_id = content.get('organisation_id')
process_id = content.get(Model._FIELD_ID)
path = Process._PATH_DELETE.format(organisation_id=organisation_id, process_id=process_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)}",
text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
with pytest.raises(RequestError):
mock.delete(f"{Session.API_URL_DEFAULT}{path}", exc=requests.exceptions.ConnectTimeout)
process.delete()
with pytest.raises(RequestError):
mock.delete(f"{Session.API_URL_DEFAULT}{path}", status_code=400)
process.delete()
mock.delete(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.delete()
def test_execute_no_wait(self):
"""
Tests that a process can be executed without waiting.
"""
with open(self.fixture_path('process.json'), 'r') as file:
process_content = json.loads(file.read())
with open(self.fixture_path('process_execution.json'), 'r') as file:
execution_content = json.loads(file.read())
session = Session()
organisation_id = process_content.get('organisation_id')
process_id = process_content.get(Model._FIELD_ID)
process_execution_id = execution_content.get(Model._FIELD_ID)
execute_path = Process._PATH_EXECUTE.format(organisation_id=organisation_id, process_id=process_id)
get_path = Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)
executions_path = Process._PATH_EXECUTIONS.format(organisation_id=organisation_id, process_id=process_id)
execution_path = ProcessExecution._PATH_GET.format(organisation_id=organisation_id, process_execution_id=process_execution_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
with pytest.raises(RequestError):
mock.post(f"{Session.API_URL_DEFAULT}{execute_path}", exc=requests.exceptions.ConnectTimeout)
process.execute()
with pytest.raises(RequestError):
mock.post(f"{Session.API_URL_DEFAULT}{execute_path}", status_code=400)
process.execute()
with pytest.raises(ModelError):
mock.post(f"{Session.API_URL_DEFAULT}{execute_path}", text='{}')
process.execute()
process_content['process_status'] = Process._PROCESS_STATUS_EXECUTE
mock.post(f"{Session.API_URL_DEFAULT}{execute_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
process.execute()
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", exc=requests.exceptions.ConnectTimeout)
process.wait_for_next_execution()
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", status_code=400)
process.wait_for_next_execution()
with pytest.raises(ModelError):
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", text='{}')
process.wait_for_next_execution()
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{executions_path}", exc=requests.exceptions.ConnectTimeout)
process.wait_for_next_execution()
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{executions_path}", status_code=400)
process.wait_for_next_execution()
with pytest.raises(ModelError):
mock.get(f"{Session.API_URL_DEFAULT}{executions_path}", text='{}')
process.wait_for_next_execution()
mock.get(f"{Session.API_URL_DEFAULT}{executions_path}", text=json.dumps({Model._RESPONSE_KEY_LIST: [execution_content]}))
process.wait_for_next_execution()
for process_execution in process.executions:
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{execution_path}", exc=requests.exceptions.ConnectTimeout)
process_execution.check_complete()
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{execution_path}", status_code=400)
process_execution.check_complete()
with pytest.raises(ModelError):
mock.get(f"{Session.API_URL_DEFAULT}{execution_path}", text='{}')
process_execution.check_complete()
mock.get(f"{Session.API_URL_DEFAULT}{execution_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: execution_content}))
self.assertTrue(process_execution.check_complete())
def test_execute_wait(self):
"""
Tests that a process can be executed with waiting.
"""
with open(self.fixture_path('process.json'), 'r') as file:
process_content = json.loads(file.read())
with open(self.fixture_path('process_execution.json'), 'r') as file:
execution_content = json.loads(file.read())
session = Session()
organisation_id = process_content.get('organisation_id')
process_id = process_content.get(Model._FIELD_ID)
process_execution_id = execution_content.get(Model._FIELD_ID)
execute_path = Process._PATH_EXECUTE.format(organisation_id=organisation_id, process_id=process_id)
get_path = Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)
executions_path = Process._PATH_EXECUTIONS.format(organisation_id=organisation_id, process_id=process_id)
execution_path = ProcessExecution._PATH_GET.format(organisation_id=organisation_id, process_execution_id=process_execution_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
with pytest.raises(RequestError):
mock.post(f"{Session.API_URL_DEFAULT}{execute_path}", exc=requests.exceptions.ConnectTimeout)
process.execute(wait=True)
with pytest.raises(RequestError):
mock.post(f"{Session.API_URL_DEFAULT}{execute_path}", status_code=400)
process.execute(wait=True)
with pytest.raises(ModelError):
mock.post(f"{Session.API_URL_DEFAULT}{execute_path}", text='{}')
process.execute(wait=True)
process_content['process_status'] = Process._PROCESS_STATUS_EXECUTE
mock.post(f"{Session.API_URL_DEFAULT}{execute_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
mock.get(f"{Session.API_URL_DEFAULT}{executions_path}", text=json.dumps({Model._RESPONSE_KEY_LIST: [execution_content]}))
mock.get(f"{Session.API_URL_DEFAULT}{execution_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: execution_content}))
process.execute(wait=True)
def test_executions(self):
"""
Tests the executions property retrieves process execution items.
"""
with open(self.fixture_path('process.json'), 'r') as file:
process_content = json.loads(file.read())
with open(self.fixture_path('process_execution.json'), 'r') as file:
execution_content = json.loads(file.read())
session = Session()
organisation_id = process_content.get('organisation_id')
process_id = process_content.get(Model._FIELD_ID)
path = Process._PATH_EXECUTIONS.format(organisation_id=organisation_id, process_id=process_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)}",
text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", exc=requests.exceptions.ConnectTimeout)
next(process.executions)
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", status_code=400)
next(process.executions)
with pytest.raises(StopIteration):
mock.get(f"{Session.API_URL_DEFAULT}{path}", text='{}')
next(process.executions)
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_LIST: [execution_content]}))
for execution in process.executions:
self.assertEqual(str(process_id), str(execution.process_id))
def test_find_executions(self):
"""
Tests the finding of process execution items.
"""
with open(self.fixture_path('process.json'), 'r') as file:
process_content = json.loads(file.read())
with open(self.fixture_path('process_execution.json'), 'r') as file:
execution_content = json.loads(file.read())
session = Session()
organisation_id = process_content.get('organisation_id')
process_id = process_content.get(Model._FIELD_ID)
process_execution_id = execution_content.get(Model._FIELD_ID)
group_id = execution_content.get(Model._FIELD_GROUP_ID)
path = Process._PATH_EXECUTIONS.format(organisation_id=organisation_id, process_id=process_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)}",
text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", exc=requests.exceptions.ConnectTimeout)
process.find_executions()
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", status_code=400)
process.find_executions()
with pytest.raises(StopIteration):
mock.get(f"{Session.API_URL_DEFAULT}{path}", text='{}')
first, generator = process.find_executions()
self.assertIsNone(first)
next(generator)
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_LIST: [execution_content]}))
first, generator = process.find_executions(id=process_execution_id, group_id=group_id)
self.assertIsNotNone(first)
for execution in generator:
self.assertEqual(first.attributes, execution.attributes)
def test_get(self):
"""
Tests that an object can be retrieved from the API.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
organisation_id = content.get('organisation_id')
process_id = content.get(Model._FIELD_ID)
path = Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", exc=requests.exceptions.ConnectTimeout)
process.get(organisation_id=organisation_id, process_id=process_id)
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", status_code=400)
process.get(organisation_id=organisation_id, process_id=process_id)
with pytest.raises(ModelError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", text='{}')
process.get(organisation_id=organisation_id, process_id=process_id)
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
process.get()
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
def test_inputs(self):
"""
Tests that the inputs can be obtained from a process model.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
organisation_id = content.get('organisation_id')
process_id = content.get(Model._FIELD_ID)
path = Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
inputs = process.inputs
self.assertIsNotNone(inputs)
for input in inputs:
self.assertIsNotNone(input.ssd_id)
def test_model_from_api_id(self):
"""
Tests that an object can be created from an API endpoint.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
organisation_id = content.get('organisation_id')
process_id = content.get(Model._FIELD_ID)
path = Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)
with requests_mock.Mocker() as mock:
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", exc=requests.exceptions.ConnectTimeout)
Process._model_from_api_id(session, id=process_id, organisation_id=organisation_id)
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", status_code=400)
Process._model_from_api_id(session, id=process_id, organisation_id=organisation_id)
with pytest.raises(ModelError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", text='{}')
Process._model_from_api_id(session, id=process_id, organisation_id=organisation_id)
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process = Process._model_from_api_id(session, id=process_id, organisation_id=organisation_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
def test_models_from_api_ids(self):
"""
Tests that objects can be created from an API endpoint.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
organisation_id = content.get('organisation_id')
process_id = content.get(Model._FIELD_ID)
path = Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
processes = Process._models_from_api_ids(session, [{Model._FIELD_ID: process_id, 'organisation_id': organisation_id}])
self.assertIsNotNone(processes)
for process in processes:
self.assertEqual(str(process_id), str(process.id))
def test_models_from_api_path(self):
"""
Tests that objects can be created from an API endpoint returning a list.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
process_id = content.get(Model._FIELD_ID)
path = '/path'
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_LIST: [content]}))
processes = Process._models_from_api_path(session, path)
self.assertIsNotNone(processes)
for process in processes:
self.assertEqual(str(process_id), str(process.id))
def test_new(self):
"""
Tests that a template new object can be created from an API endpoint with validation using a Marshmallow schema.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
wrong_content = {}
for key in content:
wrong_content[f"new_{key}"] = content[key]
session = Session()
organisation_id = content.get('organisation_id')
path = Process._PATH_NEW.format(organisation_id=organisation_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", exc=requests.exceptions.ConnectTimeout)
process._new(organisation_id=organisation_id)
with pytest.raises(RequestError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", status_code=400)
process._new(organisation_id=organisation_id)
with pytest.raises(ModelError):
mock.get(f"{Session.API_URL_DEFAULT}{path}", text='{}')
process._new(organisation_id=organisation_id)
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: wrong_content}))
process._new(organisation_id=organisation_id)
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process._new(organisation_id=organisation_id)
schema = ProcessSchema()
for key in content:
if (Model._METADATA_HIDE not in schema.fields[key].metadata) and (content[key] is not None):
self.assertEqual(json.dumps(content[key], default=json_default), json.dumps(getattr(process, key), default=json_default))
def test_options(self):
"""
Tests that the options can be obtained from a process model.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
organisation_id = content.get('organisation_id')
process_id = content.get(Model._FIELD_ID)
path = Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
options = process.options
self.assertIsNotNone(options)
for option in options:
self.assertIsNotNone(option.ssd_id)
def test_schema(self):
"""
Tests that a process model can be loaded into the schema.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
model = ProcessSchema().load(content)
self.assertIsNotNone(model)
for key in content:
self.assertEqual(json.dumps(content[key], default=json_default), json.dumps(model[key], default=json_default))
def test_stop(self):
"""
Tests that a process can be stopped.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
organisation_id = content.get('organisation_id')
process_id = content.get(Model._FIELD_ID)
path = Process._PATH_STOP.format(organisation_id=organisation_id, process_id=process_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)}",
text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
with pytest.raises(RequestError):
mock.post(f"{Session.API_URL_DEFAULT}{path}", exc=requests.exceptions.ConnectTimeout)
process.stop()
with pytest.raises(RequestError):
mock.post(f"{Session.API_URL_DEFAULT}{path}", status_code=400)
process.stop()
with pytest.raises(ModelError):
mock.post(f"{Session.API_URL_DEFAULT}{path}", text='{}')
process.stop()
mock.post(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.stop()
def test_update(self):
"""
Tests that an object can be updated to the API.
"""
with open(self.fixture_path('process.json'), 'r') as file:
content = json.loads(file.read())
session = Session()
organisation_id = content.get('organisation_id')
process_id = content.get(Model._FIELD_ID)
path = Process._PATH_PATCH.format(organisation_id=organisation_id, process_id=process_id)
name = 'New Name'
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
mock.get(f"{Session.API_URL_DEFAULT}{Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)}",
text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertIsNotNone(process)
self.assertEqual(str(process_id), str(process.id))
with pytest.raises(RequestError):
mock.patch(f"{Session.API_URL_DEFAULT}{path}", exc=requests.exceptions.ConnectTimeout)
process.update(name=name)
with pytest.raises(RequestError):
mock.patch(f"{Session.API_URL_DEFAULT}{path}", status_code=400)
process.update(name=name)
with pytest.raises(ModelError):
mock.patch(f"{Session.API_URL_DEFAULT}{path}", text='{}')
process.update(name=name)
self.assertNotEqual(name, process.name)
content['process_status'] = Process._PROCESS_STATUS_EXECUTE
mock.get(f"{Session.API_URL_DEFAULT}{Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)}",
text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertEqual(Process._PROCESS_STATUS_EXECUTE, process.process_status)
with pytest.raises(ModelError):
mock.patch(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.update(name=name)
content['process_status'] = 'draft'
mock.get(f"{Session.API_URL_DEFAULT}{Process._PATH_GET.format(organisation_id=organisation_id, process_id=process_id)}",
text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.get(organisation_id=organisation_id, process_id=process_id)
self.assertNotEqual(Process._PROCESS_STATUS_EXECUTE, process.process_status)
content['name'] = name
mock.patch(f"{Session.API_URL_DEFAULT}{path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: content}))
process.update(name=name)
self.assertEqual(name, process.name)
def test_update_input(self):
"""
Tests setting an input on a template process.
"""
with open(self.fixture_path('process.json'), 'r') as file:
process_content = json.loads(file.read())
input_content = process_content['inputs'][0]
with open(self.fixture_path('data.json'), 'r') as file:
data_content = json.loads(file.read())
with open(self.fixture_path('data_file.json'), 'r') as file:
file_content = json.loads(file.read())
session = Session()
organisation_id = process_content.get('organisation_id')
data_id = uuid.uuid4()
get_path = Process._PATH_NEW.format(organisation_id=organisation_id)
files_path = Data._PATH_FILES.format(organisation_id=organisation_id, data_id=data_id)
process = Process(session)
self.assertIsNotNone(process)
data = Data(session)
data_content['id'] = data_id
data._set_model(data_content)
with requests_mock.Mocker() as mock:
with pytest.raises(ModelError):
process.update(input_number=1, data=None)
with pytest.raises(ModelError):
process_content['process_status'] = Process._PROCESS_STATUS_EXECUTE
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
process._new(organisation_id=organisation_id)
input = next(process.inputs)
process.update(input=input, data=data)
process_content['process_status'] = 'draft'
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
process._new(organisation_id=organisation_id)
with pytest.raises(ModelError):
process.update(input_number=2, data=data)
with pytest.raises(ModelError):
input_content['input'] = 2
input = Model(None, schema=ProcessInputSchema())
input._set_model(input_content)
process.update(input=input, data=data)
del file_content['publishable']
mock.get(f"{Session.API_URL_DEFAULT}{files_path}", text=json.dumps({Model._RESPONSE_KEY_LIST: [file_content]}))
input = next(process.inputs)
with pytest.raises(ModelError):
process.update(input=input, data=data)
file_content['publishable'] = False
mock.get(f"{Session.API_URL_DEFAULT}{files_path}", text=json.dumps({Model._RESPONSE_KEY_LIST: [file_content]}))
with pytest.raises(ModelError):
file_content['file_type'] = fusion_platform.FILE_TYPE_GEOTIFF
mock.get(f"{Session.API_URL_DEFAULT}{files_path}", text=json.dumps({Model._RESPONSE_KEY_LIST: [file_content]}))
process.update(input_number=1, data=data)
file_content['file_type'] = fusion_platform.FILE_TYPE_ESRI_SHAPEFILE
mock.get(f"{Session.API_URL_DEFAULT}{files_path}", text=json.dumps({Model._RESPONSE_KEY_LIST: [file_content]}))
self.assertEqual(str(process_content['inputs'][0]['id']), str(next(process.inputs).id))
process.update(input_number=1, data=data)
self.assertEqual(str(data_id), str(next(process.inputs).id))
process._new(organisation_id=organisation_id)
self.assertEqual(str(process_content['inputs'][0]['id']), str(next(process.inputs).id))
process.update(input=input, data=data)
self.assertEqual(str(data_id), str(next(process.inputs).id))
def test_update_option(self):
"""
Tests setting an option on a template process.
"""
with open(self.fixture_path('process.json'), 'r') as file:
process_content = json.loads(file.read())
option_content = process_content['options'][0]
session = Session()
organisation_id = process_content.get('organisation_id')
get_path = Process._PATH_NEW.format(organisation_id=organisation_id)
process = Process(session)
self.assertIsNotNone(process)
with requests_mock.Mocker() as mock:
with pytest.raises(ModelError):
process_content['process_status'] = Process._PROCESS_STATUS_EXECUTE
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
process._new(organisation_id=organisation_id)
option = next(process.options)
process.update(option=option, value=None)
process_content['process_status'] = 'draft'
mock.get(f"{Session.API_URL_DEFAULT}{get_path}", text=json.dumps({Model._RESPONSE_KEY_MODEL: process_content}))
process._new(organisation_id=organisation_id)
with pytest.raises(ModelError):
option_content['name'] = 'rubbish'
option = Model(None, schema=ProcessOptionSchema())
option._set_model(option_content)
process.update(option=option, value=None)
option_content['name'] = 'latest_date'
with pytest.raises(ModelError):
option = Model(None, schema=ProcessOptionSchema())
option._set_model(option_content)
process.update(option=option, value='rubbish')
option = next(process.options)
self.assertEqual(True, next(process.options).value)
process.update(option_name='latest_date', value=False)
self.assertEqual(False, next(process.options).value)
process._new(organisation_id=organisation_id)
self.assertEqual(True, next(process.options).value)
process.update(option=option, value=False)
self.assertEqual(False, next(process.options).value)
process.update(option=option, value=None)
self.assertEqual(None, next(process.options).value)
| 45.105867 | 164 | 0.65085 | 4,123 | 35,363 | 5.329614 | 0.043415 | 0.100664 | 0.046555 | 0.052244 | 0.879312 | 0.864658 | 0.845681 | 0.824656 | 0.81196 | 0.798307 | 0 | 0.002078 | 0.237763 | 35,363 | 783 | 165 | 45.163474 | 0.813133 | 0.034641 | 0 | 0.77551 | 0 | 0 | 0.126756 | 0.102289 | 0 | 0 | 0 | 0 | 0.128015 | 1 | 0.03525 | false | 0 | 0.024119 | 0 | 0.061224 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ba9f9d1ee73f99223587084d47e6bccd7272833 | 688 | py | Python | tests/utils/conftest.py | zaczw/stochastic | 7de6ec2f9050120adfcffeebc94bfc17ec916150 | [
"MIT"
] | 268 | 2018-01-17T18:45:20.000Z | 2022-03-28T06:05:30.000Z | tests/utils/conftest.py | zaczw/stochastic | 7de6ec2f9050120adfcffeebc94bfc17ec916150 | [
"MIT"
] | 42 | 2018-07-11T02:17:43.000Z | 2021-11-27T03:27:32.000Z | tests/utils/conftest.py | zaczw/stochastic | 7de6ec2f9050120adfcffeebc94bfc17ec916150 | [
"MIT"
] | 56 | 2018-02-20T09:32:50.000Z | 2022-02-15T15:39:37.000Z | """Pytest fixtures."""
import pytest
@pytest.fixture(params=[4, 4.2, "4", -4])
def increments_fixture(request):
return request.param
@pytest.fixture(params=[4, 4.2, "4"])
def number_fixture(request):
return request.param
@pytest.fixture(params=[4, 0, -4])
def positive_number_fixture(request):
return request.param
@pytest.fixture(params=["PARAMETER_NAME"])
def parameter_name_fixture(request):
return request.param
@pytest.fixture(params=[4, 0, -4])
def nonnegative_number_fixture(request):
return request.param
@pytest.fixture(params=[16])
def n(request):
return request.param
@pytest.fixture(params=[1])
def end(request):
return request.param
| 18.105263 | 42 | 0.718023 | 94 | 688 | 5.159574 | 0.234043 | 0.187629 | 0.274227 | 0.360825 | 0.734021 | 0.734021 | 0.734021 | 0.589691 | 0.589691 | 0.235052 | 0 | 0.0301 | 0.130814 | 688 | 37 | 43 | 18.594595 | 0.780936 | 0.023256 | 0 | 0.409091 | 0 | 0 | 0.024024 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.318182 | false | 0 | 0.045455 | 0.318182 | 0.681818 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
9bbe3ca02890c8a1af1af4eeacbba80a21ab0fe6 | 140,730 | py | Python | src/frr/tests/topotests/multicast_pim_static_rp_topo1/test_multicast_pim_static_rp.py | zhouhaifeng/vpe | 9c644ffd561988e5740021ed26e0f7739844353d | [
"Apache-2.0"
] | null | null | null | src/frr/tests/topotests/multicast_pim_static_rp_topo1/test_multicast_pim_static_rp.py | zhouhaifeng/vpe | 9c644ffd561988e5740021ed26e0f7739844353d | [
"Apache-2.0"
] | null | null | null | src/frr/tests/topotests/multicast_pim_static_rp_topo1/test_multicast_pim_static_rp.py | zhouhaifeng/vpe | 9c644ffd561988e5740021ed26e0f7739844353d | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
#
# Copyright (c) 2019 by VMware, Inc. ("VMware")
# Used Copyright (c) 2018 by Network Device Education Foundation,
# Inc. ("NetDEF") in this file.
#
# Permission to use, copy, modify, and/or distribute this software
# for any purpose with or without fee is hereby granted, provided
# that the above copyright notice and this permission notice appear
# in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND VMWARE DISCLAIMS ALL WARRANTIES
# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL VMWARE BE LIABLE FOR
# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY
# DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS,
# WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS
# ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE
# OF THIS SOFTWARE.
#
"""
Following tests are covered to test Multicast basic functionality:
Topology:
_______r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
| |
|_____________|
r4
Test steps
- Create topology (setup module)
- Bring up topology
TC_1 : Verify upstream interfaces(IIF) and join state are updated properly
after adding and deleting the static RP
TC_2 : Verify IIF and OIL in "show ip pim state" updated properly after
adding and deleting the static RP
TC_3: (*, G) Mroute entry are cleared when static RP gets deleted
TC_4: Verify (*,G) prune is send towards the RP after deleting the static RP
TC_5: Verify OIF entry for RP is cleared when RP becomes unreachable
TC_6: Verify IIF and OIL in "show ip pim state" updated properly when RP
becomes unreachable
TC_7 : Verify upstream interfaces(IIF) and join state are updated properly
after adding and deleting the static RP
TC_8: Verify (*,G) prune is send towards the RP when RP becomes unreachable
TC_9 : Verify RP configured after IGMP join received, PIM join towards RP is
sent immediately
TC_10 : Verify RP becomes reachable after IGMP join received, PIM join
towards RP is sent immediately
TC_11 : Verify PIM join send towards the higher preferred RP
TC_12 : Verify PIM prune send towards the lower preferred RP
TC_13 : Verify RPF interface is updated in mroute (kernel) when higher
preferred overlapping RP configured
TC_14 : Verify IIF and OIL in "show ip pim state" updated properly when higher
preferred overlapping RP configured
TC_15 : Verify upstream interfaces(IIF) and join state are updated when higher
preferred overlapping RP is configured
TC_16 : Verify join is send to lower preferred RP, when higher preferred RP
gets deleted
TC_17 : Verify prune is send to higher preferred RP when higher preferred RP
gets deleted
TC_18 : Verify RPF interface updated in mroute when higher preferred RP gets
deleted
TC_19 : Verify IIF and OIL in "show ip pim state" updated when higher
preferred overlapping RP is deleted
TC_20 : Verfiy PIM upstream IIF updated when higher preferred overlapping RP
deleted
TC_21_1 : Verify OIF and RFP for (*,G) and (S,G) when static RP configure in
LHR router
TC_21_2 : Verify OIF and RFP for (*,G) and (S,G) when static RP configure in
LHR router
TC_22_1 : Verify OIF and RPF for (*,G) and (S,G) when static RP configure in
FHR router
TC_22_2 : Verify OIF and RPF for (*,G) and (S,G) when static RP configure in
FHR router
TC_23 : Verify (*,G) and (S,G) populated correctly when RPT and SPT path are
different
TC_24 : Verify (*,G) and (S,G) populated correctly when SPT and RPT share the
same path
TC_25 : Verify (*,G) and (S,G) populated correctly after clearing the PIM ,
IGMP and mroutes joins
TC_26 : Restart the PIMd process and verify PIM joins , and mroutes entries
TC_27 : Configure multiple groups (10 grps) with same RP address
TC_28 : Configure multiple groups (10 grps) with different RP address
TC_29 : Verify IIF and OIL in updated in mroute when upstream interface
configure as RP
TC_30 : Verify IIF and OIL change to other path after shut the primary path
TC_31 : Verify RP info and (*,G) mroute after deleting the RP and shut / no
shut the RPF interface.
TC_32 : Verify RP info and (*,G) mroute after deleting the RP and shut / no
shut the RPF inteface
"""
import os
import sys
import time
from time import sleep
import datetime
import pytest
# Save the Current Working Directory to find configuration files.
CWD = os.path.dirname(os.path.realpath(__file__))
sys.path.append(os.path.join(CWD, "../"))
sys.path.append(os.path.join(CWD, "../lib/"))
# Required to instantiate the topology builder class.
# pylint: disable=C0413
# Import topogen and topotest helpers
from lib.topogen import Topogen, get_topogen
from lib.topolog import logger
from lib.topojson import build_topo_from_json, build_config_from_json
from lib.common_config import (
start_topology,
write_test_header,
write_test_footer,
reset_config_on_routers,
step,
shutdown_bringup_interface,
kill_router_daemons,
start_router_daemons,
create_static_routes,
topo_daemons,
)
from lib.pim import (
create_pim_config,
verify_igmp_groups,
verify_upstream_iif,
verify_join_state_and_timer,
verify_ip_mroutes,
verify_pim_neighbors,
verify_pim_interface_traffic,
verify_pim_rp_info,
verify_pim_state,
clear_ip_pim_interface_traffic,
clear_ip_igmp_interfaces,
clear_ip_pim_interfaces,
clear_ip_mroute,
clear_ip_mroute_verify,
McastTesterHelper,
)
pytestmark = [pytest.mark.pimd, pytest.mark.staticd]
# Global variables
GROUP_RANGE_ALL = "224.0.0.0/4"
GROUP_RANGE = "225.1.1.1/32"
GROUP_RANGE_LIST_1 = [
"225.1.1.1/32",
"225.1.1.2/32",
"225.1.1.3/32",
"225.1.1.4/32",
"225.1.1.5/32",
]
GROUP_RANGE_LIST_2 = [
"225.1.1.6/32",
"225.1.1.7/32",
"225.1.1.8/32",
"225.1.1.9/32",
"225.1.1.10/32",
]
GROUP_ADDRESS = "225.1.1.1"
GROUP_ADDRESS_LIST_1 = ["225.1.1.1", "225.1.1.2", "225.1.1.3", "225.1.1.4", "225.1.1.5"]
GROUP_ADDRESS_LIST_2 = [
"225.1.1.6",
"225.1.1.7",
"225.1.1.8",
"225.1.1.9",
"225.1.1.10",
]
STAR = "*"
SOURCE_ADDRESS = "10.0.6.2"
SOURCE = "Static"
def build_topo(tgen):
"""Build function"""
# Building topology from json file
build_topo_from_json(tgen, TOPO)
def setup_module(mod):
"""
Sets up the pytest environment
* `mod`: module name
"""
testsuite_run_time = time.asctime(time.localtime(time.time()))
logger.info("Testsuite start time: %s", testsuite_run_time)
logger.info("=" * 40)
topology = """
_______r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
| |
|_____________|
r4
"""
logger.info("Master Topology: \n %s", topology)
logger.info("Running setup_module to create topology")
# This function initiates the topology build with Topogen...
json_file = "{}/multicast_pim_static_rp.json".format(CWD)
tgen = Topogen(json_file, mod.__name__)
global TOPO
TOPO = tgen.json_topo
# ... and here it calls Mininet initialization functions.
# get list of daemons needs to be started for this suite.
daemons = topo_daemons(tgen, TOPO)
# Starting topology, create tmp files which are loaded to routers
# to start deamons and then start routers
start_topology(tgen, daemons)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
# Creating configuration from JSON
build_config_from_json(tgen, TOPO)
# Verify PIM neighbors
result = verify_pim_neighbors(tgen, TOPO)
assert result is True, "setup_module :Failed \n Error:" " {}".format(result)
# XXX Replace this using "with McastTesterHelper()... " in each test if possible.
global app_helper
app_helper = McastTesterHelper(tgen)
logger.info("Running setup_module() done")
def teardown_module():
"""Teardown the pytest environment"""
logger.info("Running teardown_module to delete topology")
tgen = get_topogen()
app_helper.cleanup()
# Stop toplogy and Remove tmp files
tgen.stop_topology()
logger.info("Testsuite end time: %s", time.asctime(time.localtime(time.time())))
logger.info("=" * 40)
#####################################################
#
# Testcases
#
#####################################################
def verify_mroute_repopulated(uptime_before, uptime_after):
"""
API to compare uptime for mroutes
Parameters
----------
* `uptime_before` : Uptime dictionary for any particular instance
* `uptime_after` : Uptime dictionary for any particular instance
"""
for group in uptime_before.keys():
for source in uptime_before[group].keys():
if set(uptime_before[group]) != set(uptime_after[group]):
errormsg = (
"mroute (%s, %s) has not come"
" up after mroute clear [FAILED!!]" % (source, group)
)
return errormsg
d_1 = datetime.datetime.strptime(uptime_before[group][source], "%H:%M:%S")
d_2 = datetime.datetime.strptime(uptime_after[group][source], "%H:%M:%S")
if d_2 >= d_1:
errormsg = "mroute (%s, %s) is not " "repopulated [FAILED!!]" % (
source,
group,
)
return errormsg
logger.info("mroute (%s, %s) is " "repopulated [PASSED!!]", source, group)
return True
def verify_state_incremented(state_before, state_after):
"""
API to compare interface traffic state incrementing
Parameters
----------
* `state_before` : State dictionary for any particular instance
* `state_after` : State dictionary for any particular instance
"""
for router, state_data in state_before.items():
for state, _ in state_data.items():
if state_before[router][state] >= state_after[router][state]:
errormsg = (
"[DUT: %s]: state %s value has not"
" incremented, Initial value: %s, "
"Current value: %s [FAILED!!]"
% (
router,
state,
state_before[router][state],
state_after[router][state],
)
)
return errormsg
logger.info(
"[DUT: %s]: State %s value is "
"incremented, Initial value: %s, Current value: %s"
" [PASSED!!]",
router,
state,
state_before[router][state],
state_after[router][state],
)
return True
def test_add_delete_static_RP_p0(request):
"""
TC_1_P0 : Verify upstream interfaces(IIF) and join state are updated
properly after adding and deleting the static RP
TC_2_P0 : Verify IIF and OIL in "show ip pim state" updated properly
after adding and deleting the static RP
TC_3_P0: (*, G) Mroute entry are cleared when static RP gets deleted
TC_4_P0: Verify (*,G) prune is send towards the RP after deleting the
static RP
Topology used:
r0------r1-----r2
iperf DUT RP
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("pre-configuration to send IGMP join and multicast traffic")
step("Enable IGMP on r1 interface and send IGMP " "join (225.1.1.1) to r1")
step("Configure r2 loopback interface as RP")
step("Enable PIM between r1 and r3")
step("r1: Verify show ip igmp group without any IGMP join")
dut = "r1"
interface = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, interface, GROUP_ADDRESS, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: igmp group present without any IGMP join \n Error: {}".format(
tc_name, result
)
)
step("r1: Verify show ip pim interface traffic without any IGMP join")
state_dict = {"r1": {"r1-r2-eth1": ["pruneTx"]}}
state_before = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_before, dict
), "Testcase {} : Failed \n state_before is not dictionary\n Error: {}".format(
tc_name, result
)
step("r0 : Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify RP info")
dut = "r1"
iif = "r1-r2-eth1"
rp_address = "1.0.2.17"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify ip pim join")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
step("r1: Delete RP configuration")
# Delete RP configuration
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify RP info")
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE, expected=False
)
assert (
result is not True
), "Testcase {} : Failed \n " "r1: RP info present \n Error: {}".format(
tc_name, result
)
step("r1: Verify upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: upstream IIF interface present \n Error: {}".format(tc_name, result)
)
step("r1: Verify upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, STAR, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: upstream join state is up and join timer is running \n Error: {}".format(
tc_name, result
)
)
# 20
step("r1: Verify PIM state")
result = verify_pim_state(tgen, dut, iif, oif, GROUP_ADDRESS, expected=False)
assert result is not True, "Testcase {} :Failed \n Error: {}".format(
tc_name, result
)
step("r1: Verify ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert (
result is not True
), "Testcase {} : Failed \n " "r1: mroutes are still present \n Error: {}".format(
tc_name, result
)
step("r1: Verify show ip pim interface traffic without any IGMP join")
state_after = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_after, dict
), "Testcase {} : Failed \n state_before is not dictionary \n Error: {}".format(
tc_name, result
)
result = verify_state_incremented(state_before, state_after)
assert result is True, "Testcase{} : Failed Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_SPT_RPT_path_same_p1(request):
"""
TC_24_P1 : Verify (*,G) and (S,G) populated correctly when SPT and RPT
share the same path
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1 r3-----r5
r1 : LHR
r2 : RP
r3 : FHR
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
dut = "r1"
intf = "r1-r3-eth2"
shutdown_bringup_interface(tgen, dut, intf, False)
intf = "r1-r4-eth3"
shutdown_bringup_interface(tgen, dut, intf, False)
dut = "r3"
intf = "r3-r1-eth0"
shutdown_bringup_interface(tgen, dut, intf, False)
intf = "r3-r4-eth2"
shutdown_bringup_interface(tgen, dut, intf, False)
step("Enable IGMP on r1 interface and send IGMP join (225.1.1.1) to R1")
step("Configure RP on r2 (loopback interface) for the group range" " 224.0.0.0/4")
step("Enable the PIM on all the interfaces of r1, r2, r3 and r4 routers")
step("Send multicast traffic from R3")
step("r2: Verify RP info")
dut = "r2"
rp_address = "1.0.2.17"
iif = "lo"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
dut = "r1"
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", GROUP_ADDRESS, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
dut = "r1"
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream IIF interface")
dut = "r2"
iif = "lo"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream IIF interface")
iif = "r2-r3-eth1"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S, G) upstream join state is up and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r2-eth1"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_not_reachable_static_RP_p0(request):
"""
TC_5_P0: Verify OIF entry for RP is cleared when RP becomes unreachable
TC_6_P0: Verify IIF and OIL in "show ip pim state" updated properly when
RP becomes unreachable
TC_7_P0 : Verify upstream interfaces(IIF) and join state are updated
properly after adding and deleting the static RP
TC_8_P0: Verify (*,G) prune is send towards the RP when RP becomes
unreachable
Topology used:
r0------r1-----r2
iperf DUT RP
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
dut = "r1"
intf = "r1-r3-eth2"
shutdown_bringup_interface(tgen, dut, intf, False)
dut = "r1"
intf = "r1-r4-eth3"
shutdown_bringup_interface(tgen, dut, intf, False)
step(
"r1: (*,G) prune is not sent towards the RP interface, verify using"
"show ip pim interface traffic"
)
state_dict = {"r1": {"r1-r2-eth1": ["pruneTx"]}}
state_before = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_before, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, state_before
)
step("Enable IGMP on r1 interface and send IGMP " "join (225.1.1.1) to r1")
step("Configure r2 loopback interface as RP")
step("Enable PIM between r1 and r2")
step("r0 : Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify rp info")
dut = "r1"
iif = "r1-r2-eth1"
rp_address = "1.0.2.17"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify PIM state")
result = verify_pim_state(tgen, dut, iif, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 :Verify ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Make RP un-reachable")
dut = "r1"
input_dict = {
dut: {
"static_routes": [
{"network": "1.0.2.17/32", "next_hop": "10.0.1.2", "delete": True}
]
}
}
result = create_static_routes(tgen, input_dict)
assert result is True, "Testcase {} : Failed \n Error: {}".format(tc_name, result)
step("r1: Check RP detail using show ip pim rp-info OIF should be unknown")
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, "Unknown", rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step(
"r1 : OIL should be same and IIF should be cleared on R1 verify"
"using show ip pim state"
)
result = verify_pim_state(tgen, dut, iif, oif, GROUP_ADDRESS, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"OIL is not same and IIF is not cleared on R1 \n Error: {}".format(
tc_name, result
)
)
step("r1: upstream IIF should be unknown , verify using show ip pim" "upstream")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: upstream IIF is not unknown \n Error: {}".format(tc_name, result)
)
step(
"r1: join state should not be joined and join timer should stop,"
"verify using show ip pim upstream"
)
result = verify_join_state_and_timer(
tgen, dut, iif, STAR, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: join state is joined and timer is not stopped \n Error: {}".format(
tc_name, result
)
)
step(
"r1: (*,G) prune is sent towards the RP interface, verify using"
"show ip pim interface traffic"
)
state_after = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_after, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, result
)
result = verify_state_incremented(state_before, state_after)
assert result is True, "Testcase{} : Failed Error: {}".format(tc_name, result)
step("r1: (*, G) cleared from mroute table using show ip mroute")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: (*, G) are not cleared from mroute table \n Error: {}".format(
tc_name, result
)
)
logger.info("Expected behavior: %s", result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_add_RP_after_join_received_p1(request):
"""
TC_9_P1 : Verify RP configured after IGMP join received, PIM join towards
RP is sent immediately
Topology used:
r0------r1-----r2
iperf DUT RP
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on R1 interface")
step("Configure r2 loopback interface as RP")
step("Enable PIM between r1 and r2")
step("Delete RP configuration from r1")
step("r1: Delete RP configuration")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify rp-info")
dut = "r1"
rp_address = "1.0.2.17"
iif = "r1-r2-eth1"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE, expected=False
)
assert (
result is not True
), "Testcase {} : Failed \n " "r1: rp-info is present \n Error: {}".format(
tc_name, result
)
step("joinTx value before join sent")
state_dict = {"r1": {"r1-r2-eth1": ["joinTx"]}}
state_before = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_before, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, result
)
step("r0 : Send IGMP join (225.1.1.1) to r1, when rp is not configured" "in r1")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: IGMP group is received on R1 verify using show ip igmp groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: upstream IFF interface is present \n Error: {}".format(tc_name, result)
)
step("r1: Verify upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, STAR, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: upstream join state is joined and timer is running \n Error: {}".format(
tc_name, result
)
)
step("r1: Verify PIM state")
result = verify_pim_state(tgen, dut, iif, oif, GROUP_ADDRESS, expected=False)
assert (
result is not True
), "Testcase {} : Failed \n " "r1: PIM state is up\n Error: {}".format(
tc_name, result
)
step("r1: Verify ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert (
result is not True
), "Testcase {} : Failed \n " "r1: mroutes are still present\n Error: {}".format(
tc_name, result
)
step("r1: Configure static RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify rp-info")
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify PIM state")
result = verify_pim_state(tgen, dut, iif, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
logger.info("Expected behavior: %s", result)
state_after = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_after, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, result
)
result = verify_state_incremented(state_before, state_after)
assert result is True, "Testcase {} : Failed Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_reachable_static_RP_after_join_p0(request):
"""
TC_10_P0 : Verify RP becomes reachable after IGMP join received, PIM join
towards RP is sent immediately
Topology used:
r0------r1-----r3
iperf DUT RP
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface and send IGMP " "join (225.1.1.1) to r1")
step("Configure r2 loopback interface as RP")
step("Enable PIM between r1 and r2")
step("r1 : Verify pim interface traffic")
state_dict = {"r1": {"r1-r2-eth1": ["joinTx"]}}
state_before = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_before, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, state_before
)
step("r1: Make RP un-reachable")
dut = "r1"
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, False)
intf = "r1-r3-eth2"
shutdown_bringup_interface(tgen, dut, intf, False)
intf = "r1-r4-eth3"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: Verify rp-info")
rp_address = "1.0.2.17"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_ADDRESS, "Unknown", rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Send IGMP join for 225.1.1.1")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify IGMP groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify upstream IIF interface")
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: upstream IIF interface is present\n Error: {}".format(tc_name, result)
)
step("r1 : Verify upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, STAR, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: upstream join state is joined and timer is running\n Error: {}".format(
tc_name, result
)
)
step("r1 : Verify PIM state")
result = verify_pim_state(tgen, dut, iif, oif, GROUP_ADDRESS, expected=False)
assert (
result is not True
), "Testcase {} : Failed \n " "r1: PIM state is up\n Error: {}".format(
tc_name, result
)
step("r1 : Verify ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert (
result is not True
), "Testcase {} : Failed \n " "r1: mroutes are still present\n Error: {}".format(
tc_name, result
)
step("r1: Make RP reachable")
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, True)
intf = "r1-r3-eth2"
shutdown_bringup_interface(tgen, dut, intf, True)
intf = "r1-r4-eth3"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1 : Verify rp-info")
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify PIM state")
result = verify_pim_state(tgen, dut, iif, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
logger.info("Expected behavior: %s", result)
step("r1 : Verify pim interface traffic")
state_after = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_after, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, result
)
result = verify_state_incremented(state_before, state_after)
assert result is True, "Testcase{} : Failed Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_send_join_on_higher_preffered_rp_p1(request):
"""
TC_11_P1 : Verify PIM join send towards the higher preferred RP
TC_12_P1 : Verify PIM prune send towards the lower preferred RP
TC_13_P1 : Verify RPF interface is updated in mroute (kernel) when higher
preferred overlapping RP configured
TC_14_P1 : Verify IIF and OIL in "show ip pim state" updated properly when
higher preferred overlapping RP configured
TC_15_P1 : Verify upstream interfaces(IIF) and join state are updated when
higher preferred overlapping RP is configured
TC_16_P1 : Verify join is send to lower preferred RP, when higher
preferred RP gets deleted
TC_17_P1 : Verify prune is send to higher preferred RP when higher
preferred RP gets deleted
TC_18_P1 : Verify RPF interface updated in mroute when higher preferred RP
gets deleted
TC_19_P1 : Verify IIF and OIL in "show ip pim state" updated when higher
preferred overlapping RP is deleted
TC_20_P1 : Verfiy PIM upstream IIF updated when higher preferred
overlapping RP deleted
Topology used:
_______r2
|
iperf |
r0-----r1
|
|_______r4
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface")
step("Configure RP on r2 (loopback interface) for the group range " "224.0.0.0/4")
step("Configure RP on r4 (loopback interface) for the group range " "225.1.1.1/32")
step("r3 : Make all interface not reachable")
dut = "r3"
intf = "r3-r1-eth0"
shutdown_bringup_interface(tgen, dut, intf, False)
intf = "r3-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, False)
intf = "r3-r4-eth2"
shutdown_bringup_interface(tgen, dut, intf, False)
dut = "r2"
intf = "r2-r3-eth1"
shutdown_bringup_interface(tgen, dut, intf, False)
dut = "r4"
intf = "r4-r3-eth1"
shutdown_bringup_interface(tgen, dut, intf, False)
dut = "r1"
intf = "r1-r3-eth2"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1 : Verify joinTx count before sending join")
state_dict = {"r1": {"r1-r4-eth3": ["joinTx"], "r1-r2-eth1": ["pruneTx"]}}
state_before = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_before, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, state_before
)
step("r0 : Send IGMP join for 225.1.1.1")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify IGMP groups")
dut = "r1"
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("Configure static RP for group 225.1.1.1/32")
input_dict = {
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.4.17",
"group_addr_range": ["225.1.1.1/32"],
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify RP info for group 224.0.0.0/4")
rp_address_1 = "1.0.2.17"
iif = "r1-r2-eth1"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address_1, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify RP info for group 225.1.1.1")
rp_address_2 = "1.0.4.17"
iif = "r1-r4-eth3"
result = verify_pim_rp_info(tgen, TOPO, dut, GROUP_RANGE, iif, rp_address_2, SOURCE)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify join is sent to higher preferred RP")
step("r1 : Verify prune is sent to lower preferred RP")
state_after = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_after, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, result
)
result = verify_state_incremented(state_before, state_after)
assert result is True, "Testcase{} : Failed Error: {}".format(tc_name, result)
step("r1 : Verify ip mroutes")
iif = "r1-r4-eth3"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify PIM state")
result = verify_pim_state(tgen, dut, iif, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("r1 : Verify joinTx, pruneTx count before RP gets deleted")
state_dict = {"r1": {"r1-r2-eth1": ["joinTx"], "r1-r4-eth3": ["pruneTx"]}}
state_before = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_before, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, result
)
step("r1 : Delete RP configuration for 225.1.1.1")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.4.17",
"group_addr_range": ["225.1.1.1/32"],
"delete": True,
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify rp-info for group 224.0.0.0/4")
iif = "r1-r2-eth1"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address_1, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1 : Verify rp-info for group 225.1.1.1")
iif = "r1-r4-eth3"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE, oif, rp_address_2, SOURCE, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: rp-info is present for group 225.1.1.1 \n Error: {}".format(
tc_name, result
)
)
step(
"r1 : Verify RPF interface updated in mroute when higher preferred"
"RP gets deleted"
)
iif = "r1-r2-eth1"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
logger.info("Expected behavior: %s", result)
step(
"r1 : Verify IIF and OIL in show ip pim state updated when higher"
"preferred overlapping RP is deleted"
)
result = verify_pim_state(tgen, dut, iif, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step(
"r1 : Verfiy upstream IIF updated when higher preferred overlapping"
"RP deleted"
)
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step(
"r1 : Verify upstream join state and join timer updated when higher"
"preferred overlapping RP deleted"
)
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step(
"r1 : Verify join is sent to lower preferred RP, when higher"
"preferred RP gets deleted"
)
step(
"r1 : Verify prune is sent to higher preferred RP when higher"
" preferred RP gets deleted"
)
state_after = verify_pim_interface_traffic(tgen, state_dict)
assert isinstance(
state_after, dict
), "Testcase{} : Failed \n state_before is not dictionary \n " "Error: {}".format(
tc_name, result
)
result = verify_state_incremented(state_before, state_after)
assert result is True, "Testcase{} : Failed Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_RP_configured_as_LHR_1_p1(request):
"""
TC_21_1_P1: Verify OIF and RPF for (*,G) and (S,G) when static RP configure
in LHR router
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
r1 : LHR/RP
r3 : FHR
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface")
step("Configure RP on r1 (loopback interface) for the group range" " 224.0.0.0/4")
step("Enable the PIM on all the interfaces of r1, r2, r3 and r4 routers")
step("Send the IGMP join from r0")
step("Send multicast traffic from r5")
step("r1 , r2, r3, r4: Delete existing RP configuration" "configure r1(LHR) as RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r3": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Configure r1(LHR) as RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.1.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.1.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r3": {
"pim": {
"rp": [
{
"rp_addr": "1.0.1.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.1.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
shutdown_bringup_interface(tgen, "r1", "lo", False)
sleep(5)
shutdown_bringup_interface(tgen, "r1", "lo", True)
sleep(5)
step("r1: Verify RP info")
dut = "r1"
rp_address = "1.0.1.17"
iif = "lo"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", GROUP_ADDRESS, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S, G) upstream join state is joined and join"
" timer is running \n Error: {}".format(tc_name, result)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_RP_configured_as_LHR_2_p1(request):
"""
TC_21_2_P1: Verify OIF and RPF for (*,G) and (S,G) when static RP configure
in LHR router
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
r1 : LHR/RP
r3 : FHR
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface")
step("Configure RP on r1 (loopback interface) for the group range" " 224.0.0.0/4")
step("Enable the PIM on all the interfaces of r1, r2, r3 and r4 routers")
step("Send multicast traffic from r5")
step("Send the IGMP join from r0")
step("r1, r2, r3, r4: Delete existing RP configuration," "configure r1(LHR) as RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r3": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1, r2, r3, r4: Configure r1(LHR) as RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.1.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.1.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r3": {
"pim": {
"rp": [
{
"rp_addr": "1.0.1.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.1.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify RP info")
dut = "r1"
rp_address = "1.0.1.17"
iif = "lo"
result = verify_pim_rp_info(tgen, TOPO, dut, GROUP_ADDRESS, iif, rp_address, SOURCE)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", GROUP_ADDRESS, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_RP_configured_as_FHR_1_p1(request):
"""
TC_22_1_P1: Verify OIF and RFP for (*,G) and (S,G) when static RP configure
in FHR router
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
r1 : LHR
r3 : FHR/RP
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface")
step("Configure RP on r2 (loopback interface) for the group range" " 225.1.1.0/24")
step("Enable the PIM on all the interfaces of r1, r2, r3 and r4 routers")
step("Send the IGMP join from r0")
step("Send multicast traffic from r5")
step("r1, r2, r3, r4: Delete existing RP configuration" "configure r3(FHR) as RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r3": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1, r2, r3, r4: Configure r3(FHR) as RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.3.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.3.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r3": {
"pim": {
"rp": [
{
"rp_addr": "1.0.3.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.3.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify RP info")
dut = "r1"
rp_address = "1.0.3.17"
iif = "r1-r3-eth2"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Verify IGMP groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", GROUP_ADDRESS, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_RP_configured_as_FHR_2_p2(request):
"""
TC_22_2_P2: Verify OIF and RFP for (*,G) and (S,G) when static RP configure
in FHR router
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
r1 : LHR
r3 : FHR/RP
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface")
step("Configure RP on r2 (loopback interface) for the group range" " 225.1.1.0/24")
step("Enable the PIM on all the interfaces of r1, r2, r3 and r4 routers")
step("Send multicast traffic from r5")
step("Send the IGMP join from r0")
step("r1, r2, r3, r4: Delete existing RP configuration" "configure r3(FHR) as RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r3": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1, r2, r3, r4: Configure r3(FHR) as RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.3.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.3.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r3": {
"pim": {
"rp": [
{
"rp_addr": "1.0.3.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.3.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify RP info")
dut = "r1"
rp_address = "1.0.3.17"
iif = "r1-r3-eth2"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", GROUP_ADDRESS, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Verify IGMP groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_SPT_RPT_path_different_p1(request):
"""
TC_23_P1: Verify (*,G) and (S,G) populated correctly when RPT and SPT path
are different
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
r1: LHR
r2: RP
r3: FHR
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface and send IGMP join (225.1.1.1) to r1")
step("Configure RP on r2 (loopback interface) for the group range" " 224.0.0.0/4")
step("Enable the PIM on all the interfaces of r1, r2, r3 and r4 routers")
step("Send multicast traffic from r3")
step("r2: Verify RP info")
dut = "r2"
rp_address = "1.0.2.17"
iif = "lo"
result = verify_pim_rp_info(tgen, TOPO, dut, GROUP_ADDRESS, iif, rp_address, SOURCE)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
dut = "r1"
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", GROUP_ADDRESS, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream IIF interface")
dut = "r2"
iif = "lo"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream IIF interface")
dut = "r2"
iif = "r2-r3-eth1"
result = verify_upstream_iif(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS, joinState="NotJoined"
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r2: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r2: Verify (S, G) ip mroutes")
oif = "none"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_clear_pim_configuration_p1(request):
"""
TC_25_P1: Verify (*,G) and (S,G) populated correctly after clearing the
PIM,IGMP and mroutes joins
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
| |
|_____________|
r4
r1 : LHR
r2 : RP
r3 : FHR
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface")
step("Configure RP on r2 (loopback interface) for the group range" " 224.0.0.0/4")
step("Enable the PIM on all the interfaces of r1, r2, r3 and r4 routers")
step("Send the IGMP join from r0")
step("Send multicast traffic from r5")
step("r2: Verify RP info")
dut = "r2"
rp_address = "1.0.2.17"
oif = "lo"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, oif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
dut = "r1"
iif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, iif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", GROUP_ADDRESS, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
dut = "r1"
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups timer restarted")
result = clear_ip_igmp_interfaces(tgen, dut)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify PIM neighbor timer restarted")
result = clear_ip_pim_interfaces(tgen, dut)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify PIM mroute timer restarted")
result = clear_ip_mroute_verify(tgen, dut)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
# Uncomment next line for debugging
# tgen.mininet_cli()
write_test_footer(tc_name)
def test_restart_pimd_process_p2(request):
"""
TC_26_P2: Restart the PIMd process and verify PIM upstream and mroutes
entries
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
| |
|_____________|
r4
r1 : LHR
r2 : RP
r3 : FHR
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface and send IGMP join (225.1.1.1) to R1")
step("Configure RP on r3 (loopback interface) for the group range" " 224.0.0.0/4")
step("Enable the PIM on all the interfaces of r1, r2, r3 and r4 routers")
step("Send multicast traffic from R3")
step("Restart the PIMd process")
step("r2: Verify RP info")
dut = "r2"
rp_address = "1.0.2.17"
oif = "lo"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, oif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
dut = "r1"
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", GROUP_ADDRESS, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream IIF interface")
dut = "r2"
iif = "lo"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
dut = "r1"
iif = "r1-r2-eth1"
oil = "r1-r0-eth0"
logger.info("waiting for 10 sec to make sure old mroute time is higher")
sleep(10)
# Why do we then wait 60 seconds below before checking the routes?
uptime_before = verify_ip_mroutes(
tgen, dut, STAR, GROUP_ADDRESS, iif, oil, return_uptime=True, mwait=60
)
assert isinstance(uptime_before, dict), "Testcase{} : Failed Error: {}".format(
tc_name, result
)
step("r1: Kill pimd process")
kill_router_daemons(tgen, "r1", ["pimd"])
step("r1 : Start pimd process")
start_router_daemons(tgen, "r1", ["pimd"])
logger.info("Waiting for 5sec to get PIMd restarted and mroute" " re-learned..")
sleep(5)
# Why do we then wait 10 seconds below before checking the routes?
uptime_after = verify_ip_mroutes(
tgen, dut, STAR, GROUP_ADDRESS, iif, oil, return_uptime=True, mwait=10
)
assert isinstance(uptime_after, dict), "Testcase{} : Failed Error: {}".format(
tc_name, result
)
result = verify_mroute_repopulated(uptime_before, uptime_after)
assert result is True, "Testcase{} : Failed Error: {}".format(tc_name, result)
write_test_footer(tc_name)
def test_multiple_groups_same_RP_address_p2(request):
"""
TC_27_P2: Configure multiple groups (10 grps) with same RP address
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
r1 : LHR
r2 : RP
r3 : FHR
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface and send IGMP join (225.1.1.1) to r1")
step("Configure RP on r2 (loopback interface) for the group range" "225.1.1.0/24")
step("Enable the PIM on all the interfaces of r1-r2-r3")
step("Send multicast traffic from r5 to all the groups")
step("r1 : Remove the groups to RP mapping one by one")
step("r1: Shut the upstream interfaces")
step("r1: No shut the upstream interfaces")
step("r1: Configure the RP again")
step("r1: Shut the receiver interfaces")
step("r1: No Shut the receiver interfaces")
step("r2: Verify RP info")
step("r2: verify rp-info")
dut = "r2"
rp_address = "1.0.2.17"
oif = "lo"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, oif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
group_address_list = GROUP_ADDRESS_LIST_1 + GROUP_ADDRESS_LIST_2
step("r0: Send IGMP join for 10 groups")
result = app_helper.run_join("r0", group_address_list, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
dut = "r1"
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", group_address_list, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
dut = "r1"
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, STAR, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, group_address_list
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream IIF interface")
dut = "r2"
iif = "lo"
result = verify_upstream_iif(tgen, dut, iif, STAR, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, group_address_list, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream IIF interface")
dut = "r2"
iif = "r2-r3-eth1"
result = verify_upstream_iif(
tgen, dut, iif, SOURCE_ADDRESS, group_address_list, joinState="NotJoined"
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, group_address_list, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r2: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r2: Verify (S, G) ip mroutes")
oif = "none"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Delete RP configuration")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Shut the interface r1-r2-eth1 from R1 to R2")
dut = "r1"
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: No Shut the interface r1-r2-eth1 from R1 to R2")
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1: Configure RP")
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Shut the interface r1-r0-eth0 from R1 to R2")
intf = "r1-r0-eth0"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: No Shut the interface r1-r0-eth0 from R1 to R2")
intf = "r1-r0-eth0"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1: Verify (*, G) upstream IIF interface")
dut = "r1"
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, STAR, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, group_address_list
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream IIF interface")
dut = "r2"
iif = "lo"
result = verify_upstream_iif(tgen, dut, iif, STAR, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream IIF interface")
dut = "r2"
iif = "r2-r3-eth1"
result = verify_upstream_iif(
tgen, dut, iif, SOURCE_ADDRESS, group_address_list, joinState="NotJoined"
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, group_address_list, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r2: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r2: Verify (S, G) ip mroutes")
oif = "none"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, group_address_list, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, SOURCE_ADDRESS, group_address_list, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
write_test_footer(tc_name)
def test_multiple_groups_different_RP_address_p2(request):
"""
TC_28_P2: Verify IIF and OIL in updated in mroute when upstream interface
configure as RP
Topology used:
________r2_____
| |
iperf | | iperf
r0-----r1-------------r3-----r5
| |
|_____________|
r4
r1 : LHR
r2 & r4 : RP
r3 : FHR
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Delete existing RP configuration")
input_dict = {
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
input_dict = {
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_LIST_1,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.4.17",
"group_addr_range": GROUP_RANGE_LIST_2,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify RP info")
dut = "r2"
rp_address = "1.0.2.17"
oif = "lo"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_LIST_1, oif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify RP info")
dut = "r4"
rp_address = "1.0.4.17"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_LIST_2, oif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
group_address_list = GROUP_ADDRESS_LIST_1 + GROUP_ADDRESS_LIST_2
step("r0: Send IGMP join")
result = app_helper.run_join("r0", group_address_list, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
dut = "r1"
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, group_address_list)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r5: Send multicast traffic for group 225.1.1.1")
result = app_helper.run_traffic("r5", group_address_list, "r3")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
dut = "r1"
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS_LIST_1, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream IIF interface")
dut = "r2"
iif = "lo"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS_LIST_1, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream IIF interface")
iif = "r2-r3-eth1"
result = verify_upstream_iif(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, joinState="NotJoined"
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r2: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r2: Verify (S, G) ip mroutes")
oif = "none"
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
dut = "r1"
iif = "r1-r4-eth3"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS_LIST_2, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (*, G) upstream IIF interface")
dut = "r4"
iif = "lo"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (*, G) ip mroutes")
oif = "r4-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS_LIST_2, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (S, G) upstream IIF interface")
iif = "r4-r3-eth1"
result = verify_upstream_iif(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, joinState="NotJoined"
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r4: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r4: Verify (S, G) ip mroutes")
oif = "none"
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, expected=False
)
assert result is not True, "Testcase {} :Failed \n Error: {}".format(
tc_name, result
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("Delete RP configuration")
input_dict = {
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_LIST_1,
"delete": True,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.4.17",
"group_addr_range": GROUP_RANGE_LIST_2,
"delete": True,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1, r2, r3, r4: Re-configure RP")
input_dict = {
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_LIST_1,
}
]
}
},
"r4": {
"pim": {
"rp": [
{
"rp_addr": "1.0.4.17",
"group_addr_range": GROUP_RANGE_LIST_2,
}
]
}
},
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Shut the interface r1-r2-eth1 from R1 to R2")
dut = "r1"
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: No shut the interface r1-r2-eth1 from R1 to R2")
dut = "r1"
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1: Shut the interface r1-r2-eth1 from R1 to R4")
dut = "r1"
intf = "r1-r4-eth3"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: No shut the interface r1-r2-eth1 from R1 to r4")
dut = "r1"
intf = "r1-r4-eth3"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1: Shut the interface r1-r0-eth0 from R1 to R0")
dut = "r1"
intf = "r1-r0-eth0"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: No Shut the interface r1-r0-eth0 from R1 to R0")
dut = "r1"
intf = "r1-r0-eth0"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1: Verify (*, G) upstream IIF interface")
dut = "r1"
iif = "r1-r2-eth1"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS_LIST_1, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream IIF interface")
dut = "r2"
iif = "lo"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS_LIST_1, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream IIF interface")
iif = "r2-r3-eth1"
result = verify_upstream_iif(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, joinState="NotJoined"
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r2: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r2: Verify (S, G) ip mroutes")
oif = "none"
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_1, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream IIF interface")
dut = "r1"
iif = "r1-r4-eth3"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS_LIST_2, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream IIF interface")
iif = "r1-r3-eth2"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (S, G) ip mroutes")
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (*, G) upstream IIF interface")
dut = "r4"
iif = "lo"
result = verify_upstream_iif(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (*, G) upstream join state and join timer")
result = verify_join_state_and_timer(tgen, dut, iif, STAR, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (*, G) ip mroutes")
oif = "r4-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS_LIST_2, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (S, G) upstream IIF interface")
iif = "r4-r3-eth1"
result = verify_upstream_iif(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, joinState="NotJoined"
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r4: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r4: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r4: Verify (S, G) ip mroutes")
oif = "none"
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream IIF interface")
dut = "r3"
iif = "r3-r5-eth3"
result = verify_upstream_iif(tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (S, G) upstream join state and join timer")
result = verify_join_state_and_timer(
tgen, dut, iif, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, expected=False
)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (S,G) upstream state is joined and join timer is running\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (S, G) ip mroutes")
oif = "r3-r1-eth0"
result = verify_ip_mroutes(
tgen, dut, SOURCE_ADDRESS, GROUP_ADDRESS_LIST_2, iif, oif
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
write_test_footer(tc_name)
def test_shutdown_primary_path_p1(request):
"""
TC_30_P1: Verify IIF and OIL change to other path after shut the primary
path
Topology used:
________r2_____
| |
iperf | |
r0-----r1-------------r3
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
# Steps to execute
step("Enable IGMP on r1 interface")
step("Configure RP on r2 (loopback interface) for the group range" " 224.0.0.0/4")
step("r1: Shut the link from r1 to r2")
step("r3: Shut the link from r1 to r3")
step("r1: No shut the link from r1 to r2")
step("r3: No shut the link from r1 to r3")
step("r1: Verify RP info")
dut = "r1"
rp_address = "1.0.2.17"
iif = "r1-r2-eth1"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
dut = "r2"
iif = "lo"
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Shut the interface r1-r2-eth1 from R1 to R2")
dut = "r1"
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, False)
step(
"Verify after shut the R1 to R2 link , verify join is reaching to RP"
"via other path"
)
logger.info("Waiting for 110 sec only if test run with crucible")
step("r1: Verify (*, G) ip mroutes")
dut = "r1"
iif = "r1-r3-eth2"
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
dut = "r2"
iif = "lo"
oif = "r2-r3-eth1"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (*, G) ip mroutes")
dut = "r3"
iif = "r3-r2-eth1"
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Shut the link from R1 to R3 from R3 node")
dut = "r3"
intf = "r3-r1-eth0"
shutdown_bringup_interface(tgen, dut, intf, False)
step(
"Verify after shut of R1 to R3 link , verify (*,G) entries got"
" cleared from all the node R1, R2, R3"
)
step("r1: Verify (*, G) ip mroutes")
dut = "r1"
iif = "r1-r3-eth2"
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: (*,G) mroutes are not cleared after shut of R1 to R3 link\n Error: {}".format(
tc_name, result
)
)
step("r2: Verify (*, G) ip mroutes")
dut = "r2"
iif = "lo"
oif = "r2-r3-eth1"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r2: (*,G) mroutes are not cleared after shut of R1 to R3 link\n Error: {}".format(
tc_name, result
)
)
step("r3: Verify (*, G) ip mroutes")
dut = "r3"
iif = "r3-r2-eth1"
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r3: (*,G) mroutes are not cleared after shut of R1 to R3 link\n Error: {}".format(
tc_name, result
)
)
step("r3: No shutdown the link from R1 to R3 from R3 node")
dut = "r3"
intf = "r3-r1-eth0"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1: Verify (*, G) ip mroutes")
dut = "r1"
iif = "r1-r3-eth2"
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
dut = "r2"
iif = "lo"
oif = "r2-r3-eth1"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r3: Verify (*, G) ip mroutes")
dut = "r3"
iif = "r3-r2-eth1"
oif = "r3-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: No shutdown the link from R1 to R2 from R1 node")
dut = "r1"
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1: Verify (*, G) ip mroutes")
dut = "r1"
iif = "r1-r2-eth1"
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes")
dut = "r2"
iif = "lo"
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
write_test_footer(tc_name)
def test_delete_RP_shut_noshut_upstream_interface_p1(request):
"""
TC_31_P1: Verify RP info and (*,G) mroute after deleting the RP and shut /
no shut the RPF interface.
Topology used:
________r2_____
| |
iperf | |
r0-----r1-------------r3
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface")
step("Configure RP on r2 (loopback interface) for the group range" " 224.0.0.0/4")
step("r1: Delete the RP config")
step("r1: Shut and no shut the upstream interface (R1-R2) connected link")
step("r1: Shut and no shut the OIL interface")
step("r1: Verify RP info")
dut = "r1"
rp_address = "1.0.2.17"
iif = "r1-r2-eth1"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
dut = "r1"
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes created")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes created")
dut = "r2"
iif = "lo"
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Delete RP configuration")
# Delete RP configuration
input_dict = {
"r1": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Shut the interface r1-r2-eth1 from R1 to R2")
dut = "r1"
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: No shutdown the interface r1-r2-eth1 from R1 to R2")
dut = "r1"
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1: Shutdown the OIL interface r1-r0-eth0 from R1 to R0 ")
dut = "r1"
intf = "r1-r0-eth0"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: No shutdown the OIL interface r1-r0-eth0 from R1 to R0")
dut = "r1"
intf = "r1-r0-eth0"
shutdown_bringup_interface(tgen, dut, intf, True)
step("r1: Verify (*, G) ip mroutes cleared")
dut = "r1"
iif = "r1-r2-eth1"
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: (*,G) mroutes are not cleared after shut of R1 to R0 link\n Error: {}".format(
tc_name, result
)
)
step("r2: Verify (*, G) ip mroutes cleared")
dut = "r2"
iif = "lo"
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r2: (*,G) mroutes are not cleared after shut of R1 to R0 link\n Error: {}".format(
tc_name, result
)
)
write_test_footer(tc_name)
def test_delete_RP_shut_noshut_RP_interface_p1(request):
"""
TC_32_P1: Verify RP info and (*,G) mroute after deleting the RP and shut/
no shut the RPF inteface
Topology used:
________r2_____
| |
iperf | |
r0-----r1-------------r3
"""
tgen = get_topogen()
tc_name = request.node.name
write_test_header(tc_name)
# Don"t run this test if we have any failure.
if tgen.routers_have_failure():
pytest.skip(tgen.errors)
step("Creating configuration from JSON")
reset_config_on_routers(tgen)
app_helper.stop_all_hosts()
clear_ip_mroute(tgen)
clear_ip_pim_interface_traffic(tgen, TOPO)
step("Enable IGMP on r1 interface")
step("Configure RP on r2 (lo) for the group range" " 224.0.0.0/4")
step("r2: Delete the RP configuration")
step("r2: Shut the RP interface (lo)")
step("r1: Shut the interface(r1-r2-eth1, r1-r3-eth2) towards rp")
step("r1: Verify RP info")
dut = "r1"
rp_address = "1.0.2.17"
iif = "r1-r2-eth1"
result = verify_pim_rp_info(
tgen, TOPO, dut, GROUP_RANGE_ALL, iif, rp_address, SOURCE
)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r0: Send IGMP join")
result = app_helper.run_join("r0", GROUP_ADDRESS, "r1")
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify IGMP groups")
oif = "r1-r0-eth0"
result = verify_igmp_groups(tgen, dut, oif, GROUP_ADDRESS)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r1: Verify (*, G) ip mroutes created")
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Verify (*, G) ip mroutes created")
dut = "r2"
iif = "lo"
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Delete RP configuration")
# Delete RP configuration
input_dict = {
"r2": {
"pim": {
"rp": [
{
"rp_addr": "1.0.2.17",
"group_addr_range": GROUP_RANGE_ALL,
"delete": True,
}
]
}
}
}
result = create_pim_config(tgen, TOPO, input_dict)
assert result is True, "Testcase {} :Failed \n Error: {}".format(tc_name, result)
step("r2: Shut the RP interface lo")
dut = "r2"
intf = "lo"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: Shut the interface r1-r2-eth1 towards RP")
dut = "r1"
intf = "r1-r2-eth1"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: Shut the interface r1-r3-eth2 towards RP")
dut = "r1"
intf = "r1-r3-eth2"
shutdown_bringup_interface(tgen, dut, intf, False)
step("r1: Verify (*, G) ip mroutes cleared")
dut = "r1"
iif = "r1-r2-eth1"
oif = "r1-r0-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r1: (*,G) mroutes are not cleared after shut of R1 to R2 and R3 link\n Error: {}".format(
tc_name, result
)
)
step("r2: Verify (*, G) ip mroutes cleared")
dut = "r2"
iif = "lo"
oif = "r2-r1-eth0"
result = verify_ip_mroutes(tgen, dut, STAR, GROUP_ADDRESS, iif, oif, expected=False)
assert result is not True, (
"Testcase {} : Failed \n "
"r2: (*,G) mroutes are not cleared after shut of R1 to R2 and R3 link\n Error: {}".format(
tc_name, result
)
)
write_test_footer(tc_name)
if __name__ == "__main__":
ARGS = ["-s"] + sys.argv[1:]
sys.exit(pytest.main(ARGS))
| 35.663964 | 98 | 0.600192 | 18,911 | 140,730 | 4.287822 | 0.024219 | 0.029894 | 0.056113 | 0.073378 | 0.929828 | 0.921899 | 0.913747 | 0.905361 | 0.899688 | 0.891697 | 0 | 0.026386 | 0.276657 | 140,730 | 3,945 | 99 | 35.673004 | 0.770178 | 0.094948 | 0 | 0.737149 | 0 | 0.009253 | 0.303558 | 0.000658 | 0 | 0 | 0 | 0 | 0.120288 | 1 | 0.007882 | false | 0.000685 | 0.00377 | 0 | 0.013365 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fd350836644a8c543f252799388687653c4c82cc | 3,272 | py | Python | client/bdomain_service.py | zero-os/0-metadata | f316c55c7e041b5f37a6ab5791b28ed19bd46b1a | [
"Apache-2.0"
] | null | null | null | client/bdomain_service.py | zero-os/0-metadata | f316c55c7e041b5f37a6ab5791b28ed19bd46b1a | [
"Apache-2.0"
] | 18 | 2018-01-26T11:14:27.000Z | 2018-02-16T16:15:42.000Z | client/bdomain_service.py | zero-os/0-metadata | f316c55c7e041b5f37a6ab5791b28ed19bd46b1a | [
"Apache-2.0"
] | null | null | null | # DO NOT EDIT THIS FILE. This file will be overwritten when re-running go-raml.
from .bdomain import bdomain
from .unhandled_api_error import UnhandledAPIError
from .unmarshall_error import UnmarshallError
class BdomainService:
def __init__(self, client):
self.client = client
def deleteBdomain(self, id, headers=None, query_params=None, content_type="application/json"):
"""
Delete bdomain
It is method for DELETE /bdomain/{id}
"""
uri = self.client.base_url + "/bdomain/" + id
return self.client.delete(uri, None, headers, query_params, content_type)
def getBdomain(self, id, headers=None, query_params=None, content_type="application/json"):
"""
Get bdomain, id=int
It is method for GET /bdomain/{id}
"""
uri = self.client.base_url + "/bdomain/" + id
resp = self.client.get(uri, None, headers, query_params, content_type)
try:
if resp.status_code == 200:
return bdomain(resp.json()), resp
message = 'unknown status code={}'.format(resp.status_code)
raise UnhandledAPIError(response=resp, code=resp.status_code,
message=message)
except ValueError as msg:
raise UnmarshallError(resp, msg)
except UnhandledAPIError as uae:
raise uae
except Exception as e:
raise UnmarshallError(resp, e.message)
def updateBdomain(self, data, id, headers=None, query_params=None, content_type="application/json"):
"""
Update bdomain
It is method for POST /bdomain/{id}
"""
uri = self.client.base_url + "/bdomain/" + id
resp = self.client.post(uri, data, headers, query_params, content_type)
try:
if resp.status_code == 200:
return bdomain(resp.json()), resp
message = 'unknown status code={}'.format(resp.status_code)
raise UnhandledAPIError(response=resp, code=resp.status_code,
message=message)
except ValueError as msg:
raise UnmarshallError(resp, msg)
except UnhandledAPIError as uae:
raise uae
except Exception as e:
raise UnmarshallError(resp, e.message)
def listBdomain(self, headers=None, query_params=None, content_type="application/json"):
"""
Get a list of bdomains
It is method for GET /bdomain
"""
uri = self.client.base_url + "/bdomain"
resp = self.client.get(uri, None, headers, query_params, content_type)
try:
if resp.status_code == 200:
resps = []
for elem in resp.json():
resps.append(bdomain(elem))
return resps, resp
message = 'unknown status code={}'.format(resp.status_code)
raise UnhandledAPIError(response=resp, code=resp.status_code,
message=message)
except ValueError as msg:
raise UnmarshallError(resp, msg)
except UnhandledAPIError as uae:
raise uae
except Exception as e:
raise UnmarshallError(resp, e.message)
| 38.952381 | 104 | 0.595966 | 366 | 3,272 | 5.229508 | 0.221311 | 0.062696 | 0.065831 | 0.045977 | 0.778997 | 0.758098 | 0.723615 | 0.704807 | 0.704807 | 0.684953 | 0 | 0.004002 | 0.312653 | 3,272 | 83 | 105 | 39.421687 | 0.847043 | 0.088631 | 0 | 0.701754 | 1 | 0 | 0.057753 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.087719 | false | 0 | 0.052632 | 0 | 0.22807 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b5d16a922b032b50d8bda988fa68c4a2453f60e9 | 152 | py | Python | tests/data/simple_study/transform_module_simple.py | kids-first/kf-lib-data-ingest | 92889efef082c64744a00a9c110d778da7383959 | [
"Apache-2.0"
] | 3 | 2018-10-30T17:56:44.000Z | 2020-05-27T16:18:05.000Z | tests/data/simple_study/transform_module_simple.py | kids-first/kf-lib-data-ingest | 92889efef082c64744a00a9c110d778da7383959 | [
"Apache-2.0"
] | 344 | 2018-11-01T16:47:56.000Z | 2022-02-23T20:36:21.000Z | tests/data/simple_study/transform_module_simple.py | kids-first/kf-lib-data-ingest | 92889efef082c64744a00a9c110d778da7383959 | [
"Apache-2.0"
] | 1 | 2020-08-19T21:25:25.000Z | 2020-08-19T21:25:25.000Z | from kf_lib_data_ingest.config import DEFAULT_KEY
def transform_function(mapped_df_dict):
return {DEFAULT_KEY: list(mapped_df_dict.values())[0]}
| 21.714286 | 58 | 0.802632 | 24 | 152 | 4.666667 | 0.791667 | 0.178571 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007353 | 0.105263 | 152 | 6 | 59 | 25.333333 | 0.816176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
bd070de83b1c3fec1637ccd58a5648855ac0dec0 | 4,649 | py | Python | tests/operators/contain_test.py | sgissinger/grappa | 51157a828d5cfdc731cada9b16255eaaf1cabbe6 | [
"MIT"
] | 137 | 2017-03-28T10:19:07.000Z | 2022-01-30T19:21:32.000Z | tests/operators/contain_test.py | sgissinger/grappa | 51157a828d5cfdc731cada9b16255eaaf1cabbe6 | [
"MIT"
] | 47 | 2017-03-19T23:08:48.000Z | 2021-01-25T15:18:10.000Z | tests/operators/contain_test.py | grappa-project/grappa | f1861e1572e68f031977e86a5d9eba1957bd164e | [
"MIT"
] | 17 | 2017-03-28T10:39:13.000Z | 2021-07-23T20:50:15.000Z | from array import array
import pytest
def test_should_contain(should):
'hello world' | should.contain('world') | should.contain('hello')
'hello world' | should.contain('w') | should.contain('o')
[1, 2, 3] | should.contain(1) | should.contain(3)
('foo', 'bar', 123) | should.contain('bar') | should.contain(123)
{'foo', 'bar', 123} | should.contain('bar') | should.contain(123)
[{'foo': 1}] | should.contain({'foo': 1})
array('i', [1, 2, 3]) | should.contain(1) | should.contain(2)
{'foo': 'bar', 'fuu': 2} | should.contain('bar') | should.contain(2)
with pytest.raises(AssertionError):
'hello world' | should.contain('planet')
with pytest.raises(AssertionError):
'hello world' | should.contain('t')
with pytest.raises(AssertionError):
[1, 2, 3] | should.contain(4)
with pytest.raises(AssertionError):
('foo', 'bar', 123) | should.contain('baz')
with pytest.raises(AssertionError):
{'foo', 'bar', 123} | should.contain('baz')
with pytest.raises(AssertionError):
[{'foo': 1}] | should.contain({'foo': 2})
with pytest.raises(AssertionError):
array('i', [1, 2, 3]) | should.contain(4)
with pytest.raises(AssertionError):
{'foo': 'bar', 'fuu': 2} | should.contain('baz')
def test_should_contain_any(should):
'hello world' | should.contain('world', 'hello')
'hello world' | should.contain(('world', 'hello'))
'hello world' | should.contain(['world', 'hello'])
'hello world' | should.contain({'world', 'hello'})
'hello world' | should.contain('w', 'o')
'hello world' | should.contain(('w', 'o'))
'hello world' | should.contain(['w', 'o'])
'hello world' | should.contain({'w', 'o'})
[1, 2, 3] | should.contain(1, 3)
[1, 2, 3] | should.contain((1, 3))
[1, 2, 3] | should.contain([1, 3])
[1, 2, 3] | should.contain({1, 3})
('foo', 'bar', 123) | should.contain('bar', 123)
{'foo', 'bar', 123} | should.contain(('bar', 123))
{'foo', 'bar', 123} | should.contain({'bar', 123})
{'foo', 'bar', 123} | should.contain(['bar', 123])
[{'foo': 1}, {'bar': 2}] | should.contain({'foo': 1}, {'bar': 2})
[{'foo': 1}, {'bar': 2}] | should.contain(({'foo': 1}, {'bar': 2}))
[{'foo': 1}, {'bar': 2}] | should.contain([{'foo': 1}, {'bar': 2}])
array('i', [1, 2, 3]) | should.contain(1, 2)
array('i', [1, 2, 3]) | should.contain((1, 2))
array('i', [1, 2, 3]) | should.contain({1, 2})
array('i', [1, 2, 3]) | should.contain([1, 2])
{'foo': 'bar', 'fuu': 'bor'} | should.contain('bar', 'bor')
{'foo': 'bar', 'fuu': 'bor'} | should.contain(('bar', 'bor'))
{'foo': 'bar', 'fuu': 'bor'} | should.contain(['bar', 'bor'])
{'foo': 'bar', 'fuu': 'bor'} | should.contain({'bar', 'bor'})
def test_should_not_contain_any(should):
'hello planet' | should._not.contain('world', 'hello')
'hello planet' | should._not.contain(('world', 'hello'))
'hello planet' | should._not.contain(['world', 'hello'])
'hello planet' | should._not.contain({'world', 'hello'})
'hello planet' | should._not.contain('w', 'o')
'hello planet' | should._not.contain(('w', 'o'))
'hello planet' | should._not.contain(['w', 'o'])
'hello planet' | should._not.contain({'w', 'o'})
[1, 2, 3] | should._not.contain(1, 4)
[1, 2, 3] | should._not.contain((1, 4))
[1, 2, 3] | should._not.contain([1, 4])
[1, 2, 3] | should._not.contain({1, 4})
('foo', 'bar', 123) | should._not.contain('baz', 123)
{'foo', 'bar', 123} | should._not.contain(('baz', 123))
{'foo', 'bar', 123} | should._not.contain({'baz', 123})
{'foo', 'bar', 123} | should._not.contain(['baz', 123])
[{'foo': 1}, {'bar': 2}] | should._not.contain({'foo': 1}, {'baz': 2})
[{'foo': 1}, {'bar': 2}] | should._not.contain(({'foo': 1}, {'baz': 2}))
[{'foo': 1}, {'bar': 2}] | should._not.contain([{'foo': 1}, {'baz': 2}])
array('i', [1, 2, 3]) | should._not.contain(1, 4)
array('i', [1, 2, 3]) | should._not.contain((1, 4))
array('i', [1, 2, 3]) | should._not.contain({1, 4})
array('i', [1, 2, 3]) | should._not.contain([1, 4])
{'foo': 'bar', 'fuu': 'bor'} | should._not.contain('baz', 'bor')
{'foo': 'bar', 'fuu': 'bor'} | should._not.contain(('baz', 'bor'))
{'foo': 'bar', 'fuu': 'bor'} | should._not.contain(['baz', 'bor'])
{'foo': 'bar', 'fuu': 'bor'} | should._not.contain({'baz', 'bor'})
def test_should_contain_failures(should):
with pytest.raises(AssertionError):
() | should.contain('bar')
with pytest.raises(AssertionError):
1 | should.contain('bar')
| 37.491935 | 76 | 0.54743 | 627 | 4,649 | 3.99681 | 0.049442 | 0.285315 | 0.178771 | 0.071828 | 0.903831 | 0.846768 | 0.803272 | 0.80008 | 0.731445 | 0.731445 | 0 | 0.051989 | 0.189073 | 4,649 | 123 | 77 | 37.796748 | 0.612732 | 0 | 0 | 0.113636 | 0 | 0 | 0.154657 | 0 | 0 | 0 | 0 | 0 | 0.113636 | 1 | 0.045455 | false | 0 | 0.022727 | 0 | 0.068182 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1fc827a9689323823fb3421220f8563d701ca73a | 146 | py | Python | apps/users/tests/base_test_utils.py | pixelpassion/django-saas-boilerplate | 8888d67181c760708edb18a4832d9002340878fa | [
"MIT"
] | 37 | 2020-11-30T17:05:00.000Z | 2022-03-25T11:03:23.000Z | apps/users/tests/base_test_utils.py | gd-js/django-saas-boilerplate | 8888d67181c760708edb18a4832d9002340878fa | [
"MIT"
] | 5 | 2021-04-08T21:58:32.000Z | 2021-06-10T19:59:56.000Z | apps/users/tests/base_test_utils.py | gd-js/django-saas-boilerplate | 8888d67181c760708edb18a4832d9002340878fa | [
"MIT"
] | 7 | 2021-04-24T14:17:16.000Z | 2022-02-08T13:38:12.000Z | def mock_users_email_service_function(mocker, func_name):
return mocker.patch(f"apps.users.email_service.UsersSaasyEmailService.{func_name}")
| 48.666667 | 87 | 0.835616 | 20 | 146 | 5.75 | 0.7 | 0.173913 | 0.295652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061644 | 146 | 2 | 88 | 73 | 0.839416 | 0 | 0 | 0 | 0 | 0 | 0.40411 | 0.40411 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
1fedeb82e87e3517ad61cb74926e0f2617d1994c | 4,107 | py | Python | features/steps/image-visualization.py | eaton-lab/toyplot | 472f2f2f1bc048e485ade44d75c3ace310be4b41 | [
"BSD-3-Clause"
] | 438 | 2015-01-06T20:54:02.000Z | 2022-03-15T00:39:33.000Z | features/steps/image-visualization.py | eaton-lab/toyplot | 472f2f2f1bc048e485ade44d75c3ace310be4b41 | [
"BSD-3-Clause"
] | 184 | 2015-01-26T17:04:47.000Z | 2022-02-19T16:29:00.000Z | features/steps/image-visualization.py | eaton-lab/toyplot | 472f2f2f1bc048e485ade44d75c3ace310be4b41 | [
"BSD-3-Clause"
] | 45 | 2015-07-06T18:00:27.000Z | 2022-02-14T12:46:17.000Z | # Copyright 2014, Sandia Corporation. Under the terms of Contract
# DE-AC04-94AL85000 with Sandia Corporation, the U.S. Government retains certain
# rights in this software.
from behave import *
import os
import nose.tools
import numpy
import PIL.Image
import toyplot.color
import testing
art_dir = os.path.abspath(os.path.dirname(__file__))
@given(u'a numpy 1 bit L image')
def step_impl(context):
context.image = testing.read_png(os.path.join(art_dir, "toyplot-8-L.png"))
context.image = context.image > 128
nose.tools.assert_equal(context.image.shape, (256, 256, 1))
nose.tools.assert_equal(context.image.dtype, "bool")
@given(u'a numpy 8 bit L image')
def step_impl(context):
context.image = testing.read_png(os.path.join(art_dir, "toyplot-8-L.png"))
nose.tools.assert_equal(context.image.shape, (256, 256, 1))
nose.tools.assert_equal(context.image.dtype, "uint8")
@given(u'a numpy 8 bit L image with colormap')
def step_impl(context):
context.image = testing.read_png(os.path.join(art_dir, "toyplot-8-L.png"))
nose.tools.assert_equal(context.image.shape, (256, 256, 1))
nose.tools.assert_equal(context.image.dtype, "uint8")
context.image = (context.image, toyplot.color.brewer.map("BlueRed"))
@given(u'a numpy 8 bit LA image')
def step_impl(context):
context.image = testing.read_png(os.path.join(art_dir, "toyplot-8-LA.png"))
nose.tools.assert_equal(context.image.shape, (256, 256, 2))
nose.tools.assert_equal(context.image.dtype, "uint8")
@given(u'a numpy 8 bit RGB image')
def step_impl(context):
context.image = testing.read_png(os.path.join(art_dir, "toyplot-8-RGB.png"))
nose.tools.assert_equal(context.image.shape, (256, 256, 3))
nose.tools.assert_equal(context.image.dtype, "uint8")
@given(u'a numpy 8 bit RGBA image')
def step_impl(context):
context.image = testing.read_png(os.path.join(art_dir, "toyplot-8-RGBA.png"))
nose.tools.assert_equal(context.image.shape, (256, 256, 4))
nose.tools.assert_equal(context.image.dtype, "uint8")
@given(u'a pillow 8 bit L image')
def step_impl(context):
context.image = PIL.Image.open(os.path.join(art_dir, "toyplot-8-L.png"))
nose.tools.assert_equal(context.image.size, (256, 256))
nose.tools.assert_equal(context.image.mode, "L")
@given(u'a pillow 8 bit L image with colormap')
def step_impl(context):
context.image = PIL.Image.open(os.path.join(art_dir, "toyplot-8-L.png"))
nose.tools.assert_equal(context.image.size, (256, 256))
nose.tools.assert_equal(context.image.mode, "L")
context.image = (context.image, toyplot.color.brewer.map("BlueRed"))
@given(u'a pillow 8 bit RGB image')
def step_impl(context):
context.image = PIL.Image.open(os.path.join(art_dir, "toyplot-8-RGB.png"))
nose.tools.assert_equal(context.image.size, (256, 256))
nose.tools.assert_equal(context.image.mode, "RGB")
@given(u'a pillow 8 bit RGBA image')
def step_impl(context):
context.image = PIL.Image.open(os.path.join(art_dir, "toyplot-8-RGBA.png"))
nose.tools.assert_equal(context.image.size, (256, 256))
nose.tools.assert_equal(context.image.mode, "RGBA")
@given(u'a non-square numpy 8 bit L image')
def step_impl(context):
numpy.random.seed(1234)
context.image = numpy.random.uniform(0, 1, size=(10, 5)).repeat(50, axis=0).repeat(50, axis=1)
nose.tools.assert_equal(context.image.shape, (500, 250))
nose.tools.assert_equal(context.image.dtype, "float64")
@given(u'a non-square numpy 8 bit L image with colormap')
def step_impl(context):
numpy.random.seed(1234)
context.image = numpy.random.uniform(0, 1, size=(10, 5)).repeat(50, axis=0).repeat(50, axis=1)
nose.tools.assert_equal(context.image.shape, (500, 250))
nose.tools.assert_equal(context.image.dtype, "float64")
context.image = (context.image, toyplot.color.linear.map("Blackbody"))
@given(u'a canvas background color')
def step_impl(context):
context.canvas.style = {"background-color":"lightgray"}
@when(u'the image is added to the canvas')
def step_impl(context):
context.canvas.image(context.image)
| 34.225 | 98 | 0.718286 | 666 | 4,107 | 4.340841 | 0.15015 | 0.186787 | 0.124524 | 0.166033 | 0.844345 | 0.844345 | 0.800761 | 0.800761 | 0.791768 | 0.790038 | 0 | 0.043212 | 0.126613 | 4,107 | 119 | 99 | 34.512605 | 0.762754 | 0.040662 | 0 | 0.5375 | 0 | 0 | 0.164972 | 0 | 0 | 0 | 0 | 0 | 0.3 | 1 | 0.175 | false | 0 | 0.0875 | 0 | 0.2625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1f174d740ceb5a799b0a4c35ac39191333d4d561 | 118 | py | Python | cumulusci/robotframework/locators_53.py | emregunel/CumulusCI | 2cc0b88c50ebde372a17a6849eab268d12a20cfb | [
"BSD-3-Clause"
] | 163 | 2018-09-13T18:49:34.000Z | 2022-03-25T08:37:15.000Z | cumulusci/robotframework/locators_53.py | emregunel/CumulusCI | 2cc0b88c50ebde372a17a6849eab268d12a20cfb | [
"BSD-3-Clause"
] | 1,280 | 2018-09-11T20:09:37.000Z | 2022-03-31T18:40:21.000Z | cumulusci/robotframework/locators_53.py | emregunel/CumulusCI | 2cc0b88c50ebde372a17a6849eab268d12a20cfb | [
"BSD-3-Clause"
] | 93 | 2018-09-13T07:29:22.000Z | 2022-03-26T23:15:48.000Z | import copy
from cumulusci.robotframework import locators_52
lex_locators = copy.deepcopy(locators_52.lex_locators)
| 19.666667 | 54 | 0.855932 | 16 | 118 | 6.0625 | 0.5625 | 0.206186 | 0.268041 | 0.43299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037383 | 0.09322 | 118 | 5 | 55 | 23.6 | 0.869159 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1f440f883cd898edb1edcb87c84935c6b8be9e5a | 118,457 | py | Python | src/beale_wordlist.py | nwillhite/diceware | 6018162cc26da190964e5d757432bf9ebed0e896 | [
"MIT"
] | null | null | null | src/beale_wordlist.py | nwillhite/diceware | 6018162cc26da190964e5d757432bf9ebed0e896 | [
"MIT"
] | null | null | null | src/beale_wordlist.py | nwillhite/diceware | 6018162cc26da190964e5d757432bf9ebed0e896 | [
"MIT"
] | null | null | null | beale_wordlist = {65536: '2001', 46422: 'puppy', 65542: '20th', 65543: '21', 65544: '21st', 65545: '22', 65546: '222', 13655: 'berra', 65551: '2222', 65552: '22nd', 13656: 'berry', 65554: '234', 65555: '2345', 65556: '23rd', 65561: '24', 65562: '2468', 16411: 'chess', 16412: 'chest', 16413: 'chevy', 16414: 'chew', 16415: 'chews', 16416: 'chewy', 16421: 'chi', 16422: 'chic', 16423: 'chick', 16424: 'chide', 16425: 'chief', 16426: 'child', 16431: 'chile', 16432: 'chili', 16433: 'chill', 16434: 'chilly', 16435: 'chime', 16436: 'chimp', 16441: 'chin', 16442: 'china', 16443: 'chip', 16444: 'chips', 16445: 'chirp', 16446: 'chisel', 13664: 'beset', 16451: 'chit', 16452: 'chive', 16453: 'chloe', 16454: 'chock', 16455: 'choir', 16456: 'choke', 65611: '26', 65612: '26th', 16461: 'chomp', 16462: 'chop', 16463: 'chopin', 16464: 'chops', 16465: 'choral', 16466: 'chord', 65621: '29', 65622: '29th', 65623: '2:00', 65624: '2:30', 65625: '2nd', 65626: '3', 65631: '3%', 65632: '3/4', 65633: '3/8', 65634: '30', 65635: '30%', 65636: '300', 65553: '23', 65641: '3000', 63613: 'wilkes', 65643: '31', 65644: '31st', 65645: '32', 65646: '32nd', 64363: 'xo', 65651: '33', 65652: '333', 65653: '3333', 65654: '33rd', 65655: '34', 65656: '345', 13643: 'bends', 65661: '3456', 65662: '34th', 16511: 'chore', 16512: 'chose', 16513: 'chosen', 16514: 'chow', 16515: 'chris', 16516: 'chub', 16521: 'chuck', 16522: 'chug', 16523: 'chum', 16524: 'chump', 16525: 'chunk', 16526: 'churn', 16531: 'chute', 16532: 'ci', 16533: 'cia', 16534: 'ciao', 16535: 'cicada', 16536: 'cider', 16541: 'cigar', 16542: 'cilia', 16543: 'cinch', 16544: 'cindy', 16545: 'cipher', 16546: 'circa', 65563: '24th', 61465: 'trots', 16551: 'circe', 16552: 'cite', 16553: 'citrus', 16554: 'city', 16555: 'civet', 16556: 'civic', 65565: '25%', 16561: 'civil', 16562: 'cj', 16563: 'ck', 16564: 'cl', 16565: 'clad', 16566: 'claim', 56424: 'teeth', 66662: ';', 12555: 'athens', 54121: 'sky', 16611: 'clam', 16612: 'clammy', 16613: 'clamp', 16614: 'clan', 16615: 'clang', 16616: 'clank', 16621: 'clap', 16622: 'claps', 16623: 'clara', 16624: 'clark', 16625: 'clash', 16626: 'clasp', 16631: 'class', 16632: 'claus', 16633: 'clause', 16634: 'claw', 16635: 'claws', 16636: 'clay', 63614: 'will', 16641: 'clean', 16642: 'clear', 16643: 'cleat', 16644: 'clef', 16645: 'cleft', 16646: 'clem', 16651: 'cleo', 16652: 'clerk', 16653: 'clever', 16654: 'cliche', 16655: 'click', 16656: 'cliff', 62523: 'vessel', 16661: 'climb', 16662: 'cling', 16663: 'clink', 16664: 'clip', 16665: 'cloak', 16666: 'clock', 55112: 'spits', 55113: 'spitz', 33111: 'hoax', 33112: 'hobby', 33113: 'hobo', 33114: 'hock', 33115: 'hockey', 33116: 'hoe', 33121: 'hog', 33122: 'hogan', 33123: 'hogs', 33124: 'hoist', 33125: 'hold', 33126: 'holds', 33131: 'holdup', 33132: 'hole', 33133: 'holes', 33134: 'holly', 33135: 'holmes', 33136: 'holy', 33141: 'home', 33142: 'honda', 33143: 'hone', 33144: 'honey', 33145: 'honk', 33146: 'honor', 33151: 'hooch', 33152: 'hood', 33153: 'hoof', 33154: 'hook', 33155: 'hooks', 33156: 'hookup', 33161: 'hoop', 33162: 'hoot', 33163: 'hop', 33164: 'hope', 33165: 'hopes', 33166: 'hops', 33211: 'horde', 33212: 'horn', 33213: 'horny', 33214: 'horse', 33215: 'hose', 33216: 'host', 33221: 'hot', 33222: 'hotel', 33223: 'hotrod', 33224: 'hound', 33225: 'hour', 33226: 'house', 33231: 'hovel', 33232: 'hover', 33233: 'how', 33234: 'howdy', 33235: 'howl', 33236: 'howls', 65614: '27th', 33241: 'hoyle', 33242: 'hp', 33243: 'hq', 33244: 'hr', 33245: 'hrh', 33246: 'hs', 65616: '28th', 33251: 'ht', 33252: 'hu', 33253: 'hub', 33254: 'hubbub', 33255: 'hubby', 33256: 'hubs', 61665: 'tycoon', 33261: 'hue', 33262: 'hues', 33263: 'huey', 33264: 'huff', 33265: 'hug', 33266: 'huge', 12565: 'atoms', 54123: 'slab', 56211: 'tack', 33311: 'hugh', 33312: 'hugo', 33313: 'hugs', 33314: 'huh', 33315: 'hula', 33316: 'hulk', 33321: 'hull', 33322: 'hum', 33323: 'human', 33324: 'humid', 33325: 'humor', 33326: 'hump', 63616: 'willy', 33331: 'humps', 33332: 'hums', 33333: 'humus', 33334: 'hun', 33335: 'hunch', 33336: 'hung', 56212: 'tacky', 33341: 'hunk', 33342: 'hunt', 33343: 'hunts', 33344: 'hurl', 33345: 'huron', 33346: 'hurrah', 66115: '38th', 66116: '39', 33351: 'hurry', 33352: 'hurt', 33353: 'hush', 33354: 'husk', 33355: 'husky', 33356: 'hut', 66125: '4', 66126: '4%', 33361: 'hutch', 33362: 'hv', 33363: 'hw', 33364: 'hwy', 33365: 'hx', 33366: 'hy', 66135: '40th', 66136: '41', 66141: '41st', 66142: '42', 66143: '42nd', 66144: '43', 66145: '4321', 66146: '43rd', 66151: '44', 66152: '444', 66153: '4444', 66154: '44th', 66155: '45', 66156: '45%', 66161: '456', 66162: '4567', 66163: '45th', 66164: '46', 66165: '46th', 66166: '47', 55123: 'sponge', 65642: '30th', 33411: 'hyde', 33412: 'hydra', 33413: 'hyena', 33414: 'hymn', 33415: 'hymnal', 33416: 'hype', 33421: 'hyper', 33422: 'hypo', 33423: 'hz', 33424: 'i', 33425: "i'd", 33426: "i'll", 33431: "i'm", 33432: "i's", 33433: "i've", 33434: 'ia', 33435: 'ian', 33436: 'ib', 33441: 'ibid', 33442: 'ibm', 33443: 'ibsen', 33444: 'ic', 33445: 'icbm', 33446: 'ice', 66215: '49th', 66216: '4:00', 33451: 'iced', 33452: 'icicle', 33453: 'icing', 33454: 'icky', 33455: 'icon', 33456: 'icons', 66225: '5/8', 66226: '50', 33461: 'icy', 33462: 'id', 33463: 'ida', 33464: 'idaho', 33465: 'idea', 33466: 'ideal', 66235: '51', 66236: '51st', 66241: '52', 66242: '52nd', 66243: '53', 66244: '53rd', 66245: '54', 66246: '54th', 66251: '55', 66252: '55%', 66253: '555', 66254: '5555', 64433: 'xz', 66256: '56', 66261: '567', 66262: '5678', 66263: '56th', 66264: '57', 66265: '57th', 66266: '58', 33511: 'ideas', 33512: 'idiom', 33513: 'idiot', 33514: 'idle', 33515: 'idly', 33516: 'idol', 33521: 'idols', 33522: 'ie', 33523: 'if', 33524: 'iffy', 33525: 'ig', 33526: 'igloo', 33531: 'ignite', 33532: 'igor', 33533: 'ih', 33534: 'ii', 33535: 'iii', 33536: 'iiii', 65664: '35%', 33541: 'ij', 33542: 'ijk', 33543: 'ik', 33544: 'ike', 33545: 'il', 33546: 'iliad', 66315: '5:30', 66316: '5th', 65666: '36', 33551: 'ill', 33552: 'im', 33553: 'image', 33554: 'imbibe', 33555: 'imf', 33556: 'imp', 66325: '600', 66326: '6000', 33561: 'impel', 33562: 'imply', 33563: 'import', 33564: 'imps', 33565: 'in', 33566: 'inane', 66335: '62nd', 66336: '63', 66341: '63rd', 66342: '64', 66343: '65', 66344: '65%', 66345: '65th', 66346: '66', 13666: 'best', 66351: '666', 66352: '6666', 66353: '66th', 66354: '67', 66355: '678', 66356: '6789', 66361: '67th', 66362: '68', 66363: '68th', 66364: '69', 66365: '69th', 63313: 'was', 56221: 'taffy', 33611: 'inc', 33612: 'inca', 33613: 'incest', 33614: 'inch', 33615: 'incur', 33616: 'index', 33621: 'india', 33622: 'indies', 33623: 'indy', 33624: 'inept', 33625: 'inert', 33626: 'infamy', 63314: 'wash', 33631: 'infect', 33632: 'infer', 33633: 'info', 33634: 'ingot', 33635: 'inhale', 33636: 'ink', 56222: 'taft', 33641: 'inky', 33642: 'inlay', 33643: 'inlet', 33644: 'inn', 33645: 'inner', 33646: 'inns', 66415: '7/8', 66416: '70', 33651: 'input', 33652: 'insect', 33653: 'inset', 33654: 'insult', 33655: 'intel', 33656: 'intend', 66425: '71', 63315: 'wasp', 33661: 'inter', 33662: 'into', 33663: 'intro', 33664: 'invoke', 33665: 'io', 33666: 'ion', 66435: '74', 66436: '74th', 55132: 'spoon', 66441: '75', 66442: '75%', 66443: '75th', 66444: '76', 66445: '76th', 66446: '77', 66451: '777', 66452: '7777', 66453: '77th', 66454: '78', 66455: '789', 63316: 'wasps', 66461: '79', 66462: '79th', 66463: '7:00', 66464: '7:30', 66465: '7th', 66466: '8', 56224: 'tags', 45314: 'pick', 66511: '8%', 66512: '80', 66513: '80%', 66514: '800', 66515: '8000', 66516: '80th', 66521: '81', 66522: '81st', 66523: '82', 66524: '82nd', 66525: '83', 66526: '83rd', 66531: '84', 66532: '84th', 66533: '85', 66534: '85%', 66535: '85th', 66536: '86', 66541: '86th', 66542: '87', 66543: '87th', 66544: '88', 66545: '888', 66546: '8888', 66551: '88th', 66552: '89', 66553: '89th', 66554: '8:00', 66555: '8:30', 66556: '8th', 66561: '9', 66562: '9%', 66563: '9-5', 66564: '90', 31431: 'grace', 66566: '900', 62133: 'ue', 65365: '13', 65665: '35th', 66611: '9000', 66612: '90th', 66613: '91', 66614: '91st', 66615: '92', 66616: '92nd', 66621: '93', 66622: '93rd', 66623: '94', 66624: '94th', 66625: '95', 66626: '95%', 66631: '95th', 64413: 'xs', 66633: '96th', 66634: '97', 66635: '97th', 66636: '98', 66641: '98%', 66642: '98.6', 66643: '9876', 66644: '98th', 66645: '99', 66646: '99%', 66651: '999', 66652: '9999', 66653: '99th', 66654: '9:00', 66655: '9:30', 66656: '9th', 66661: ':', 64414: 'xt', 66663: '=', 66664: '?', 66665: '??', 63323: 'water', 64415: 'xu', 52413: 'sagas', 64416: 'xv', 65366: '13th', 64465: 'yawns', 64421: 'xvi', 34111: 'ions', 34112: 'iota', 34113: 'iou', 34114: 'iowa', 34115: 'ip', 34116: 'iq', 34121: 'ir', 34122: 'ira', 34123: 'iran', 34124: 'iraq', 34125: 'iraqi', 34126: 'irate', 34131: 'ire', 34132: 'irene', 34133: 'iris', 34134: 'irish', 34135: 'irk', 34136: 'irked', 34141: 'irma', 34142: 'iron', 34143: 'irons', 34144: 'irony', 34145: 'irvin', 34146: 'is', 34151: 'isaac', 34152: 'isabel', 34153: 'islam', 34154: 'island', 34155: 'isle', 34156: 'ism', 34161: "isn't", 34162: 'israel', 34163: 'issue', 34164: 'isuzu', 34165: 'it', 34166: "it'd", 65515: '1996', 64424: 'xx', 61151: 'tk', 34211: "it'll", 34212: "it's", 34213: 'italy', 34214: 'itch', 34215: 'itchy', 34216: 'item', 34221: 'items', 34222: 'iu', 34223: 'iud', 34224: 'iv', 34225: 'ivan', 34226: 'ivory', 34231: 'ivy', 34232: 'iw', 34233: 'ix', 34234: 'iy', 34235: 'iz', 34236: 'j', 34241: "j's", 34242: 'ja', 34243: 'jab', 34244: 'jack', 34245: 'jackal', 34246: 'jacob', 45331: 'piers', 34251: 'jade', 34252: 'jaded', 34253: 'jag', 34254: 'jaguar', 34255: 'jail', 34256: 'jam', 34261: 'jamb', 34262: 'james', 34263: 'jan', 34264: 'jane', 34265: 'janet', 34266: 'janis', 62313: 'usia', 66132: '40%', 56421: 'teen', 34311: 'japan', 34312: 'jar', 34313: 'jars', 34314: 'jason', 34315: 'jaunt', 34316: 'java', 34321: 'jaw', 34322: 'jaws', 34323: 'jay', 34324: 'jazz', 34325: 'jazzy', 34326: 'jb', 34331: 'jc', 34332: 'jd', 34333: 'je', 34334: 'jean', 34335: 'jeans', 34336: 'jed', 34341: 'jedi', 34342: 'jeep', 34343: 'jeer', 34344: 'jeers', 34345: 'jeff', 34346: 'jello', 34351: 'jelly', 34352: 'jenny', 34353: 'jerk', 34354: 'jerks', 34355: 'jerky', 34356: 'jerry', 34361: 'jersey', 34362: 'jesse', 34363: 'jest', 34364: 'jesus', 34365: 'jet', 34366: 'jets', 65513: '1994', 64431: 'xy', 34411: 'jew', 34412: 'jewel', 34413: 'jewish', 34414: 'jf', 34415: 'jfk', 34416: 'jg', 34421: 'jh', 34422: 'ji', 34423: 'jiffy', 34424: 'jig', 34425: 'jiggle', 34426: 'jigs', 62314: 'ussr', 34431: 'jill', 34432: 'jilt', 34433: 'jim', 34434: 'jimmy', 34435: 'jinx', 34436: 'jive', 66133: '400', 34441: 'jj', 34442: 'jjj', 34443: 'jjjj', 34444: 'jk', 34445: 'jkl', 34446: 'jl', 34451: 'jm', 34452: 'jn', 34453: 'jo', 34454: 'joan', 34455: 'job', 34456: 'jobs', 34461: 'jock', 34462: 'jockey', 34463: 'jody', 34464: 'joe', 34465: 'joel', 34466: 'joey', 64434: 'y', 34511: 'jog', 34512: 'jogs', 34513: 'john', 34514: 'join', 34515: 'joins', 34516: 'joint', 34521: 'joke', 34522: 'joker', 34523: 'jokes', 34524: 'jolly', 34525: 'jolt', 34526: 'jonas', 34531: 'jones', 34532: 'jose', 34533: 'josef', 34534: 'josh', 34535: 'joshua', 34536: 'jostle', 34541: 'jot', 34542: 'jots', 34543: 'joust', 34544: 'jove', 34545: 'jowl', 34546: 'jowls', 34551: 'joy', 34552: 'joyce', 34553: 'jp', 34554: 'jq', 34555: 'jr', 34556: 'js', 34561: 'jt', 34562: 'ju', 34563: 'juan', 34564: 'judas', 34565: 'jude', 34566: 'judge', 55163: 'spy', 34611: 'judo', 34612: 'judy', 34613: 'jug', 34614: 'juggle', 34615: 'jugs', 34616: 'juice', 34621: 'juicy', 34622: 'jul', 34623: 'julep', 34624: 'jules', 34625: 'julia', 34626: 'julie', 34631: 'julio', 34632: 'july', 42616: 'ms', 34634: 'jump', 34635: 'jumps', 34636: 'jumpy', 34641: 'jun', 34642: 'june', 34643: 'jung', 34644: 'junk', 34645: 'junky', 34646: 'juno', 34651: 'junta', 34652: 'juror', 34653: 'jury', 34654: 'just', 34655: 'jut', 34656: 'jute', 34661: 'jv', 34662: 'jw', 34663: 'jx', 34664: 'jy', 34665: 'jz', 34666: 'k', 63123: 'vq', 62643: 'vivid', 62425: 'vault', 51111: 'rag', 51112: 'rage', 51113: 'raged', 51114: 'rags', 51115: 'raid', 51116: 'raids', 51121: 'rail', 51122: 'rails', 51123: 'rain', 51124: 'rains', 51125: 'rainy', 51126: 'raise', 51131: 'rake', 51132: 'raked', 51133: 'rakes', 51134: 'rally', 51135: 'ralph', 51136: 'ram', 51141: 'rambo', 51142: 'ramp', 51143: 'rams', 51144: 'ramsey', 51145: 'ran', 51146: 'ranch', 64443: 'yahoo', 51151: 'rand', 51152: 'randy', 51153: 'rang', 51154: 'range', 51155: 'rank', 51156: 'ranks', 55224: 'ssw', 51161: 'rant', 51162: 'rants', 51163: 'raoul', 51164: 'rap', 51165: 'rape', 51166: 'raped', 63141: 'vw', 65535: '2000', 51211: 'rapid', 51212: 'raps', 51213: 'rare', 51214: 'rascal', 51215: 'rash', 51216: 'rat', 62263: 'usage', 51221: 'rate', 51222: 'rated', 51223: 'rates', 51224: 'ratio', 51225: 'rats', 51226: 'rattle', 51231: 'rave', 51232: 'raved', 51233: 'raven', 51234: 'raw', 51235: 'ray', 51236: 'rayon', 51241: 'rays', 51242: 'raze', 51243: 'razor', 51244: 'rb', 51245: 'rc', 51246: 'rd', 51251: 're', 51252: 'reach', 51253: 'read', 51254: 'reads', 51255: 'ready', 51256: 'reagan', 65263: '(r)', 51261: 'real', 51262: 'realm', 51263: 'reap', 51264: 'rear', 51265: 'rebel', 51266: 'rebut', 62265: 'used', 45623: 'pol', 66212: '48', 62513: 'vern', 51311: 'recap', 51312: 'recipe', 51313: 'recur', 51314: 'red', 51315: 'redeem', 51316: 'redo', 51321: 'reduce', 51322: 'reed', 51323: 'reeds', 51324: 'reef', 51325: 'reek', 51326: 'reeks', 51331: 'reel', 51332: 'reels', 51333: 'ref', 51334: 'refer', 51335: 'refs', 51336: 'regal', 51341: 'regs', 51342: 'rehab', 51343: 'reich', 51344: 'reid', 51345: 'reign', 51346: 'rein', 51351: 'reins', 51352: 'reject', 51353: 'relax', 51354: 'relay', 51355: 'relic', 51356: 'rely', 51361: 'rem', 51362: 'remedy', 51363: 'remit', 51364: 'remix', 51365: 'rena', 51366: 'rend', 51411: 'renee', 51412: 'renew', 51413: 'reno', 51414: 'renown', 51415: 'rent', 51416: 'rents', 51421: 'rep', 51422: 'repay', 51423: 'repel', 51424: 'repent', 51425: 'reply', 51426: 'reps', 53541: 'signs', 51431: 'rerun', 51432: 'reset', 51433: 'resin', 51434: 'resort', 51435: 'rest', 51436: 'rests', 51441: 'retch', 51442: 'return', 51443: 'reuse', 51444: 'rev', 51445: 'reveal', 51446: 'revel', 51451: 'review', 51452: 'rex', 51453: 'rf', 51454: 'rg', 51455: 'rh', 51456: 'rhino', 51461: 'rho', 51462: 'rhoda', 51463: 'rhyme', 51464: 'ri', 51465: 'rib', 51466: 'ribs', 63363: 'wears', 35111: "k's", 35112: 'ka', 35113: 'kafka', 35114: 'kale', 41351: 'maj', 35116: 'kansas', 35121: 'kant', 35122: 'kappa', 35123: 'kaput', 35124: 'karate', 35125: 'karen', 35126: 'karl', 51511: 'rice', 51512: 'rich', 51513: 'rick', 51514: 'ricky', 35131: 'karma', 35132: 'karol', 35133: 'kate', 35134: 'kathy', 35135: 'katie', 35136: 'kay', 51521: 'ride', 51522: 'rider', 51523: 'ridge', 51524: 'rif', 35141: 'kayak', 35142: 'kayo', 35143: 'kazoo', 35144: 'kb', 35145: 'kc', 35146: 'kd', 51531: 'rig', 51532: 'riggs', 51533: 'right', 51534: 'rigid', 51535: 'rigs', 35152: 'keats', 35153: 'kebob', 35154: 'keel', 35155: 'keen', 35156: 'keep', 51541: 'rim', 51542: 'rims', 51543: 'rind', 51544: 'ring', 51545: 'ringo', 35162: 'keg', 35163: 'kegs', 35164: 'keith', 35165: 'kelly', 35166: 'kelp', 51551: 'rink', 51552: 'rinse', 51553: 'rio', 51554: 'riot', 51555: 'riots', 51556: 'rip', 51561: 'ripe', 51562: 'ripen', 51563: 'ripley', 51564: 'rips', 51565: 'rise', 51566: 'risen', 63366: 'web', 65514: '1995', 42635: 'muffin', 35211: 'ken', 35212: 'kennel', 35213: 'kent', 35214: 'kept', 35215: 'kerry', 35216: 'kettle', 35221: 'kevin', 35222: 'key', 35223: 'keyed', 35224: 'keys', 35225: 'kf', 35226: 'kg', 51611: 'risk', 51612: 'risky', 51613: 'rite', 51614: 'ritual', 35231: 'kgb', 51616: 'river', 35233: 'khaki', 35234: 'khan', 35235: 'khz', 35236: 'ki', 51621: 'rivet', 51622: 'rj', 51623: 'rk', 51624: 'rl', 35241: 'kibitz', 35242: 'kick', 35243: 'kicks', 35244: 'kid', 35245: 'kidney', 35246: 'kids', 51631: 'rna', 51632: 'ro', 51633: 'roach', 51634: 'road', 51635: 'roads', 35252: 'kills', 35253: 'kiln', 35254: 'kilo', 35255: 'kilt', 35256: 'kilts', 51641: 'roar', 51642: 'roast', 51643: 'rob', 51644: 'robe', 35261: 'kim', 35262: 'kin', 35263: 'kind', 35264: 'kinds', 35265: 'king', 35266: 'kings', 51651: 'robs', 51652: 'rock', 51653: 'rocket', 51654: 'rocks', 51655: 'rocky', 51656: 'rod', 51661: 'rode', 51662: 'rodeo', 51663: 'rods', 51664: 'roger', 51665: 'rogue', 51666: 'role', 11343: 'adobe', 44112: 'octave', 44113: 'od', 44114: 'odd', 35311: 'kink', 35312: 'kinky', 35313: 'kiosk', 35314: 'kirby', 35315: 'kirk', 35316: 'kiss', 35321: 'kit', 35322: 'kite', 35323: 'kites', 35324: 'kitty', 35325: 'kiwi', 35326: 'kj', 35331: 'kk', 35332: 'kkk', 35333: 'kkkk', 35334: 'kl', 35335: 'klan', 35336: 'klaus', 35341: 'klaxon', 35342: 'klein', 35343: 'klm', 35344: 'klutz', 35345: 'km', 35346: 'kn', 63235: 'wally', 35351: 'knack', 35352: 'knave', 35353: 'knead', 35354: 'knee', 35355: 'kneel', 35356: 'knees', 11354: 'adult', 35361: 'knelt', 35362: 'knew', 44123: 'oe', 35364: 'knight', 35365: 'knit', 35366: 'knits', 44124: 'of', 63242: 'walt', 63243: 'walton', 35411: 'knob', 35412: 'knobs', 35413: 'knock', 35414: 'knot', 35415: 'knots', 35416: 'know', 11364: 'aerial', 35421: 'known', 35422: 'knows', 35423: 'knox', 35424: 'ko', 35425: 'koala', 35426: 'koan', 44134: 'ogle', 35431: 'kodak', 35432: 'kong', 35433: 'kook', 35434: 'kooks', 35435: 'kooky', 35436: 'koran', 35441: 'korea', 35442: 'kp', 35443: 'kq', 35444: 'kr', 35445: 'kraft', 35446: 'kraut', 35451: 'kris', 35452: 'ks', 35453: 'kt', 35454: 'ku', 35455: 'kudo', 35456: 'kudos', 35461: 'kudzu', 35462: 'kurt', 35463: 'kv', 35464: 'kw', 35465: 'kx', 35466: 'ky', 63255: 'warm', 66431: '72', 52335: 'ry', 63261: 'warn', 35511: 'kz', 35512: 'l', 35513: "l's", 35514: 'la', 35515: 'lab', 35516: 'label', 52341: 'rye', 35521: 'labor', 35522: 'labs', 35523: 'lace', 35524: 'laces', 35525: 'lack', 35526: 'lacks', 63265: 'wars', 35531: 'lacy', 35532: 'lad', 35533: 'ladder', 35534: 'ladle', 35535: 'lads', 35536: 'lady', 35541: 'lag', 35542: 'lager', 35543: 'lagoon', 35544: 'lags', 35545: 'laid', 35546: 'lair', 41423: 'mantle', 35551: 'lake', 35552: 'lakes', 35553: 'lam', 35554: 'lamar', 35555: 'lamb', 35556: 'lambs', 35561: 'lame', 35562: 'lamp', 35563: 'lamps', 35564: 'lana', 35565: 'lance', 35566: 'land', 64613: 'yore', 52353: 'sack', 35611: 'lands', 35612: 'lane', 35613: 'lanky', 35614: 'laos', 35615: 'lap', 35616: 'lapel', 35621: 'laps', 35622: 'lapse', 35623: 'lara', 35624: 'lard', 35625: 'large', 35626: 'lark', 35631: 'larks', 35632: 'larry', 35633: 'larva', 35634: 'larynx', 35635: 'laser', 35636: 'lash', 35641: 'lass', 35642: 'lasso', 35643: 'last', 35644: 'latch', 35645: 'late', 35646: 'later', 56131: 'swish', 35651: 'latest', 35652: 'latex', 35653: 'lathe', 35654: 'latin', 35655: 'laud', 35656: 'laugh', 56321: 'tardy', 35661: 'launch', 35662: 'laura', 35663: 'lava', 35664: 'law', 35665: 'lawn', 35666: 'lawns', 66111: '36th', 65564: '25', 65566: '25th', 52111: 'roll', 52112: 'rolls', 52113: 'roman', 52114: 'rome', 52115: 'romeo', 52116: 'romp', 66112: '37', 52121: 'ron', 52122: 'roof', 52123: 'rook', 52124: 'rookie', 52125: 'room', 52126: 'rooms', 52131: 'roomy', 52132: 'roost', 52133: 'root', 52134: 'roots', 52135: 'rope', 52136: 'rosa', 55111: 'spite', 52141: 'rose', 52142: 'ross', 52143: 'rosy', 52144: 'rot', 52145: 'rote', 52146: 'roth', 66113: '37th', 52151: 'rots', 52152: 'rouge', 52153: 'rough', 52154: 'round', 52155: 'rouse', 52156: 'rout', 55114: 'splat', 52161: 'route', 52162: 'rover', 52163: 'row', 52164: 'rowdy', 52165: 'rows', 52166: 'roy', 66114: '38', 54112: 'skit', 55121: 'spoil', 55122: 'spoke', 52211: 'royal', 52212: 'rp', 52213: 'rpg', 52214: 'rq', 52215: 'rr', 52216: 'rrr', 55124: 'spoof', 52221: 'rrrr', 52222: 'rs', 52223: 'rst', 52224: 'rsvp', 52225: 'rt', 52226: 'ru', 52231: 'rub', 52232: 'rube', 52233: 'rubs', 52234: 'ruby', 52235: 'rude', 52236: 'rudy', 52241: 'rufus', 52242: 'rug', 52243: 'rugged', 52244: 'rugs', 52245: 'ruin', 52246: 'ruins', 52251: 'rule', 52252: 'ruler', 52253: 'rules', 52254: 'rum', 52255: 'rummy', 52256: 'rumor', 55131: 'spool', 52261: 'rump', 52262: 'rumpus', 52263: 'run', 52264: 'rune', 52265: 'runes', 52266: 'rung', 55133: 'spore', 11443: 'aglow', 55134: 'sport', 44212: 'ole', 11445: 'agnew', 44214: 'olive', 63415: 'weds', 63331: 'waved', 52311: 'runs', 52312: 'runt', 52313: 'runway', 52314: 'rural', 52315: 'ruse', 52316: 'rush', 52321: 'russ', 52322: 'rust', 52323: 'rusts', 52324: 'rusty', 52325: 'rut', 52326: 'ruth', 52331: 'ruts', 52332: 'rv', 52333: 'rw', 52334: 'rx', 44221: 'omaha', 52336: 'ryan', 11454: 'aha', 52342: 'rz', 52343: 's', 52344: "s's", 52345: 'sa', 52346: 'saber', 44223: 'omen', 52351: 'sable', 52352: 'sac', 11456: 'ahead', 52354: 'sacks', 52355: 'sacred', 52356: 'sad', 52361: 'saddle', 52362: 'sadly', 52363: 'safari', 52364: 'safe', 52365: 'safer', 52366: 'safes', 66666: '@', 66121: '39th', 11463: 'ahoy', 11464: 'ai', 44233: 'one', 52411: 'sag', 52412: 'saga', 44234: 'onion', 52414: 'sage', 52415: 'sags', 52416: 'said', 66122: '3:00', 52421: 'sail', 52422: 'sails', 52423: 'saint', 52424: 'sake', 52425: 'sal', 52426: 'salad', 55211: 'squid', 52431: 'salami', 52432: 'sale', 52433: 'sales', 52434: 'salk', 52435: 'sally', 52436: 'salon', 52441: 'salt', 52442: 'salts', 52443: 'salty', 52444: 'salvo', 52445: 'sam', 52446: 'same', 52451: 'sammy', 52452: 'samuel', 52453: 'sand', 52454: 'sandal', 52455: 'sands', 52456: 'sandy', 55212: 'squint', 52461: 'sane', 52462: 'sang', 52463: 'sank', 52464: 'santa', 52465: 'sap', 52466: 'sappy', 41515: 'masks', 41516: 'mason', 36111: 'laws', 36112: 'lawson', 36113: 'lax', 36114: 'lay', 36115: 'layer', 36116: 'layla', 36121: 'lays', 36122: 'lazy', 36123: 'lb', 36124: 'lbj', 36125: 'lbs', 36126: 'lc', 52511: 'saps', 52512: 'sara', 52513: 'sarah', 52514: 'saran', 36131: 'lcd', 36132: 'ld', 36133: 'le', 36134: 'lead', 36135: 'leads', 36136: 'leaf', 52521: 'sat', 52522: 'satan', 52523: 'satin', 52524: 'sauce', 36141: 'leafy', 36142: 'leah', 36143: 'leak', 36144: 'leaks', 36145: 'leaky', 36146: 'lean', 52531: 'saul', 52532: 'sauna', 52533: 'saute', 52534: 'save', 36151: 'leap', 36152: 'leaps', 36153: 'lear', 36154: 'learn', 36155: 'leary', 36156: 'lease', 52541: 'savvy', 52542: 'saw', 41525: 'match', 52544: 'sawyer', 36161: 'leash', 36162: 'least', 36163: 'leave', 36164: 'led', 41526: 'mate', 36166: 'ledge', 52551: 'says', 52552: 'sb', 52553: 'sc', 52554: 'scab', 52555: 'scald', 52556: 'scale', 52561: 'scalp', 52562: 'scam', 52563: 'scamp', 52564: 'scan', 52565: 'scans', 52566: 'scar', 36211: 'lee', 36212: 'leech', 36213: 'leer', 36214: 'leers', 36215: 'leery', 36216: 'leeway', 41535: 'matt', 36221: 'left', 36222: 'lefty', 36223: 'leg', 36224: 'legacy', 41536: 'matzo', 36226: 'legion', 52611: 'scare', 52612: 'scarf', 52613: 'scars', 52614: 'scary', 36231: 'legs', 36232: 'lei', 36233: 'lemon', 36234: 'len', 36235: 'lend', 36236: 'lends', 52621: 'scent', 52622: 'school', 52623: 'scoff', 52624: 'scold', 36241: 'length', 36242: 'lenin', 36243: 'lenny', 36244: 'lens', 36245: 'lent', 36246: 'leo', 52631: 'scope', 52632: 'scorch', 52633: 'score', 52634: 'scorn', 36251: 'leon', 36252: 'leona', 36253: 'leper', 36254: 'leroy', 36255: 'less', 36256: 'lest', 52641: 'scour', 52642: 'scout', 52643: 'scow', 52644: 'scowl', 36261: 'let', 36262: "let's", 36263: 'lets', 36264: 'letter', 36265: 'levee', 36266: 'level', 52651: 'scrape', 52652: 'screw', 52653: 'scrip', 52654: 'scrod', 52655: 'scrub', 52656: 'scuba', 52661: 'scuff', 52662: 'scum', 41545: 'maw', 52664: 'sd', 52665: 'sdi', 52666: 'se', 41546: 'max', 66123: '3:30', 66124: '3rd', 36311: 'lever', 36312: 'levis', 36313: 'levy', 36314: 'lewd', 36315: 'lewis', 36316: 'lf', 36321: 'lg', 36322: 'lh', 36323: 'li', 36324: 'liar', 36325: 'liars', 36326: 'lib', 36331: 'libel', 36332: 'libido', 36333: 'libya', 36334: 'lice', 36335: 'lick', 36336: 'licks', 66131: '40', 36341: 'lid', 36342: 'lids', 36343: 'lie', 36344: 'lied', 36345: 'lien', 36346: 'lies', 36351: 'lieu', 36352: 'lieut', 36353: 'life', 36354: 'lift', 36355: 'light', 36356: 'like', 66134: '4000', 36361: 'liked', 36362: 'likes', 36363: 'lil', 36364: 'lilac', 36365: 'lilt', 36366: 'lily', 55213: 'squirm', 55214: 'sr', 55222: 'ssss', 62315: 'usual', 54132: 'slams', 36411: 'lima', 36412: 'limb', 36413: 'limbo', 36414: 'limbs', 36415: 'lime', 36416: 'limit', 36421: 'limp', 36422: 'limps', 36423: 'linda', 36424: 'line', 36425: 'linen', 36426: 'lines', 55223: 'sst', 36431: 'lingo', 36432: 'link', 36433: 'lint', 36434: 'linus', 36435: 'lion', 36436: 'lip', 36441: 'lips', 36442: 'liquid', 36443: 'lira', 36444: 'lisa', 36445: 'lisp', 36446: 'list', 36451: 'listen', 36452: 'lists', 36453: 'liszt', 36454: 'lit', 36455: 'litton', 36456: 'live', 36461: 'liver', 36462: 'livid', 36463: 'liz', 36464: 'liza', 36465: 'lizzie', 36466: 'lj', 55231: 'stabs', 55232: 'stack', 55233: 'stacy', 44311: 'optic', 55234: 'staff', 44312: 'opus', 62546: 'vibes', 11545: 'alarm', 44314: 'or', 36511: 'lk', 36512: 'll', 36513: 'lll', 36514: 'llll', 36515: 'lloyd', 36516: 'lm', 36521: 'lmn', 36522: 'ln', 36523: 'lo', 36524: 'load', 36525: 'loaf', 36526: 'loam', 36531: 'loamy', 36532: 'loan', 36533: 'lob', 36534: 'lobby', 36535: 'lobe', 36536: 'lobs', 36541: 'local', 36542: 'loch', 36543: 'lock', 36544: 'locks', 36545: 'lode', 36546: 'lodge', 11553: 'alden', 36552: 'lofty', 36553: 'log', 36554: 'logan', 36555: 'logic', 36556: 'logo', 44322: 'orbs', 36561: 'logs', 36562: 'loin', 44323: 'orchid', 36564: 'lois', 36565: 'loiter', 36566: 'loki', 11556: 'aleck', 63442: 'wendy', 12355: 'arab', 63413: 'wed', 36611: 'lola', 36612: 'loll', 36613: 'lone', 36614: 'loner', 36615: 'long', 36616: 'longs', 44332: 'ornery', 36621: 'look', 36622: 'looks', 36623: 'loom', 36624: 'loon', 36625: 'loony', 36626: 'loop', 44334: 'os', 36631: 'loose', 36632: 'loot', 36633: 'lop', 36634: 'lopez', 36635: 'lops', 36636: 'lord', 36641: 'lore', 36642: 'loren', 36643: 'lose', 36644: 'loser', 36645: 'loses', 36646: 'loss', 36651: 'lost', 36652: 'lot', 36653: 'lots', 36654: 'lotto', 36655: 'lotus', 36656: 'lou', 56655: 'tidy', 36661: 'loud', 36662: 'louis', 36663: 'louise', 36664: 'louse', 36665: 'lousy', 36666: 'lout', 56323: 'tarp', 52535: 'saved', 52536: 'saves', 63416: 'wee', 13665: 'bess', 41615: 'meadow', 41616: 'meal', 63462: 'wham', 63463: 'wharf', 63464: 'what', 53111: 'sea', 53112: 'seal', 53113: 'seals', 53114: 'seam', 52543: 'saws', 53116: 'seamy', 63466: 'whee', 53121: 'sean', 53122: 'sear', 53123: 'sears', 53124: 'seas', 53125: 'season', 53126: 'seat', 52545: 'sax', 53131: 'seats', 53132: 'sect', 52546: 'say', 53134: 'sedan', 53135: 'seduce', 53136: 'see', 53141: 'seed', 53142: 'seeds', 41625: 'meat', 53144: 'seek', 53145: 'seeks', 53146: 'seem', 41626: 'meaty', 36165: 'leda', 53152: 'seen', 53153: 'seep', 53154: 'seer', 53155: 'seers', 53156: 'sees', 53161: 'seethe', 53162: 'seize', 53163: 'self', 53164: 'sell', 53165: 'sells', 53166: 'semen', 41635: 'medley', 41636: 'meek', 53211: 'semi', 53212: 'send', 53213: 'sends', 53214: 'sense', 53215: 'sent', 53216: 'sentry', 53221: 'sep', 53222: 'sepia', 53223: 'sequel', 53224: 'sequin', 53225: 'serb', 53226: 'serf', 53231: 'serum', 53232: 'serve', 53233: 'servo', 53234: 'set', 53235: 'seth', 53236: 'sets', 53241: 'setup', 53242: 'seven', 53243: 'sever', 53244: 'severe', 53245: 'sew', 53246: 'sewed', 53251: 'sewer', 53252: 'sewn', 53253: 'sews', 53254: 'sex', 53255: 'sexy', 53256: 'sf', 53261: 'sg', 53262: 'sgt', 41645: 'melee', 53264: 'shack', 53265: 'shade', 53266: 'shady', 41646: 'mellow', 66223: '5', 66224: '5%', 64514: 'yea', 63423: 'week', 53311: 'shaft', 53312: 'shaggy', 53313: 'shake', 53314: 'shaken', 53315: 'shaky', 53316: 'shall', 53321: 'sham', 53322: 'shame', 53323: 'shank', 53324: 'shape', 53325: 'share', 53326: 'shari', 66232: '500', 53331: 'shark', 53332: 'sharp', 53333: 'shave', 53334: 'shaw', 53335: 'shawl', 53336: 'she', 55311: 'stat', 53341: "she'd", 53342: "she's", 53343: 'shea', 53344: 'sheaf', 53345: 'shear', 53346: 'sheath', 53351: 'shed', 53352: 'sheds', 53353: 'sheep', 53354: 'sheer', 53355: 'sheet', 53356: 'sheik', 55314: 'statue', 53361: 'shelf', 53362: 'shell', 53363: 'shh', 53364: 'shift', 53365: 'shifty', 53366: 'shin', 44331: 'orgy', 52515: 'sase', 63511: 'wheel', 45423: 'pitch', 55321: 'steady', 55322: 'steak', 53411: 'shine', 53412: 'shins', 53413: 'shiny', 53414: 'ship', 52516: 'sash', 53416: 'shirk', 55324: 'steam', 53421: 'shirt', 53422: 'shock', 53423: 'shoe', 53424: 'shoes', 53425: 'shone', 53426: 'shoo', 44333: 'orphan', 53431: 'shook', 53432: 'shoot', 53433: 'shop', 53434: 'shops', 53435: 'shore', 53436: 'short', 53441: 'shot', 53442: 'shots', 53443: 'shout', 53444: 'shove', 53445: 'show', 53446: 'shown', 53451: 'shows', 53452: 'shrank', 53453: 'shred', 53454: 'shrew', 53455: 'shriek', 53456: 'shrub', 55331: 'steep', 53461: 'shrug', 53462: 'shuck', 53463: 'shun', 53464: 'shut', 53465: 'shuts', 53466: 'shy', 66255: '55th', 55333: 'stein', 44411: 'ovals', 55334: 'stella', 65223: 'zs', 11644: 'alma', 11645: 'almost', 44414: 'ovens', 61414: 'trap', 36225: 'legal', 53512: 'si', 53513: 'sic', 53514: 'sick', 53515: 'sicko', 53516: 'sid', 65613: '27', 53521: 'side', 53522: 'siege', 53523: 'siesta', 53524: 'sieve', 53525: 'sift', 53526: 'sifts', 53531: 'sigh', 53532: 'sighs', 53533: 'sight', 53534: 'sigma', 44421: 'ow', 53536: 'signal', 44422: 'owe', 53542: 'silk', 53543: 'silks', 53544: 'silky', 53545: 'sill', 53546: 'silly', 11655: 'aloof', 53551: 'silo', 53552: 'silt', 44424: 'owens', 53554: 'simms', 53555: 'simon', 53556: 'simons', 53561: 'sims', 53562: 'sin', 53563: 'since', 53564: 'sinew', 53565: 'sing', 53566: 'sings', 65615: '28', 63433: 'welch', 53613: 'sins', 65364: '12th', 11663: 'alps', 44432: 'owls', 11665: 'alsop', 53611: 'sink', 53612: 'sinks', 44434: 'owned', 53614: 'sinus', 53615: 'sip', 53616: 'sips', 53621: 'sir', 53622: 'sire', 53623: 'siren', 53624: 'sis', 53625: 'sit', 53626: 'site', 53631: 'sites', 53632: 'sits', 53633: 'six', 53634: 'sixgun', 45431: 'pity', 53636: 'sixty', 53641: 'size', 53642: 'sizes', 53643: 'sj', 53644: 'sk', 53645: 'skate', 53646: 'skew', 53651: 'ski', 53652: 'skid', 53653: 'skids', 53654: 'skies', 53655: 'skill', 53656: 'skim', 53661: 'skimpy', 53662: 'skims', 53663: 'skin', 53664: 'skip', 53665: 'skips', 53666: 'skirt', 52635: 'scot', 64513: 'ye', 52636: 'scott', 52525: 'saucy', 45433: 'pixel', 63563: 'wield', 63564: 'wife', 63565: 'wig', 52526: 'saudi', 63566: 'wigs', 52645: 'scram', 52646: 'scrap', 64625: 'yoyo', 64516: 'year', 66311: '58th', 66312: '59', 66313: '59th', 66314: '5:00', 64666: 'zaps', 52663: 'scurry', 66321: '6', 66322: '6%', 66323: '60', 21111: 'clod', 21112: 'clog', 21113: 'clone', 21114: 'close', 21115: 'closet', 21116: 'clot', 21121: 'cloth', 21122: 'cloud', 21123: 'clout', 21124: 'clove', 21125: 'clown', 21126: 'cloy', 21131: 'club', 21132: 'clubs', 21133: 'cluck', 21134: 'clue', 21135: 'clues', 21136: 'clump', 21141: 'clumsy', 21142: 'clung', 21143: 'clyde', 21144: 'cm', 21145: 'cn', 21146: 'co', 21151: 'co2', 21152: 'coach', 21153: 'coal', 21154: 'coast', 21155: 'coat', 21156: 'coats', 21161: 'coax', 21162: 'cob', 21163: 'cobble', 21164: 'cobol', 21165: 'cobra', 21166: 'coca', 66333: '61st', 55411: 'stone', 66334: '62', 55412: 'stony', 55413: 'stood', 61263: 'topaz', 55414: 'stool', 63446: 'wes', 21211: 'cock', 21212: 'cockle', 21213: 'cocky', 21214: 'cocoa', 21215: 'cod', 21216: 'coda', 21221: 'coddle', 21222: 'code', 21223: 'coded', 21224: 'codes', 21225: 'cody', 21226: 'coed', 21231: 'cog', 21232: 'cogent', 21233: 'cogs', 21234: 'cohen', 21235: 'coif', 21236: 'coil', 55422: 'store', 21241: 'coils', 21242: 'coin', 21243: 'coins', 21244: 'coke', 21245: 'cola', 21246: 'colby', 55424: 'storm', 21251: 'cold', 21252: 'cole', 21253: 'colon', 21254: 'colony', 21255: 'color', 21256: 'colt', 21261: 'coma', 21262: 'comb', 21263: 'combat', 21264: 'combo', 21265: 'come', 21266: 'comet', 61266: 'tops', 55431: 'stout', 55432: 'stove', 55433: 'stow', 55434: 'strafe', 21311: 'comfy', 21312: 'comic', 21313: 'comma', 21314: 'con', 21315: 'conch', 21316: 'condo', 21321: 'cone', 21322: 'coney', 21323: 'congo', 21324: 'conic', 21325: 'convex', 21326: 'convoy', 21331: 'conway', 21332: 'coo', 21333: 'cook', 21334: 'cooky', 21335: 'cool', 21336: 'coon', 21341: 'coop', 21342: 'cooper', 21343: 'coors', 21344: 'coos', 21345: 'coot', 21346: 'cop', 54115: 'skull', 54116: 'skunk', 21351: 'cope', 21352: 'copes', 21353: 'copper', 21354: 'copra', 21355: 'cops', 21356: 'copy', 54125: 'slack', 54126: 'slain', 21361: 'coral', 21362: 'cord', 21363: 'cords', 21364: 'core', 21365: 'cork', 21366: 'corn', 54135: 'slap', 54136: 'slaps', 54141: 'slash', 54142: 'slate', 54143: 'slater', 54144: 'slave', 54145: 'slaw', 54146: 'slay', 54151: 'sled', 54152: 'sleds', 54153: 'sleek', 54154: 'sleep', 54155: 'sleet', 54156: 'slept', 54161: 'slew', 54162: 'slice', 54163: 'slick', 54164: 'slid', 54165: 'slide', 54166: 'slim', 21411: 'corny', 21412: 'corp', 21413: 'corps', 21414: 'cortex', 21415: 'cost', 21416: 'costs', 21421: 'cot', 21422: 'couch', 21423: 'cough', 21424: 'could', 21425: 'count', 21426: 'coup', 21431: 'coupe', 21432: 'court', 21433: 'cousin', 21434: 'cove', 21435: 'coven', 21436: 'cover', 21441: 'covet', 21442: 'cow', 21443: 'cowboy', 21444: 'cowl', 21445: 'cows', 21446: 'cox', 54215: 'slips', 54216: 'slit', 21451: 'coy', 21452: 'coyote', 21453: 'cozy', 21454: 'cp', 21455: 'cpa', 21456: 'cpr', 54225: 'slop', 54226: 'slope', 21461: 'cpu', 21462: 'cq', 21463: 'cr', 21464: 'crab', 21465: 'crack', 21466: 'craft', 54235: 'sloth', 54236: 'slots', 54241: 'slow', 54242: 'slows', 54243: 'slug', 54244: 'slugs', 54245: 'slum', 54246: 'slump', 54251: 'slums', 54252: 'slung', 54253: 'slur', 54254: 'slurp', 54255: 'slurs', 54256: 'sly', 54261: 'slyly', 54262: 'sm', 54263: 'smack', 54264: 'small', 54265: 'smart', 54266: 'smash', 21511: 'crag', 21512: 'craig', 21513: 'cram', 21514: 'cramp', 21515: 'crane', 21516: 'crank', 21521: 'crap', 21522: 'craps', 21523: 'crash', 21524: 'crass', 21525: 'crate', 21526: 'crater', 21531: 'crave', 21532: 'crawl', 21533: 'craze', 21534: 'crazy', 21535: 'creak', 21536: 'cream', 63664: 'wisp', 21541: 'credit', 21542: 'credo', 21543: 'creed', 21544: 'creek', 21545: 'creep', 21546: 'creole', 54315: 'smith', 54316: 'smock', 63666: 'wit', 21551: 'crepe', 21552: 'crept', 21553: 'cress', 21554: 'crest', 21555: 'crete', 21556: 'crew', 54325: 'smug', 54326: 'smut', 21561: 'crib', 21562: 'cried', 21563: 'crime', 21564: 'crimp', 21565: 'crisp', 21566: 'croak', 54335: 'snail', 54336: 'snake', 54341: 'snap', 54342: 'snaps', 54343: 'snare', 54344: 'snarl', 54345: 'snatch', 54346: 'sneak', 54351: 'sneer', 54352: 'sniff', 54353: 'snip', 54354: 'snipe', 54355: 'snob', 54356: 'snobs', 54361: 'snoop', 54362: 'snore', 54363: 'snort', 54364: 'snot', 54365: 'snout', 54366: 'snow', 14523: 'bonus', 21611: 'crock', 21612: 'crocus', 21613: 'crone', 21614: 'crony', 21615: 'crook', 21616: 'croon', 21621: 'crop', 21622: 'crops', 21623: 'cross', 21624: 'crow', 21625: 'crowd', 21626: 'crown', 21631: 'crows', 21632: 'crt', 21633: 'crud', 21634: 'crude', 21635: 'cruel', 21636: 'crumb', 21641: 'crunch', 21642: 'crush', 21643: 'crust', 21644: 'crux', 21645: 'cry', 21646: 'crypt', 54415: 'snuff', 54416: 'snug', 21651: 'cs', 21652: 'ct', 21653: 'cu', 21654: 'cub', 21655: 'cuba', 21656: 'cuban', 54425: 'soapy', 54426: 'soar', 21661: 'cube', 21662: 'cubic', 21663: 'cubs', 21664: 'cud', 21665: 'cuddle', 21666: 'cue', 54435: 'social', 54436: 'sock', 54441: 'socks', 54442: 'sod', 14535: 'boom', 54444: 'sofa', 54445: 'soft', 54446: 'soften', 54451: 'soggy', 54452: 'soil', 54453: 'soils', 54454: 'sol', 54455: 'solar', 54456: 'sold', 54461: 'sole', 54462: 'solemn', 54463: 'solid', 54464: 'solo', 54465: 'solve', 54466: 'somber', 66422: '700', 66423: '7000', 14541: 'boone', 66424: '70th', 66426: '71st', 54511: 'some', 54512: 'son', 54513: 'sonar', 54514: 'song', 54515: 'songs', 54516: 'sonny', 54521: 'sons', 54522: 'sony', 54523: 'soon', 54524: 'soot', 54525: 'sop', 54526: 'sore', 66432: '72nd', 54531: 'sorry', 54532: 'sort', 54533: 'sorts', 54534: 'sos', 54535: 'sot', 54536: 'soul', 54541: 'sound', 54542: 'soup', 54543: 'soupy', 54544: 'sour', 54545: 'source', 54546: 'south', 63465: 'wheat', 54551: 'sow', 54552: 'sown', 54553: 'sows', 54554: 'sox', 54555: 'soy', 54556: 'soyuz', 54561: 'sp', 54562: 'spa', 54563: 'space', 54564: 'spade', 54565: 'spain', 54566: 'spam', 54611: 'span', 54612: 'spank', 54613: 'spans', 54614: 'spar', 54615: 'spare', 54616: 'spark', 54621: 'sparks', 54622: 'spas', 54623: 'spasm', 54624: 'spat', 54625: 'spawn', 54626: 'spay', 54631: 'speak', 54632: 'spear', 54633: 'spec', 54634: 'speck', 54635: 'sped', 54636: 'speed', 54641: 'spell', 54642: 'spend', 54643: 'spent', 54644: 'sperm', 54645: 'spew', 54646: 'sphinx', 54651: 'spice', 54652: 'spicy', 54653: 'spies', 54654: 'spike', 54655: 'spiky', 54656: 'spill', 54661: 'spin', 54662: 'spine', 54663: 'spins', 54664: 'spiny', 54665: 'spire', 54666: 'spit', 66456: '78th', 64563: 'yodel', 56121: 'swim', 64564: 'yoga', 64566: 'yoke', 65112: 'zc', 22111: 'cues', 22112: 'cuff', 22113: 'cull', 22114: 'cult', 22115: 'cults', 22116: 'cup', 65113: 'zd', 22121: 'cupful', 22122: 'cupid', 22123: 'cups', 22124: 'cur', 22125: 'curb', 22126: 'curd', 22131: 'cure', 22132: 'cured', 22133: 'curfew', 22134: 'curie', 22135: 'curio', 22136: 'curl', 22141: 'curls', 22142: 'curry', 22143: 'curse', 22144: 'curt', 22145: 'curve', 22146: 'cusp', 65114: 'ze', 22151: 'cuss', 22152: 'cut', 22153: 'cute', 22154: 'cutlet', 22155: 'cuts', 22156: 'cv', 56122: 'swims', 22161: 'cw', 22162: 'cx', 22163: 'cy', 22164: 'cycle', 22165: 'cynic', 22166: 'cyrus', 22211: 'cyst', 22212: 'cz', 22213: 'czar', 22214: 'czech', 22215: 'd', 22216: 'd&d', 22221: "d's", 22222: 'd-day', 22223: 'da', 22224: 'dab', 22225: 'dad', 22226: 'daddy', 22231: 'daffy', 22232: 'daft', 22233: 'dagger', 22234: 'dahlia', 22235: 'daily', 22236: 'dairy', 22241: 'dais', 22242: 'daisy', 22243: 'dale', 22244: 'dally', 22245: 'dam', 22246: 'dame', 22251: 'damn', 22252: 'damon', 22253: 'damp', 22254: 'damsel', 22255: 'dan', 22256: 'dana', 22261: 'dance', 22262: 'dandy', 22263: 'dane', 22264: 'dang', 22265: 'dank', 22266: 'danny', 22311: 'dante', 22312: 'dare', 22313: 'dared', 22314: 'dares', 22315: 'dark', 22316: 'darken', 22321: 'darn', 22322: 'dart', 22323: 'darts', 22324: 'darwin', 22325: 'daryl', 22326: 'dash', 22331: 'data', 22332: 'date', 22333: 'dates', 22334: 'datum', 22335: 'daub', 22336: 'daunt', 22341: 'dave', 22342: 'david', 22343: 'davis', 22344: 'davy', 22345: 'dawn', 22346: 'day', 55115: 'split', 55116: 'spock', 22351: 'days', 22352: 'daze', 22353: 'dazed', 22354: 'db', 22355: 'dbms', 22356: 'dc', 55125: 'spook', 55126: 'spooky', 65121: 'zebra', 22361: 'dd', 22362: 'ddd', 22363: 'dddd', 22364: 'dds', 22365: 'ddt', 22366: 'de', 55135: 'spot', 55136: 'spots', 55141: 'spout', 55142: 'sprain', 55143: 'spray', 55144: 'spree', 55145: 'sprig', 55146: 'spruce', 55151: 'spry', 55152: 'spud', 55153: 'spun', 55154: 'spunk', 55155: 'spur', 55156: 'spurn', 65122: 'zeke', 55161: 'spurs', 55162: 'spurt', 14655: 'boys', 55164: 'sq', 55165: 'squad', 55166: 'squat', 64515: 'yeah', 22411: 'deacon', 22412: 'dead', 22413: 'deaf', 22414: 'deal', 22415: 'deals', 22416: 'dealt', 22421: 'dean', 22422: 'dear', 22423: 'death', 22424: 'debby', 22425: 'debit', 22426: 'debra', 22431: 'debris', 22432: 'debt', 22433: 'debts', 22434: 'debug', 22435: 'debut', 22436: 'dec', 22441: 'decal', 22442: 'decay', 22443: 'deck', 22444: 'decor', 22445: 'decoy', 22446: 'decree', 55215: 'ss', 55216: 'sse', 22451: 'decry', 22452: 'dee', 22453: 'deed', 22454: 'deeds', 22455: 'deejay', 22456: 'deem', 55225: 'st', 55226: 'stab', 22461: 'deep', 22462: 'deer', 22463: 'def', 22464: 'defect', 22465: 'defer', 22466: 'deform', 55235: 'stag', 55236: 'stage', 55241: 'stain', 55242: 'stair', 55243: 'stake', 55244: 'stale', 55245: 'stalk', 55246: 'stall', 55251: 'stamp', 55252: 'stan', 55253: 'stance', 55254: 'stand', 55255: 'stank', 55256: 'star', 55261: 'stare', 55262: 'stark', 55263: 'starr', 55264: 'stars', 55265: 'start', 55266: 'stash', 55221: 'sss', 22511: 'deft', 22512: 'defy', 22513: 'deify', 22514: 'deity', 22515: 'del', 22516: 'delay', 22521: 'delhi', 22522: 'deli', 22523: 'delia', 22524: 'della', 22525: 'delta', 22526: 'deluxe', 22531: 'delve', 22532: 'demo', 22533: 'demon', 22534: 'demur', 22535: 'den', 22536: 'denial', 22541: 'denim', 22542: 'denny', 22543: 'dense', 22544: 'dent', 22545: 'dents', 22546: 'deny', 55315: 'stay', 55316: 'stays', 22551: 'depot', 22552: 'dept', 22553: 'depth', 22554: 'deputy', 22555: 'derby', 22556: 'derek', 55325: 'steed', 55326: 'steel', 66565: '90%', 22561: 'desist', 22562: 'desk', 22563: 'desks', 22564: 'detach', 22565: 'deter', 22566: 'detox', 55335: 'stem', 55336: 'stems', 55341: 'step', 55342: 'steps', 55343: 'stern', 55344: 'steve', 55345: 'stew', 55346: 'stick', 55351: 'stiff', 55352: 'still', 55353: 'sting', 55354: 'stingy', 55355: 'stink', 55356: 'stint', 55361: 'stir', 55362: 'stirs', 55363: 'stock', 55364: 'stoke', 55365: 'stole', 55366: 'stomp', 22611: 'deuce', 22612: 'devil', 22613: 'devoid', 22614: 'dew', 22615: 'dewey', 22616: 'dewy', 22621: 'df', 22622: 'dg', 22623: 'dh', 22624: 'di', 22625: 'dial', 22626: 'dials', 22631: 'diana', 22632: 'diane', 22633: 'diaper', 22634: 'diary', 22635: 'dibs', 22636: 'dice', 22641: 'dick', 22642: 'did', 22643: 'die', 22644: 'died', 22645: 'diego', 22646: 'dies', 55415: 'stoop', 55416: 'stop', 22651: 'diesel', 22652: 'diet', 22653: 'diets', 22654: 'dig', 22655: 'digit', 22656: 'digs', 55425: 'stormy', 55426: 'story', 22661: 'dike', 22662: 'dilate', 22663: 'dill', 22664: 'dim', 22665: 'dime', 22666: 'dimes', 55435: 'strap', 55436: 'straw', 55441: 'stray', 55442: 'strep', 55443: 'strike', 55444: 'strip', 55445: 'stroll', 55446: 'strum', 55451: 'strut', 55452: 'stu', 55453: 'stuart', 55454: 'stub', 55455: 'stuck', 55456: 'stud', 65132: 'zf', 55461: 'study', 55312: 'state', 55463: 'stuffy', 55464: 'stump', 55465: 'stun', 55466: 'stung', 36551: 'loft', 55313: 'stats', 55511: 'stunk', 55512: 'stuns', 55513: 'stunt', 55514: 'sty', 55515: 'style', 55516: 'styx', 65134: 'zh', 55521: 'su', 55522: 'suave', 55523: 'sub', 55524: 'subs', 55525: 'subtle', 55526: 'such', 55531: 'suck', 55532: 'sucks', 55533: 'suds', 55534: 'sue', 55535: 'sued', 55536: 'suede', 36563: 'loins', 55541: 'sues', 55542: 'suey', 55543: 'sugar', 55544: 'suit', 55545: 'suite', 55546: 'suits', 55551: 'sulk', 55552: 'sulks', 55553: 'sulky', 55554: 'sultry', 55555: 'sum', 55556: 'sumac', 55561: 'summon', 55562: 'sumo', 55563: 'sums', 55564: 'sun', 55565: 'sung', 55566: 'sunk', 46431: 'pus', 55611: 'sunny', 55612: 'suns', 55613: 'sunset', 55614: 'sunup', 55615: 'sup', 55616: 'super', 61364: 'tracy', 55621: 'supt', 55622: 'sure', 55623: 'surf', 55624: 'surge', 55625: 'susan', 55626: 'sushi', 55631: 'susie', 55632: 'sutton', 55633: 'suzy', 55634: 'sv', 55635: 'sven', 55636: 'sw', 55641: 'swab', 55642: 'swag', 55643: 'swam', 55644: 'swami', 55645: 'swamp', 55646: 'swampy', 55651: 'swan', 55652: 'swank', 55653: 'swans', 55654: 'swap', 55655: 'swarm', 55656: 'swat', 65214: 'zoos', 55661: 'sway', 55662: 'sways', 55663: 'swear', 55664: 'sweat', 55665: 'sweaty', 55666: 'swede', 66231: '50%', 56411: 'tear', 62413: 'vans', 56412: 'tease', 66632: '96', 62414: 'vapor', 66233: '5000', 62415: 'vary', 66234: '50th', 55323: 'steal', 62416: 'vase', 23111: 'dimly', 23112: 'dims', 23113: 'din', 23114: 'dinah', 23115: 'dine', 23116: 'diner', 62566: 'viii', 23121: 'ding', 23122: 'dingo', 23123: 'dingy', 23124: 'dint', 23125: 'diode', 23126: 'dip', 23131: 'dips', 23132: 'dire', 23133: 'dirge', 23134: 'dirk', 23135: 'dirt', 23136: 'dirty', 31163: 'glance', 23141: 'disc', 23142: 'disco', 23143: 'dish', 23144: 'disk', 23145: 'disney', 23146: 'ditch', 23151: 'ditto', 23152: 'ditty', 23153: 'diva', 23154: 'divan', 23155: 'dive', 23156: 'dives', 23161: 'divot', 23162: 'dixie', 23163: 'dizzy', 23164: 'dj', 23165: 'dk', 23166: 'dl', 23211: 'dm', 23212: 'dn', 23213: 'dna', 23214: 'do', 23215: 'dobro', 23216: 'doc', 63513: 'where', 23221: 'dock', 23222: 'docket', 23223: 'doctor', 23224: 'dodge', 23225: 'dodo', 23226: 'doe', 23231: 'does', 23232: 'doff', 23233: 'dog', 23234: 'dogma', 23235: 'dogs', 23236: 'doily', 23241: 'doing', 23242: 'dolby', 23243: 'dole', 23244: 'doll', 23245: 'dolly', 23246: 'dolt', 63514: 'whew', 23251: 'dome', 23252: 'domed', 23253: 'domino', 23254: 'don', 23255: "don't", 23256: 'done', 23261: 'donna', 23262: 'donor', 23263: 'donut', 23264: 'doom', 23265: 'door', 23266: 'dope', 63515: 'which', 55332: 'steer', 63516: 'whiff', 23311: 'dopey', 23312: 'dora', 23313: 'doris', 23314: 'dorm', 23315: 'dose', 23316: 'dot', 53511: 'shyly', 23321: 'dote', 23322: 'dots', 23323: 'double', 23324: 'doubt', 23325: 'doug', 23326: 'dough', 61232: 'tom', 23331: 'douse', 23332: 'dove', 23333: 'doves', 23334: 'dowel', 23335: 'down', 23336: 'dowry', 23341: 'doze', 23342: 'dozen', 23343: 'dp', 23344: 'dq', 23345: 'dr', 23346: 'drab', 56115: 'swift', 56116: 'swig', 23351: 'draft', 23352: 'drag', 23353: 'drain', 23354: 'drake', 23355: 'drama', 23356: 'drank', 56125: 'swipe', 56126: 'swirl', 23361: 'drape', 23362: 'draw', 23363: 'drawl', 23364: 'drawn', 23365: 'dread', 23366: 'dream', 56135: 'swore', 56136: 'sworn', 66211: '47th', 56141: 'swum', 56142: 'swung', 56143: 'sx', 56144: 'sy', 56145: 'sybil', 56146: 'symbol', 56151: 'syrup', 56152: 'sz', 56153: 't', 56154: 't&a', 56155: "t's", 56156: 'ta', 56161: 'tab', 56162: 'table', 56163: 'tablet', 56164: 'taboo', 56165: 'tabs', 56166: 'tabu', 23411: 'dreamy', 23412: 'dregs', 23413: 'dress', 23414: 'dressy', 23415: 'drew', 23416: 'dried', 23421: 'drier', 23422: 'dries', 23423: 'drift', 23424: 'drill', 23425: 'drink', 23426: 'drip', 23431: 'drips', 23432: 'drive', 23433: 'droid', 23434: 'droll', 23435: 'drone', 23436: 'drool', 23441: 'droop', 23442: 'drop', 23443: 'drops', 23444: 'drove', 23445: 'drown', 23446: 'dru', 56215: 'tactic', 56216: 'tad', 23451: 'drub', 23452: 'drug', 23453: 'drugs', 23454: 'druid', 23455: 'drum', 23456: 'drums', 56225: 'tail', 56226: 'tails', 23461: 'drunk', 23462: 'dry', 23463: 'dryad', 23464: 'ds', 23465: 'dt', 23466: 'du', 56235: 'tale', 56236: 'tales', 56241: 'talk', 56242: 'talks', 56243: 'tall', 56244: 'tally', 56245: 'talon', 56246: 'tame', 56251: 'tamer', 56252: 'tamper', 56253: 'tan', 56254: 'tang', 56255: 'tango', 56256: 'tangy', 56261: 'tank', 56262: 'tanks', 56263: 'tans', 56264: 'tanya', 56265: 'tao', 56266: 'tap', 31223: 'glint', 61262: 'top', 23511: 'dual', 23512: 'duane', 23513: 'dub', 23514: 'dublin', 23515: 'duck', 23516: 'ducks', 64614: 'york', 23521: 'duct', 23522: 'dud', 23523: 'dude', 23524: 'due', 23525: 'duel', 23526: 'dues', 61265: 'topple', 23531: 'duet', 23532: 'duff', 23533: 'dug', 23534: 'duke', 23535: 'dull', 23536: 'dully', 23541: 'duly', 23542: 'dumb', 23543: 'dumbo', 23544: 'dummy', 23545: 'dump', 23546: 'dumps', 56315: 'taps', 56316: 'tar', 23551: 'dumpy', 23552: 'dun', 23553: 'dunce', 23554: 'dune', 23555: 'dung', 23556: 'dunk', 56325: 'tart', 56326: 'tarts', 65161: 'zm', 23561: 'duo', 23562: 'dupe', 23563: 'during', 23564: 'dusk', 23565: 'dusky', 23566: 'dust', 56335: 'tater', 56336: 'tattle', 56341: 'tau', 56342: 'taunt', 56343: 'taut', 56344: 'tavern', 56345: 'tax', 56346: 'taxi', 56351: 'tb', 56352: 'tba', 56353: 'tbsp', 56354: 'tc', 56355: 'td', 56356: 'te', 56361: 'tea', 56362: 'teach', 56363: 'teacup', 56364: 'teak', 56365: 'team', 56366: 'teams', 23611: 'dusty', 23612: 'dutch', 23613: 'duty', 23614: 'dv', 42165: 'mile', 23616: 'dwarf', 52615: 'scat', 23621: 'dwell', 23622: 'dwelt', 23623: 'dwight', 23624: 'dx', 23625: 'dy', 23626: 'dyad', 23631: 'dye', 23632: 'dyed', 23633: 'dying', 23634: 'dylan', 23635: 'dynamo', 23636: 'dz', 23641: 'e', 23642: "e's", 23643: 'ea', 23644: 'each', 23645: 'eager', 23646: 'eagle', 56415: 'teddy', 56416: 'tee', 52616: 'scene', 23651: 'ear', 23652: 'earl', 23653: 'early', 23654: 'earn', 23655: 'earns', 23656: 'ears', 56425: 'tell', 56426: 'tells', 23661: 'earth', 23662: 'ease', 23663: 'easel', 23664: 'east', 23665: 'easy', 23666: 'eat', 31251: 'gm', 56436: 'tempt', 56441: 'ten', 56442: 'tend', 56443: 'tends', 56444: 'tenor', 56445: 'tens', 56446: 'tense', 56451: 'tent', 51526: 'rift', 56453: 'tents', 56454: 'term', 56455: 'terms', 56456: 'terra', 56461: 'terry', 56462: 'terse', 56463: 'test', 56464: 'tests', 56465: 'testy', 56466: 'tex', 65516: '1997', 56511: 'texan', 56512: 'texas', 56513: 'text', 56514: 'tf', 56515: 'tg', 56516: 'tgif', 56521: 'th', 56522: 'thai', 56523: 'than', 56524: 'thank', 56525: 'that', 56526: 'thaw', 56531: 'thaws', 56532: 'the', 56533: 'theft', 56534: 'their', 56535: 'them', 56536: 'theme', 56541: 'then', 56542: 'there', 56543: 'these', 56544: 'theta', 56545: 'they', 56546: 'thick', 53115: 'seams', 56551: 'thief', 56552: 'thigh', 56553: 'thin', 56554: 'thing', 56555: 'think', 56556: 'thins', 56561: 'third', 56562: 'this', 56563: 'tho', 56564: 'thong', 56565: 'thor', 56566: 'thorn', 61313: 'torch', 61314: 'tore', 61315: 'torn', 61316: 'torso', 56611: 'thorny', 56612: 'those', 56613: 'thread', 56614: 'three', 56615: 'threw', 56616: 'throb', 56621: 'throw', 56622: 'throws', 56623: 'thru', 56624: 'thu', 56625: 'thud', 56626: 'thug', 56631: 'thumb', 56632: 'thump', 56633: 'thur', 56634: 'thus', 44131: 'offer', 56636: 'ti', 56641: 'tiara', 56642: 'tibet', 56643: 'tic', 56644: 'tick', 56645: 'ticket', 56646: 'ticks', 56651: 'tics', 56652: 'tidal', 56653: 'tidbit', 56654: 'tide', 53133: 'sects', 56656: 'tie', 56661: 'tied', 56662: 'tier', 56663: 'ties', 56664: 'tiger', 56665: 'tight', 56666: 'tile', 52625: 'scoop', 53143: 'seedy', 52626: 'scoot', 42225: 'mind', 53151: 'seems', 63663: 'wish', 31311: 'goal', 62463: 'vera', 42235: 'mini', 24111: 'eaten', 24112: 'eater', 24113: 'eats', 24114: 'eave', 24115: 'eaves', 24116: 'eb', 24121: 'ebb', 24122: 'ebony', 24123: 'ec', 24124: 'echo', 24125: 'ed', 24126: 'eddie', 61365: 'trade', 24131: 'eddy', 24132: 'eden', 24133: 'edgar', 24134: 'edge', 24135: 'edges', 24136: 'edgy', 24141: 'edible', 24142: 'edict', 24143: 'edify', 24144: 'edit', 24145: 'edith', 24146: 'editor', 24151: 'edits', 24152: 'edna', 24153: 'edsel', 24154: 'edwin', 24155: 'ee', 24156: 'eee', 24161: 'eeee', 24162: 'eeg', 24163: 'eel', 24164: 'eerie', 24165: 'ef', 24166: 'efface', 44133: 'og', 61363: 'tract', 24211: 'efg', 24212: 'eflat', 24213: 'eft', 24214: 'eg', 24215: 'egg', 24216: 'eggs', 31343: 'good', 61264: 'topic', 24221: 'ego', 24222: 'egress', 24223: 'egret', 24224: 'egypt', 24225: 'eh', 24226: 'ei', 24231: 'eight', 24232: 'ej', 24233: 'eject', 24234: 'ek', 24235: 'ekg', 24236: 'el', 64114: 'witty', 24241: 'elate', 24242: 'elbow', 24243: 'elder', 24244: 'elect', 24245: 'elegy', 24246: 'elena', 24251: 'eleven', 24252: 'elf', 24253: 'elfin', 24254: 'eli', 24255: 'elide', 24256: 'eliot', 24261: 'elite', 24262: 'eliza', 24263: 'elk', 24264: 'elks', 24265: 'ella', 24266: 'ellen', 64212: 'worm', 61366: 'trail', 51546: 'rings', 64122: 'wm', 64123: 'wn', 64124: 'wnw', 65231: 'zv', 24311: 'elm', 24312: 'elmer', 24313: 'elms', 24314: 'elope', 24315: 'elroy', 24316: 'else', 24321: 'elsie', 24322: 'elton', 24323: 'elude', 24324: 'elves', 24325: 'elvis', 24326: 'ely', 24331: 'em', 24332: 'email', 24333: 'embalm', 24334: 'embed', 24335: 'ember', 24336: 'emcee', 62465: 'verbs', 64131: 'woes', 24341: 'emery', 24342: 'emil', 24343: 'emile', 24344: 'emily', 24345: 'emit', 24346: 'emits', 24351: 'emma', 24352: 'emmy', 24353: 'emote', 24354: 'employ', 24355: 'empty', 24356: 'emu', 64134: 'wolf', 24361: 'en', 24362: 'enact', 24363: 'enamel', 24364: 'end', 24365: 'ended', 24366: 'endow', 24411: 'ends', 24412: 'enema', 24413: 'enemy', 24414: 'enigma', 24415: 'enjoy', 24416: 'enmity', 24421: 'ennui', 24422: 'enoch', 24423: 'ensue', 24424: 'enter', 24425: 'entrap', 24426: 'entry', 61415: 'traps', 24431: 'envoy', 24432: 'envy', 24433: 'eo', 24434: 'eon', 24435: 'eons', 24436: 'ep', 24441: 'epic', 24442: 'epics', 24443: 'epoch', 24444: 'epoxy', 24445: 'epsom', 24446: 'eq', 24451: 'equal', 24452: 'equip', 24453: 'er', 24454: 'era', 24455: 'erase', 24456: 'erect', 24461: 'ergo', 24462: 'eric', 24463: 'erica', 24464: 'erie', 24465: 'erik', 24466: 'erin', 13363: 'bass', 24511: 'ernest', 24512: 'ernie', 24513: 'erode', 24514: 'eros', 24515: 'err', 24516: 'errand', 24521: 'errol', 24522: 'error', 24523: 'erupt', 24524: 'es', 24525: 'esp', 24526: 'espy', 24531: 'esq', 24532: 'essay', 24533: 'ester', 24534: 'et', 24535: 'eta', 24536: 'etc', 24541: 'etch', 24542: 'ethel', 24543: 'ether', 24544: 'ethic', 24545: 'ethos', 24546: 'ethyl', 24551: 'etude', 24552: 'eu', 24553: 'eureka', 24554: 'ev', 24555: 'eva', 24556: 'evade', 24561: 'evans', 24562: 'eve', 24563: 'even', 24564: 'event', 24565: 'ever', 24566: 'every', 24611: 'evict', 24612: 'evil', 24613: 'evita', 24614: 'evoke', 24615: 'evolve', 24616: 'ew', 24621: 'ewe', 24622: 'ex', 24623: 'exact', 24624: 'exalt', 24625: 'exam', 24626: 'exams', 24631: 'excel', 24632: 'excess', 24633: 'exec', 24634: 'exert', 24635: 'exile', 24636: 'exist', 24641: 'exit', 24642: 'exits', 24643: 'exodus', 24644: 'expel', 24645: 'expo', 24646: 'extant', 24651: 'extent', 24652: 'extol', 24653: 'extra', 24654: 'exult', 24655: 'exxon', 24656: 'ey', 24661: 'eye', 24662: 'eyed', 24663: 'eyes', 24664: 'ez', 24665: 'ezra', 24666: 'f', 53263: 'sh', 61463: 'troop', 61464: 'trot', 41111: 'love', 41112: 'loved', 41113: 'lover', 41114: 'low', 41115: 'lower', 41116: 'lowry', 61466: 'trout', 41121: 'lox', 41122: 'loyal', 41123: 'lp', 41124: 'lq', 41125: 'lr', 41126: 'ls', 41131: 'lsd', 41132: 'lt', 41133: 'ltd', 41134: 'lu', 41135: 'luau', 41136: 'lucas', 41141: 'luce', 41142: 'lucia', 41143: 'lucid', 41144: 'luck', 41145: 'lucky', 41146: 'lucy', 41151: 'ludwig', 41152: 'lug', 41153: 'luger', 41154: 'lugs', 41155: 'luis', 41156: 'luke', 41161: 'lull', 41162: 'lulu', 41163: 'lump', 41164: 'lumps', 41165: 'lumpy', 41166: 'luna', 64111: 'witch', 64323: 'wyman', 46231: 'prime', 64211: 'world', 64112: 'with', 41211: 'lunar', 41212: 'lunch', 41213: 'lung', 41214: 'lunge', 41215: 'lungs', 41216: 'lurch', 41221: 'lure', 41222: 'lurid', 41223: 'lurk', 41224: 'lurks', 41225: 'lush', 41226: 'lust', 41231: 'lusty', 41232: 'lute', 41233: 'luxury', 41234: 'lv', 41235: 'lw', 41236: 'lx', 64113: 'wits', 41241: 'ly', 41242: 'lye', 41243: 'lying', 41244: 'lyle', 41245: 'lymph', 41246: 'lynch', 41251: 'lynn', 41252: 'lynx', 41253: 'lyre', 41254: 'lyric', 41255: 'lz', 41256: 'm', 41261: 'm&m', 41262: "m's", 41263: 'm-16', 41264: 'ma', 41265: "ma'am", 41266: 'mabel', 64222: 'worst', 64223: 'worth', 64224: 'would', 12343: 'apply', 12344: 'apr', 12345: 'april', 45114: 'peach', 41311: 'mac', 41312: 'macaw', 41313: 'mace', 41314: 'macho', 41315: 'macro', 41316: 'mad', 41321: 'madam', 41322: 'made', 41323: 'madly', 41324: 'madman', 41325: 'mafia', 41326: 'magic', 64232: 'wow', 41331: 'magma', 41332: 'magnet', 41333: 'magoo', 41334: 'magpie', 41335: 'maid', 41336: 'maids', 41341: 'mail', 41342: 'maim', 41343: 'maims', 41344: 'main', 41345: 'maine', 41346: 'maize', 12353: 'aqua', 41352: 'major', 41353: 'make', 41354: 'malady', 41355: 'male', 41356: 'malice', 12354: 'ar', 41361: 'mall', 41362: 'malls', 41363: 'malt', 41364: 'mama', 41365: 'mambo', 41366: 'mammal', 12356: 'arabs', 61513: 'truck', 64664: 'zag', 61514: 'trudge', 64213: 'worms', 41411: 'man', 41412: 'mane', 41413: 'mango', 41414: 'mania', 41415: 'manic', 41416: 'manly', 12364: 'arcade', 41421: 'manna', 41422: 'manor', 12365: 'arch', 41424: 'many', 41425: 'mao', 41426: 'map', 12366: 'archer', 41431: 'maple', 41432: 'maps', 41433: 'mar', 41434: 'marble', 41435: 'march', 41436: 'marco', 41441: 'mare', 41442: 'mares', 41443: 'marge', 41444: 'margo', 41445: 'maria', 41446: 'marie', 41451: 'marine', 41452: 'mario', 41453: 'mark', 41454: 'marks', 41455: 'marlin', 41456: 'marrow', 41461: 'marry', 41462: 'mars', 41463: 'marsh', 41464: 'mart', 41465: 'marty', 41466: 'martyr', 61525: 'truth', 65212: 'zoom', 61526: 'try', 64121: 'wl', 25111: 'f#', 25112: "f's", 25113: 'fa', 25114: 'fable', 25115: 'fabric', 25116: 'face', 25121: 'faces', 25122: 'facet', 25123: 'facile', 25124: 'fact', 25125: 'facts', 25126: 'fad', 41511: 'marx', 41512: 'mary', 41513: 'mash', 41514: 'mask', 25131: 'fade', 25132: 'fads', 25133: 'fail', 25134: 'faint', 25135: 'fair', 25136: 'fairy', 41521: 'mass', 41522: 'mast', 41523: 'masts', 41524: 'mat', 25141: 'faith', 25142: 'fake', 25143: 'faker', 25144: 'fall', 25145: 'false', 25146: 'fame', 41531: 'mated', 41532: 'mates', 41533: 'math', 41534: 'mats', 25151: 'fan', 25152: 'fancy', 25153: 'fang', 25154: 'fangs', 25155: 'fanny', 25156: 'fans', 41541: 'maud', 41542: 'maude', 41543: 'maul', 41544: 'mauls', 25161: 'far', 25162: 'farce', 25163: 'fare', 25164: 'farm', 25165: 'farms', 25166: 'fast', 41551: 'maxim', 41552: 'may', 41553: 'maybe', 41554: 'mayhem', 41555: 'mayo', 41556: 'mayor', 41561: 'mazda', 41562: 'maze', 41563: 'mazes', 41564: 'mb', 41565: 'mba', 41566: 'mc', 61545: 'tubes', 64366: 'xr', 25211: 'fat', 25212: 'fatal', 25213: 'fate', 25214: 'father', 25215: 'fats', 25216: 'fatty', 25221: 'fault', 25222: 'fauna', 25223: 'faust', 25224: 'faux', 25225: 'fawn', 25226: 'fax', 41611: 'mccoy', 41612: 'mcgee', 41613: 'md', 41614: 'me', 25231: 'faze', 25232: 'fb', 25233: 'fbi', 25234: 'fc', 25235: 'fd', 25236: 'fe', 41621: 'meals', 41622: 'mean', 41623: 'means', 41624: 'meant', 25241: 'fear', 25242: 'fears', 25243: 'feast', 25244: 'feat', 25245: 'feb', 25246: 'fed', 41631: 'mecca', 41632: 'medal', 41633: 'media', 41634: 'medic', 25251: 'fee', 25252: 'feeble', 25253: 'feed', 25254: 'feeds', 25255: 'feel', 25256: 'feels', 41641: 'meet', 41642: 'meets', 41643: 'meg', 41644: 'meld', 25261: 'fees', 25262: 'feet', 25263: 'feign', 25264: 'feint', 25265: 'felice', 25266: 'felix', 41651: 'melody', 41652: 'melon', 41653: 'melt', 41654: 'melts', 41655: 'memo', 41656: 'memoir', 15135: 'bravo', 41661: 'men', 41662: 'mend', 41663: 'mends', 41664: 'menu', 41665: 'meow', 41666: 'mercy', 53415: 'ships', 61562: 'tulip', 25311: 'fell', 25312: 'felon', 25313: 'felt', 25314: 'femur', 25315: 'fence', 25316: 'fend', 31526: 'grim', 25321: 'fern', 25322: 'ferry', 25323: 'fetal', 25324: 'fetch', 25325: 'fete', 25326: 'fetid', 61565: 'tune', 25331: 'fetus', 25332: 'feud', 25333: 'fever', 25334: 'few', 25335: 'fez', 25336: 'ff', 25341: 'fff', 25342: 'ffff', 25343: 'fg', 25344: 'fgh', 25345: 'fh', 25346: 'fi', 25351: 'fiat', 25352: 'fib', 25353: 'fiber', 25354: 'fickle', 25355: 'fido', 25356: 'field', 25361: 'fiend', 25362: 'fiery', 25363: 'fife', 25364: 'fifth', 25365: 'fifty', 25366: 'fig', 31535: 'grins', 31536: 'grip', 56113: 'swell', 56114: 'swept', 65222: 'zr', 25411: 'fight', 25412: 'figs', 25413: 'fiji', 25414: 'filch', 25415: 'file', 25416: 'filed', 64311: 'wwii', 25421: 'files', 25422: 'filet', 25423: 'fill', 25424: 'filler', 25425: 'filly', 25426: 'film', 25431: 'films', 25432: 'filmy', 25433: 'filth', 25434: 'fin', 25435: 'final', 25436: 'finale', 31546: 'grog', 25441: 'finch', 25442: 'find', 25443: 'fine', 25444: 'fined', 25445: 'finer', 25446: 'finite', 56124: 'swing', 25451: 'fink', 25452: 'finn', 25453: 'finny', 25454: 'fir', 25455: 'fire', 25456: 'firm', 25461: 'first', 25462: 'fish', 25463: 'fishy', 25464: 'fist', 25465: 'fit', 25466: 'fits', 64321: 'wyatt', 64322: 'wylie', 54314: 'smirk', 56132: 'swiss', 56133: 'swoop', 12443: 'arose', 56134: 'sword', 25511: 'five', 25512: 'fix', 25513: 'fixed', 25514: 'fizz', 25515: 'fj', 25516: 'fjord', 45213: 'per', 25521: 'fk', 25522: 'fl', 25523: 'flab', 25524: 'flag', 25525: 'flail', 25526: 'flair', 25531: 'flak', 25532: 'flake', 25533: 'flaky', 25534: 'flame', 25535: 'flank', 25536: 'flap', 64331: "x's", 25541: 'flare', 25542: 'flash', 25543: 'flask', 25544: 'flat', 25545: 'flavor', 25546: 'flaw', 25551: 'flax', 25552: 'flay', 25553: 'flea', 25554: 'fled', 25555: 'flee', 25556: 'fleet', 64334: 'xc', 25561: 'flesh', 25562: 'flew', 25563: 'flex', 25564: 'flick', 25565: 'flier', 25566: 'flies', 12453: 'artery', 12454: 'arthur', 12455: 'artie', 12456: 'arts', 64341: 'xerox', 64432: 'xyz', 25611: 'flinch', 25612: 'fling', 25613: 'flint', 25614: 'flip', 25615: 'flirt', 25616: 'flit', 25621: 'flo', 25622: 'float', 25623: 'flock', 25624: 'flog', 25625: 'flood', 25626: 'floor', 12463: 'as', 25631: 'flop', 25632: 'floppy', 25633: 'flora', 25634: 'flour', 25635: 'flow', 25636: 'flown', 12465: 'ascend', 25641: 'floyd', 25642: 'flu', 25643: 'flub', 25644: 'flue', 25645: 'fluff', 25646: 'fluid', 25651: 'fluke', 25652: 'flung', 25653: 'flush', 25654: 'flute', 25655: 'flux', 25656: 'fly', 25661: 'flyer', 25662: 'fm', 25663: 'fn', 25664: 'fo', 25665: 'foal', 25666: 'foam', 64353: 'xj', 64354: 'xk', 63414: 'wedge', 61625: 'tutor', 61413: 'tramp', 65232: 'zw', 61626: 'tutu', 54321: 'smog', 42515: 'morse', 66324: '60%', 42516: 'morsel', 65233: 'zx', 42111: 'mere', 42112: 'merge', 42113: 'merit', 42114: 'merry', 42115: 'mesa', 42116: 'mesh', 42121: 'mess', 42122: 'messy', 42123: 'met', 42124: 'metal', 42125: 'meteor', 42126: 'meter', 64365: 'xq', 42131: 'metro', 42132: 'meyer', 42133: 'mf', 42134: 'mg', 42135: 'mgm', 42136: 'mgmt', 61636: 'twain', 42141: 'mh', 42142: 'mi', 42143: 'mia', 42144: 'miami', 42145: 'mice', 42146: 'mickey', 42151: 'micro', 42152: 'mid', 42153: 'midas', 42154: 'midst', 42155: 'mig', 42156: 'might', 42525: 'mossy', 42161: 'migs', 42162: 'mike', 42163: 'mild', 42164: 'mildew', 42526: 'most', 42166: 'miles', 54324: 'smooth', 61645: 'twigs', 61646: 'twin', 64145: 'wonder', 42211: 'milk', 42212: 'milky', 42213: 'mill', 42214: 'mills', 42215: 'milo', 42216: 'mime', 42535: 'moths', 42221: 'mimes', 42222: 'mimi', 42223: 'mimic', 42224: 'mince', 42536: 'motif', 42226: 'minds', 42231: 'mine', 42232: 'mined', 42233: 'miner', 42234: 'mines', 15231: 'brood', 42236: 'mink', 42241: 'minnow', 42242: 'minor', 42243: 'mint', 42244: 'mints', 42245: 'minty', 42246: 'minus', 42251: 'mirage', 42252: 'mire', 42253: 'mired', 42254: 'mirth', 42255: 'mirv', 42256: 'misc', 42261: 'miser', 42262: 'misery', 42263: 'miss', 42264: 'mist', 42265: 'mists', 42266: 'misty', 42545: 'mourn', 42546: 'mouse', 31625: 'guess', 61663: 'tx', 31626: 'guest', 61664: 'ty', 42311: 'mit', 42312: 'mite', 42313: 'mites', 42314: 'mitt', 42315: 'mitts', 42316: 'mix', 61666: 'tying', 66331: '60th', 42321: 'mixed', 42322: 'mixer', 42323: 'mixes', 42324: 'mixup', 42325: 'mj', 42326: 'mk', 42331: 'ml', 42332: 'mm', 42333: 'mmm', 42334: 'mmmm', 42335: 'mn', 42336: 'mno', 42341: 'mo', 42342: 'moan', 42343: 'moans', 42344: 'moat', 42345: 'mob', 42346: 'mobil', 42351: 'mobs', 42352: 'moby', 42353: 'mock', 42354: 'mocks', 42355: 'mod', 42356: 'mode', 42361: 'model', 42362: 'modem', 42363: 'moe', 42364: 'mogul', 42365: 'moist', 42366: 'mojo', 56213: 'taco', 62514: 'verna', 56214: 'tact', 15255: 'bryan', 62515: 'verne', 64411: 'xray', 64412: 'xrays', 42411: 'molar', 42412: 'mold', 42413: 'molds', 42414: 'mole', 42415: 'moles', 42416: 'molly', 15262: 'btu', 42422: 'molten', 42423: 'mom', 42424: 'momma', 42425: 'mommy', 42426: 'mon', 56223: 'tag', 42431: 'mona', 42432: 'money', 42433: 'monk', 42434: 'monkey', 42435: 'mono', 42436: 'month', 42441: 'monty', 42442: 'moo', 42443: 'mooch', 42444: 'mood', 42445: 'moods', 42446: 'moody', 42451: 'moon', 42452: 'moons', 42453: 'moor', 42454: 'moore', 42455: 'moose', 42456: 'mop', 42461: 'mope', 42462: 'mopes', 42463: 'mops', 42464: 'moral', 42465: 'morale', 42466: 'morbid', 64422: 'xvii', 56231: 'taint', 56232: 'take', 56233: 'taken', 45311: 'phrase', 56234: 'takes', 26111: 'foamy', 26112: 'fob', 26113: 'focal', 26114: 'focus', 26115: 'fodder', 26116: 'foe', 12545: 'astor', 26121: 'foes', 26122: 'fog', 26123: 'foggy', 26124: 'fogy', 26125: 'foil', 26126: 'foist', 42511: 'more', 42512: 'morn', 42513: 'moron', 42514: 'morph', 26131: 'fold', 26132: 'folio', 26133: 'folk', 26134: 'folly', 26135: 'fond', 26136: 'font', 42521: 'mort', 42522: 'mosaic', 42523: 'moses', 42524: 'moss', 26141: 'food', 26142: 'fool', 26143: 'foot', 26144: 'fop', 26145: 'for', 26146: 'foray', 42531: 'mote', 42532: 'motel', 42533: 'moth', 42534: 'mother', 26151: 'force', 26152: 'ford', 26153: 'fore', 26154: 'forge', 26155: 'forgot', 26156: 'fork', 42541: 'motor', 42542: 'motto', 42543: 'mound', 42544: 'mount', 26161: 'form', 26162: 'forms', 26163: 'fort', 26164: 'forte', 26165: 'forth', 26166: 'forty', 12553: 'atari', 42552: 'mouth', 42553: 'move', 42554: 'moved', 42555: 'moves', 42556: 'movie', 12554: 'ate', 42561: 'mow', 42562: 'mowed', 42563: 'mower', 42564: 'mows', 42565: 'moxie', 42566: 'mp', 12556: 'atlas', 62316: 'usurp', 26211: 'forum', 26212: 'fossil', 26213: 'foul', 26214: 'found', 26215: 'fount', 26216: 'four', 26221: 'fowl', 26222: 'fox', 26223: 'foxes', 26224: 'foxy', 26225: 'foyer', 26226: 'fp', 42611: 'mpg', 42612: 'mph', 42613: 'mq', 42614: 'mr', 42615: 'mrs', 26232: 'fr', 26233: 'frail', 26234: 'frame', 26235: 'france', 26236: 'frank', 42621: 'msdos', 42622: 'msg', 42623: 'mt', 42624: 'mu', 26241: 'franz', 26242: 'frau', 26243: 'fraud', 26244: 'fray', 26245: 'freak', 26246: 'fred', 42631: 'mucus', 42632: 'mud', 42633: 'muddy', 42634: 'muff', 26251: 'free', 26252: 'freed', 26253: 'freer', 26254: 'frenzy', 26255: 'freon', 26256: 'fresh', 42641: 'muggy', 42642: 'mugs', 42643: 'mulch', 42644: 'mule', 26261: 'fret', 26262: 'freud', 26263: 'fri', 26264: 'friar', 26265: 'fried', 26266: 'fries', 42651: 'mum', 42652: 'mumble', 42653: 'mummy', 42654: 'mumps', 42655: 'munch', 42656: 'mural', 42661: 'muriel', 42662: 'murk', 42663: 'murky', 42664: 'muse', 42665: 'muses', 42666: 'mush', 63615: 'wills', 53535: 'sign', 64333: 'xb', 26311: 'frill', 26312: 'frilly', 26313: 'frisky', 26314: 'fritz', 26231: 'fq', 26316: 'frog', 62525: 'vests', 64461: 'yard', 26321: 'frogs', 26322: 'from', 26323: 'frond', 26324: 'front', 26325: 'frost', 26326: 'froth', 26331: 'frown', 26332: 'froze', 26333: 'fruit', 26334: 'fry', 26335: 'fs', 26336: 'ft', 64464: 'yawn', 26341: 'fu', 26342: 'fudge', 26343: 'fuel', 26344: 'fugue', 26345: 'fuji', 26346: 'full', 62526: 'vet', 64466: 'yb', 26351: 'fully', 26352: 'fumble', 26353: 'fume', 26354: 'fumes', 26355: 'fun', 26356: 'fund', 63451: 'west', 26361: 'funds', 26362: 'fungi', 26363: 'funk', 26364: 'funky', 26365: 'funny', 26366: 'fur', 42625: 'much', 42626: 'muck', 53553: 'silver', 26411: 'furl', 26412: 'furry', 26413: 'furs', 26414: 'fury', 26415: 'fuse', 26416: 'fuss', 26421: 'fussy', 26422: 'fuzz', 26423: 'fuzzy', 26424: 'fv', 26425: 'fw', 26426: 'fx', 26431: 'fy', 26432: 'fyi', 26433: 'fz', 26434: 'g', 26435: "g's", 26436: 'ga', 26441: 'gab', 26442: 'gable', 26443: 'gadget', 26444: 'gaea', 26445: 'gaffe', 26446: 'gag', 26451: 'gags', 26452: 'gail', 26453: 'gaily', 26454: 'gain', 26455: 'gait', 26456: 'gal', 26461: 'gala', 26462: 'galaxy', 26463: 'gale', 26464: 'gall', 26465: 'gallop', 26466: 'gam', 42645: 'mules', 64455: 'yanks', 42646: 'mull', 26511: 'game', 26512: 'games', 26513: 'gamma', 26514: 'gamut', 26515: 'gamy', 26516: 'gander', 26521: 'gang', 26522: 'gangs', 26523: 'gap', 26524: 'gape', 26525: 'gapes', 26526: 'gaps', 26531: 'garb', 26532: 'gargle', 26533: 'garish', 26534: 'gary', 26535: 'gas', 26536: 'gash', 26541: 'gasp', 26542: 'gasps', 26543: 'gassy', 26544: 'gate', 26545: 'gates', 26546: 'gator', 26551: 'gauche', 26552: 'gaudy', 26553: 'gauge', 26554: 'gaunt', 26555: 'gauze', 26556: 'gave', 26561: 'gavel', 26562: 'gawk', 26563: 'gawky', 26564: 'gay', 26565: 'gaze', 26566: 'gazed', 31525: 'grill', 56311: 'tape', 56312: 'taped', 56313: 'taper', 56314: 'tapes', 64335: 'xd', 26611: 'gazes', 26612: 'gb', 26613: 'gc', 26614: 'gd', 26615: 'ge', 26616: 'gear', 26621: 'gears', 26622: 'gee', 26623: 'geese', 26624: 'gel', 26625: 'geld', 26626: 'gem', 26631: 'gems', 26632: 'gene', 26633: 'genes', 26634: 'genie', 26635: 'genre', 26636: 'gent', 56322: 'target', 62153: 'um', 26641: 'gentry', 26642: 'geo', 26643: 'gerbil', 26644: 'germ', 26645: 'germs', 26646: 'get', 62536: 'vf', 56324: 'tarry', 26651: 'gets', 26652: 'gf', 26653: 'gg', 26654: 'ggg', 26655: 'gggg', 26656: 'gh', 26661: 'ghetto', 26662: 'ghi', 26663: 'ghost', 26664: 'ghoul', 26665: 'ghq', 26666: 'gi', 64521: 'yearn', 65265: ')', 66332: '61', 56331: 'task', 56332: 'taste', 56333: 'tasty', 12643: 'avail', 56334: 'tate', 12644: 'avert', 12645: 'avery', 12646: 'avian', 43111: 'mushy', 43112: 'music', 43113: 'musk', 43114: 'musky', 43115: 'muslim', 43116: 'muss', 43121: 'must', 43122: 'musty', 43123: 'mute', 43124: 'muted', 43125: 'mutt', 43126: 'muzak', 43131: 'mv', 43132: 'mw', 43133: 'mx', 43134: 'my', 43135: 'mylar', 43136: 'mynah', 43141: 'myob', 43142: 'myopia', 43143: 'myra', 43144: 'myron', 43145: 'myself', 43146: 'myth', 45421: 'pit', 43152: 'mz', 43153: 'n', 43154: "n's", 43155: 'na', 43156: 'nab', 45422: 'pita', 43161: 'nabs', 43162: 'nacl', 43163: 'nag', 43164: 'nags', 43165: 'nail', 43166: 'nails', 12656: 'avow', 26315: 'frock', 43211: 'naive', 43212: 'naked', 43213: 'name', 43214: 'named', 43215: 'names', 43216: 'nan', 45432: 'pivot', 43221: 'nancy', 43222: 'naomi', 43223: 'nap', 43224: 'nape', 43225: 'napkin', 43226: 'naps', 12666: 'awash', 43231: 'nasa', 43232: 'nasal', 43233: 'nash', 43234: 'nasty', 43235: 'nat', 43236: 'natal', 43241: 'nate', 43242: 'nato', 43243: 'nature', 43244: 'nausea', 43245: 'naval', 43246: 'navel', 43251: 'navy', 43252: 'nay', 43253: 'nazi', 43254: 'nb', 43255: 'nc', 43256: 'nd', 43261: 'ne', 43262: 'near', 43263: 'nearby', 43264: 'neat', 43265: 'neck', 43266: 'necks', 64555: 'yl', 13443: 'bbs', 63135: 'vvv', 53635: 'sixth', 13444: 'bc', 43311: 'ned', 43312: 'need', 43313: 'needs', 43314: 'needy', 43315: 'negate', 43316: 'negro', 43321: 'neigh', 43322: 'neil', 43323: 'nell', 43324: 'neon', 43325: 'nerd', 43326: 'nerve', 64565: 'yogi', 43331: 'nest', 43332: 'nests', 43333: 'net', 43334: 'nets', 43335: 'never', 43336: 'new', 43341: 'newly', 43342: 'news', 43343: 'newt', 43344: 'next', 43345: 'nf', 43346: 'ng', 43351: 'nguyen', 43352: 'nh', 43353: 'ni', 43354: 'nice', 43355: 'nicer', 43356: 'nick', 13446: 'bd', 43361: 'nickel', 43362: 'nico', 43363: 'niece', 43364: 'nifty', 43365: 'night', 43366: 'nil', 23615: 'dw', 43411: 'nile', 43412: 'nina', 43413: 'nine', 43414: 'ninja', 43415: 'ninth', 43416: 'niobe', 43421: 'nip', 43422: 'nips', 43423: 'nitwit', 43424: 'nix', 43425: 'nixon', 43426: 'nj', 43431: 'nk', 43432: 'nl', 43433: 'nm', 43434: 'nn', 43435: 'nne', 43436: 'nnn', 43441: 'nnnn', 43442: 'nnw', 43443: 'no', 43444: 'noah', 43445: 'noble', 43446: 'nod', 65211: 'zoo', 43451: 'node', 43452: 'nods', 43453: 'noel', 43454: 'noise', 43455: 'noisy', 43456: 'nomad', 43461: 'none', 43462: 'nono', 43463: 'nook', 43464: 'noon', 43465: 'noose', 43466: 'nop', 43511: 'nope', 43512: 'nor', 43513: 'nora', 43514: 'norm', 43515: 'norma', 43516: 'north', 43521: 'norway', 43522: 'nose', 43523: 'nosy', 43524: 'not', 43525: 'notch', 43526: 'note', 43531: 'noted', 43532: 'notes', 43533: 'noun', 43534: 'nouns', 43535: 'nov', 43536: 'nova', 43541: 'novak', 43542: 'novel', 43543: 'now', 43544: 'np', 43545: 'nq', 43546: 'nr', 43551: 'ns', 43552: 'nt', 43553: 'nu', 43554: 'nuance', 43555: 'nude', 43556: 'nudge', 43561: 'nuke', 43562: 'null', 43563: 'numb', 43564: 'nun', 43565: 'nuns', 43566: 'nurse', 56413: 'tech', 56414: 'ted', 55462: 'stuff', 12363: 'arc', 43611: 'nut', 43612: 'nutmeg', 43613: 'nuts', 43614: 'nutty', 43615: 'nv', 43616: 'nw', 43621: 'nx', 43622: 'ny', 43623: 'nyc', 43624: 'nylon', 43625: 'nymph', 43626: 'nz', 56423: 'tees', 43631: 'o', 43632: "o's", 43633: 'oa', 43634: 'oaf', 43635: 'oak', 43636: 'oaken', 43641: 'oar', 43642: 'oars', 43643: 'oasis', 43644: 'oat', 43645: 'oath', 43646: 'oats', 43651: 'ob', 43652: 'obese', 43653: 'obey', 43654: 'obeys', 43655: 'obit', 43656: 'object', 43661: 'oboe', 43662: 'oc', 43663: 'occur', 43664: 'ocean', 43665: 'ocr', 43666: 'oct', 56431: 'temp', 56432: 'temper', 56433: 'temple', 56434: 'tempo', 56435: 'temps', 65322: '1/2', 64635: 'yt', 65213: 'zooms', 56452: 'tenth', 62563: 'vigil', 65541: '2020', 61613: 'turf', 64654: 'yyy', 65432: '17th', 62564: 'vigor', 11111: 'a', 11112: "a's", 11113: 'a-1', 11114: 'a-z', 11115: 'aa', 11116: 'aaa', 11121: 'aaaa', 11122: 'aaron', 11123: 'ab', 11124: 'aback', 11125: 'abacus', 11126: 'abase', 13464: 'bean', 11131: 'abash', 11132: 'abate', 11133: 'abbey', 11134: 'abbot', 11135: 'abbr', 11136: 'abby', 11141: 'abc', 11142: "abc's", 11143: 'abcd', 11144: 'abduct', 11145: 'abdul', 11146: 'abe', 64663: 'za', 11151: 'abed', 11152: 'abel', 11153: 'abet', 11154: 'abhor', 11155: 'abide', 11156: 'ablaze', 64665: 'zap', 11161: 'able', 11162: 'abm', 11163: 'abner', 11164: 'aboard', 11165: 'abode', 11166: 'abort', 13466: 'bear', 34633: 'jumbo', 11211: 'about', 11212: 'above', 11213: 'abram', 11214: 'absent', 11215: 'absorb', 11216: 'abuse', 11221: 'abut', 11222: 'abyss', 11223: 'ac', 11224: 'ac/dc', 11225: 'accept', 11226: 'accuse', 11231: 'ace', 11232: 'aces', 11233: 'ache', 11234: 'ached', 11235: 'aches', 11236: 'achoo', 61614: 'turk', 11241: 'achy', 11242: 'acid', 11243: 'acidic', 11244: 'acids', 11245: 'acme', 11246: 'acne', 65433: '18', 11251: 'acorn', 11252: 'acquit', 11253: 'acre', 11254: 'acres', 11255: 'acrid', 11256: 'act', 11261: 'acted', 11262: 'actor', 11263: 'acts', 11264: 'acute', 11265: 'ad', 11266: 'ada', 65324: '1/4', 15533: 'bz', 11311: 'adage', 11312: 'adagio', 11313: 'adair', 11314: 'adam', 11315: 'adams', 11316: 'adapt', 11321: 'add', 11322: 'added', 11323: 'adder', 11324: 'addict', 11325: 'addle', 11326: 'adds', 11331: 'adele', 11332: 'adept', 11333: 'adieu', 11334: 'adios', 11335: 'adjust', 11336: 'adler', 11341: 'admit', 11342: 'ado', 44111: 'octal', 11344: 'adolf', 11345: 'adonis', 11346: 'adopt', 44115: 'odds', 44116: 'ode', 11351: 'adore', 11352: 'adorn', 11353: 'ads', 44122: 'odors', 11355: 'advent', 11356: 'adverb', 44125: 'off', 44126: 'offend', 11361: 'advise', 11362: 'ae', 11363: 'aeiou', 44132: 'often', 11365: 'aesop', 11366: 'af', 44135: 'ogled', 44136: 'ogles', 44141: 'ogre', 44142: 'oh', 44143: 'ohio', 44144: 'oho', 44145: 'oi', 44146: 'oil', 44151: 'oiled', 44152: 'oils', 44153: 'oily', 44154: 'oink', 44155: 'oj', 44156: 'ok', 44161: 'okay', 44162: 'okays', 44163: 'okra', 44164: 'ol', 44165: 'olaf', 44166: 'old', 63665: 'wispy', 11411: 'afar', 11412: 'affair', 11413: 'afghan', 11414: 'afire', 11415: 'afoot', 11416: 'afraid', 11421: 'africa', 11422: 'afro', 11423: 'aft', 11424: 'after', 11425: 'ag', 11426: 'again', 11431: 'agate', 11432: 'age', 11433: 'aged', 11434: 'agenda', 11435: 'agent', 11436: 'ages', 11441: 'agile', 11442: 'aging', 44211: 'older', 11444: 'agnes', 44213: 'olga', 11446: 'ago', 44215: 'olson', 44216: 'om', 11451: 'agony', 11452: 'agree', 11453: 'ah', 44222: 'omega', 11455: 'ahab', 44224: 'omens', 44225: 'omit', 44226: 'omits', 11461: 'ahem', 11462: 'ahmed', 44231: 'on', 44232: 'once', 11465: 'aid', 11466: 'aide', 44235: 'only', 44236: 'onset', 44241: 'onto', 44242: 'onward', 44243: 'oo', 44244: 'ooo', 44245: 'oooo', 44246: 'oops', 44251: 'ooze', 44252: 'oozed', 44253: 'op', 44254: 'opal', 44255: 'opals', 44256: 'opec', 44261: 'open', 44262: 'opens', 44263: 'opera', 44264: 'opium', 44265: 'opq', 44266: 'opt', 64214: 'wormy', 11511: 'aided', 11512: 'ail', 11513: 'aim', 11514: 'aimed', 11515: 'aims', 11516: "ain't", 11521: 'air', 11522: 'airman', 11523: 'airway', 11524: 'airy', 11525: 'aisle', 11526: 'aj', 11531: 'ajar', 11532: 'ajax', 11533: 'ak', 11534: 'aka', 11535: 'akers', 11536: 'akin', 11541: 'akqj', 11542: 'akron', 11543: 'al', 11544: 'alan', 44313: 'oq', 11546: 'alas', 44315: 'oral', 44316: 'orb', 11551: 'alaska', 11552: 'album', 44321: 'orbit', 11554: 'ale', 11555: 'alec', 44324: 'order', 44325: 'ore', 44326: 'organ', 11561: 'alert', 11562: 'alex', 11563: 'alexa', 11564: 'alexei', 11565: 'algae', 11566: 'alger', 44335: 'oscar', 44336: 'ot', 44341: 'other', 44342: 'otis', 44343: 'otter', 44344: 'otto', 44345: 'ou', 44346: 'ouch', 44351: 'ought', 44352: 'ouija', 44353: 'ounce', 44354: 'our', 44355: 'ours', 44356: 'oust', 44361: 'out', 44362: 'outdo', 44363: 'outer', 44364: 'outlaw', 44365: 'ov', 44366: 'oval', 11611: 'ali', 11612: 'alias', 11613: 'alibi', 11614: 'alice', 11615: 'alien', 11616: 'alight', 11621: 'align', 11622: 'alike', 11623: 'alive', 11624: 'alkali', 11625: 'all', 11626: 'allah', 11631: 'allan', 11632: 'allen', 11633: 'alley', 11634: 'allied', 11635: 'allot', 11636: 'allow', 11641: 'alloy', 11642: 'allure', 11643: 'ally', 44412: 'ovary', 44413: 'oven', 11646: 'alms', 44415: 'over', 44416: 'overt', 11651: 'aloft', 11652: 'aloha', 11653: 'alone', 11654: 'along', 44423: 'owed', 11656: 'aloud', 44425: 'owes', 44426: 'owing', 11661: 'alp', 11662: 'alpha', 44431: 'owl', 11664: 'also', 44433: 'own', 11666: 'altar', 44435: 'owner', 44436: 'owns', 44441: 'ox', 44442: 'oxen', 44443: 'oxide', 44444: 'oy', 44445: 'oz', 44446: 'ozone', 44451: 'p', 44452: "p's", 44453: 'pa', 44454: 'pablo', 44455: 'pace', 44456: 'paces', 44461: 'pack', 44462: 'packet', 44463: 'packs', 44464: 'pact', 44465: 'pad', 44466: 'paddy', 65312: '+', 64221: 'worse', 65313: '-', 44511: 'pads', 44512: 'pagan', 44513: 'page', 44514: 'pages', 44515: 'paid', 44516: 'pail', 44521: 'pain', 44522: 'pains', 44523: 'paint', 44524: 'pair', 44525: 'pajama', 44526: 'pal', 44531: 'pale', 44532: 'palm', 44533: 'palms', 44534: 'pals', 44535: 'pam', 44536: 'pan', 65314: '0', 44541: 'panama', 44542: 'panda', 44543: 'pane', 44544: 'panel', 44545: 'pang', 44546: 'panic', 44551: 'pans', 44552: 'pansy', 44553: 'pant', 44554: 'pants', 44555: 'papa', 44556: 'paper', 44561: 'pappy', 44562: 'par', 44563: 'pardon', 44564: 'pare', 44565: 'paris', 44566: 'park', 65315: '007', 44611: 'parks', 44612: 'parse', 44613: 'part', 44614: 'parts', 44615: 'party', 44616: 'pascal', 44621: 'pass', 44622: 'past', 44623: 'paste', 44624: 'pasty', 44625: 'pat', 44626: 'patch', 44631: 'path', 44632: 'paths', 44633: 'patio', 44634: 'pats', 44635: 'patsy', 44636: 'patton', 63364: 'weary', 44641: 'patty', 44642: 'paul', 44643: 'paula', 44644: 'pause', 44645: 'pave', 44646: 'paved', 44651: 'paves', 44652: 'paw', 44653: 'pawed', 44654: 'pawn', 44655: 'pawns', 44656: 'paws', 44661: 'pay', 44662: 'payday', 44663: 'pb', 44664: 'pc', 44665: 'pd', 44666: 'pdq', 12564: 'atomic', 66411: '6:30', 61111: 'tiled', 61112: 'tiles', 61113: 'till', 61114: 'tilt', 61115: 'tim', 61116: 'time', 61121: 'times', 61122: 'timex', 61123: 'timid', 61124: 'tin', 61125: 'tina', 61126: 'tinge', 66412: '6th', 61131: 'tinny', 61132: 'tint', 61133: 'tiny', 61134: 'tip', 61135: 'tipoff', 61136: 'tips', 61141: 'tipsy', 61142: 'tire', 61143: 'tired', 61144: 'tires', 61145: 'title', 61146: 'tj', 15653: 'canal', 61152: 'tl', 61153: 'tlc', 61154: 'tm', 61155: 'tn', 61156: 'tnt', 66413: '7', 61161: 'to', 61162: 'toad', 61163: 'toads', 61164: 'toast', 61165: 'toby', 61166: 'today', 64231: 'woven', 54411: 'snows', 63365: 'weave', 66414: '7%', 54412: 'snowy', 61211: 'todd', 61212: 'toe', 61213: 'toes', 61214: 'tofu', 61215: 'toga', 61216: 'toil', 61221: 'toilet', 61222: 'toils', 61223: 'token', 61224: 'tokyo', 61225: 'told', 61226: 'toll', 64233: 'wp', 61231: 'tolls', 54413: 'snub', 61233: 'tomb', 61234: 'tombs', 61235: 'tommy', 61236: 'ton', 61241: 'tonal', 61242: 'tone', 61243: 'toni', 61244: 'tonic', 61245: 'tons', 61246: 'tonsil', 61251: 'tony', 61252: 'too', 61253: 'took', 61254: 'tool', 61255: 'tools', 61256: 'toot', 64234: 'wq', 61261: 'tooth', 54414: 'snubs', 12111: 'alter', 12112: 'altho', 12113: 'alto', 12114: 'alum', 12115: 'alumni', 12116: 'alvin', 12121: 'alyx', 12122: 'am', 12123: 'am/fm', 12124: 'amass', 12125: 'amaze', 12126: 'amber', 12131: 'amble', 12132: 'ambush', 12133: 'amen', 12134: 'amend', 12135: 'ames', 12136: 'amid', 12141: 'amigo', 12142: 'amino', 12143: 'amish', 12144: 'amiss', 12145: 'amity', 12146: 'ammo', 12151: 'amok', 12152: 'among', 12153: 'amos', 12154: 'amour', 12155: 'amp', 12156: 'ampere', 61311: 'topsy', 61312: 'torah', 12161: 'ample', 12162: 'amply', 12163: 'amps', 12164: 'amulet', 12165: 'amuse', 12166: 'amy', 61321: 'tort', 61322: 'tory', 61323: 'toss', 61324: 'tot', 61325: 'total', 61326: 'tote', 61331: 'totem', 61332: 'tots', 61333: 'touch', 61334: 'tough', 61335: 'tour', 61336: 'tours', 61341: 'tout', 61342: 'tow', 61343: 'towel', 61344: 'tower', 61345: 'town', 61346: 'tows', 61351: 'toxic', 61352: 'toy', 61353: 'toys', 61354: 'tp', 61355: 'tq', 61356: 'tr', 61361: 'trace', 61362: 'track', 12211: 'an', 12212: 'anal', 12213: 'anchor', 12214: 'and', 12215: 'andes', 12216: 'andre', 12221: 'andrew', 12222: 'andy', 12223: 'anew', 12224: 'angel', 12225: 'angelo', 12226: 'anger', 12231: 'angie', 12232: 'angle', 12233: 'angles', 12234: 'anglo', 12235: 'angry', 12236: 'angst', 12241: 'angus', 12242: 'anita', 12243: 'ankle', 12244: 'ann', 12245: 'anna', 12246: 'anne', 66421: '70%', 12251: 'annex', 12252: 'annie', 12253: 'annoy', 12254: 'annul', 12255: 'anon', 12256: 'answer', 61411: 'train', 61412: 'trait', 12261: 'ant', 12262: 'ante', 12263: 'anti', 12264: 'antic', 12265: 'anton', 12266: 'ants', 61421: 'tray', 61422: 'trays', 61423: 'tread', 61424: 'treat', 61425: 'treble', 61426: 'tree', 61431: 'trees', 61432: 'trek', 61433: 'trench', 61434: 'trend', 61435: 'trial', 61436: 'tribe', 62122: 'u', 61441: 'trick', 61442: 'tricky', 61443: 'tried', 61444: 'tries', 61445: 'trig', 61446: 'trill', 61451: 'trim', 61452: 'trims', 61453: 'trio', 61454: 'trip', 61455: 'tripe', 61456: 'trips', 61461: 'trite', 61462: 'troll', 12311: 'anus', 12312: 'anvil', 12313: 'any', 12314: 'anyhow', 12315: 'anyway', 12316: 'ao', 65331: '10%', 54421: 'so', 12321: 'aok', 12322: 'aorta', 12323: 'ap', 12324: 'apart', 12325: 'apathy', 12326: 'ape', 12331: 'apes', 12332: 'apex', 12333: 'aphid', 12334: 'aplomb', 12335: 'appeal', 12336: 'appear', 12341: 'append', 12342: 'apple', 45111: 'pe', 45112: 'pea', 45113: 'peace', 12346: 'apron', 45115: 'peak', 45116: 'peaks', 54422: 'soak', 12351: 'apt', 12352: 'aq', 45121: 'pear', 45122: 'pearl', 45123: 'pears', 45124: 'peas', 45125: 'pebble', 45126: 'pecan', 61511: 'troy', 61512: 'truce', 12361: 'araby', 12362: 'arbor', 45131: 'peck', 45132: 'pecks', 45133: 'pedal', 45134: 'pedro', 45135: 'pee', 45136: 'peed', 61521: 'truly', 61515: 'trudy', 61523: 'truss', 42636: 'mug', 45141: 'peek', 45142: 'peel', 45143: 'peep', 45144: 'peer', 45145: 'peeve', 45146: 'peg', 61531: 'ts', 54423: 'soaks', 61533: 'tsp', 61534: 'tt', 45151: 'peggy', 45152: 'pegs', 45153: 'pelt', 45154: 'pen', 45155: 'penal', 45156: 'pencil', 61541: 'tu', 61542: 'tub', 61543: 'tuba', 61544: 'tube', 45161: 'penn', 45162: 'penny', 45163: 'pens', 45164: 'peony', 45165: 'people', 45166: 'pep', 61551: 'tuck', 61516: 'true', 61553: 'tues', 61554: 'tuft', 61555: 'tufts', 61556: 'tug', 61561: 'tugs', 54424: 'soap', 12411: 'arcs', 12412: 'ardent', 12413: 'are', 12414: 'area', 12415: 'areas', 12416: 'arena', 12421: 'argon', 12422: 'argue', 12423: 'aria', 12424: 'arid', 12425: 'arise', 12426: 'ark', 12431: 'arlene', 12432: 'arm', 12433: 'armed', 12434: 'armor', 12435: 'arms', 12436: 'army', 12441: 'arnold', 12442: 'aroma', 45211: 'peppy', 12444: 'array', 12445: 'arrive', 12446: 'arrow', 45215: 'percy', 45216: 'perez', 12451: 'arson', 12452: 'art', 45221: 'peril', 45222: 'period', 45223: 'perk', 45224: 'perks', 45225: 'perky', 45226: 'perm', 61611: 'tunic', 61612: 'tunnel', 12461: 'arty', 12462: 'aryan', 45231: 'perry', 12464: 'asap', 45233: 'peru', 45234: 'peso', 45235: 'pest', 45236: 'pests', 61621: 'tush', 61622: 'tusk', 61623: 'tusks', 61624: 'tut', 45241: 'pet', 45242: 'petal', 45243: 'pete', 45244: 'peter', 45245: 'pets', 45246: 'petty', 61631: 'tuv', 61632: 'tux', 61633: 'tv', 61634: 'tw', 45251: 'pf', 45252: 'pfc', 45253: 'pg', 45254: 'ph', 45255: 'phase', 45256: 'phd', 61641: 'tweak', 61642: 'tweed', 61643: 'twice', 61644: 'twig', 45261: 'phi', 45262: 'phil', 45263: 'phlox', 45264: 'phone', 45265: 'phony', 45266: 'photo', 61651: 'twine', 61652: 'twins', 61653: 'twirl', 61654: 'twist', 61655: 'twisty', 61656: 'twit', 64132: 'wok', 61661: 'two', 61662: 'twos', 12511: 'ash', 12512: 'ashen', 12513: 'ashes', 12514: 'ashley', 12515: 'ashy', 12516: 'asia', 12521: 'asian', 12522: 'aside', 12523: 'ask', 12524: 'asked', 12525: 'askew', 12526: 'asks', 32125: 'gw', 12531: 'asleep', 12532: 'asp', 12533: 'aspen', 12534: 'aspire', 12535: 'ass', 12536: 'asses', 12541: 'asset', 12542: 'assn', 12543: 'assure', 45312: 'pi', 45313: 'piano', 12546: 'astral', 45315: 'picks', 45316: 'pickup', 12551: 'at', 12552: 'at&t', 45321: 'picky', 45322: 'picnic', 45323: 'pie', 45324: 'piece', 45325: 'pier', 45326: 'pierce', 12561: 'atm', 12562: 'atoll', 12563: 'atom', 45332: 'pies', 45333: 'piety', 45334: 'pig', 45335: 'piggy', 45336: 'pigs', 45341: 'pike', 45342: 'pile', 45343: 'piles', 45344: 'pill', 45345: 'pills', 45346: 'pilot', 61522: 'trunk', 31516: 'grew', 45351: 'pimp', 45352: 'pimple', 45353: 'pin', 45354: 'pinch', 45355: 'pine', 45356: 'pines', 45361: 'ping', 45362: 'pink', 45363: 'pinko', 45364: 'pins', 45365: 'pint', 45366: 'pinto', 62614: 'vines', 66433: '73', 12611: 'atop', 12612: 'attic', 12613: 'attire', 12614: 'attn', 12615: 'au', 12616: 'audio', 65333: '100%', 12621: 'audit', 12622: 'audrey', 12623: 'aug', 12624: 'augur', 12625: 'august', 12626: 'auk', 12631: 'aunt', 12632: 'aunts', 12633: 'aura', 12634: 'aural', 12635: 'austin', 12636: 'auto', 66434: '73rd', 61524: 'trust', 12641: 'autumn', 12642: 'av', 45411: 'pinup', 45412: 'pious', 45413: 'pip', 45414: 'pipe', 45415: 'piper', 45416: 'pirate', 12651: 'aviate', 12652: 'avid', 12653: 'avis', 12654: 'avoid', 12655: 'avon', 45424: 'pith', 45425: 'pithy', 45426: 'pits', 12661: 'aw', 12662: 'await', 12663: 'awake', 12664: 'award', 12665: 'aware', 45434: 'pixie', 45435: 'pizza', 45436: 'pj', 54313: 'smile', 45441: 'pk', 45442: 'pl', 45443: 'place', 45444: 'plague', 45445: 'plaid', 45446: 'plain', 45451: 'plan', 45452: 'plane', 45453: 'planet', 45454: 'plank', 45455: 'plant', 45456: 'plate', 45461: 'plato', 45462: 'play', 45463: 'plays', 45464: 'plaza', 45465: 'plea', 45466: 'plead', 64423: 'xw', 45511: 'pleas', 45512: 'pleat', 45513: 'pledge', 45514: 'plod', 45515: 'plods', 45516: 'plop', 45521: 'plot', 45522: 'plots', 45523: 'plow', 45524: 'plows', 45525: 'ploy', 45526: 'ploys', 45531: 'pluck', 45532: 'plug', 45533: 'plugs', 45534: 'plum', 45535: 'plume', 45536: 'plump', 45541: 'plums', 45542: 'plus', 45543: 'plush', 45544: 'pluto', 45545: 'ply', 45546: 'pm', 45551: 'pms', 45552: 'pn', 45553: 'po', 45554: 'poach', 45555: 'pobox', 45556: 'pod', 45561: 'pods', 45562: 'poe', 45563: 'poem', 45564: 'poems', 45565: 'poet', 45566: 'poetry', 45611: 'pogo', 45612: 'poi', 45613: 'point', 45614: 'poise', 45615: 'poison', 45616: 'poke', 45621: 'poked', 45622: 'pokes', 42421: 'molt', 45624: 'polar', 45625: 'pole', 45626: 'poles', 45631: 'police', 45632: 'polio', 45633: 'polk', 45634: 'polka', 45635: 'poll', 45636: 'polls', 45641: 'polo', 45642: 'pomp', 45643: 'pond', 45644: 'ponds', 45645: 'pony', 45646: 'pooch', 61532: 'tsar', 45651: 'pooh', 45652: 'pool', 45653: 'pools', 45654: 'poop', 45655: 'poor', 45656: 'pop', 45661: 'pope', 45662: 'poppy', 45663: 'pops', 45664: 'porch', 45665: 'pore', 45666: 'pores', 62625: 'virus', 15161: 'bribe', 62111: 'tyke', 62112: 'tyler', 62113: 'type', 62114: 'typed', 62115: 'types', 62116: 'typo', 62626: 'visa', 62121: 'tz', 61535: 'ttt', 62123: "u's", 62124: 'u-2', 62125: 'ua', 62126: 'ub', 64263: 'wu', 62131: 'uc', 62132: 'ud', 54443: 'soda', 62134: 'uf', 62135: 'ufo', 62136: 'ug', 62141: 'ugh', 62142: 'ugly', 62143: 'uh', 62144: 'ui', 62145: 'uj', 62146: 'uk', 62151: 'ul', 62152: 'ulcer', 61536: 'tttt', 62154: 'umpire', 62155: 'un', 62156: 'uncle', 64264: 'wv', 62161: 'uncut', 62162: 'under', 62163: 'undo', 62164: 'undue', 62165: 'unfit', 62166: 'unify', 65663: '35', 65422: '16', 64265: 'ww', 64463: 'yarn', 62211: 'union', 62212: 'unit', 62213: 'unite', 62214: 'units', 62215: 'unity', 62216: 'unix', 64266: 'wwi', 62221: 'untie', 62222: 'until', 62223: 'unto', 62224: 'unwed', 62225: 'uo', 62226: 'up', 62231: 'uphill', 62232: 'uphold', 62233: 'upi', 62234: 'upon', 62235: 'upper', 62236: 'uproar', 62241: 'ups', 62242: 'upset', 62243: 'uptake', 62244: 'uq', 62245: 'ur', 62246: 'urban', 62251: 'urge', 62252: 'urged', 62253: 'urges', 62254: 'urine', 62255: 'urn', 62256: 'us', 62261: 'usa', 62262: 'usaf', 13111: 'away', 13112: 'awe', 13113: 'awed', 13114: 'awful', 13115: 'awl', 13116: 'awn', 13121: 'awoke', 13122: 'awol', 13123: 'awry', 13124: 'ax', 13125: 'axe', 13126: 'axes', 13131: 'axiom', 13132: 'axis', 13133: 'axle', 13134: 'ay', 13135: 'aye', 13136: 'az', 62264: 'use', 13141: 'aztec', 13142: 'azure', 13143: 'b', 13144: 'b&w', 13145: "b's", 13146: 'b-52', 43151: 'myths', 62266: 'useful', 13151: 'ba', 13152: 'baal', 13153: 'babe', 13154: 'babel', 13155: 'babes', 13156: 'baboon', 62311: 'uses', 62312: 'usher', 13161: 'baby', 13162: 'bach', 13163: 'back', 13164: 'backup', 13165: 'bacon', 13166: 'bad', 62321: 'ut', 62322: 'utah', 62323: 'utmost', 62324: 'utter', 62325: 'uu', 62326: 'uuu', 32233: 'hands', 62633: 'visor', 62331: 'uuuu', 62332: 'uv', 62333: 'uvula', 62334: 'uvw', 62335: 'uw', 62336: 'ux', 65332: '100', 62341: 'uy', 62342: 'uz', 62343: 'v', 62344: "v's", 62345: 'v-8', 62346: 'va', 62351: 'vacuum', 62352: 'vague', 62353: 'vain', 62354: 'val', 62355: 'vale', 62356: 'valet', 62361: 'valid', 62362: 'valor', 13211: 'badge', 13212: 'badly', 13213: 'baffle', 13214: 'bag', 13215: 'bagel', 13216: 'baggy', 13221: 'bags', 13222: 'bah', 13223: 'bahama', 13224: 'bail', 13225: 'bait', 13226: 'bake', 13231: 'baker', 13232: 'bakes', 13233: 'bald', 13234: 'bale', 13235: 'bali', 13236: 'balk', 62635: 'vital', 13241: 'balkan', 13242: 'ball', 13243: 'balled', 13244: 'ballot', 13245: 'balls', 13246: 'balm', 32245: 'hardy', 13251: 'balmy', 13252: 'balsa', 13253: 'bambi', 13254: 'ban', 13255: 'banal', 13256: 'banana', 62411: 'vance', 62412: 'vane', 13261: 'band', 13262: 'bandit', 13263: 'bands', 13264: 'bandy', 13265: 'bane', 13266: 'bang', 62421: 'vases', 62422: 'vast', 62423: 'vat', 62424: 'vats', 56635: 'thyme', 62426: 'vb', 62431: 'vc', 62432: 'vcr', 62433: 'vd', 62434: 've', 62435: 'veal', 62436: 'veep', 62441: 'veer', 62442: 'veers', 62443: 'veggie', 62444: 'veil', 62445: 'vein', 62446: 'veins', 62451: 'venal', 62452: 'vend', 61546: 'tubs', 62454: 'vends', 62455: 'venom', 62456: 'vent', 62461: 'vents', 62462: 'venus', 13311: 'bangs', 13312: 'banish', 13313: 'banjo', 13314: 'bank', 13315: 'banks', 13316: 'bar', 13321: 'barb', 13322: 'barbs', 13323: 'bard', 13324: 'bare', 13325: 'barf', 13326: 'barge', 13331: 'bark', 13332: 'barks', 13333: 'barley', 13334: 'barn', 13335: 'barnes', 13336: 'baron', 13341: 'barony', 13342: 'barry', 13343: 'bars', 46112: 'porn', 46113: 'porous', 46114: 'port', 46115: 'pose', 46116: 'posed', 13351: 'base', 13352: 'bash', 46121: 'poses', 46122: 'posh', 46123: 'posse', 13356: 'basis', 46125: 'posts', 46126: 'posy', 62511: 'verge', 62512: 'verify', 13361: 'bask', 13362: 'basket', 46131: 'pot', 13364: 'baste', 46133: 'pots', 46134: 'potts', 46135: 'pouch', 46136: 'pound', 62521: 'verve', 62522: 'very', 54111: 'skis', 62524: 'vest', 46141: 'pour', 46142: 'pours', 46143: 'pout', 46144: 'pouts', 46145: 'pow', 46146: 'powder', 62531: 'veto', 62532: 'vets', 62533: 'vex', 62534: 'vexed', 46151: 'power', 46152: 'pox', 46153: 'pp', 46154: 'ppm', 46155: 'ppp', 46156: 'pppp', 54114: 'skulk', 62542: 'vh', 62543: 'vi', 62544: 'via', 46161: 'pq', 46162: 'pqr', 46163: 'pr', 46164: 'prank', 46165: 'prawn', 46166: 'pray', 62551: 'vic', 62552: 'vice', 62553: 'vices', 62554: 'vicky', 62555: 'video', 62556: 'vie', 62561: 'viet', 62562: 'view', 13411: 'bates', 13412: 'bath', 13413: 'bathe', 13414: 'baths', 13415: 'baton', 13416: 'bats', 13421: 'bauble', 13422: 'baud', 13423: 'bawd', 13424: 'bawdy', 13425: 'bawl', 13426: 'bay', 13431: 'bayer', 13432: 'bayou', 13433: 'bays', 13434: 'bazaar', 13435: 'bb', 13436: 'bbb', 54122: 'sl', 13441: 'bbbb', 13442: 'bbc', 46211: 'prays', 46212: 'preen', 46213: 'prefix', 46214: 'prep', 46215: 'press', 46216: 'prexy', 54124: 'slabs', 13451: 'be', 13452: 'beach', 46221: 'prey', 46222: 'price', 13455: 'beads', 13456: 'beady', 46225: 'prig', 46226: 'prim', 62611: 'vile', 62612: 'vinci', 13461: 'beak', 13462: 'beam', 13463: 'beams', 46232: 'prince', 46233: 'print', 46234: 'prior', 46235: 'prism', 46236: 'prissy', 62621: 'violet', 62622: 'vip', 62623: 'virgil', 62624: 'virgo', 46241: 'privy', 46242: 'prize', 46243: 'pro', 46244: 'probe', 46245: 'prod', 46246: 'prods', 62631: 'vise', 62632: 'visit', 61552: 'tue', 62634: 'vista', 46251: 'prof', 46252: 'prom', 46253: 'promo', 46254: 'prone', 46255: 'prong', 46256: 'proof', 62641: 'viva', 62642: 'vivian', 54131: 'slam', 62644: 'vixen', 46261: 'prop', 46262: 'propel', 46263: 'props', 46264: 'prose', 46265: 'proud', 46266: 'prove', 62651: 'vl', 62652: 'vlad', 62653: 'vm', 62654: 'vn', 54133: 'slang', 62656: 'vocal', 54134: 'slant', 62662: 'vogue', 13511: 'beard', 13512: 'bears', 13513: 'beast', 13514: 'beat', 13515: 'beats', 13516: 'beau', 64525: 'yellow', 13521: 'beauty', 13522: 'beaver', 13523: 'bebop', 13524: 'beck', 13525: 'becky', 13526: 'bed', 13531: 'beds', 13532: 'bee', 13533: 'beech', 13534: 'beef', 13535: 'beefy', 13536: 'been', 62645: 'vj', 13541: 'beep', 13542: 'beeps', 13543: 'beer', 13544: 'beers', 13545: 'bees', 13546: 'beet', 46315: 'prune', 46316: 'pry', 13551: 'beets', 13552: 'befall', 13553: 'befit', 13554: 'befog', 13555: 'beg', 13556: 'began', 46325: 'pt', 46326: 'pu', 13561: 'beget', 13562: 'beggar', 13563: 'begin', 46332: 'pubic', 13565: 'begun', 13566: 'behind', 46335: 'pucker', 46336: 'puddle', 46341: 'pudgy', 46342: 'puff', 46343: 'puffs', 46344: 'puffy', 46345: 'pug', 46346: 'puke', 46351: 'pull', 46352: 'pulls', 46353: 'pulp', 46354: 'pulse', 46355: 'puma', 46356: 'pump', 46361: 'pumps', 46362: 'pun', 46363: 'punch', 46364: 'punish', 46365: 'punk', 46366: 'punks', 13611: 'beige', 13612: 'being', 13613: 'beirut', 13614: 'belch', 13615: 'belfry', 13616: 'belief', 13621: 'bell', 13622: 'bella', 13623: 'belle', 13624: 'bellow', 13625: 'bells', 13626: 'belly', 13631: 'below', 13632: 'belt', 13633: 'belts', 13634: 'bemoan', 13635: 'ben', 13636: 'bench', 31515: 'greta', 13641: 'bend', 13642: 'bender', 46411: 'punky', 13644: 'benign', 46413: 'punt', 46414: 'punts', 46415: 'puny', 46416: 'pup', 13651: 'benz', 13652: 'beret', 13653: 'berg', 13654: 'berlin', 46423: 'pure', 46424: 'purge', 46425: 'purr', 46426: 'purse', 13661: 'bert', 13662: 'berth', 13663: 'beryl', 46432: 'push', 46433: 'pushy', 46434: 'pussy', 46435: 'put', 46436: 'puts', 46441: 'putt', 46442: 'putty', 46443: 'puzzle', 46444: 'pv', 46445: 'pvc', 46446: 'pw', 46451: 'px', 46452: 'py', 46453: 'pygmy', 46454: 'pyre', 46455: 'pyrex', 46456: 'pz', 64133: 'woke', 46461: 'q', 46462: 'q&a', 46463: "q's", 46464: 'qa', 46465: 'qb', 46466: 'qc', 56422: 'teens', 12566: 'atone', 62363: 'value', 62364: 'valve', 46511: 'qd', 46512: 'qe', 46513: 'qed', 46514: 'qf', 46515: 'qg', 46516: 'qh', 62366: 'van', 46521: 'qi', 46522: 'qj', 46523: 'qk', 46524: 'ql', 46525: 'qm', 46526: 'qn', 46531: 'qo', 46532: 'qp', 46533: 'qq', 46534: 'qqq', 46535: 'qqqq', 46536: 'qr', 46541: 'qrs', 46542: 'qs', 46543: 'qt', 46544: 'qu', 46545: 'quack', 46546: 'quad', 46551: 'quail', 46552: 'quake', 46553: 'quarry', 46554: 'quart', 46555: 'queasy', 46556: 'queen', 62613: 'vine', 46561: 'query', 46562: 'quest', 46563: 'queue', 46564: 'quick', 46565: 'quiet', 46566: 'quill', 61563: 'tumble', 62365: 'vamp', 65111: 'zb', 66214: '49', 61564: 'tuna', 46611: 'quilt', 46612: 'quinn', 46613: 'quip', 46614: 'quips', 46615: 'quirk', 46616: 'quit', 46621: 'quite', 46622: 'quits', 46623: 'quiver', 46624: 'quiz', 46625: 'quota', 46626: 'quote', 46631: 'qv', 46632: 'qw', 46633: 'qx', 46634: 'qy', 46635: 'qz', 46636: 'r', 46641: 'r&b', 46642: 'r&d', 46643: 'r&r', 46644: "r's", 46645: 'ra', 46646: 'rabbi', 46651: 'rabbit', 46652: 'rabid', 46653: 'race', 46654: 'raced', 46655: 'races', 46656: 'rack', 46661: 'racy', 46662: 'radar', 46663: 'radio', 46664: 'radish', 46665: 'raft', 46666: 'rafts', 61566: 'tuned', 65123: 'zen', 65124: 'zero', 56111: 'sweep', 65126: 'zesty', 12466: 'ascii', 56112: 'sweet', 65131: 'zeta', 63111: 'volvo', 63112: 'vomit', 63113: 'vote', 63114: 'vouch', 63115: 'vow', 63116: 'vowel', 65133: 'zg', 63121: 'vows', 63122: 'vp', 54211: 'slime', 63124: 'vr', 63125: 'vs', 63126: 'vt', 54212: 'slimy', 63131: 'vtol', 63132: 'vu', 63133: 'vulcan', 63134: 'vv', 54213: 'sling', 63136: 'vvvv', 54214: 'slip', 63142: 'vwx', 63143: 'vx', 63144: 'vy', 63145: 'vz', 63146: 'w', 63151: "w's", 63152: 'w/o', 63153: 'wa', 63154: 'wacko', 63155: 'wacky', 63156: 'wad', 63161: 'wade', 63162: 'wades', 63163: 'wafer', 63164: 'waffle', 63165: 'wag', 63166: 'wage', 65224: 'zt', 54221: 'sliver', 54222: 'slob', 54223: 'slog', 54224: 'sloop', 63211: 'wager', 63212: 'wages', 63213: 'wagon', 63214: 'wags', 63215: 'wahoo', 63216: 'waif', 63221: 'wail', 63222: 'wails', 63223: 'waist', 63224: 'wait', 63225: 'wake', 63226: 'waken', 62663: 'voice', 63231: 'waldo', 63232: 'walk', 63233: 'wall', 63234: 'walls', 35115: 'kane', 63236: 'walrus', 63241: 'walsh', 62615: 'vinyl', 54231: 'sloppy', 63244: 'waltz', 63245: 'wand', 63246: 'wang', 54232: 'slops', 63251: 'want', 63252: 'wants', 63253: 'war', 63254: 'ward', 54233: 'slosh', 63256: 'warmth', 54234: 'slot', 63262: 'warns', 14111: 'bet', 14112: 'beta', 14113: 'beth', 14114: 'betray', 14115: 'bets', 14116: 'betsy', 14121: 'bette', 14122: 'betty', 14123: 'bevy', 14124: 'beware', 14125: 'beyond', 14126: 'bf', 32113: 'gusts', 14131: 'bflat', 14132: 'bg', 14133: 'bh', 14134: 'bi', 14135: 'bias', 14136: 'bib', 62665: 'volt', 14141: 'bible', 14142: 'biceps', 14143: 'bid', 14144: 'bide', 14145: 'bids', 14146: 'bier', 14151: 'big', 14152: 'bigamy', 14153: 'bigot', 14154: 'bike', 14155: 'biker', 14156: 'bikini', 63311: 'warts', 63312: 'wary', 14161: 'bile', 14162: 'bilge', 14163: 'bilk', 14164: 'bill', 14165: 'bills', 14166: 'billy', 63321: 'waste', 63322: 'watch', 54323: 'smoky', 63324: 'watt', 63325: 'watts', 63326: 'wave', 51515: 'rico', 63332: 'waver', 63333: 'waves', 63334: 'wavy', 63335: 'wax', 63336: 'waxy', 51516: 'rid', 63341: 'way', 63342: 'wayne', 63343: 'ways', 63344: 'wb', 63345: 'wc', 63346: 'wd', 63351: 'we', 63352: "we'd", 63353: "we'll", 63354: "we're", 63355: "we've", 63356: 'weak', 63361: 'wealth', 63362: 'wear', 14211: 'bimbo', 14212: 'bin', 14213: 'binary', 14214: 'bind', 14215: 'binge', 14216: 'bingo', 14221: 'biped', 14222: 'birch', 14223: 'bird', 14224: 'birdie', 14225: 'birds', 14226: 'birth', 14231: 'bison', 14232: 'bisque', 14233: 'bit', 14234: 'bite', 14235: 'bites', 14236: 'bits', 51525: 'rifle', 62616: 'viola', 14241: 'bitten', 14242: 'biz', 14243: 'bj', 14244: 'bk', 14245: 'bl', 14246: 'blab', 14251: 'black', 14252: 'blade', 14253: 'blah', 14254: 'blair', 14255: 'blake', 14256: 'blame', 63411: 'webb', 63412: 'webs', 14261: 'bland', 14262: 'blank', 14263: 'blare', 14264: 'blast', 14265: 'blat', 14266: 'blaze', 63421: 'weed', 63422: 'weedy', 62453: 'vendor', 63424: 'weeks', 63425: 'weep', 63426: 'weeps', 61416: 'trash', 63431: 'weigh', 63432: 'weird', 56123: 'swine', 63434: 'weld', 63435: 'well', 63436: 'wells', 63441: 'welsh', 45212: 'pepsi', 63443: 'went', 63444: 'wept', 63445: 'were', 44121: 'odor', 35151: 'ke', 63452: 'wet', 63453: 'wets', 63454: 'wf', 63455: 'wg', 63456: 'wh', 51536: 'riley', 63461: 'whale', 14661: 'bp', 14311: 'bldg', 14312: 'bleak', 14313: 'bleat', 14314: 'bled', 14315: 'bleed', 14316: 'blend', 14321: 'bless', 14322: 'blew', 14323: 'blimp', 14324: 'blind', 14325: 'blink', 14326: 'blip', 14331: 'blips', 14332: 'bliss', 14333: 'blithe', 14334: 'blitz', 14335: 'bloat', 14336: 'blob', 62464: 'verb', 14341: 'blobs', 14342: 'bloc', 14343: 'block', 14344: 'bloke', 14345: 'blond', 14346: 'blonde', 62466: 'verdi', 45214: 'perch', 14351: 'blood', 14352: 'bloom', 14353: 'bloop', 14354: 'blot', 14355: 'blotch', 14356: 'blots', 35161: 'keeps', 63512: 'when', 14361: 'blow', 14362: 'blown', 14363: 'blows', 14364: 'blt', 14365: 'blue', 14366: 'blues', 63521: 'while', 63522: 'whim', 63523: 'whine', 63524: 'whinny', 63525: 'whip', 63526: 'whips', 63531: 'whir', 63532: 'whirl', 63533: 'white', 63534: 'whiz', 63535: 'who', 63536: "who'd", 63541: 'whoa', 63542: 'whole', 63543: 'whom', 63544: 'whoop', 63545: 'whoosh', 63546: 'whose', 63551: 'why', 63552: 'wi', 63553: 'wick', 63554: 'wide', 63555: 'widen', 63556: 'wider', 63561: 'widow', 63562: 'width', 14411: 'bluff', 14412: 'blunt', 14413: 'blur', 14414: 'blurs', 14415: 'blurt', 14416: 'blush', 14421: 'blvd', 14422: 'blythe', 14423: 'bm', 14424: 'bmw', 14425: 'bn', 14426: 'bo', 14431: 'boa', 14432: 'boar', 14433: 'board', 14434: 'boast', 14435: 'boat', 14436: 'boats', 14441: 'bob', 14442: 'bobby', 14443: 'bobcat', 14444: 'bobs', 14445: 'bode', 14446: 'body', 64312: 'www', 14451: 'bog', 14452: 'bogey', 14453: 'boggy', 14454: 'bogs', 14455: 'bogus', 14456: 'boil', 63611: 'wild', 63612: 'wiley', 14461: 'boils', 14462: 'boise', 14463: 'bold', 14464: 'bolt', 14465: 'bolts', 14466: 'bomb', 63621: 'wilma', 63622: 'wilt', 63623: 'wily', 63624: 'wimp', 63625: 'wimpy', 63626: 'win', 64313: 'wwww', 63631: 'wince', 63632: 'winch', 63633: 'wind', 63634: 'windy', 63635: 'wine', 63636: 'wines', 66213: '48th', 63641: 'wing', 63642: 'wings', 63643: 'wink', 63644: 'winks', 63645: 'winnie', 63646: 'wino', 65221: 'zq', 63651: 'wins', 63652: 'winter', 63653: 'wipe', 63654: 'wire', 63655: 'wires', 63656: 'wiry', 64314: 'wx', 63661: 'wise', 63662: 'wiser', 14511: 'bombay', 14512: 'bombs', 14513: 'bond', 14514: 'bone', 14515: 'bones', 14516: 'bong', 14521: 'bongo', 14522: 'bonn', 46111: 'pork', 14524: 'bony', 14525: 'boo', 14526: 'boob', 13344: 'bart', 14531: 'booby', 14532: 'boogie', 14533: 'book', 14534: 'books', 13345: 'barter', 14536: 'boon', 64315: 'wxy', 13346: 'barton', 14542: 'boor', 14543: 'boost', 14544: 'boot', 14545: 'booth', 14546: 'boots', 14551: 'booty', 14552: 'booze', 14553: 'bop', 14554: 'borax', 14555: 'border', 14556: 'bore', 14561: 'bored', 14562: 'bores', 14563: 'borg', 14564: 'boris', 14565: 'born', 14566: 'borneo', 64316: 'wy', 54311: 'smear', 65234: 'zy', 54312: 'smell', 13353: 'basic', 13354: 'basil', 13355: 'basin', 46124: 'post', 14611: 'boron', 14612: 'bosom', 14613: 'boss', 14614: 'bossy', 14615: 'boston', 14616: 'botch', 14621: 'both', 14622: 'bottle', 14623: 'bough', 14624: 'bouncy', 14625: 'bound', 14626: 'bout', 65243: '!!', 14631: 'bovine', 14632: 'bow', 14633: 'bowed', 14634: 'bowel', 14635: 'bowie', 14636: 'bowl', 54322: 'smoke', 14641: 'bowls', 14642: 'bows', 14643: 'box', 14644: 'boxed', 14645: 'boxer', 14646: 'boxes', 46132: 'potato', 14651: 'boxy', 14652: 'boy', 14653: 'boyd', 14654: 'boyle', 13365: 'bat', 14656: 'bozo', 13366: 'batch', 14662: 'bq', 14663: 'br', 14664: 'bra', 14665: 'brace', 14666: 'brad', 65411: '14', 65334: '1000', 54331: 'sn', 54332: 'snack', 54333: 'snafu', 54334: 'snag', 65412: '1492', 31111: 'giant', 31112: 'giddy', 31113: 'gift', 31114: 'gifts', 31115: 'gig', 31116: 'gil', 31121: 'gila', 31122: 'gild', 31123: 'gill', 31124: 'gills', 31125: 'gilt', 31126: 'gimme', 65413: '14th', 31131: 'gimpy', 31132: 'gin', 31133: 'gina', 31134: 'ginger', 31135: 'gino', 31136: 'gird', 65264: '(tm)', 31141: 'girl', 31142: 'girls', 31143: 'girth', 31144: 'gist', 31145: 'give', 31146: 'given', 62535: 'vexes', 65266: '*', 31151: 'gives', 31152: 'gizmo', 31153: 'gj', 31154: 'gk', 31155: 'gl', 31156: 'glad', 65414: '15', 31161: 'glade', 31162: 'glamor', 51615: 'rival', 31164: 'gland', 31165: 'glare', 31166: 'glass', 35232: 'kh', 62541: 'vg', 64324: 'wynn', 64332: 'xa', 62545: 'vial', 31211: 'glaze', 31212: 'gleam', 31213: 'glean', 31214: 'glee', 31215: 'glen', 31216: 'glenn', 65416: '1500', 31221: 'glib', 31222: 'glide', 51625: 'rm', 31224: 'gloat', 31225: 'glob', 31226: 'globe', 51626: 'rn', 31231: 'gloom', 31232: 'glory', 31233: 'gloss', 31234: 'glove', 31235: 'glow', 31236: 'glows', 31241: 'glue', 31242: 'glued', 31243: 'gluey', 31244: 'gluing', 31245: 'glum', 31246: 'glut', 32515: 'hence', 31252: 'gmt', 31253: 'gn', 31254: 'gnash', 31255: 'gnat', 31256: 'gnaw', 16132: 'carat', 31261: 'gnaws', 31262: 'gnome', 31263: 'gnp', 31264: 'gnu', 31265: 'go', 31266: 'goad', 66221: '4:30', 45232: 'pert', 35251: 'kill', 51636: 'roam', 32525: 'herbs', 31312: 'goals', 31313: 'goat', 31314: 'goats', 31315: 'gob', 31316: 'god', 32526: 'herd', 31321: 'godly', 31322: 'gods', 31323: 'goes', 31324: 'goggle', 31325: 'gogh', 31326: 'gogo', 62565: 'vii', 31331: 'going', 31332: 'gold', 31333: 'golf', 31334: 'golly', 31335: 'gomez', 31336: 'gone', 31341: 'gong', 31342: 'goo', 51645: 'robin', 31344: 'goods', 31345: 'goody', 31346: 'gooey', 64115: 'wj', 64116: 'wk', 51646: 'robot', 31351: 'goof', 31352: 'goofy', 31353: 'goon', 31354: 'goose', 31355: 'gordon', 31356: 'gore', 64125: 'wo', 64126: 'woe', 31361: 'gorge', 31362: 'gory', 31363: 'gosh', 31364: 'gospel', 31365: 'got', 31366: 'gouge', 64135: 'wolff', 64136: 'woman', 32535: 'heron', 64141: 'womb', 64142: 'women', 64143: 'won', 64144: "won't", 16152: 'carp', 64146: 'wong', 64151: 'woo', 64152: 'wood', 64153: 'woods', 64154: 'woody', 64155: 'woof', 64156: 'wool', 64161: 'woos', 64162: 'word', 64163: 'words', 64164: 'wordy', 64165: 'wore', 64166: 'work', 16156: 'cars', 31411: 'gould', 31412: 'gourd', 31413: 'gout', 31414: 'govt', 31415: 'gown', 31416: 'gowns', 65311: '**', 66222: '4th', 31421: 'gp', 31422: 'gpa', 31423: 'gq', 31424: 'gr', 31425: 'grab', 31426: 'grabs', 32545: 'hexed', 31432: 'grad', 31433: 'grade', 31434: 'grady', 31435: 'graft', 31436: 'grail', 32546: 'hey', 31441: 'grain', 31442: 'gram', 31443: 'grams', 31444: 'grand', 31445: 'grant', 31446: 'grape', 64215: 'worn', 64216: 'worry', 65316: '1', 31451: 'graph', 31452: 'grasp', 31453: 'grass', 31454: 'grate', 31455: 'grave', 31456: 'gravel', 64225: 'wound', 64226: 'wove', 31461: 'gravy', 31462: 'gray', 31463: 'graze', 31464: 'great', 31465: 'greed', 31466: 'greedy', 64235: 'wr', 64236: 'wrap', 64241: 'wrath', 64242: 'wreak', 64243: 'wreck', 64244: 'wren', 64245: 'wring', 64246: 'wrist', 65321: '1%', 64251: 'write', 64252: 'writhe', 64253: 'wrong', 64254: 'wrote', 64255: 'wry', 64256: 'ws', 65323: '1/3', 64261: 'wsw', 64262: 'wt', 15111: 'brady', 15112: 'brag', 15113: 'brags', 15114: 'braid', 15115: 'brain', 15116: 'brainy', 15121: 'brake', 15122: 'bran', 15123: 'brand', 15124: 'brandy', 15125: 'brash', 15126: 'brass', 31511: 'greek', 31512: 'green', 31513: 'greet', 31514: 'greg', 15131: 'brassy', 15132: 'brat', 15133: 'brats', 15134: 'brave', 13445: 'bcd', 15136: 'brawl', 31521: 'grey', 31522: 'grid', 31523: 'grief', 31524: 'grieve', 15141: 'brawn', 15142: 'bray', 15143: 'brazil', 15144: 'bread', 15145: 'break', 15146: 'breath', 31531: 'grime', 31532: 'grimy', 31533: 'grin', 31534: 'grind', 15151: 'bred', 15152: 'breed', 15153: 'breeze', 15154: 'brew', 15155: 'brian', 15156: 'briar', 31541: 'gripe', 31542: 'grips', 31543: 'grist', 31544: 'grit', 31545: 'groan', 15162: 'brick', 15163: 'bride', 15164: 'bridge', 15165: 'brief', 15166: 'brig', 31551: 'groin', 31552: 'groom', 31553: 'groove', 31554: 'grope', 31555: 'gross', 31556: 'group', 64325: 'wz', 64326: 'x', 31561: 'grout', 31562: 'grove', 31563: 'grow', 31564: 'growl', 31565: 'grown', 31566: 'grows', 13453: 'beacon', 64336: 'xe', 13454: 'bead', 64342: 'xf', 64343: 'xg', 64344: 'xh', 64345: 'xi', 64346: 'xii', 46223: 'prick', 64351: 'xiii', 64352: 'xiv', 46224: 'pride', 55421: 'stops', 64355: 'xl', 64356: 'xm', 64361: 'xmas', 64362: 'xn', 15211: 'brim', 15212: 'brine', 15213: 'bring', 15214: 'brink', 15215: 'briny', 15216: 'brisk', 15221: 'broad', 15222: 'broil', 15223: 'broke', 15224: 'broken', 15225: 'bronco', 15226: 'bronx', 31611: 'grub', 31612: 'grubs', 31613: 'gruff', 31614: 'grunt', 31615: 'gs', 15232: 'brook', 15233: 'broom', 15234: 'broth', 15235: 'brow', 15236: 'brown', 31621: 'gu', 31622: 'guam', 31623: 'guano', 31624: 'guard', 15241: 'brows', 15242: 'browse', 15243: 'bruce', 15244: 'bruin', 15245: 'brunch', 15246: 'bruno', 31631: 'gui', 31632: 'guide', 31633: 'guild', 31634: 'guile', 15251: 'brunt', 15252: 'brush', 15253: 'brutal', 15254: 'brute', 13465: 'beans', 15256: 'bs', 31641: 'guitar', 31642: 'gulag', 31643: 'gulf', 31644: 'gull', 31645: 'gulls', 31646: 'gully', 15263: 'bu', 15264: 'bub', 15265: 'buck', 15266: 'bucks', 31651: 'gulp', 31652: 'gum', 31653: 'gumbo', 31654: 'gummy', 31655: 'gun', 31656: 'gunk', 64425: 'xxx', 64426: 'xxxx', 31661: 'guns', 31662: 'guppy', 31663: 'gurgle', 31664: 'guru', 31665: 'gus', 31666: 'gush', 64435: "y'all", 64436: "y's", 64441: 'ya', 64442: 'yacht', 54431: 'soars', 64444: 'yak', 64445: 'yale', 64446: 'yam', 66366: '6:00', 54432: 'sob', 64451: 'yamaha', 64452: 'yams', 64453: 'yang', 64454: 'yank', 54433: 'sober', 64456: 'yap', 54434: 'sobs', 64462: 'yards', 15311: 'bud', 15312: 'buddha', 15313: 'buddy', 15314: 'budge', 15315: 'buds', 15316: 'buff', 15321: 'bug', 15322: 'buggy', 15323: 'bugle', 15324: 'bugs', 15325: 'buick', 15326: 'build', 15331: 'built', 15332: 'bulb', 15333: 'bulbs', 15334: 'bulge', 15335: 'bulk', 15336: 'bulky', 15341: 'bull', 15342: 'bulls', 15343: 'bully', 15344: 'bum', 15345: 'bump', 15346: 'bumps', 65363: '12:30', 15351: 'bumpy', 15352: 'bums', 15353: 'bun', 15354: 'bunch', 15355: 'bunco', 15356: 'bundy', 64511: 'yc', 64512: 'yd', 15361: 'bunk', 15362: 'bunny', 15363: 'buns', 15364: 'bunt', 15365: 'bunts', 15366: 'buoy', 62636: 'vito', 64522: 'yeast', 64523: 'yeats', 64524: 'yell', 61615: 'turkey', 64526: 'yelp', 64531: 'yen', 64532: 'yep', 64533: 'yes', 64534: 'yet', 64535: 'yew', 64536: 'yews', 64541: 'yf', 64542: 'yg', 64543: 'yh', 64544: 'yi', 64545: 'yield', 64546: 'yin', 64551: 'yip', 64552: 'yips', 64553: 'yj', 64554: 'yk', 61616: 'turn', 64556: 'ym', 54113: 'skits', 64561: 'yn', 64562: 'yo', 15411: 'bureau', 15412: 'burg', 15413: 'burger', 15414: 'buried', 15415: 'burke', 15416: 'burly', 15421: 'burma', 15422: 'burn', 15423: 'burns', 15424: 'burnt', 15425: 'burp', 15426: 'burps', 62646: 'vk', 15431: 'burro', 15432: 'burst', 15433: 'burt', 15434: 'burton', 15435: 'bury', 15436: 'bus', 15441: 'bush', 15442: 'bushel', 15443: 'bushy', 15444: 'buss', 15445: 'bust', 15446: 'busy', 15451: 'but', 15452: 'butane', 15453: 'butch', 15454: 'butt', 15455: 'butte', 15456: 'buxom', 64611: 'yokel', 64612: 'yolk', 15461: 'buy', 15462: 'buyer', 15463: 'buys', 15464: 'buzz', 15465: 'bv', 15466: 'bvm', 16231: 'catsup', 64621: 'young', 64622: 'your', 64623: 'yours', 64624: 'youth', 32616: 'hilly', 64626: 'yp', 64631: 'yq', 64632: 'yr', 64633: 'yrs', 64634: 'ys', 62655: 'vo', 64636: 'ytd', 64641: 'yu', 64642: 'yucca', 64643: 'yuck', 64644: 'yukon', 64645: 'yule', 64646: 'yv', 64651: 'yw', 64652: 'yx', 64653: 'yy', 55423: 'stork', 64655: 'yyyy', 64656: 'yz', 64661: 'z', 64662: "z's", 15511: 'bw', 15512: 'bwana', 15513: 'bx', 15514: 'by', 15515: 'bye', 15516: 'bylaw', 62661: 'vodka', 15521: 'byline', 15522: 'byob', 15523: 'bypass', 15524: 'byrd', 15525: 'byron', 15526: 'byte', 16241: 'caves', 15531: 'bytes', 15532: 'byway', 16242: 'cavort', 15534: 'c', 15535: 'c#', 15536: 'c&w', 62664: 'void', 15541: "c's", 15542: 'c/o', 15543: 'ca', 15544: 'cab', 15545: 'cabal', 15546: 'cabana', 62666: 'volts', 15551: 'cabin', 15552: 'cable', 15553: 'cabot', 15554: 'cache', 15555: 'cackle', 15556: 'cacti', 64615: 'you', 15561: 'caddy', 15562: 'cadet', 15563: 'caesar', 15564: 'cafe', 15565: 'cage', 15566: 'caged', 35363: 'knife', 31616: 'gt', 16251: 'cccp', 16252: 'cd', 62516: 'verse', 16341: 'chap', 15611: 'cages', 15612: 'cagey', 15613: 'cain', 15614: 'cairn', 15615: 'cairo', 15616: 'cajun', 15621: 'cake', 15622: 'cakes', 15623: 'calf', 15624: 'calico', 15625: 'call', 15626: 'calls', 15631: 'callus', 15632: 'calm', 15633: 'calms', 15634: 'calvin', 15635: 'cam', 15636: 'came', 15641: 'camel', 15642: 'cameo', 15643: 'camera', 15644: 'camp', 15645: 'camps', 15646: 'camry', 16261: 'cedar', 15651: 'can', 15652: "can't", 32646: 'hitch', 15654: 'canary', 15655: 'cancer', 15656: 'candle', 65415: '15%', 15661: 'candy', 15662: 'cane', 15663: 'caned', 15664: 'canes', 15665: 'cannot', 15666: 'canny', 65421: '15th', 32654: 'hives', 65423: '1600', 63263: 'warp', 65424: '16th', 46311: 'prow', 32111: 'gust', 32112: 'gusto', 46312: 'prowl', 32114: 'gusty', 32115: 'gut', 32116: 'guts', 46313: 'proxy', 32121: 'gutsy', 32122: 'guy', 32123: 'guys', 32124: 'gv', 46314: 'prude', 32126: 'gwen', 63264: 'warren', 32131: 'gx', 32132: 'gy', 32133: 'gym', 32134: 'gyp', 32135: 'gypsum', 32136: 'gypsy', 65463: '1980', 65431: '1776', 32141: 'gyro', 32142: 'gz', 32143: 'h', 32144: "h's", 32145: 'h2o', 32146: 'ha', 32151: 'habit', 32152: 'hack', 32153: 'had', 32154: 'hag', 32155: 'haha', 32156: 'haiku', 65434: '1800', 32161: 'hail', 32162: 'hair', 32163: 'hairdo', 32164: 'hairs', 32165: 'hairy', 32166: 'haiti', 46321: 'ps', 46322: 'psalm', 46323: 'psi', 46324: 'psych', 63266: 'wart', 32211: 'hal', 32212: 'half', 32213: 'hall', 32214: 'halls', 32215: 'halo', 32216: 'halt', 32221: 'halts', 32222: 'halve', 32223: 'ham', 32224: 'hamlet', 32225: 'hammer', 32226: 'hams', 46331: 'pub', 32231: 'hand', 32232: 'handle', 13564: 'begs', 32234: 'handy', 32235: 'hang', 32236: 'hank', 46333: 'pubs', 32241: 'hanna', 32242: 'hans', 32243: 'happy', 32244: 'hard', 46334: 'puck', 32246: 'hare', 32251: 'harem', 32252: 'hark', 32253: 'harley', 32254: 'harm', 32255: 'harms', 32256: 'harp', 32261: 'harps', 32262: 'harry', 32263: 'harsh', 32264: 'hart', 32265: 'harv', 32266: 'harvey', 32311: 'has', 32312: 'hash', 32313: 'hasp', 32314: 'haste', 32315: 'hasty', 32316: 'hat', 32321: 'hatch', 32322: 'hate', 32323: 'hates', 32324: 'hatred', 32325: 'hats', 32326: 'haul', 15261: 'bt', 32331: 'hauls', 32332: 'haunt', 32333: 'have', 32334: 'haven', 32335: 'havoc', 32336: 'hawk', 65464: '1985', 32341: 'hawks', 32342: 'hay', 32343: 'haydn', 32344: 'hayes', 32345: 'hazard', 32346: 'haze', 65115: 'zeal', 65116: 'zealot', 65466: '1991', 32351: 'hazel', 32352: 'hazy', 32353: 'hb', 32354: 'hc', 32355: 'hd', 32356: 'hdtv', 65125: 'zest', 61635: 'twa', 32361: 'he', 32362: "he'd", 32363: "he'll", 32364: 'head', 32365: 'heads', 32366: 'heady', 65135: 'zi', 65136: 'zig', 65141: 'ziggy', 65142: 'zigzag', 65143: 'zilch', 65144: 'zinc', 65145: 'zing', 65146: 'zion', 65151: 'zip', 65152: 'zips', 65153: 'ziti', 65154: 'zj', 65155: 'zk', 65156: 'zl', 64364: 'xp', 65162: 'zn', 65163: 'zo', 65164: 'zoe', 65165: 'zone', 65166: 'zoned', 32411: 'heal', 32412: 'heals', 32413: 'heap', 32414: 'heaps', 32415: 'hear', 32416: 'heard', 32421: 'hears', 32422: 'heart', 32423: 'heat', 32424: 'heath', 32425: 'heats', 32426: 'heave', 32431: 'heaven', 32432: 'heavy', 32433: 'hebrew', 32434: 'heck', 32435: 'heckle', 32436: 'hectic', 32441: 'hedge', 32442: 'heed', 32443: 'heel', 32444: 'heels', 32445: 'heft', 32446: 'hefty', 65215: 'zowie', 65216: 'zp', 32451: 'height', 32452: 'heinz', 32453: 'heir', 32454: 'heirs', 32455: 'held', 32456: 'helen', 65225: 'zu', 65226: 'zulu', 32461: 'helga', 32462: 'helix', 32463: 'hell', 32464: 'hello', 32465: 'helm', 32466: 'help', 65235: 'zz', 65236: 'zzz', 65241: 'zzzz', 65242: '!', 16335: 'chant', 65244: '""""', 65245: '#', 65246: '##', 65251: '$', 65252: '$$', 65253: '%', 65254: '%%', 65255: '&', 65256: '(', 65261: '()', 65262: '(c)', 16111: 'canoe', 16112: 'canon', 16113: 'canopy', 16114: 'cans', 16115: 'canto', 16116: 'canvas', 16121: 'canyon', 16122: 'cap', 16123: 'cape', 16124: 'caped', 16125: 'caper', 16126: 'capri', 32511: 'hem', 32512: 'hemp', 32513: 'hems', 32514: 'hen', 16131: 'car', 32516: 'henry', 16133: 'carbon', 16134: 'card', 16135: 'care', 16136: 'cares', 32521: 'hens', 32522: 'hep', 32523: 'her', 32524: 'herb', 16141: 'caress', 16142: 'caret', 16143: 'cargo', 16144: 'carl', 16145: 'carla', 16146: 'carlo', 32531: 'here', 32532: 'hero', 32533: 'herod', 32534: 'heroic', 16151: 'carol', 32536: 'herr', 16153: 'carpet', 16154: 'carrie', 16155: 'carry', 31635: 'guilt', 32541: 'hers', 32542: 'hertz', 32543: 'hew', 32544: 'hex', 16161: 'carson', 16162: 'cart', 16163: 'caruso', 16164: 'carve', 16165: 'case', 16166: 'cases', 32551: 'hf', 32552: 'hg', 32553: 'hh', 32554: 'hhh', 32555: 'hhhh', 32556: 'hi', 65325: '1/8', 65326: '10', 32561: 'hick', 32562: 'hid', 32563: 'hide', 32564: 'hides', 32565: 'high', 32566: 'hij', 65335: '100th', 65336: '101', 31636: 'guise', 65341: '101st', 65342: '10:00', 65343: '10:30', 65344: '10th', 65345: '11', 65346: '111', 65351: '1111', 65352: '11:00', 65353: '11:30', 65354: '11th', 65355: '12', 65356: '123', 65361: '1234', 65362: '12:00', 16211: 'casey', 16212: 'cash', 16213: 'cashew', 16214: 'cask', 16215: 'casket', 16216: 'cast', 16221: 'caste', 16222: 'cat', 16223: 'catch', 16224: 'cater', 16225: 'cathy', 16226: 'cats', 32611: 'hijack', 32612: 'hike', 32613: 'hikes', 32614: 'hill', 32615: 'hills', 16232: 'catty', 16233: 'caulk', 16234: 'cause', 16235: 'cave', 16236: 'cavern', 32621: 'hilt', 32622: 'him', 32623: 'hind', 32624: 'hindu', 32625: 'hinge', 32626: 'hint', 16243: 'cb', 16244: 'cc', 16245: 'ccc', 16246: 'cccc', 32631: 'hints', 32632: 'hip', 32633: 'hippo', 32634: 'hips', 32635: 'hiram', 32636: 'hire', 16253: 'cde', 16254: 'ce', 16255: 'cease', 16256: 'cecil', 32641: 'hired', 32642: 'hires', 32643: 'his', 32644: 'hiss', 32645: 'hit', 16262: 'cede', 16263: 'celery', 16264: 'celia', 16265: 'cell', 16266: 'cello', 32651: 'hits', 32652: 'hiv', 32653: 'hive', 12544: 'asthma', 32655: 'hj', 32656: 'hk', 65425: '17', 65426: '1700', 32661: 'hl', 32662: 'hm', 32663: 'hn', 32664: 'ho', 32665: 'hoagy', 32666: 'hoard', 65435: '18th', 65436: '19', 65441: '1900', 65442: '1910', 65443: '1920', 65444: '1925', 65445: '1930', 65446: '1935', 42551: 'mousy', 65451: '1940', 65452: '1945', 65453: '1950', 65454: '1955', 65455: '1960', 65456: '1965', 65465: '1990', 64616: "you'd", 65461: '1970', 65462: '1975', 16311: 'census', 16312: 'cent', 16313: 'cents', 16314: 'ceo', 16315: 'cesar', 16316: 'cf', 16321: 'cg', 16322: 'ch', 16323: 'chad', 16324: 'chafe', 16325: 'chaff', 16326: 'chain', 46412: 'puns', 16331: 'chair', 16332: 'chalk', 16333: 'champ', 16334: 'chance', 13645: 'benny', 16336: 'chaos', 13646: 'bent', 16342: 'chapel', 16343: 'char', 16344: 'charm', 16345: 'chart', 16346: 'chase', 16351: 'chasm', 16352: 'chaste', 16353: 'chat', 16354: 'chats', 16355: 'cheap', 16356: 'cheat', 65511: '1992', 65512: '1993', 16361: 'check', 16362: 'cheek', 16363: 'cheeky', 16364: 'cheer', 16365: 'chef', 16366: 'cherub', 65521: '19th', 65522: '1:00', 65523: '1:30', 65524: '1st', 65525: '2', 65526: '2%', 65531: '2/3', 65532: '20', 65533: '20%', 65534: '200', 46421: 'pupil'}
| 59,228.5 | 118,456 | 0.604591 | 15,639 | 118,457 | 4.579385 | 0.984142 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.386208 | 0.131305 | 118,457 | 1 | 118,457 | 118,457 | 0.309758 | 0 | 0 | 0 | 0 | 0 | 0.277763 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 9 |
1f463b9ce36433237b51bd191da296c64f228086 | 196 | py | Python | math_utils.py | ctoth/shooter | 2151678355553527a944c29678c13f741c1ee6c9 | [
"MIT"
] | null | null | null | math_utils.py | ctoth/shooter | 2151678355553527a944c29678c13f741c1ee6c9 | [
"MIT"
] | null | null | null | math_utils.py | ctoth/shooter | 2151678355553527a944c29678c13f741c1ee6c9 | [
"MIT"
] | null | null | null | from __future__ import division
import math
def percentage(what, percent):
return (what / 100.0) * percent
def inverse_percentage(what, percent):
return 100.0 / 100.0 / what * percent
| 21.777778 | 39 | 0.714286 | 27 | 196 | 5 | 0.481481 | 0.244444 | 0.311111 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075949 | 0.193878 | 196 | 8 | 40 | 24.5 | 0.778481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 9 |
1f5c3164205575be064e8cff430e72fe7294bef4 | 44,998 | py | Python | alien4cloud-cloudify3-provider/src/test/resources/outputs/blueprints/openstack/storage/plugins/custom_wf_plugin/plugin/workflows.py | alien4cloud/alien4cloud-cloudify4-provider | 97faee855255eb0c3ce25bb3075c29acd11a63c5 | [
"Apache-2.0"
] | null | null | null | alien4cloud-cloudify3-provider/src/test/resources/outputs/blueprints/openstack/storage/plugins/custom_wf_plugin/plugin/workflows.py | alien4cloud/alien4cloud-cloudify4-provider | 97faee855255eb0c3ce25bb3075c29acd11a63c5 | [
"Apache-2.0"
] | 3 | 2015-12-04T15:27:22.000Z | 2016-04-08T11:32:43.000Z | alien4cloud-cloudify3-provider/src/test/resources/outputs/blueprints/openstack/storage/plugins/custom_wf_plugin/plugin/workflows.py | alien4cloud/alien4cloud-cloudify4-provider | 97faee855255eb0c3ce25bb3075c29acd11a63c5 | [
"Apache-2.0"
] | 16 | 2015-01-29T10:05:09.000Z | 2019-06-24T19:23:54.000Z | from cloudify.decorators import workflow
from cloudify.workflows import ctx
from cloudify.workflows import tasks as workflow_tasks
from utils import set_state_task
from utils import operation_task
from utils import link_tasks
from utils import CustomContext
from utils import generate_native_node_workflows
from utils import _get_all_nodes
from utils import _get_all_nodes_instances
from utils import _get_all_modified_node_instances
from utils import is_host_node
from workflow import WfStartEvent
from workflow import build_pre_event
# subworkflow 'install' for host 'Compute'
def install_host_compute(ctx, graph, custom_context):
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'configuring', 'LinuxFileSystem_2_configuring', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'configuring', 'LinuxFileSystem_4_configuring', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'starting', 'LinuxFileSystem_3_starting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'configuring', 'LinuxFileSystem_1_configuring', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'creating', 'LinuxFileSystem_3_creating', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'configuring', 'LinuxFileSystem_3_configuring', custom_context)
custom_context.register_native_delegate_wf_step('Compute', 'Compute_install')
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'started', 'LinuxFileSystem_4_started', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'started', 'LinuxFileSystem_3_started', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'created', 'LinuxFileSystem_4_created', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_1', 'cloudify.interfaces.lifecycle.configure', 'configure_LinuxFileSystem_1', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_3', 'cloudify.interfaces.lifecycle.start', 'start_LinuxFileSystem_3', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'created', 'LinuxFileSystem_2_created', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_2', 'cloudify.interfaces.lifecycle.configure', 'configure_LinuxFileSystem_2', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_4', 'cloudify.interfaces.lifecycle.start', 'start_LinuxFileSystem_4', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_1', 'cloudify.interfaces.lifecycle.start', 'start_LinuxFileSystem_1', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'created', 'LinuxFileSystem_3_created', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_3', 'cloudify.interfaces.lifecycle.configure', 'configure_LinuxFileSystem_3', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'configured', 'LinuxFileSystem_2_configured', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_2', 'cloudify.interfaces.lifecycle.start', 'start_LinuxFileSystem_2', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_4', 'cloudify.interfaces.lifecycle.configure', 'configure_LinuxFileSystem_4', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'initial', 'LinuxFileSystem_3_initial', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'initial', 'LinuxFileSystem_2_initial', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'creating', 'LinuxFileSystem_2_creating', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'initial', 'LinuxFileSystem_4_initial', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'initial', 'LinuxFileSystem_1_initial', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'created', 'LinuxFileSystem_1_created', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'started', 'LinuxFileSystem_1_started', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'started', 'LinuxFileSystem_2_started', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'configured', 'LinuxFileSystem_3_configured', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'starting', 'LinuxFileSystem_2_starting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'configured', 'LinuxFileSystem_1_configured', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'creating', 'LinuxFileSystem_4_creating', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'starting', 'LinuxFileSystem_1_starting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'starting', 'LinuxFileSystem_4_starting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'configured', 'LinuxFileSystem_4_configured', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'creating', 'LinuxFileSystem_1_creating', custom_context)
custom_context.register_native_delegate_wf_step('CBS4', 'CBS4_install')
custom_context.register_native_delegate_wf_step('CBS3', 'CBS3_install')
custom_context.register_native_delegate_wf_step('CBS1', 'CBS1_install')
custom_context.register_native_delegate_wf_step('CBS2', 'CBS2_install')
generate_native_node_workflows(ctx, graph, custom_context, 'install')
link_tasks(graph, 'configure_LinuxFileSystem_2', 'LinuxFileSystem_2_configuring', custom_context)
link_tasks(graph, 'configure_LinuxFileSystem_4', 'LinuxFileSystem_4_configuring', custom_context)
link_tasks(graph, 'start_LinuxFileSystem_3', 'LinuxFileSystem_3_starting', custom_context)
link_tasks(graph, 'configure_LinuxFileSystem_1', 'LinuxFileSystem_1_configuring', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_created', 'LinuxFileSystem_3_creating', custom_context)
link_tasks(graph, 'configure_LinuxFileSystem_3', 'LinuxFileSystem_3_configuring', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_initial', 'Compute_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_initial', 'Compute_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_initial', 'Compute_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_initial', 'Compute_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_configuring', 'LinuxFileSystem_4_created', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_configured', 'configure_LinuxFileSystem_1', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_started', 'start_LinuxFileSystem_3', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_configuring', 'LinuxFileSystem_2_created', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_configured', 'configure_LinuxFileSystem_2', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_started', 'start_LinuxFileSystem_4', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_started', 'start_LinuxFileSystem_1', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_configuring', 'LinuxFileSystem_3_created', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_configured', 'configure_LinuxFileSystem_3', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_starting', 'LinuxFileSystem_2_configured', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_started', 'start_LinuxFileSystem_2', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_configured', 'configure_LinuxFileSystem_4', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_creating', 'LinuxFileSystem_3_initial', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_creating', 'LinuxFileSystem_2_initial', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_created', 'LinuxFileSystem_2_creating', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_creating', 'LinuxFileSystem_4_initial', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_creating', 'LinuxFileSystem_1_initial', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_configuring', 'LinuxFileSystem_1_created', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_starting', 'LinuxFileSystem_3_configured', custom_context)
link_tasks(graph, 'start_LinuxFileSystem_2', 'LinuxFileSystem_2_starting', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_starting', 'LinuxFileSystem_1_configured', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_created', 'LinuxFileSystem_4_creating', custom_context)
link_tasks(graph, 'start_LinuxFileSystem_1', 'LinuxFileSystem_1_starting', custom_context)
link_tasks(graph, 'start_LinuxFileSystem_4', 'LinuxFileSystem_4_starting', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_starting', 'LinuxFileSystem_4_configured', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_created', 'LinuxFileSystem_1_creating', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_initial', 'CBS4_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_initial', 'CBS3_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_initial', 'CBS1_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_initial', 'CBS2_install', custom_context)
# subworkflow 'uninstall' for host 'Compute'
def uninstall_host_compute(ctx, graph, custom_context):
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.register_native_delegate_wf_step('Compute', 'Compute_uninstall')
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'stopping', 'LinuxFileSystem_4_stopping', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'deleting', 'LinuxFileSystem_2_deleting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'stopping', 'LinuxFileSystem_3_stopping', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'deleted', 'LinuxFileSystem_2_deleted', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'deleted', 'LinuxFileSystem_1_deleted', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'deleting', 'LinuxFileSystem_3_deleting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'stopped', 'LinuxFileSystem_1_stopped', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'deleting', 'LinuxFileSystem_4_deleting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'deleting', 'LinuxFileSystem_1_deleting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'stopped', 'LinuxFileSystem_2_stopped', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'stopping', 'LinuxFileSystem_2_stopping', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'stopped', 'LinuxFileSystem_3_stopped', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'deleted', 'LinuxFileSystem_4_deleted', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'stopped', 'LinuxFileSystem_4_stopped', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'deleted', 'LinuxFileSystem_3_deleted', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_1', 'cloudify.interfaces.lifecycle.stop', 'stop_LinuxFileSystem_1', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_2', 'cloudify.interfaces.lifecycle.stop', 'stop_LinuxFileSystem_2', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_3', 'cloudify.interfaces.lifecycle.stop', 'stop_LinuxFileSystem_3', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_4', 'cloudify.interfaces.lifecycle.stop', 'stop_LinuxFileSystem_4', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'stopping', 'LinuxFileSystem_1_stopping', custom_context)
custom_context.register_native_delegate_wf_step('CBS2', 'CBS2_uninstall')
custom_context.register_native_delegate_wf_step('CBS4', 'CBS4_uninstall')
custom_context.register_native_delegate_wf_step('CBS1', 'CBS1_uninstall')
custom_context.register_native_delegate_wf_step('CBS3', 'CBS3_uninstall')
generate_native_node_workflows(ctx, graph, custom_context, 'uninstall')
link_tasks(graph, 'stop_LinuxFileSystem_4', 'LinuxFileSystem_4_stopping', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_deleted', 'LinuxFileSystem_2_deleting', custom_context)
link_tasks(graph, 'stop_LinuxFileSystem_3', 'LinuxFileSystem_3_stopping', custom_context)
link_tasks(graph, 'Compute_uninstall', 'LinuxFileSystem_2_deleted', custom_context)
link_tasks(graph, 'CBS2_uninstall', 'LinuxFileSystem_2_deleted', custom_context)
link_tasks(graph, 'Compute_uninstall', 'LinuxFileSystem_1_deleted', custom_context)
link_tasks(graph, 'CBS1_uninstall', 'LinuxFileSystem_1_deleted', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_deleted', 'LinuxFileSystem_3_deleting', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_deleting', 'LinuxFileSystem_1_stopped', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_deleted', 'LinuxFileSystem_4_deleting', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_deleted', 'LinuxFileSystem_1_deleting', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_deleting', 'LinuxFileSystem_2_stopped', custom_context)
link_tasks(graph, 'stop_LinuxFileSystem_2', 'LinuxFileSystem_2_stopping', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_deleting', 'LinuxFileSystem_3_stopped', custom_context)
link_tasks(graph, 'Compute_uninstall', 'LinuxFileSystem_4_deleted', custom_context)
link_tasks(graph, 'CBS4_uninstall', 'LinuxFileSystem_4_deleted', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_deleting', 'LinuxFileSystem_4_stopped', custom_context)
link_tasks(graph, 'CBS3_uninstall', 'LinuxFileSystem_3_deleted', custom_context)
link_tasks(graph, 'Compute_uninstall', 'LinuxFileSystem_3_deleted', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_stopped', 'stop_LinuxFileSystem_1', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_stopped', 'stop_LinuxFileSystem_2', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_stopped', 'stop_LinuxFileSystem_3', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_stopped', 'stop_LinuxFileSystem_4', custom_context)
link_tasks(graph, 'stop_LinuxFileSystem_1', 'LinuxFileSystem_1_stopping', custom_context)
def install_host(ctx, graph, custom_context, compute):
options = {}
options['Compute'] = install_host_compute
options[compute](ctx, graph, custom_context)
def uninstall_host(ctx, graph, custom_context, compute):
options = {}
options['Compute'] = uninstall_host_compute
options[compute](ctx, graph, custom_context)
@workflow
def a4c_install(**kwargs):
graph = ctx.graph_mode()
nodes = _get_all_nodes(ctx)
instances = _get_all_nodes_instances(ctx)
custom_context = CustomContext(ctx, instances, nodes)
ctx.internal.send_workflow_event(event_type='a4c_workflow_started', message=build_pre_event(WfStartEvent('install')))
_a4c_install(ctx, graph, custom_context)
return graph.execute()
@workflow
def a4c_uninstall(**kwargs):
graph = ctx.graph_mode()
nodes = _get_all_nodes(ctx)
instances = _get_all_nodes_instances(ctx)
custom_context = CustomContext(ctx, instances, nodes)
ctx.internal.send_workflow_event(event_type='a4c_workflow_started', message=build_pre_event(WfStartEvent('uninstall')))
_a4c_uninstall(ctx, graph, custom_context)
return graph.execute()
def _a4c_install(ctx, graph, custom_context):
# following code can be pasted in src/test/python/workflows/tasks.py for simulation
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'configuring', 'LinuxFileSystem_2_configuring', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'configuring', 'LinuxFileSystem_4_configuring', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'starting', 'LinuxFileSystem_3_starting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'configuring', 'LinuxFileSystem_1_configuring', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'creating', 'LinuxFileSystem_3_creating', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'configuring', 'LinuxFileSystem_3_configuring', custom_context)
custom_context.register_native_delegate_wf_step('Compute', 'Compute_install')
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'started', 'LinuxFileSystem_4_started', custom_context)
custom_context.register_native_delegate_wf_step('CBS4', 'CBS4_install')
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'started', 'LinuxFileSystem_3_started', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'created', 'LinuxFileSystem_4_created', custom_context)
custom_context.register_native_delegate_wf_step('CBS3', 'CBS3_install')
operation_task(ctx, graph, 'LinuxFileSystem_1', 'cloudify.interfaces.lifecycle.configure', 'configure_LinuxFileSystem_1', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_3', 'cloudify.interfaces.lifecycle.start', 'start_LinuxFileSystem_3', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'created', 'LinuxFileSystem_2_created', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_2', 'cloudify.interfaces.lifecycle.configure', 'configure_LinuxFileSystem_2', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_4', 'cloudify.interfaces.lifecycle.start', 'start_LinuxFileSystem_4', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_1', 'cloudify.interfaces.lifecycle.start', 'start_LinuxFileSystem_1', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'created', 'LinuxFileSystem_3_created', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_3', 'cloudify.interfaces.lifecycle.configure', 'configure_LinuxFileSystem_3', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'configured', 'LinuxFileSystem_2_configured', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_2', 'cloudify.interfaces.lifecycle.start', 'start_LinuxFileSystem_2', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_4', 'cloudify.interfaces.lifecycle.configure', 'configure_LinuxFileSystem_4', custom_context)
custom_context.register_native_delegate_wf_step('PublicNetwork', 'PublicNetwork_install')
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'initial', 'LinuxFileSystem_3_initial', custom_context)
custom_context.register_native_delegate_wf_step('CBS1', 'CBS1_install')
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'initial', 'LinuxFileSystem_2_initial', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'creating', 'LinuxFileSystem_2_creating', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'initial', 'LinuxFileSystem_4_initial', custom_context)
custom_context.register_native_delegate_wf_step('CBS2', 'CBS2_install')
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'initial', 'LinuxFileSystem_1_initial', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'created', 'LinuxFileSystem_1_created', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'started', 'LinuxFileSystem_1_started', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'started', 'LinuxFileSystem_2_started', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'configured', 'LinuxFileSystem_3_configured', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'starting', 'LinuxFileSystem_2_starting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'configured', 'LinuxFileSystem_1_configured', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'creating', 'LinuxFileSystem_4_creating', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'starting', 'LinuxFileSystem_1_starting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'starting', 'LinuxFileSystem_4_starting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'configured', 'LinuxFileSystem_4_configured', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'creating', 'LinuxFileSystem_1_creating', custom_context)
generate_native_node_workflows(ctx, graph, custom_context, 'install')
link_tasks(graph, 'LinuxFileSystem_2_configuring', 'LinuxFileSystem_2_created', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_configuring', 'LinuxFileSystem_4_created', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_starting', 'LinuxFileSystem_3_configured', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_configuring', 'LinuxFileSystem_1_created', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_creating', 'LinuxFileSystem_3_initial', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_configuring', 'LinuxFileSystem_3_created', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_started', 'start_LinuxFileSystem_4', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_started', 'start_LinuxFileSystem_3', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_created', 'LinuxFileSystem_4_creating', custom_context)
link_tasks(graph, 'configure_LinuxFileSystem_1', 'LinuxFileSystem_1_configuring', custom_context)
link_tasks(graph, 'start_LinuxFileSystem_3', 'LinuxFileSystem_3_starting', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_created', 'LinuxFileSystem_2_creating', custom_context)
link_tasks(graph, 'configure_LinuxFileSystem_2', 'LinuxFileSystem_2_configuring', custom_context)
link_tasks(graph, 'start_LinuxFileSystem_4', 'LinuxFileSystem_4_starting', custom_context)
link_tasks(graph, 'start_LinuxFileSystem_1', 'LinuxFileSystem_1_starting', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_created', 'LinuxFileSystem_3_creating', custom_context)
link_tasks(graph, 'configure_LinuxFileSystem_3', 'LinuxFileSystem_3_configuring', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_configured', 'configure_LinuxFileSystem_2', custom_context)
link_tasks(graph, 'start_LinuxFileSystem_2', 'LinuxFileSystem_2_starting', custom_context)
link_tasks(graph, 'configure_LinuxFileSystem_4', 'LinuxFileSystem_4_configuring', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_initial', 'Compute_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_initial', 'CBS3_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_initial', 'Compute_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_initial', 'CBS2_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_creating', 'LinuxFileSystem_2_initial', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_initial', 'Compute_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_initial', 'CBS4_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_initial', 'Compute_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_initial', 'CBS1_install', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_created', 'LinuxFileSystem_1_creating', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_started', 'start_LinuxFileSystem_1', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_started', 'start_LinuxFileSystem_2', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_configured', 'configure_LinuxFileSystem_3', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_starting', 'LinuxFileSystem_2_configured', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_configured', 'configure_LinuxFileSystem_1', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_creating', 'LinuxFileSystem_4_initial', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_starting', 'LinuxFileSystem_1_configured', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_starting', 'LinuxFileSystem_4_configured', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_configured', 'configure_LinuxFileSystem_4', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_creating', 'LinuxFileSystem_1_initial', custom_context)
def _a4c_uninstall(ctx, graph, custom_context):
# following code can be pasted in src/test/python/workflows/tasks.py for simulation
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_2')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_4')
custom_context.add_customized_wf_node('LinuxFileSystem_3')
custom_context.add_customized_wf_node('LinuxFileSystem_1')
custom_context.register_native_delegate_wf_step('Compute', 'Compute_uninstall')
custom_context.register_native_delegate_wf_step('PublicNetwork', 'PublicNetwork_uninstall')
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'stopping', 'LinuxFileSystem_4_stopping', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'deleting', 'LinuxFileSystem_2_deleting', custom_context)
custom_context.register_native_delegate_wf_step('CBS2', 'CBS2_uninstall')
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'stopping', 'LinuxFileSystem_3_stopping', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'deleted', 'LinuxFileSystem_2_deleted', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'deleted', 'LinuxFileSystem_1_deleted', custom_context)
custom_context.register_native_delegate_wf_step('CBS4', 'CBS4_uninstall')
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'deleting', 'LinuxFileSystem_3_deleting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'stopped', 'LinuxFileSystem_1_stopped', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'deleting', 'LinuxFileSystem_4_deleting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'deleting', 'LinuxFileSystem_1_deleting', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'stopped', 'LinuxFileSystem_2_stopped', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_2', 'stopping', 'LinuxFileSystem_2_stopping', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'stopped', 'LinuxFileSystem_3_stopped', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'deleted', 'LinuxFileSystem_4_deleted', custom_context)
custom_context.register_native_delegate_wf_step('CBS1', 'CBS1_uninstall')
set_state_task(ctx, graph, 'LinuxFileSystem_4', 'stopped', 'LinuxFileSystem_4_stopped', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_3', 'deleted', 'LinuxFileSystem_3_deleted', custom_context)
custom_context.register_native_delegate_wf_step('CBS3', 'CBS3_uninstall')
operation_task(ctx, graph, 'LinuxFileSystem_1', 'cloudify.interfaces.lifecycle.stop', 'stop_LinuxFileSystem_1', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_2', 'cloudify.interfaces.lifecycle.stop', 'stop_LinuxFileSystem_2', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_3', 'cloudify.interfaces.lifecycle.stop', 'stop_LinuxFileSystem_3', custom_context)
operation_task(ctx, graph, 'LinuxFileSystem_4', 'cloudify.interfaces.lifecycle.stop', 'stop_LinuxFileSystem_4', custom_context)
set_state_task(ctx, graph, 'LinuxFileSystem_1', 'stopping', 'LinuxFileSystem_1_stopping', custom_context)
generate_native_node_workflows(ctx, graph, custom_context, 'uninstall')
link_tasks(graph, 'Compute_uninstall', 'LinuxFileSystem_2_deleted', custom_context)
link_tasks(graph, 'Compute_uninstall', 'LinuxFileSystem_1_deleted', custom_context)
link_tasks(graph, 'Compute_uninstall', 'LinuxFileSystem_4_deleted', custom_context)
link_tasks(graph, 'Compute_uninstall', 'LinuxFileSystem_3_deleted', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_deleting', 'LinuxFileSystem_2_stopped', custom_context)
link_tasks(graph, 'CBS2_uninstall', 'LinuxFileSystem_2_deleted', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_deleted', 'LinuxFileSystem_2_deleting', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_deleted', 'LinuxFileSystem_1_deleting', custom_context)
link_tasks(graph, 'CBS4_uninstall', 'LinuxFileSystem_4_deleted', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_deleting', 'LinuxFileSystem_3_stopped', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_stopped', 'stop_LinuxFileSystem_1', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_deleting', 'LinuxFileSystem_4_stopped', custom_context)
link_tasks(graph, 'LinuxFileSystem_1_deleting', 'LinuxFileSystem_1_stopped', custom_context)
link_tasks(graph, 'LinuxFileSystem_2_stopped', 'stop_LinuxFileSystem_2', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_stopped', 'stop_LinuxFileSystem_3', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_deleted', 'LinuxFileSystem_4_deleting', custom_context)
link_tasks(graph, 'CBS1_uninstall', 'LinuxFileSystem_1_deleted', custom_context)
link_tasks(graph, 'LinuxFileSystem_4_stopped', 'stop_LinuxFileSystem_4', custom_context)
link_tasks(graph, 'LinuxFileSystem_3_deleted', 'LinuxFileSystem_3_deleting', custom_context)
link_tasks(graph, 'CBS3_uninstall', 'LinuxFileSystem_3_deleted', custom_context)
link_tasks(graph, 'stop_LinuxFileSystem_1', 'LinuxFileSystem_1_stopping', custom_context)
link_tasks(graph, 'stop_LinuxFileSystem_2', 'LinuxFileSystem_2_stopping', custom_context)
link_tasks(graph, 'stop_LinuxFileSystem_3', 'LinuxFileSystem_3_stopping', custom_context)
link_tasks(graph, 'stop_LinuxFileSystem_4', 'LinuxFileSystem_4_stopping', custom_context)
def _get_scaling_group_name_from_node_id(ctx, node_id):
scaling_groups=ctx.deployment.scaling_groups
for group_name, scaling_group in ctx.deployment.scaling_groups.iteritems():
for member in scaling_group['members']:
if member == node_id:
ctx.logger.info("Node {} found in scaling group {}".format(node_id, group_name))
return group_name
return None
@workflow
def a4c_scale(ctx, node_id, delta, scale_compute, **kwargs):
delta = int(delta)
scalable_entity_name = _get_scaling_group_name_from_node_id(ctx, node_id)
scaling_group = ctx.deployment.scaling_groups.get(scalable_entity_name)
if scalable_entity_name:
curr_num_instances = scaling_group['properties']['current_instances']
planned_num_instances = curr_num_instances + delta
scale_id = scalable_entity_name
else:
scaled_node = ctx.get_node(scalable_entity_name)
if not scaled_node:
raise ValueError("Node {0} doesn't exist".format(scalable_entity_name))
if not is_host_node(scaled_node):
raise ValueError("Node {0} is not a host. This workflow can only scale hosts".format(scalable_entity_name))
if delta == 0:
ctx.logger.info('delta parameter is 0, so no scaling will take place.')
return
curr_num_instances = scaled_node.number_of_instances
planned_num_instances = curr_num_instances + delta
scale_id = scaled_node.id
if planned_num_instances < 1:
raise ValueError('Provided delta: {0} is illegal. current number of'
'instances of node/group {1} is {2}'
.format(delta, scalable_entity_name, curr_num_instances))
modification = ctx.deployment.start_modification({
scale_id: {
'instances': planned_num_instances
}
})
ctx.logger.info('Deployment modification started. [modification_id={0} : {1}]'.format(modification.id, dir(modification)))
try:
if delta > 0:
ctx.logger.info('Scaling host/group {0} adding {1} instances'.format(scalable_entity_name, delta))
added_and_related = _get_all_nodes(modification.added)
added = _get_all_modified_node_instances(added_and_related, 'added')
graph = ctx.graph_mode()
ctx.internal.send_workflow_event(event_type='a4c_workflow_started',
message=build_pre_event(WfStartEvent('scale', 'install')))
custom_context = CustomContext(ctx, added, added_and_related)
install_host(ctx, graph, custom_context, node_id)
try:
graph.execute()
except:
ctx.logger.error('Scale failed. Uninstalling node/group {0}'.format(scalable_entity_name))
graph = ctx.internal.task_graph
for task in graph.tasks_iter():
graph.remove_task(task)
try:
custom_context = CustomContext(ctx, added, added_and_related)
uninstall_host(ctx, graph, custom_context, scalable_entity_name)
graph.execute()
except:
ctx.logger.error('Node {0} uninstallation following scale failure has failed'.format(scalable_entity_name))
raise
else:
ctx.logger.info('Unscaling host/group {0} removing {1} instances'.format(scalable_entity_name, delta))
removed_and_related = _get_all_nodes(modification.removed)
removed = _get_all_modified_node_instances(removed_and_related, 'removed')
graph = ctx.graph_mode()
ctx.internal.send_workflow_event(event_type='a4c_workflow_started',
message=build_pre_event(WfStartEvent('scale', 'uninstall')))
custom_context = CustomContext(ctx, removed, removed_and_related)
uninstall_host(ctx, graph, custom_context, node_id)
try:
graph.execute()
except:
ctx.logger.error('Unscale failed.')
raise
except:
ctx.logger.warn('Rolling back deployment modification. [modification_id={0}]'.format(modification.id))
try:
modification.rollback()
except:
ctx.logger.warn('Deployment modification rollback failed. The '
'deployment model is most likely in some corrupted'
' state.'
'[modification_id={0}]'.format(modification.id))
raise
raise
else:
try:
modification.finish()
except:
ctx.logger.warn('Deployment modification finish failed. The '
'deployment model is most likely in some corrupted'
' state.'
'[modification_id={0}]'.format(modification.id))
raise
@workflow
def a4c_heal(
ctx,
node_instance_id,
diagnose_value='Not provided',
**kwargs):
"""Reinstalls the whole subgraph of the system topology
The subgraph consists of all the nodes that are hosted in the
failing node's compute and the compute itself.
Additionally it unlinks and establishes appropriate relationships
:param ctx: cloudify context
:param node_id: failing node's id
:param diagnose_value: diagnosed reason of failure
"""
ctx.logger.info("Starting 'heal' workflow on {0}, Diagnosis: {1}"
.format(node_instance_id, diagnose_value))
failing_node = ctx.get_node_instance(node_instance_id)
host_instance_id = failing_node._node_instance.host_id
failing_node_host = ctx.get_node_instance(host_instance_id)
node_id = failing_node_host.node_id
subgraph_node_instances = failing_node_host.get_contained_subgraph()
added_and_related = _get_all_nodes(ctx)
try:
graph = ctx.graph_mode()
ctx.internal.send_workflow_event(event_type='a4c_workflow_started',
message=build_pre_event(WfStartEvent('heal', 'uninstall')))
custom_context = CustomContext(ctx, subgraph_node_instances, added_and_related)
uninstall_host(ctx, graph, custom_context, node_id)
graph.execute()
except:
ctx.logger.error('Uninstall while healing failed.')
graph = ctx.internal.task_graph
for task in graph.tasks_iter():
graph.remove_task(task)
ctx.internal.send_workflow_event(event_type='a4c_workflow_started',
message=build_pre_event(WfStartEvent('heal', 'install')))
custom_context = CustomContext(ctx, subgraph_node_instances, added_and_related)
install_host(ctx, graph, custom_context, node_id)
graph.execute()
#following code can be pasted in src/test/python/workflows/context.py for simulation
#def _build_nodes(ctx):
#types = []
#types.append('alien.cloudify.openstack.nodes.Volume')
#types.append('alien.cloudify.openstack.nodes.DeletableVolume')
#types.append('tosca.nodes.BlockStorage')
#types.append('tosca.nodes.Root')
#node_CBS3 = _build_node(ctx, 'CBS3', types, 1)
#types = []
#types.append('alien.cloudify.openstack.nodes.Volume')
#types.append('alien.cloudify.openstack.nodes.DeletableVolume')
#types.append('tosca.nodes.BlockStorage')
#types.append('tosca.nodes.Root')
#node_CBS2 = _build_node(ctx, 'CBS2', types, 1)
#types = []
#types.append('alien.nodes.LinuxFileSystem')
#types.append('tosca.nodes.SoftwareComponent')
#types.append('tosca.nodes.Root')
#node_LinuxFileSystem_1 = _build_node(ctx, 'LinuxFileSystem_1', types, 1)
#types = []
#types.append('alien.cloudify.openstack.nodes.Volume')
#types.append('alien.cloudify.openstack.nodes.DeletableVolume')
#types.append('tosca.nodes.BlockStorage')
#types.append('tosca.nodes.Root')
#node_CBS1 = _build_node(ctx, 'CBS1', types, 1)
#types = []
#types.append('alien.nodes.LinuxFileSystem')
#types.append('tosca.nodes.SoftwareComponent')
#types.append('tosca.nodes.Root')
#node_LinuxFileSystem_2 = _build_node(ctx, 'LinuxFileSystem_2', types, 1)
#types = []
#types.append('alien.nodes.LinuxFileSystem')
#types.append('tosca.nodes.SoftwareComponent')
#types.append('tosca.nodes.Root')
#node_LinuxFileSystem_3 = _build_node(ctx, 'LinuxFileSystem_3', types, 1)
#types = []
#types.append('alien.nodes.LinuxFileSystem')
#types.append('tosca.nodes.SoftwareComponent')
#types.append('tosca.nodes.Root')
#node_LinuxFileSystem_4 = _build_node(ctx, 'LinuxFileSystem_4', types, 1)
#types = []
#types.append('alien.nodes.openstack.Compute')
#types.append('tosca.nodes.Compute')
#types.append('tosca.nodes.Root')
#node_Compute = _build_node(ctx, 'Compute', types, 1)
#types = []
#types.append('alien.nodes.openstack.PublicNetwork')
#types.append('alien.nodes.PublicNetwork')
#types.append('tosca.nodes.Network')
#types.append('tosca.nodes.Root')
#node_PublicNetwork = _build_node(ctx, 'PublicNetwork', types, 1)
#types = []
#types.append('alien.cloudify.openstack.nodes.Volume')
#types.append('alien.cloudify.openstack.nodes.DeletableVolume')
#types.append('tosca.nodes.BlockStorage')
#types.append('tosca.nodes.Root')
#node_CBS4 = _build_node(ctx, 'CBS4', types, 1)
#_add_relationship(node_CBS3, node_Compute)
#_add_relationship(node_CBS2, node_Compute)
#_add_relationship(node_LinuxFileSystem_1, node_Compute)
#_add_relationship(node_LinuxFileSystem_1, node_CBS1)
#_add_relationship(node_CBS1, node_Compute)
#_add_relationship(node_LinuxFileSystem_2, node_Compute)
#_add_relationship(node_LinuxFileSystem_2, node_CBS2)
#_add_relationship(node_LinuxFileSystem_3, node_Compute)
#_add_relationship(node_LinuxFileSystem_3, node_CBS3)
#_add_relationship(node_LinuxFileSystem_4, node_Compute)
#_add_relationship(node_LinuxFileSystem_4, node_CBS4)
#_add_relationship(node_Compute, node_PublicNetwork)
#_add_relationship(node_CBS4, node_Compute)
| 70.974763 | 141 | 0.779124 | 5,317 | 44,998 | 6.14463 | 0.04006 | 0.149613 | 0.05485 | 0.083499 | 0.908941 | 0.897707 | 0.873956 | 0.869181 | 0.855682 | 0.846102 | 0 | 0.016706 | 0.119361 | 44,998 | 633 | 142 | 71.086888 | 0.807757 | 0.078515 | 0 | 0.809886 | 0 | 0 | 0.378171 | 0.231037 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020913 | false | 0 | 0.026616 | 0 | 0.057034 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2f79b101efb4f934a8b8d46d1eb6fc278ba4e47f | 2,784 | py | Python | bin/config.py | hchau630/nn-analysis | 0fbe7ad7b2b4566b9f88d8f21413a6d405f96bdc | [
"MIT"
] | null | null | null | bin/config.py | hchau630/nn-analysis | 0fbe7ad7b2b4566b9f88d8f21413a6d405f96bdc | [
"MIT"
] | null | null | null | bin/config.py | hchau630/nn-analysis | 0fbe7ad7b2b4566b9f88d8f21413a6d405f96bdc | [
"MIT"
] | null | null | null | [
{
"model_names": ["identity"],
"layers": [None],
"metrics": [
["decode", 0],
["dimensionality", 0],
["fact", 0],
["generalize", 0],
["neural_fits", 0],
["rdm", 0],
["sparsity", 0],
["curve", 0],
["curve", 1],
["trajectory", 0],
],
},
{
"model_names": [
"moco_control",
"moco_CD",
"moco_CF",
"moco_CDF",
"barlow_control",
"barlow_CD",
"barlow_CF",
"barlow_CDF",
"barlow_P",
"barlow_PF",
],
"epochs": [49,50],
"layers": [None], # All layers
"metrics": [
["decode", 0],
["dimensionality", 0],
["fact", 0],
["generalize", 0],
["neural_fits", 0],
["rdm", 0],
["sparsity", 0],
["curve", 0],
["curve", 1],
["trajectory", 0],
]
},
{
"model_names": [
"barlow_P",
"barlow_P_projector",
],
"epochs": [82,83], # epoch 82
"layers": [None], # All layers
"metrics": [
# ["decode", 0],
# ["dimensionality", 0],
# ["fact", 0],
# ["generalize", 0],
# ["neural_fits", 0],
# ["rdm", 0],
# ["sparsity", 0],
["curve", 0],
["curve", 1],
# ["trajectory", 0],
]
},
{
"model_names": [
"barlow_control",
"barlow_control_projector",
],
"epochs": [54,55], # epoch 82
"layers": [None], # All layers
"metrics": [
# ["decode", 0],
# ["dimensionality", 0],
# ["fact", 0],
# ["generalize", 0],
# ["neural_fits", 0],
# ["rdm", 0],
# ["sparsity", 0],
["curve", 0],
["curve", 1],
# ["trajectory", 0],
]
},
# {
# "model_names": [
# "moco_control",
# "moco_CD",
# "moco_CF",
# "moco_CDF",
# "barlow_control",
# "barlow_CD",
# "barlow_CF",
# "barlow_CDF",
# "barlow_P",
# "barlow_PF",
# ],
# "epochs": [0,50],
# "layers": [15,None], # 15, 16 layers
# "metrics": [
# ["decode", 0],
# ["dimensionality", 0],
# ["fact", 0],
# ["generalize", 0],
# ["neural_fits", 0],
# ["rdm", 0],
# ["sparsity", 0],
# ]
# }
] | 25.081081 | 46 | 0.322198 | 201 | 2,784 | 4.283582 | 0.189055 | 0.055749 | 0.081301 | 0.162602 | 0.864112 | 0.864112 | 0.864112 | 0.864112 | 0.864112 | 0.864112 | 0 | 0.050279 | 0.485632 | 2,784 | 111 | 47 | 25.081081 | 0.550978 | 0.298491 | 0 | 0.614286 | 0 | 0 | 0.236621 | 0.012592 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c866ffaade6182b4d05dcb298a491e5a0a34d6f4 | 108,895 | py | Python | boto3_type_annotations_with_docs/boto3_type_annotations/snowball/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 119 | 2018-12-01T18:20:57.000Z | 2022-02-02T10:31:29.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/snowball/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 15 | 2018-11-16T00:16:44.000Z | 2021-11-13T03:44:18.000Z | boto3_type_annotations_with_docs/boto3_type_annotations/snowball/client.py | cowboygneox/boto3_type_annotations | 450dce1de4e066b939de7eac2ec560ed1a7ddaa2 | [
"MIT"
] | 11 | 2019-05-06T05:26:51.000Z | 2021-09-28T15:27:59.000Z | from typing import Optional
from botocore.client import BaseClient
from botocore.waiter import Waiter
from typing import Union
from typing import Dict
from botocore.paginate import Paginator
class Client(BaseClient):
def can_paginate(self, operation_name: str = None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is ``create_foo``, and you\'d normally invoke the
operation as ``client.create_foo(**kwargs)``, if the
``create_foo`` operation can be paginated, you can use the
call ``client.get_paginator(\"create_foo\")``.
:return: ``True`` if the operation can be paginated,
``False`` otherwise.
"""
pass
def cancel_cluster(self, ClusterId: str) -> Dict:
"""
Cancels a cluster job. You can only cancel a cluster job while it's in the ``AwaitingQuorum`` status. You'll have at least an hour after creating a cluster job to cancel it.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/CancelCluster>`_
**Request Syntax**
::
response = client.cancel_cluster(
ClusterId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --*
:type ClusterId: string
:param ClusterId: **[REQUIRED]**
The 39-character ID for the cluster that you want to cancel, for example ``CID123e4567-e89b-12d3-a456-426655440000`` .
:rtype: dict
:returns:
"""
pass
def cancel_job(self, JobId: str) -> Dict:
"""
Cancels the specified job. You can only cancel a job before its ``JobState`` value changes to ``PreparingAppliance`` . Requesting the ``ListJobs`` or ``DescribeJob`` action returns a job's ``JobState`` as part of the response element data returned.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/CancelJob>`_
**Request Syntax**
::
response = client.cancel_job(
JobId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --*
:type JobId: string
:param JobId: **[REQUIRED]**
The 39-character job ID for the job that you want to cancel, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
:rtype: dict
:returns:
"""
pass
def create_address(self, Address: Dict) -> Dict:
"""
Creates an address for a Snowball to be shipped to. In most regions, addresses are validated at the time of creation. The address you provide must be located within the serviceable area of your region. If the address is invalid or unsupported, then an exception is thrown.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/CreateAddress>`_
**Request Syntax**
::
response = client.create_address(
Address={
'AddressId': 'string',
'Name': 'string',
'Company': 'string',
'Street1': 'string',
'Street2': 'string',
'Street3': 'string',
'City': 'string',
'StateOrProvince': 'string',
'PrefectureOrDistrict': 'string',
'Landmark': 'string',
'Country': 'string',
'PostalCode': 'string',
'PhoneNumber': 'string',
'IsRestricted': True|False
}
)
**Response Syntax**
::
{
'AddressId': 'string'
}
**Response Structure**
- *(dict) --*
- **AddressId** *(string) --*
The automatically generated ID for a specific address. You'll use this ID when you create a job to specify which address you want the Snowball for that job shipped to.
:type Address: dict
:param Address: **[REQUIRED]**
The address that you want the Snowball shipped to.
- **AddressId** *(string) --*
The unique ID for an address.
- **Name** *(string) --*
The name of a person to receive a Snowball at an address.
- **Company** *(string) --*
The name of the company to receive a Snowball at an address.
- **Street1** *(string) --*
The first line in a street address that a Snowball is to be delivered to.
- **Street2** *(string) --*
The second line in a street address that a Snowball is to be delivered to.
- **Street3** *(string) --*
The third line in a street address that a Snowball is to be delivered to.
- **City** *(string) --*
The city in an address that a Snowball is to be delivered to.
- **StateOrProvince** *(string) --*
The state or province in an address that a Snowball is to be delivered to.
- **PrefectureOrDistrict** *(string) --*
This field is no longer used and the value is ignored.
- **Landmark** *(string) --*
This field is no longer used and the value is ignored.
- **Country** *(string) --*
The country in an address that a Snowball is to be delivered to.
- **PostalCode** *(string) --*
The postal code in an address that a Snowball is to be delivered to.
- **PhoneNumber** *(string) --*
The phone number associated with an address that a Snowball is to be delivered to.
- **IsRestricted** *(boolean) --*
If the address you are creating is a primary address, then set this option to true. This field is not supported in most regions.
:rtype: dict
:returns:
"""
pass
def create_cluster(self, JobType: str, Resources: Dict, AddressId: str, RoleARN: str, ShippingOption: str, Description: str = None, KmsKeyARN: str = None, SnowballType: str = None, Notification: Dict = None, ForwardingAddressId: str = None) -> Dict:
"""
Creates an empty cluster. Each cluster supports five nodes. You use the CreateJob action separately to create the jobs for each of these nodes. The cluster does not ship until these five node jobs have been created.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/CreateCluster>`_
**Request Syntax**
::
response = client.create_cluster(
JobType='IMPORT'|'EXPORT'|'LOCAL_USE',
Resources={
'S3Resources': [
{
'BucketArn': 'string',
'KeyRange': {
'BeginMarker': 'string',
'EndMarker': 'string'
}
},
],
'LambdaResources': [
{
'LambdaArn': 'string',
'EventTriggers': [
{
'EventResourceARN': 'string'
},
]
},
],
'Ec2AmiResources': [
{
'AmiId': 'string',
'SnowballAmiId': 'string'
},
]
},
Description='string',
AddressId='string',
KmsKeyARN='string',
RoleARN='string',
SnowballType='STANDARD'|'EDGE'|'EDGE_C'|'EDGE_CG',
ShippingOption='SECOND_DAY'|'NEXT_DAY'|'EXPRESS'|'STANDARD',
Notification={
'SnsTopicARN': 'string',
'JobStatesToNotify': [
'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
],
'NotifyAll': True|False
},
ForwardingAddressId='string'
)
**Response Syntax**
::
{
'ClusterId': 'string'
}
**Response Structure**
- *(dict) --*
- **ClusterId** *(string) --*
The automatically generated ID for a cluster.
:type JobType: string
:param JobType: **[REQUIRED]**
The type of job for this cluster. Currently, the only job type supported for clusters is ``LOCAL_USE`` .
:type Resources: dict
:param Resources: **[REQUIRED]**
The resources associated with the cluster job. These resources include Amazon S3 buckets and optional AWS Lambda functions written in the Python language.
- **S3Resources** *(list) --*
An array of ``S3Resource`` objects.
- *(dict) --*
Each ``S3Resource`` object represents an Amazon S3 bucket that your transferred data will be exported from or imported into. For export jobs, this object can have an optional ``KeyRange`` value. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BucketArn** *(string) --*
The Amazon Resource Name (ARN) of an Amazon S3 bucket.
- **KeyRange** *(dict) --*
For export jobs, you can provide an optional ``KeyRange`` within a specific Amazon S3 bucket. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BeginMarker** *(string) --*
The key that starts an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **EndMarker** *(string) --*
The key that ends an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **LambdaResources** *(list) --*
The Python-language Lambda functions for this job.
- *(dict) --*
Identifies
- **LambdaArn** *(string) --*
An Amazon Resource Name (ARN) that represents an AWS Lambda function to be triggered by PUT object actions on the associated local Amazon S3 resource.
- **EventTriggers** *(list) --*
The array of ARNs for S3Resource objects to trigger the LambdaResource objects associated with this job.
- *(dict) --*
The container for the EventTriggerDefinition$EventResourceARN .
- **EventResourceARN** *(string) --*
The Amazon Resource Name (ARN) for any local Amazon S3 resource that is an AWS Lambda function\'s event trigger associated with this job.
- **Ec2AmiResources** *(list) --*
The Amazon Machine Images (AMIs) associated with this job.
- *(dict) --*
A JSON-formatted object that contains the IDs for an Amazon Machine Image (AMI), including the Amazon EC2 AMI ID and the Snowball Edge AMI ID. Each AMI has these two IDs to simplify identifying the AMI in both the AWS Cloud and on the device.
- **AmiId** *(string) --* **[REQUIRED]**
The ID of the AMI in Amazon EC2.
- **SnowballAmiId** *(string) --*
The ID of the AMI on the supported device.
:type Description: string
:param Description:
An optional description of this specific cluster, for example ``Environmental Data Cluster-01`` .
:type AddressId: string
:param AddressId: **[REQUIRED]**
The ID for the address that you want the cluster shipped to.
:type KmsKeyARN: string
:param KmsKeyARN:
The ``KmsKeyARN`` value that you want to associate with this cluster. ``KmsKeyARN`` values are created by using the `CreateKey <http://docs.aws.amazon.com/kms/latest/APIReference/API_CreateKey.html>`__ API action in AWS Key Management Service (AWS KMS).
:type RoleARN: string
:param RoleARN: **[REQUIRED]**
The ``RoleARN`` that you want to associate with this cluster. ``RoleArn`` values are created by using the `CreateRole <http://docs.aws.amazon.com/IAM/latest/APIReference/API_CreateRole.html>`__ API action in AWS Identity and Access Management (IAM).
:type SnowballType: string
:param SnowballType:
The type of AWS Snowball device to use for this cluster. The only supported device types for cluster jobs are ``EDGE`` , ``EDGE_C`` , and ``EDGE_CG`` .
:type ShippingOption: string
:param ShippingOption: **[REQUIRED]**
The shipping speed for each node in this cluster. This speed doesn\'t dictate how soon you\'ll get each Snowball Edge device, rather it represents how quickly each device moves to its destination while in transit. Regional shipping speeds are as follows:
* In Australia, you have access to express shipping. Typically, devices shipped express are delivered in about a day.
* In the European Union (EU), you have access to express shipping. Typically, Snowball Edges shipped express are delivered in about a day. In addition, most countries in the EU have access to standard shipping, which typically takes less than a week, one way.
* In India, devices are delivered in one to seven days.
* In the US, you have access to one-day shipping and two-day shipping.
:type Notification: dict
:param Notification:
The Amazon Simple Notification Service (Amazon SNS) notification settings for this cluster.
- **SnsTopicARN** *(string) --*
The new SNS ``TopicArn`` that you want to associate with this job. You can create Amazon Resource Names (ARNs) for topics by using the `CreateTopic <http://docs.aws.amazon.com/sns/latest/api/API_CreateTopic.html>`__ Amazon SNS API action.
You can subscribe email addresses to an Amazon SNS topic through the AWS Management Console, or by using the `Subscribe <http://docs.aws.amazon.com/sns/latest/api/API_Subscribe.html>`__ AWS Simple Notification Service (SNS) API action.
- **JobStatesToNotify** *(list) --*
The list of job states that will trigger a notification for this job.
- *(string) --*
- **NotifyAll** *(boolean) --*
Any change in job state will trigger a notification for this job.
:type ForwardingAddressId: string
:param ForwardingAddressId:
The forwarding address ID for a cluster. This field is not supported in most regions.
:rtype: dict
:returns:
"""
pass
def create_job(self, JobType: str = None, Resources: Dict = None, Description: str = None, AddressId: str = None, KmsKeyARN: str = None, RoleARN: str = None, SnowballCapacityPreference: str = None, ShippingOption: str = None, Notification: Dict = None, ClusterId: str = None, SnowballType: str = None, ForwardingAddressId: str = None) -> Dict:
"""
Creates a job to import or export data between Amazon S3 and your on-premises data center. Your AWS account must have the right trust policies and permissions in place to create a job for Snowball. If you're creating a job for a node in a cluster, you only need to provide the ``clusterId`` value; the other job attributes are inherited from the cluster.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/CreateJob>`_
**Request Syntax**
::
response = client.create_job(
JobType='IMPORT'|'EXPORT'|'LOCAL_USE',
Resources={
'S3Resources': [
{
'BucketArn': 'string',
'KeyRange': {
'BeginMarker': 'string',
'EndMarker': 'string'
}
},
],
'LambdaResources': [
{
'LambdaArn': 'string',
'EventTriggers': [
{
'EventResourceARN': 'string'
},
]
},
],
'Ec2AmiResources': [
{
'AmiId': 'string',
'SnowballAmiId': 'string'
},
]
},
Description='string',
AddressId='string',
KmsKeyARN='string',
RoleARN='string',
SnowballCapacityPreference='T50'|'T80'|'T100'|'T42'|'NoPreference',
ShippingOption='SECOND_DAY'|'NEXT_DAY'|'EXPRESS'|'STANDARD',
Notification={
'SnsTopicARN': 'string',
'JobStatesToNotify': [
'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
],
'NotifyAll': True|False
},
ClusterId='string',
SnowballType='STANDARD'|'EDGE'|'EDGE_C'|'EDGE_CG',
ForwardingAddressId='string'
)
**Response Syntax**
::
{
'JobId': 'string'
}
**Response Structure**
- *(dict) --*
- **JobId** *(string) --*
The automatically generated ID for a job, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
:type JobType: string
:param JobType:
Defines the type of job that you\'re creating.
:type Resources: dict
:param Resources:
Defines the Amazon S3 buckets associated with this job.
With ``IMPORT`` jobs, you specify the bucket or buckets that your transferred data will be imported into.
With ``EXPORT`` jobs, you specify the bucket or buckets that your transferred data will be exported from. Optionally, you can also specify a ``KeyRange`` value. If you choose to export a range, you define the length of the range by providing either an inclusive ``BeginMarker`` value, an inclusive ``EndMarker`` value, or both. Ranges are UTF-8 binary sorted.
- **S3Resources** *(list) --*
An array of ``S3Resource`` objects.
- *(dict) --*
Each ``S3Resource`` object represents an Amazon S3 bucket that your transferred data will be exported from or imported into. For export jobs, this object can have an optional ``KeyRange`` value. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BucketArn** *(string) --*
The Amazon Resource Name (ARN) of an Amazon S3 bucket.
- **KeyRange** *(dict) --*
For export jobs, you can provide an optional ``KeyRange`` within a specific Amazon S3 bucket. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BeginMarker** *(string) --*
The key that starts an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **EndMarker** *(string) --*
The key that ends an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **LambdaResources** *(list) --*
The Python-language Lambda functions for this job.
- *(dict) --*
Identifies
- **LambdaArn** *(string) --*
An Amazon Resource Name (ARN) that represents an AWS Lambda function to be triggered by PUT object actions on the associated local Amazon S3 resource.
- **EventTriggers** *(list) --*
The array of ARNs for S3Resource objects to trigger the LambdaResource objects associated with this job.
- *(dict) --*
The container for the EventTriggerDefinition$EventResourceARN .
- **EventResourceARN** *(string) --*
The Amazon Resource Name (ARN) for any local Amazon S3 resource that is an AWS Lambda function\'s event trigger associated with this job.
- **Ec2AmiResources** *(list) --*
The Amazon Machine Images (AMIs) associated with this job.
- *(dict) --*
A JSON-formatted object that contains the IDs for an Amazon Machine Image (AMI), including the Amazon EC2 AMI ID and the Snowball Edge AMI ID. Each AMI has these two IDs to simplify identifying the AMI in both the AWS Cloud and on the device.
- **AmiId** *(string) --* **[REQUIRED]**
The ID of the AMI in Amazon EC2.
- **SnowballAmiId** *(string) --*
The ID of the AMI on the supported device.
:type Description: string
:param Description:
Defines an optional description of this specific job, for example ``Important Photos 2016-08-11`` .
:type AddressId: string
:param AddressId:
The ID for the address that you want the Snowball shipped to.
:type KmsKeyARN: string
:param KmsKeyARN:
The ``KmsKeyARN`` that you want to associate with this job. ``KmsKeyARN`` s are created using the `CreateKey <http://docs.aws.amazon.com/kms/latest/APIReference/API_CreateKey.html>`__ AWS Key Management Service (KMS) API action.
:type RoleARN: string
:param RoleARN:
The ``RoleARN`` that you want to associate with this job. ``RoleArn`` s are created using the `CreateRole <http://docs.aws.amazon.com/IAM/latest/APIReference/API_CreateRole.html>`__ AWS Identity and Access Management (IAM) API action.
:type SnowballCapacityPreference: string
:param SnowballCapacityPreference:
If your job is being created in one of the US regions, you have the option of specifying what size Snowball you\'d like for this job. In all other regions, Snowballs come with 80 TB in storage capacity.
:type ShippingOption: string
:param ShippingOption:
The shipping speed for this job. This speed doesn\'t dictate how soon you\'ll get the Snowball, rather it represents how quickly the Snowball moves to its destination while in transit. Regional shipping speeds are as follows:
* In Australia, you have access to express shipping. Typically, Snowballs shipped express are delivered in about a day.
* In the European Union (EU), you have access to express shipping. Typically, Snowballs shipped express are delivered in about a day. In addition, most countries in the EU have access to standard shipping, which typically takes less than a week, one way.
* In India, Snowballs are delivered in one to seven days.
* In the US, you have access to one-day shipping and two-day shipping.
:type Notification: dict
:param Notification:
Defines the Amazon Simple Notification Service (Amazon SNS) notification settings for this job.
- **SnsTopicARN** *(string) --*
The new SNS ``TopicArn`` that you want to associate with this job. You can create Amazon Resource Names (ARNs) for topics by using the `CreateTopic <http://docs.aws.amazon.com/sns/latest/api/API_CreateTopic.html>`__ Amazon SNS API action.
You can subscribe email addresses to an Amazon SNS topic through the AWS Management Console, or by using the `Subscribe <http://docs.aws.amazon.com/sns/latest/api/API_Subscribe.html>`__ AWS Simple Notification Service (SNS) API action.
- **JobStatesToNotify** *(list) --*
The list of job states that will trigger a notification for this job.
- *(string) --*
- **NotifyAll** *(boolean) --*
Any change in job state will trigger a notification for this job.
:type ClusterId: string
:param ClusterId:
The ID of a cluster. If you\'re creating a job for a node in a cluster, you need to provide only this ``clusterId`` value. The other job attributes are inherited from the cluster.
:type SnowballType: string
:param SnowballType:
The type of AWS Snowball device to use for this job. The only supported device types for cluster jobs are ``EDGE`` , ``EDGE_C`` , and ``EDGE_CG`` .
:type ForwardingAddressId: string
:param ForwardingAddressId:
The forwarding address ID for a job. This field is not supported in most regions.
:rtype: dict
:returns:
"""
pass
def describe_address(self, AddressId: str) -> Dict:
"""
Takes an ``AddressId`` and returns specific details about that address in the form of an ``Address`` object.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/DescribeAddress>`_
**Request Syntax**
::
response = client.describe_address(
AddressId='string'
)
**Response Syntax**
::
{
'Address': {
'AddressId': 'string',
'Name': 'string',
'Company': 'string',
'Street1': 'string',
'Street2': 'string',
'Street3': 'string',
'City': 'string',
'StateOrProvince': 'string',
'PrefectureOrDistrict': 'string',
'Landmark': 'string',
'Country': 'string',
'PostalCode': 'string',
'PhoneNumber': 'string',
'IsRestricted': True|False
}
}
**Response Structure**
- *(dict) --*
- **Address** *(dict) --*
The address that you want the Snowball or Snowballs associated with a specific job to be shipped to.
- **AddressId** *(string) --*
The unique ID for an address.
- **Name** *(string) --*
The name of a person to receive a Snowball at an address.
- **Company** *(string) --*
The name of the company to receive a Snowball at an address.
- **Street1** *(string) --*
The first line in a street address that a Snowball is to be delivered to.
- **Street2** *(string) --*
The second line in a street address that a Snowball is to be delivered to.
- **Street3** *(string) --*
The third line in a street address that a Snowball is to be delivered to.
- **City** *(string) --*
The city in an address that a Snowball is to be delivered to.
- **StateOrProvince** *(string) --*
The state or province in an address that a Snowball is to be delivered to.
- **PrefectureOrDistrict** *(string) --*
This field is no longer used and the value is ignored.
- **Landmark** *(string) --*
This field is no longer used and the value is ignored.
- **Country** *(string) --*
The country in an address that a Snowball is to be delivered to.
- **PostalCode** *(string) --*
The postal code in an address that a Snowball is to be delivered to.
- **PhoneNumber** *(string) --*
The phone number associated with an address that a Snowball is to be delivered to.
- **IsRestricted** *(boolean) --*
If the address you are creating is a primary address, then set this option to true. This field is not supported in most regions.
:type AddressId: string
:param AddressId: **[REQUIRED]**
The automatically generated ID for a specific address.
:rtype: dict
:returns:
"""
pass
def describe_addresses(self, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns a specified number of ``ADDRESS`` objects. Calling this API in one of the US regions will return addresses from the list of all addresses associated with this account in all US regions.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/DescribeAddresses>`_
**Request Syntax**
::
response = client.describe_addresses(
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'Addresses': [
{
'AddressId': 'string',
'Name': 'string',
'Company': 'string',
'Street1': 'string',
'Street2': 'string',
'Street3': 'string',
'City': 'string',
'StateOrProvince': 'string',
'PrefectureOrDistrict': 'string',
'Landmark': 'string',
'Country': 'string',
'PostalCode': 'string',
'PhoneNumber': 'string',
'IsRestricted': True|False
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **Addresses** *(list) --*
The Snowball shipping addresses that were created for this account.
- *(dict) --*
The address that you want the Snowball or Snowballs associated with a specific job to be shipped to. Addresses are validated at the time of creation. The address you provide must be located within the serviceable area of your region. Although no individual elements of the ``Address`` are required, if the address is invalid or unsupported, then an exception is thrown.
- **AddressId** *(string) --*
The unique ID for an address.
- **Name** *(string) --*
The name of a person to receive a Snowball at an address.
- **Company** *(string) --*
The name of the company to receive a Snowball at an address.
- **Street1** *(string) --*
The first line in a street address that a Snowball is to be delivered to.
- **Street2** *(string) --*
The second line in a street address that a Snowball is to be delivered to.
- **Street3** *(string) --*
The third line in a street address that a Snowball is to be delivered to.
- **City** *(string) --*
The city in an address that a Snowball is to be delivered to.
- **StateOrProvince** *(string) --*
The state or province in an address that a Snowball is to be delivered to.
- **PrefectureOrDistrict** *(string) --*
This field is no longer used and the value is ignored.
- **Landmark** *(string) --*
This field is no longer used and the value is ignored.
- **Country** *(string) --*
The country in an address that a Snowball is to be delivered to.
- **PostalCode** *(string) --*
The postal code in an address that a Snowball is to be delivered to.
- **PhoneNumber** *(string) --*
The phone number associated with an address that a Snowball is to be delivered to.
- **IsRestricted** *(boolean) --*
If the address you are creating is a primary address, then set this option to true. This field is not supported in most regions.
- **NextToken** *(string) --*
HTTP requests are stateless. If you use the automatically generated ``NextToken`` value in your next ``DescribeAddresses`` call, your list of returned addresses will start from this point in the array.
:type MaxResults: integer
:param MaxResults:
The number of ``ADDRESS`` objects to return.
:type NextToken: string
:param NextToken:
HTTP requests are stateless. To identify what object comes \"next\" in the list of ``ADDRESS`` objects, you have the option of specifying a value for ``NextToken`` as the starting point for your list of returned addresses.
:rtype: dict
:returns:
"""
pass
def describe_cluster(self, ClusterId: str) -> Dict:
"""
Returns information about a specific cluster including shipping information, cluster status, and other important metadata.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/DescribeCluster>`_
**Request Syntax**
::
response = client.describe_cluster(
ClusterId='string'
)
**Response Syntax**
::
{
'ClusterMetadata': {
'ClusterId': 'string',
'Description': 'string',
'KmsKeyARN': 'string',
'RoleARN': 'string',
'ClusterState': 'AwaitingQuorum'|'Pending'|'InUse'|'Complete'|'Cancelled',
'JobType': 'IMPORT'|'EXPORT'|'LOCAL_USE',
'SnowballType': 'STANDARD'|'EDGE'|'EDGE_C'|'EDGE_CG',
'CreationDate': datetime(2015, 1, 1),
'Resources': {
'S3Resources': [
{
'BucketArn': 'string',
'KeyRange': {
'BeginMarker': 'string',
'EndMarker': 'string'
}
},
],
'LambdaResources': [
{
'LambdaArn': 'string',
'EventTriggers': [
{
'EventResourceARN': 'string'
},
]
},
],
'Ec2AmiResources': [
{
'AmiId': 'string',
'SnowballAmiId': 'string'
},
]
},
'AddressId': 'string',
'ShippingOption': 'SECOND_DAY'|'NEXT_DAY'|'EXPRESS'|'STANDARD',
'Notification': {
'SnsTopicARN': 'string',
'JobStatesToNotify': [
'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
],
'NotifyAll': True|False
},
'ForwardingAddressId': 'string'
}
}
**Response Structure**
- *(dict) --*
- **ClusterMetadata** *(dict) --*
Information about a specific cluster, including shipping information, cluster status, and other important metadata.
- **ClusterId** *(string) --*
The automatically generated ID for a cluster.
- **Description** *(string) --*
The optional description of the cluster.
- **KmsKeyARN** *(string) --*
The ``KmsKeyARN`` Amazon Resource Name (ARN) associated with this cluster. This ARN was created using the `CreateKey <http://docs.aws.amazon.com/kms/latest/APIReference/API_CreateKey.html>`__ API action in AWS Key Management Service (AWS KMS).
- **RoleARN** *(string) --*
The role ARN associated with this cluster. This ARN was created using the `CreateRole <http://docs.aws.amazon.com/IAM/latest/APIReference/API_CreateRole.html>`__ API action in AWS Identity and Access Management (IAM).
- **ClusterState** *(string) --*
The current status of the cluster.
- **JobType** *(string) --*
The type of job for this cluster. Currently, the only job type supported for clusters is ``LOCAL_USE`` .
- **SnowballType** *(string) --*
The type of AWS Snowball device to use for this cluster. The only supported device types for cluster jobs are ``EDGE`` , ``EDGE_C`` , and ``EDGE_CG`` .
- **CreationDate** *(datetime) --*
The creation date for this cluster.
- **Resources** *(dict) --*
The arrays of JobResource objects that can include updated S3Resource objects or LambdaResource objects.
- **S3Resources** *(list) --*
An array of ``S3Resource`` objects.
- *(dict) --*
Each ``S3Resource`` object represents an Amazon S3 bucket that your transferred data will be exported from or imported into. For export jobs, this object can have an optional ``KeyRange`` value. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BucketArn** *(string) --*
The Amazon Resource Name (ARN) of an Amazon S3 bucket.
- **KeyRange** *(dict) --*
For export jobs, you can provide an optional ``KeyRange`` within a specific Amazon S3 bucket. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BeginMarker** *(string) --*
The key that starts an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **EndMarker** *(string) --*
The key that ends an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **LambdaResources** *(list) --*
The Python-language Lambda functions for this job.
- *(dict) --*
Identifies
- **LambdaArn** *(string) --*
An Amazon Resource Name (ARN) that represents an AWS Lambda function to be triggered by PUT object actions on the associated local Amazon S3 resource.
- **EventTriggers** *(list) --*
The array of ARNs for S3Resource objects to trigger the LambdaResource objects associated with this job.
- *(dict) --*
The container for the EventTriggerDefinition$EventResourceARN .
- **EventResourceARN** *(string) --*
The Amazon Resource Name (ARN) for any local Amazon S3 resource that is an AWS Lambda function's event trigger associated with this job.
- **Ec2AmiResources** *(list) --*
The Amazon Machine Images (AMIs) associated with this job.
- *(dict) --*
A JSON-formatted object that contains the IDs for an Amazon Machine Image (AMI), including the Amazon EC2 AMI ID and the Snowball Edge AMI ID. Each AMI has these two IDs to simplify identifying the AMI in both the AWS Cloud and on the device.
- **AmiId** *(string) --*
The ID of the AMI in Amazon EC2.
- **SnowballAmiId** *(string) --*
The ID of the AMI on the supported device.
- **AddressId** *(string) --*
The automatically generated ID for a specific address.
- **ShippingOption** *(string) --*
The shipping speed for each node in this cluster. This speed doesn't dictate how soon you'll get each device, rather it represents how quickly each device moves to its destination while in transit. Regional shipping speeds are as follows:
* In Australia, you have access to express shipping. Typically, devices shipped express are delivered in about a day.
* In the European Union (EU), you have access to express shipping. Typically, devices shipped express are delivered in about a day. In addition, most countries in the EU have access to standard shipping, which typically takes less than a week, one way.
* In India, devices are delivered in one to seven days.
* In the US, you have access to one-day shipping and two-day shipping.
- **Notification** *(dict) --*
The Amazon Simple Notification Service (Amazon SNS) notification settings for this cluster.
- **SnsTopicARN** *(string) --*
The new SNS ``TopicArn`` that you want to associate with this job. You can create Amazon Resource Names (ARNs) for topics by using the `CreateTopic <http://docs.aws.amazon.com/sns/latest/api/API_CreateTopic.html>`__ Amazon SNS API action.
You can subscribe email addresses to an Amazon SNS topic through the AWS Management Console, or by using the `Subscribe <http://docs.aws.amazon.com/sns/latest/api/API_Subscribe.html>`__ AWS Simple Notification Service (SNS) API action.
- **JobStatesToNotify** *(list) --*
The list of job states that will trigger a notification for this job.
- *(string) --*
- **NotifyAll** *(boolean) --*
Any change in job state will trigger a notification for this job.
- **ForwardingAddressId** *(string) --*
The ID of the address that you want a cluster shipped to, after it will be shipped to its primary address. This field is not supported in most regions.
:type ClusterId: string
:param ClusterId: **[REQUIRED]**
The automatically generated ID for a cluster.
:rtype: dict
:returns:
"""
pass
def describe_job(self, JobId: str) -> Dict:
"""
Returns information about a specific job including shipping information, job status, and other important metadata.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/DescribeJob>`_
**Request Syntax**
::
response = client.describe_job(
JobId='string'
)
**Response Syntax**
::
{
'JobMetadata': {
'JobId': 'string',
'JobState': 'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
'JobType': 'IMPORT'|'EXPORT'|'LOCAL_USE',
'SnowballType': 'STANDARD'|'EDGE'|'EDGE_C'|'EDGE_CG',
'CreationDate': datetime(2015, 1, 1),
'Resources': {
'S3Resources': [
{
'BucketArn': 'string',
'KeyRange': {
'BeginMarker': 'string',
'EndMarker': 'string'
}
},
],
'LambdaResources': [
{
'LambdaArn': 'string',
'EventTriggers': [
{
'EventResourceARN': 'string'
},
]
},
],
'Ec2AmiResources': [
{
'AmiId': 'string',
'SnowballAmiId': 'string'
},
]
},
'Description': 'string',
'KmsKeyARN': 'string',
'RoleARN': 'string',
'AddressId': 'string',
'ShippingDetails': {
'ShippingOption': 'SECOND_DAY'|'NEXT_DAY'|'EXPRESS'|'STANDARD',
'InboundShipment': {
'Status': 'string',
'TrackingNumber': 'string'
},
'OutboundShipment': {
'Status': 'string',
'TrackingNumber': 'string'
}
},
'SnowballCapacityPreference': 'T50'|'T80'|'T100'|'T42'|'NoPreference',
'Notification': {
'SnsTopicARN': 'string',
'JobStatesToNotify': [
'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
],
'NotifyAll': True|False
},
'DataTransferProgress': {
'BytesTransferred': 123,
'ObjectsTransferred': 123,
'TotalBytes': 123,
'TotalObjects': 123
},
'JobLogInfo': {
'JobCompletionReportURI': 'string',
'JobSuccessLogURI': 'string',
'JobFailureLogURI': 'string'
},
'ClusterId': 'string',
'ForwardingAddressId': 'string'
},
'SubJobMetadata': [
{
'JobId': 'string',
'JobState': 'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
'JobType': 'IMPORT'|'EXPORT'|'LOCAL_USE',
'SnowballType': 'STANDARD'|'EDGE'|'EDGE_C'|'EDGE_CG',
'CreationDate': datetime(2015, 1, 1),
'Resources': {
'S3Resources': [
{
'BucketArn': 'string',
'KeyRange': {
'BeginMarker': 'string',
'EndMarker': 'string'
}
},
],
'LambdaResources': [
{
'LambdaArn': 'string',
'EventTriggers': [
{
'EventResourceARN': 'string'
},
]
},
],
'Ec2AmiResources': [
{
'AmiId': 'string',
'SnowballAmiId': 'string'
},
]
},
'Description': 'string',
'KmsKeyARN': 'string',
'RoleARN': 'string',
'AddressId': 'string',
'ShippingDetails': {
'ShippingOption': 'SECOND_DAY'|'NEXT_DAY'|'EXPRESS'|'STANDARD',
'InboundShipment': {
'Status': 'string',
'TrackingNumber': 'string'
},
'OutboundShipment': {
'Status': 'string',
'TrackingNumber': 'string'
}
},
'SnowballCapacityPreference': 'T50'|'T80'|'T100'|'T42'|'NoPreference',
'Notification': {
'SnsTopicARN': 'string',
'JobStatesToNotify': [
'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
],
'NotifyAll': True|False
},
'DataTransferProgress': {
'BytesTransferred': 123,
'ObjectsTransferred': 123,
'TotalBytes': 123,
'TotalObjects': 123
},
'JobLogInfo': {
'JobCompletionReportURI': 'string',
'JobSuccessLogURI': 'string',
'JobFailureLogURI': 'string'
},
'ClusterId': 'string',
'ForwardingAddressId': 'string'
},
]
}
**Response Structure**
- *(dict) --*
- **JobMetadata** *(dict) --*
Information about a specific job, including shipping information, job status, and other important metadata.
- **JobId** *(string) --*
The automatically generated ID for a job, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
- **JobState** *(string) --*
The current status of the jobs.
- **JobType** *(string) --*
The type of job.
- **SnowballType** *(string) --*
The type of device used with this job.
- **CreationDate** *(datetime) --*
The creation date for this job.
- **Resources** *(dict) --*
An array of ``S3Resource`` objects. Each ``S3Resource`` object represents an Amazon S3 bucket that your transferred data will be exported from or imported into.
- **S3Resources** *(list) --*
An array of ``S3Resource`` objects.
- *(dict) --*
Each ``S3Resource`` object represents an Amazon S3 bucket that your transferred data will be exported from or imported into. For export jobs, this object can have an optional ``KeyRange`` value. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BucketArn** *(string) --*
The Amazon Resource Name (ARN) of an Amazon S3 bucket.
- **KeyRange** *(dict) --*
For export jobs, you can provide an optional ``KeyRange`` within a specific Amazon S3 bucket. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BeginMarker** *(string) --*
The key that starts an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **EndMarker** *(string) --*
The key that ends an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **LambdaResources** *(list) --*
The Python-language Lambda functions for this job.
- *(dict) --*
Identifies
- **LambdaArn** *(string) --*
An Amazon Resource Name (ARN) that represents an AWS Lambda function to be triggered by PUT object actions on the associated local Amazon S3 resource.
- **EventTriggers** *(list) --*
The array of ARNs for S3Resource objects to trigger the LambdaResource objects associated with this job.
- *(dict) --*
The container for the EventTriggerDefinition$EventResourceARN .
- **EventResourceARN** *(string) --*
The Amazon Resource Name (ARN) for any local Amazon S3 resource that is an AWS Lambda function's event trigger associated with this job.
- **Ec2AmiResources** *(list) --*
The Amazon Machine Images (AMIs) associated with this job.
- *(dict) --*
A JSON-formatted object that contains the IDs for an Amazon Machine Image (AMI), including the Amazon EC2 AMI ID and the Snowball Edge AMI ID. Each AMI has these two IDs to simplify identifying the AMI in both the AWS Cloud and on the device.
- **AmiId** *(string) --*
The ID of the AMI in Amazon EC2.
- **SnowballAmiId** *(string) --*
The ID of the AMI on the supported device.
- **Description** *(string) --*
The description of the job, provided at job creation.
- **KmsKeyARN** *(string) --*
The Amazon Resource Name (ARN) for the AWS Key Management Service (AWS KMS) key associated with this job. This ARN was created using the `CreateKey <http://docs.aws.amazon.com/kms/latest/APIReference/API_CreateKey.html>`__ API action in AWS KMS.
- **RoleARN** *(string) --*
The role ARN associated with this job. This ARN was created using the `CreateRole <http://docs.aws.amazon.com/IAM/latest/APIReference/API_CreateRole.html>`__ API action in AWS Identity and Access Management (IAM).
- **AddressId** *(string) --*
The ID for the address that you want the Snowball shipped to.
- **ShippingDetails** *(dict) --*
A job's shipping information, including inbound and outbound tracking numbers and shipping speed options.
- **ShippingOption** *(string) --*
The shipping speed for a particular job. This speed doesn't dictate how soon you'll get the Snowball from the job's creation date. This speed represents how quickly it moves to its destination while in transit. Regional shipping speeds are as follows:
* In Australia, you have access to express shipping. Typically, Snowballs shipped express are delivered in about a day.
* In the European Union (EU), you have access to express shipping. Typically, Snowballs shipped express are delivered in about a day. In addition, most countries in the EU have access to standard shipping, which typically takes less than a week, one way.
* In India, Snowballs are delivered in one to seven days.
* In the United States of America (US), you have access to one-day shipping and two-day shipping.
- **InboundShipment** *(dict) --*
The ``Status`` and ``TrackingNumber`` values for a Snowball being returned to AWS for a particular job.
- **Status** *(string) --*
Status information for a shipment.
- **TrackingNumber** *(string) --*
The tracking number for this job. Using this tracking number with your region's carrier's website, you can track a Snowball as the carrier transports it.
For India, the carrier is Amazon Logistics. For all other regions, UPS is the carrier.
- **OutboundShipment** *(dict) --*
The ``Status`` and ``TrackingNumber`` values for a Snowball being delivered to the address that you specified for a particular job.
- **Status** *(string) --*
Status information for a shipment.
- **TrackingNumber** *(string) --*
The tracking number for this job. Using this tracking number with your region's carrier's website, you can track a Snowball as the carrier transports it.
For India, the carrier is Amazon Logistics. For all other regions, UPS is the carrier.
- **SnowballCapacityPreference** *(string) --*
The Snowball capacity preference for this job, specified at job creation. In US regions, you can choose between 50 TB and 80 TB Snowballs. All other regions use 80 TB capacity Snowballs.
- **Notification** *(dict) --*
The Amazon Simple Notification Service (Amazon SNS) notification settings associated with a specific job. The ``Notification`` object is returned as a part of the response syntax of the ``DescribeJob`` action in the ``JobMetadata`` data type.
- **SnsTopicARN** *(string) --*
The new SNS ``TopicArn`` that you want to associate with this job. You can create Amazon Resource Names (ARNs) for topics by using the `CreateTopic <http://docs.aws.amazon.com/sns/latest/api/API_CreateTopic.html>`__ Amazon SNS API action.
You can subscribe email addresses to an Amazon SNS topic through the AWS Management Console, or by using the `Subscribe <http://docs.aws.amazon.com/sns/latest/api/API_Subscribe.html>`__ AWS Simple Notification Service (SNS) API action.
- **JobStatesToNotify** *(list) --*
The list of job states that will trigger a notification for this job.
- *(string) --*
- **NotifyAll** *(boolean) --*
Any change in job state will trigger a notification for this job.
- **DataTransferProgress** *(dict) --*
A value that defines the real-time status of a Snowball's data transfer while the device is at AWS. This data is only available while a job has a ``JobState`` value of ``InProgress`` , for both import and export jobs.
- **BytesTransferred** *(integer) --*
The number of bytes transferred between a Snowball and Amazon S3.
- **ObjectsTransferred** *(integer) --*
The number of objects transferred between a Snowball and Amazon S3.
- **TotalBytes** *(integer) --*
The total bytes of data for a transfer between a Snowball and Amazon S3. This value is set to 0 (zero) until all the keys that will be transferred have been listed.
- **TotalObjects** *(integer) --*
The total number of objects for a transfer between a Snowball and Amazon S3. This value is set to 0 (zero) until all the keys that will be transferred have been listed.
- **JobLogInfo** *(dict) --*
Links to Amazon S3 presigned URLs for the job report and logs. For import jobs, the PDF job report becomes available at the end of the import process. For export jobs, your job report typically becomes available while the Snowball for your job part is being delivered to you.
- **JobCompletionReportURI** *(string) --*
A link to an Amazon S3 presigned URL where the job completion report is located.
- **JobSuccessLogURI** *(string) --*
A link to an Amazon S3 presigned URL where the job success log is located.
- **JobFailureLogURI** *(string) --*
A link to an Amazon S3 presigned URL where the job failure log is located.
- **ClusterId** *(string) --*
The 39-character ID for the cluster, for example ``CID123e4567-e89b-12d3-a456-426655440000`` .
- **ForwardingAddressId** *(string) --*
The ID of the address that you want a job shipped to, after it will be shipped to its primary address. This field is not supported in most regions.
- **SubJobMetadata** *(list) --*
Information about a specific job part (in the case of an export job), including shipping information, job status, and other important metadata.
- *(dict) --*
Contains information about a specific job including shipping information, job status, and other important metadata. This information is returned as a part of the response syntax of the ``DescribeJob`` action.
- **JobId** *(string) --*
The automatically generated ID for a job, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
- **JobState** *(string) --*
The current status of the jobs.
- **JobType** *(string) --*
The type of job.
- **SnowballType** *(string) --*
The type of device used with this job.
- **CreationDate** *(datetime) --*
The creation date for this job.
- **Resources** *(dict) --*
An array of ``S3Resource`` objects. Each ``S3Resource`` object represents an Amazon S3 bucket that your transferred data will be exported from or imported into.
- **S3Resources** *(list) --*
An array of ``S3Resource`` objects.
- *(dict) --*
Each ``S3Resource`` object represents an Amazon S3 bucket that your transferred data will be exported from or imported into. For export jobs, this object can have an optional ``KeyRange`` value. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BucketArn** *(string) --*
The Amazon Resource Name (ARN) of an Amazon S3 bucket.
- **KeyRange** *(dict) --*
For export jobs, you can provide an optional ``KeyRange`` within a specific Amazon S3 bucket. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BeginMarker** *(string) --*
The key that starts an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **EndMarker** *(string) --*
The key that ends an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **LambdaResources** *(list) --*
The Python-language Lambda functions for this job.
- *(dict) --*
Identifies
- **LambdaArn** *(string) --*
An Amazon Resource Name (ARN) that represents an AWS Lambda function to be triggered by PUT object actions on the associated local Amazon S3 resource.
- **EventTriggers** *(list) --*
The array of ARNs for S3Resource objects to trigger the LambdaResource objects associated with this job.
- *(dict) --*
The container for the EventTriggerDefinition$EventResourceARN .
- **EventResourceARN** *(string) --*
The Amazon Resource Name (ARN) for any local Amazon S3 resource that is an AWS Lambda function's event trigger associated with this job.
- **Ec2AmiResources** *(list) --*
The Amazon Machine Images (AMIs) associated with this job.
- *(dict) --*
A JSON-formatted object that contains the IDs for an Amazon Machine Image (AMI), including the Amazon EC2 AMI ID and the Snowball Edge AMI ID. Each AMI has these two IDs to simplify identifying the AMI in both the AWS Cloud and on the device.
- **AmiId** *(string) --*
The ID of the AMI in Amazon EC2.
- **SnowballAmiId** *(string) --*
The ID of the AMI on the supported device.
- **Description** *(string) --*
The description of the job, provided at job creation.
- **KmsKeyARN** *(string) --*
The Amazon Resource Name (ARN) for the AWS Key Management Service (AWS KMS) key associated with this job. This ARN was created using the `CreateKey <http://docs.aws.amazon.com/kms/latest/APIReference/API_CreateKey.html>`__ API action in AWS KMS.
- **RoleARN** *(string) --*
The role ARN associated with this job. This ARN was created using the `CreateRole <http://docs.aws.amazon.com/IAM/latest/APIReference/API_CreateRole.html>`__ API action in AWS Identity and Access Management (IAM).
- **AddressId** *(string) --*
The ID for the address that you want the Snowball shipped to.
- **ShippingDetails** *(dict) --*
A job's shipping information, including inbound and outbound tracking numbers and shipping speed options.
- **ShippingOption** *(string) --*
The shipping speed for a particular job. This speed doesn't dictate how soon you'll get the Snowball from the job's creation date. This speed represents how quickly it moves to its destination while in transit. Regional shipping speeds are as follows:
* In Australia, you have access to express shipping. Typically, Snowballs shipped express are delivered in about a day.
* In the European Union (EU), you have access to express shipping. Typically, Snowballs shipped express are delivered in about a day. In addition, most countries in the EU have access to standard shipping, which typically takes less than a week, one way.
* In India, Snowballs are delivered in one to seven days.
* In the United States of America (US), you have access to one-day shipping and two-day shipping.
- **InboundShipment** *(dict) --*
The ``Status`` and ``TrackingNumber`` values for a Snowball being returned to AWS for a particular job.
- **Status** *(string) --*
Status information for a shipment.
- **TrackingNumber** *(string) --*
The tracking number for this job. Using this tracking number with your region's carrier's website, you can track a Snowball as the carrier transports it.
For India, the carrier is Amazon Logistics. For all other regions, UPS is the carrier.
- **OutboundShipment** *(dict) --*
The ``Status`` and ``TrackingNumber`` values for a Snowball being delivered to the address that you specified for a particular job.
- **Status** *(string) --*
Status information for a shipment.
- **TrackingNumber** *(string) --*
The tracking number for this job. Using this tracking number with your region's carrier's website, you can track a Snowball as the carrier transports it.
For India, the carrier is Amazon Logistics. For all other regions, UPS is the carrier.
- **SnowballCapacityPreference** *(string) --*
The Snowball capacity preference for this job, specified at job creation. In US regions, you can choose between 50 TB and 80 TB Snowballs. All other regions use 80 TB capacity Snowballs.
- **Notification** *(dict) --*
The Amazon Simple Notification Service (Amazon SNS) notification settings associated with a specific job. The ``Notification`` object is returned as a part of the response syntax of the ``DescribeJob`` action in the ``JobMetadata`` data type.
- **SnsTopicARN** *(string) --*
The new SNS ``TopicArn`` that you want to associate with this job. You can create Amazon Resource Names (ARNs) for topics by using the `CreateTopic <http://docs.aws.amazon.com/sns/latest/api/API_CreateTopic.html>`__ Amazon SNS API action.
You can subscribe email addresses to an Amazon SNS topic through the AWS Management Console, or by using the `Subscribe <http://docs.aws.amazon.com/sns/latest/api/API_Subscribe.html>`__ AWS Simple Notification Service (SNS) API action.
- **JobStatesToNotify** *(list) --*
The list of job states that will trigger a notification for this job.
- *(string) --*
- **NotifyAll** *(boolean) --*
Any change in job state will trigger a notification for this job.
- **DataTransferProgress** *(dict) --*
A value that defines the real-time status of a Snowball's data transfer while the device is at AWS. This data is only available while a job has a ``JobState`` value of ``InProgress`` , for both import and export jobs.
- **BytesTransferred** *(integer) --*
The number of bytes transferred between a Snowball and Amazon S3.
- **ObjectsTransferred** *(integer) --*
The number of objects transferred between a Snowball and Amazon S3.
- **TotalBytes** *(integer) --*
The total bytes of data for a transfer between a Snowball and Amazon S3. This value is set to 0 (zero) until all the keys that will be transferred have been listed.
- **TotalObjects** *(integer) --*
The total number of objects for a transfer between a Snowball and Amazon S3. This value is set to 0 (zero) until all the keys that will be transferred have been listed.
- **JobLogInfo** *(dict) --*
Links to Amazon S3 presigned URLs for the job report and logs. For import jobs, the PDF job report becomes available at the end of the import process. For export jobs, your job report typically becomes available while the Snowball for your job part is being delivered to you.
- **JobCompletionReportURI** *(string) --*
A link to an Amazon S3 presigned URL where the job completion report is located.
- **JobSuccessLogURI** *(string) --*
A link to an Amazon S3 presigned URL where the job success log is located.
- **JobFailureLogURI** *(string) --*
A link to an Amazon S3 presigned URL where the job failure log is located.
- **ClusterId** *(string) --*
The 39-character ID for the cluster, for example ``CID123e4567-e89b-12d3-a456-426655440000`` .
- **ForwardingAddressId** *(string) --*
The ID of the address that you want a job shipped to, after it will be shipped to its primary address. This field is not supported in most regions.
:type JobId: string
:param JobId: **[REQUIRED]**
The automatically generated ID for a job, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
:rtype: dict
:returns:
"""
pass
def generate_presigned_url(self, ClientMethod: str = None, Params: Dict = None, ExpiresIn: int = None, HttpMethod: str = None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to
``ClientMethod``.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid
for. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By
default, the http method is whatever is used in the method\'s model.
:returns: The presigned url
"""
pass
def get_job_manifest(self, JobId: str) -> Dict:
"""
Returns a link to an Amazon S3 presigned URL for the manifest file associated with the specified ``JobId`` value. You can access the manifest file for up to 60 minutes after this request has been made. To access the manifest file after 60 minutes have passed, you'll have to make another call to the ``GetJobManifest`` action.
The manifest is an encrypted file that you can download after your job enters the ``WithCustomer`` status. The manifest is decrypted by using the ``UnlockCode`` code value, when you pass both values to the Snowball through the Snowball client when the client is started for the first time.
As a best practice, we recommend that you don't save a copy of an ``UnlockCode`` value in the same location as the manifest file for that job. Saving these separately helps prevent unauthorized parties from gaining access to the Snowball associated with that job.
The credentials of a given job, including its manifest file and unlock code, expire 90 days after the job is created.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/GetJobManifest>`_
**Request Syntax**
::
response = client.get_job_manifest(
JobId='string'
)
**Response Syntax**
::
{
'ManifestURI': 'string'
}
**Response Structure**
- *(dict) --*
- **ManifestURI** *(string) --*
The Amazon S3 presigned URL for the manifest file associated with the specified ``JobId`` value.
:type JobId: string
:param JobId: **[REQUIRED]**
The ID for a job that you want to get the manifest file for, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
:rtype: dict
:returns:
"""
pass
def get_job_unlock_code(self, JobId: str) -> Dict:
"""
Returns the ``UnlockCode`` code value for the specified job. A particular ``UnlockCode`` value can be accessed for up to 90 days after the associated job has been created.
The ``UnlockCode`` value is a 29-character code with 25 alphanumeric characters and 4 hyphens. This code is used to decrypt the manifest file when it is passed along with the manifest to the Snowball through the Snowball client when the client is started for the first time.
As a best practice, we recommend that you don't save a copy of the ``UnlockCode`` in the same location as the manifest file for that job. Saving these separately helps prevent unauthorized parties from gaining access to the Snowball associated with that job.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/GetJobUnlockCode>`_
**Request Syntax**
::
response = client.get_job_unlock_code(
JobId='string'
)
**Response Syntax**
::
{
'UnlockCode': 'string'
}
**Response Structure**
- *(dict) --*
- **UnlockCode** *(string) --*
The ``UnlockCode`` value for the specified job. The ``UnlockCode`` value can be accessed for up to 90 days after the job has been created.
:type JobId: string
:param JobId: **[REQUIRED]**
The ID for the job that you want to get the ``UnlockCode`` value for, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
:rtype: dict
:returns:
"""
pass
def get_paginator(self, operation_name: str = None) -> Paginator:
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is ``create_foo``, and you\'d normally invoke the
operation as ``client.create_foo(**kwargs)``, if the
``create_foo`` operation can be paginated, you can use the
call ``client.get_paginator(\"create_foo\")``.
:raise OperationNotPageableError: Raised if the operation is not
pageable. You can use the ``client.can_paginate`` method to
check if an operation is pageable.
:rtype: L{botocore.paginate.Paginator}
:return: A paginator object.
"""
pass
def get_snowball_usage(self) -> Dict:
"""
Returns information about the Snowball service limit for your account, and also the number of Snowballs your account has in use.
The default service limit for the number of Snowballs that you can have at one time is 1. If you want to increase your service limit, contact AWS Support.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/GetSnowballUsage>`_
**Request Syntax**
::
response = client.get_snowball_usage()
**Response Syntax**
::
{
'SnowballLimit': 123,
'SnowballsInUse': 123
}
**Response Structure**
- *(dict) --*
- **SnowballLimit** *(integer) --*
The service limit for number of Snowballs this account can have at once. The default service limit is 1 (one).
- **SnowballsInUse** *(integer) --*
The number of Snowballs that this account is currently using.
:rtype: dict
:returns:
"""
pass
def get_waiter(self, waiter_name: str = None) -> Waiter:
"""
Returns an object that can wait for some condition.
:type waiter_name: str
:param waiter_name: The name of the waiter to get. See the waiters
section of the service docs for a list of available waiters.
:returns: The specified waiter object.
:rtype: botocore.waiter.Waiter
"""
pass
def list_cluster_jobs(self, ClusterId: str, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns an array of ``JobListEntry`` objects of the specified length. Each ``JobListEntry`` object is for a job in the specified cluster and contains a job's state, a job's ID, and other information.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/ListClusterJobs>`_
**Request Syntax**
::
response = client.list_cluster_jobs(
ClusterId='string',
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'JobListEntries': [
{
'JobId': 'string',
'JobState': 'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
'IsMaster': True|False,
'JobType': 'IMPORT'|'EXPORT'|'LOCAL_USE',
'SnowballType': 'STANDARD'|'EDGE'|'EDGE_C'|'EDGE_CG',
'CreationDate': datetime(2015, 1, 1),
'Description': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **JobListEntries** *(list) --*
Each ``JobListEntry`` object contains a job's state, a job's ID, and a value that indicates whether the job is a job part, in the case of export jobs.
- *(dict) --*
Each ``JobListEntry`` object contains a job's state, a job's ID, and a value that indicates whether the job is a job part, in the case of an export job.
- **JobId** *(string) --*
The automatically generated ID for a job, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
- **JobState** *(string) --*
The current state of this job.
- **IsMaster** *(boolean) --*
A value that indicates that this job is a master job. A master job represents a successful request to create an export job. Master jobs aren't associated with any Snowballs. Instead, each master job will have at least one job part, and each job part is associated with a Snowball. It might take some time before the job parts associated with a particular master job are listed, because they are created after the master job is created.
- **JobType** *(string) --*
The type of job.
- **SnowballType** *(string) --*
The type of device used with this job.
- **CreationDate** *(datetime) --*
The creation date for this job.
- **Description** *(string) --*
The optional description of this specific job, for example ``Important Photos 2016-08-11`` .
- **NextToken** *(string) --*
HTTP requests are stateless. If you use the automatically generated ``NextToken`` value in your next ``ListClusterJobsResult`` call, your list of returned jobs will start from this point in the array.
:type ClusterId: string
:param ClusterId: **[REQUIRED]**
The 39-character ID for the cluster that you want to list, for example ``CID123e4567-e89b-12d3-a456-426655440000`` .
:type MaxResults: integer
:param MaxResults:
The number of ``JobListEntry`` objects to return.
:type NextToken: string
:param NextToken:
HTTP requests are stateless. To identify what object comes \"next\" in the list of ``JobListEntry`` objects, you have the option of specifying ``NextToken`` as the starting point for your returned list.
:rtype: dict
:returns:
"""
pass
def list_clusters(self, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns an array of ``ClusterListEntry`` objects of the specified length. Each ``ClusterListEntry`` object contains a cluster's state, a cluster's ID, and other important status information.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/ListClusters>`_
**Request Syntax**
::
response = client.list_clusters(
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'ClusterListEntries': [
{
'ClusterId': 'string',
'ClusterState': 'AwaitingQuorum'|'Pending'|'InUse'|'Complete'|'Cancelled',
'CreationDate': datetime(2015, 1, 1),
'Description': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **ClusterListEntries** *(list) --*
Each ``ClusterListEntry`` object contains a cluster's state, a cluster's ID, and other important status information.
- *(dict) --*
Contains a cluster's state, a cluster's ID, and other important information.
- **ClusterId** *(string) --*
The 39-character ID for the cluster that you want to list, for example ``CID123e4567-e89b-12d3-a456-426655440000`` .
- **ClusterState** *(string) --*
The current state of this cluster. For information about the state of a specific node, see JobListEntry$JobState .
- **CreationDate** *(datetime) --*
The creation date for this cluster.
- **Description** *(string) --*
Defines an optional description of the cluster, for example ``Environmental Data Cluster-01`` .
- **NextToken** *(string) --*
HTTP requests are stateless. If you use the automatically generated ``NextToken`` value in your next ``ClusterListEntry`` call, your list of returned clusters will start from this point in the array.
:type MaxResults: integer
:param MaxResults:
The number of ``ClusterListEntry`` objects to return.
:type NextToken: string
:param NextToken:
HTTP requests are stateless. To identify what object comes \"next\" in the list of ``ClusterListEntry`` objects, you have the option of specifying ``NextToken`` as the starting point for your returned list.
:rtype: dict
:returns:
"""
pass
def list_compatible_images(self, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
This action returns a list of the different Amazon EC2 Amazon Machine Images (AMIs) that are owned by your AWS account that would be supported for use on ``EDGE`` , ``EDGE_C`` , and ``EDGE_CG`` devices. For more information on compatible AMIs, see `Using Amazon EC2 Compute Instances <http://docs.aws.amazon.com/snowball/latest/developer-guide/using-ec2.html>`__ in the *AWS Snowball Developer Guide* .
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/ListCompatibleImages>`_
**Request Syntax**
::
response = client.list_compatible_images(
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'CompatibleImages': [
{
'AmiId': 'string',
'Name': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **CompatibleImages** *(list) --*
A JSON-formatted object that describes a compatible AMI.
- *(dict) --*
A JSON-formatted object that describes a compatible Amazon Machine Image (AMI). For more information on compatible AMIs, see `Using Amazon EC2 Compute Instances <http://docs.aws.amazon.com/snowball/latest/developer-guide/using-ec2.html>`__ in the *AWS Snowball Developer Guide* .
- **AmiId** *(string) --*
The unique identifier for an individual Snowball Edge AMI.
- **Name** *(string) --*
The optional name of a compatible image.
- **NextToken** *(string) --*
Because HTTP requests are stateless, this is the starting point for your next list of returned images.
:type MaxResults: integer
:param MaxResults:
The maximum number of results for the list of compatible images. Currently, each supported device can store 10 AMIs.
:type NextToken: string
:param NextToken:
HTTP requests are stateless. To identify what object comes \"next\" in the list of compatible images, you can specify a value for ``NextToken`` as the starting point for your list of returned images.
:rtype: dict
:returns:
"""
pass
def list_jobs(self, MaxResults: int = None, NextToken: str = None) -> Dict:
"""
Returns an array of ``JobListEntry`` objects of the specified length. Each ``JobListEntry`` object contains a job's state, a job's ID, and a value that indicates whether the job is a job part, in the case of export jobs. Calling this API action in one of the US regions will return jobs from the list of all jobs associated with this account in all US regions.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/ListJobs>`_
**Request Syntax**
::
response = client.list_jobs(
MaxResults=123,
NextToken='string'
)
**Response Syntax**
::
{
'JobListEntries': [
{
'JobId': 'string',
'JobState': 'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
'IsMaster': True|False,
'JobType': 'IMPORT'|'EXPORT'|'LOCAL_USE',
'SnowballType': 'STANDARD'|'EDGE'|'EDGE_C'|'EDGE_CG',
'CreationDate': datetime(2015, 1, 1),
'Description': 'string'
},
],
'NextToken': 'string'
}
**Response Structure**
- *(dict) --*
- **JobListEntries** *(list) --*
Each ``JobListEntry`` object contains a job's state, a job's ID, and a value that indicates whether the job is a job part, in the case of export jobs.
- *(dict) --*
Each ``JobListEntry`` object contains a job's state, a job's ID, and a value that indicates whether the job is a job part, in the case of an export job.
- **JobId** *(string) --*
The automatically generated ID for a job, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
- **JobState** *(string) --*
The current state of this job.
- **IsMaster** *(boolean) --*
A value that indicates that this job is a master job. A master job represents a successful request to create an export job. Master jobs aren't associated with any Snowballs. Instead, each master job will have at least one job part, and each job part is associated with a Snowball. It might take some time before the job parts associated with a particular master job are listed, because they are created after the master job is created.
- **JobType** *(string) --*
The type of job.
- **SnowballType** *(string) --*
The type of device used with this job.
- **CreationDate** *(datetime) --*
The creation date for this job.
- **Description** *(string) --*
The optional description of this specific job, for example ``Important Photos 2016-08-11`` .
- **NextToken** *(string) --*
HTTP requests are stateless. If you use this automatically generated ``NextToken`` value in your next ``ListJobs`` call, your returned ``JobListEntry`` objects will start from this point in the array.
:type MaxResults: integer
:param MaxResults:
The number of ``JobListEntry`` objects to return.
:type NextToken: string
:param NextToken:
HTTP requests are stateless. To identify what object comes \"next\" in the list of ``JobListEntry`` objects, you have the option of specifying ``NextToken`` as the starting point for your returned list.
:rtype: dict
:returns:
"""
pass
def update_cluster(self, ClusterId: str, RoleARN: str = None, Description: str = None, Resources: Dict = None, AddressId: str = None, ShippingOption: str = None, Notification: Dict = None, ForwardingAddressId: str = None) -> Dict:
"""
While a cluster's ``ClusterState`` value is in the ``AwaitingQuorum`` state, you can update some of the information associated with a cluster. Once the cluster changes to a different job state, usually 60 minutes after the cluster being created, this action is no longer available.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/UpdateCluster>`_
**Request Syntax**
::
response = client.update_cluster(
ClusterId='string',
RoleARN='string',
Description='string',
Resources={
'S3Resources': [
{
'BucketArn': 'string',
'KeyRange': {
'BeginMarker': 'string',
'EndMarker': 'string'
}
},
],
'LambdaResources': [
{
'LambdaArn': 'string',
'EventTriggers': [
{
'EventResourceARN': 'string'
},
]
},
],
'Ec2AmiResources': [
{
'AmiId': 'string',
'SnowballAmiId': 'string'
},
]
},
AddressId='string',
ShippingOption='SECOND_DAY'|'NEXT_DAY'|'EXPRESS'|'STANDARD',
Notification={
'SnsTopicARN': 'string',
'JobStatesToNotify': [
'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
],
'NotifyAll': True|False
},
ForwardingAddressId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --*
:type ClusterId: string
:param ClusterId: **[REQUIRED]**
The cluster ID of the cluster that you want to update, for example ``CID123e4567-e89b-12d3-a456-426655440000`` .
:type RoleARN: string
:param RoleARN:
The new role Amazon Resource Name (ARN) that you want to associate with this cluster. To create a role ARN, use the `CreateRole <http://docs.aws.amazon.com/IAM/latest/APIReference/API_CreateRole.html>`__ API action in AWS Identity and Access Management (IAM).
:type Description: string
:param Description:
The updated description of this cluster.
:type Resources: dict
:param Resources:
The updated arrays of JobResource objects that can include updated S3Resource objects or LambdaResource objects.
- **S3Resources** *(list) --*
An array of ``S3Resource`` objects.
- *(dict) --*
Each ``S3Resource`` object represents an Amazon S3 bucket that your transferred data will be exported from or imported into. For export jobs, this object can have an optional ``KeyRange`` value. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BucketArn** *(string) --*
The Amazon Resource Name (ARN) of an Amazon S3 bucket.
- **KeyRange** *(dict) --*
For export jobs, you can provide an optional ``KeyRange`` within a specific Amazon S3 bucket. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BeginMarker** *(string) --*
The key that starts an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **EndMarker** *(string) --*
The key that ends an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **LambdaResources** *(list) --*
The Python-language Lambda functions for this job.
- *(dict) --*
Identifies
- **LambdaArn** *(string) --*
An Amazon Resource Name (ARN) that represents an AWS Lambda function to be triggered by PUT object actions on the associated local Amazon S3 resource.
- **EventTriggers** *(list) --*
The array of ARNs for S3Resource objects to trigger the LambdaResource objects associated with this job.
- *(dict) --*
The container for the EventTriggerDefinition$EventResourceARN .
- **EventResourceARN** *(string) --*
The Amazon Resource Name (ARN) for any local Amazon S3 resource that is an AWS Lambda function\'s event trigger associated with this job.
- **Ec2AmiResources** *(list) --*
The Amazon Machine Images (AMIs) associated with this job.
- *(dict) --*
A JSON-formatted object that contains the IDs for an Amazon Machine Image (AMI), including the Amazon EC2 AMI ID and the Snowball Edge AMI ID. Each AMI has these two IDs to simplify identifying the AMI in both the AWS Cloud and on the device.
- **AmiId** *(string) --* **[REQUIRED]**
The ID of the AMI in Amazon EC2.
- **SnowballAmiId** *(string) --*
The ID of the AMI on the supported device.
:type AddressId: string
:param AddressId:
The ID of the updated Address object.
:type ShippingOption: string
:param ShippingOption:
The updated shipping option value of this cluster\'s ShippingDetails object.
:type Notification: dict
:param Notification:
The new or updated Notification object.
- **SnsTopicARN** *(string) --*
The new SNS ``TopicArn`` that you want to associate with this job. You can create Amazon Resource Names (ARNs) for topics by using the `CreateTopic <http://docs.aws.amazon.com/sns/latest/api/API_CreateTopic.html>`__ Amazon SNS API action.
You can subscribe email addresses to an Amazon SNS topic through the AWS Management Console, or by using the `Subscribe <http://docs.aws.amazon.com/sns/latest/api/API_Subscribe.html>`__ AWS Simple Notification Service (SNS) API action.
- **JobStatesToNotify** *(list) --*
The list of job states that will trigger a notification for this job.
- *(string) --*
- **NotifyAll** *(boolean) --*
Any change in job state will trigger a notification for this job.
:type ForwardingAddressId: string
:param ForwardingAddressId:
The updated ID for the forwarding address for a cluster. This field is not supported in most regions.
:rtype: dict
:returns:
"""
pass
def update_job(self, JobId: str, RoleARN: str = None, Notification: Dict = None, Resources: Dict = None, AddressId: str = None, ShippingOption: str = None, Description: str = None, SnowballCapacityPreference: str = None, ForwardingAddressId: str = None) -> Dict:
"""
While a job's ``JobState`` value is ``New`` , you can update some of the information associated with a job. Once the job changes to a different job state, usually within 60 minutes of the job being created, this action is no longer available.
See also: `AWS API Documentation <https://docs.aws.amazon.com/goto/WebAPI/snowball-2016-06-30/UpdateJob>`_
**Request Syntax**
::
response = client.update_job(
JobId='string',
RoleARN='string',
Notification={
'SnsTopicARN': 'string',
'JobStatesToNotify': [
'New'|'PreparingAppliance'|'PreparingShipment'|'InTransitToCustomer'|'WithCustomer'|'InTransitToAWS'|'WithAWSSortingFacility'|'WithAWS'|'InProgress'|'Complete'|'Cancelled'|'Listing'|'Pending',
],
'NotifyAll': True|False
},
Resources={
'S3Resources': [
{
'BucketArn': 'string',
'KeyRange': {
'BeginMarker': 'string',
'EndMarker': 'string'
}
},
],
'LambdaResources': [
{
'LambdaArn': 'string',
'EventTriggers': [
{
'EventResourceARN': 'string'
},
]
},
],
'Ec2AmiResources': [
{
'AmiId': 'string',
'SnowballAmiId': 'string'
},
]
},
AddressId='string',
ShippingOption='SECOND_DAY'|'NEXT_DAY'|'EXPRESS'|'STANDARD',
Description='string',
SnowballCapacityPreference='T50'|'T80'|'T100'|'T42'|'NoPreference',
ForwardingAddressId='string'
)
**Response Syntax**
::
{}
**Response Structure**
- *(dict) --*
:type JobId: string
:param JobId: **[REQUIRED]**
The job ID of the job that you want to update, for example ``JID123e4567-e89b-12d3-a456-426655440000`` .
:type RoleARN: string
:param RoleARN:
The new role Amazon Resource Name (ARN) that you want to associate with this job. To create a role ARN, use the `CreateRole <http://docs.aws.amazon.com/IAM/latest/APIReference/API_CreateRole.html>`__ AWS Identity and Access Management (IAM) API action.
:type Notification: dict
:param Notification:
The new or updated Notification object.
- **SnsTopicARN** *(string) --*
The new SNS ``TopicArn`` that you want to associate with this job. You can create Amazon Resource Names (ARNs) for topics by using the `CreateTopic <http://docs.aws.amazon.com/sns/latest/api/API_CreateTopic.html>`__ Amazon SNS API action.
You can subscribe email addresses to an Amazon SNS topic through the AWS Management Console, or by using the `Subscribe <http://docs.aws.amazon.com/sns/latest/api/API_Subscribe.html>`__ AWS Simple Notification Service (SNS) API action.
- **JobStatesToNotify** *(list) --*
The list of job states that will trigger a notification for this job.
- *(string) --*
- **NotifyAll** *(boolean) --*
Any change in job state will trigger a notification for this job.
:type Resources: dict
:param Resources:
The updated ``JobResource`` object, or the updated JobResource object.
- **S3Resources** *(list) --*
An array of ``S3Resource`` objects.
- *(dict) --*
Each ``S3Resource`` object represents an Amazon S3 bucket that your transferred data will be exported from or imported into. For export jobs, this object can have an optional ``KeyRange`` value. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BucketArn** *(string) --*
The Amazon Resource Name (ARN) of an Amazon S3 bucket.
- **KeyRange** *(dict) --*
For export jobs, you can provide an optional ``KeyRange`` within a specific Amazon S3 bucket. The length of the range is defined at job creation, and has either an inclusive ``BeginMarker`` , an inclusive ``EndMarker`` , or both. Ranges are UTF-8 binary sorted.
- **BeginMarker** *(string) --*
The key that starts an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **EndMarker** *(string) --*
The key that ends an optional key range for an export job. Ranges are inclusive and UTF-8 binary sorted.
- **LambdaResources** *(list) --*
The Python-language Lambda functions for this job.
- *(dict) --*
Identifies
- **LambdaArn** *(string) --*
An Amazon Resource Name (ARN) that represents an AWS Lambda function to be triggered by PUT object actions on the associated local Amazon S3 resource.
- **EventTriggers** *(list) --*
The array of ARNs for S3Resource objects to trigger the LambdaResource objects associated with this job.
- *(dict) --*
The container for the EventTriggerDefinition$EventResourceARN .
- **EventResourceARN** *(string) --*
The Amazon Resource Name (ARN) for any local Amazon S3 resource that is an AWS Lambda function\'s event trigger associated with this job.
- **Ec2AmiResources** *(list) --*
The Amazon Machine Images (AMIs) associated with this job.
- *(dict) --*
A JSON-formatted object that contains the IDs for an Amazon Machine Image (AMI), including the Amazon EC2 AMI ID and the Snowball Edge AMI ID. Each AMI has these two IDs to simplify identifying the AMI in both the AWS Cloud and on the device.
- **AmiId** *(string) --* **[REQUIRED]**
The ID of the AMI in Amazon EC2.
- **SnowballAmiId** *(string) --*
The ID of the AMI on the supported device.
:type AddressId: string
:param AddressId:
The ID of the updated Address object.
:type ShippingOption: string
:param ShippingOption:
The updated shipping option value of this job\'s ShippingDetails object.
:type Description: string
:param Description:
The updated description of this job\'s JobMetadata object.
:type SnowballCapacityPreference: string
:param SnowballCapacityPreference:
The updated ``SnowballCapacityPreference`` of this job\'s JobMetadata object. The 50 TB Snowballs are only available in the US regions.
:type ForwardingAddressId: string
:param ForwardingAddressId:
The updated ID for the forwarding address for a job. This field is not supported in most regions.
:rtype: dict
:returns:
"""
pass
| 62.296911 | 453 | 0.559218 | 11,503 | 108,895 | 5.274972 | 0.056768 | 0.020024 | 0.009855 | 0.01213 | 0.886746 | 0.855928 | 0.83986 | 0.823775 | 0.812156 | 0.800455 | 0 | 0.013523 | 0.348776 | 108,895 | 1,747 | 454 | 62.33257 | 0.842121 | 0.8489 | 0 | 0.431373 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.431373 | false | 0.431373 | 0.117647 | 0 | 0.568627 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 10 |
c0a8480c5d61186f757e996dd241ce5a604e2c99 | 3,005 | py | Python | convrnn/custom_transforms.py | esizikova/anytime-prediction | 5c2672d6454a91873ca2b40796a29c6f5db5ec99 | [
"MIT"
] | 3 | 2021-06-08T10:43:42.000Z | 2022-02-17T02:20:47.000Z | convrnn/custom_transforms.py | esizikova/anytime-prediction | 5c2672d6454a91873ca2b40796a29c6f5db5ec99 | [
"MIT"
] | null | null | null | convrnn/custom_transforms.py | esizikova/anytime-prediction | 5c2672d6454a91873ca2b40796a29c6f5db5ec99 | [
"MIT"
] | 2 | 2021-06-16T17:15:42.000Z | 2021-08-28T06:04:41.000Z | import numpy as np
from torchvision import transforms
import torch
def all_random_blur(image, kernel=7, std=0.9):
all_devs = np.arange(0.0, std+0.1, 0.1)
std = np.random.choice(all_devs)
if std != 0.0:
image = np.transpose(image, (2,0,1))
image = transforms.functional.gaussian_blur(torch.Tensor(image), kernel, sigma=std)
image = np.transpose(image.numpy(), (1,2,0))
return image
def add_gaussian_blur(image, kernel=7, std=0.9):
if std != 0.0:
image = np.transpose(image, (2,0,1))
image = transforms.functional.gaussian_blur(torch.Tensor(image), kernel, sigma=std)
image = np.transpose(image.numpy(), (1,2,0))
return image
def all_random_noise(image, std=0.04, mean=0, contrast=0.1):
std = np.random.choice(np.arange(0.0, std+0.01, 0.01))
noise = np.array([])
n = image.shape[0] * image.shape[1]
sd2 = std * 2
while len(noise) < n:
# more samples than we require
m = 2 * (n - len(noise))
new = np.random.randn(m) * std
# remove out-of-range samples
new = new[new >= -sd2]
new = new[new <= sd2]
# append to noise tensor
noise = np.concatenate((noise, new))
# pick first n samples and reshape to 2D
noise = np.reshape(noise[:n], (image.shape[0], image.shape[1]))
# stack noise and translate by mean to produce std +
newnoise = np.stack([noise, noise, noise], axis=2) + mean
# shift image hist to mean = 0.5
image = image + (0.5 - np.mean(image))
# self.contrast = 1.0 / (5. * max(1.0, tensor.max() + sd2, 1.0 + (0 - tensor.min() - sd2)))
# print(self.contrast)
image = np.transpose(image, (2,0,1))
image = transforms.functional.adjust_contrast(torch.Tensor(image), contrast)
image = np.transpose(image.numpy(), (1,2,0))
return image + newnoise + mean
def add_gaussian_noise(image, std=0.04, mean=0, contrast=0.1):
noise = np.array([])
n = image.shape[0] * image.shape[1]
sd2 = std * 2
while len(noise) < n:
# more samples than we require
m = 2 * (n - len(noise))
new = np.random.randn(m) * std
# remove out-of-range samples
new = new[new >= -sd2]
new = new[new <= sd2]
# append to noise tensor
noise = np.concatenate((noise, new))
# pick first n samples and reshape to 2D
noise = np.reshape(noise[:n], (image.shape[0], image.shape[1]))
# stack noise and translate by mean to produce std +
newnoise = np.stack([noise, noise, noise], axis=2) + mean
# shift image hist to mean = 0.5
image = image + (0.5 - np.mean(image))
# self.contrast = 1.0 / (5. * max(1.0, tensor.max() + sd2, 1.0 + (0 - tensor.min() - sd2)))
# print(self.contrast)
image = np.transpose(image, (2,0,1))
image = transforms.functional.adjust_contrast(torch.Tensor(image), contrast)
image = np.transpose(image.numpy(), (1,2,0))
return image + newnoise + mean | 32.311828 | 95 | 0.596672 | 458 | 3,005 | 3.884279 | 0.165939 | 0.017988 | 0.071951 | 0.094435 | 0.93086 | 0.93086 | 0.894885 | 0.871276 | 0.871276 | 0.871276 | 0 | 0.048759 | 0.249251 | 3,005 | 93 | 96 | 32.311828 | 0.739805 | 0.207987 | 0 | 0.807692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.057692 | 0 | 0.211538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c0d400eecd04e5e28ca22b1f0dd5a5af3a252172 | 245 | py | Python | tests/parser/aggregates.ordering.1.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/aggregates.ordering.1.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/aggregates.ordering.1.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | input = """
ok:-q(Y,X), not #count{V: a(V)}<1, not r(Y), #count{V:b(V)}>=1.
a(1) | a(2).
b(1) | b(2).
r(2).
q(1,2).
"""
output = """
ok:-q(Y,X), not #count{V: a(V)}<1, not r(Y), #count{V:b(V)}>=1.
a(1) | a(2).
b(1) | b(2).
r(2).
q(1,2).
"""
| 12.894737 | 63 | 0.404082 | 64 | 245 | 1.546875 | 0.21875 | 0.242424 | 0.080808 | 0.10101 | 0.888889 | 0.888889 | 0.888889 | 0.888889 | 0.888889 | 0.888889 | 0 | 0.08867 | 0.171429 | 245 | 18 | 64 | 13.611111 | 0.399015 | 0 | 0 | 0.857143 | 0 | 0.142857 | 0.873469 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
c0dfea0021342c4249edde07f00bd5c73596d555 | 44,747 | py | Python | timeseria/transformations.py | sarusso/Timeseria | fdf9990ab68e20f75a64f090a2c43979266dcba9 | [
"Apache-2.0"
] | 8 | 2021-01-02T17:43:13.000Z | 2022-02-22T09:07:22.000Z | timeseria/transformations.py | sarusso/Timeseria | fdf9990ab68e20f75a64f090a2c43979266dcba9 | [
"Apache-2.0"
] | 20 | 2020-07-15T11:29:41.000Z | 2022-03-29T22:51:52.000Z | timeseria/transformations.py | sarusso/Timeseria | fdf9990ab68e20f75a64f090a2c43979266dcba9 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""Series transformations as slotting and resampling."""
from .time import dt_from_s, s_from_dt
from .datastructures import DataTimeSlot, DataTimeSlotSeries, TimePoint, DataTimePointSeries, DataTimePoint
from .utilities import compute_data_loss
from .operations import avg
from .units import TimeUnit
# Setup logging
import logging
logger = logging.getLogger(__name__)
#==========================
# Base Transformation
#==========================
class Transformation(object):
"""Base transformation class."""
@classmethod
def __str__(cls):
return '{} transformation'.format(cls.__name__.replace('Operator',''))
def process(self, *args, **kwargs):
raise NotImplementedError('This transformation is not implemented.')
#==========================
# Resampler Transformation
#==========================
class Resampler(Transformation):
"""Resampler transformation."""
def __init__(self, unit, interpolation_method='linear'):
if isinstance(unit, TimeUnit):
self.time_unit = unit
else:
self.time_unit = TimeUnit(unit)
if self.time_unit.is_calendar():
raise ValueError('Sorry, calendar time units are not supported by the Resampler (got "{}"). Use the Slotter instead.'.format(self.time_unit))
self.interpolation_method=interpolation_method
def _compute_resampled_point(self, data_time_point_series, unit, start_t, end_t, validity, timezone, fill_with, force_data_loss, fill_gaps, series_indexes, series_resolution, first_last):
# Compute data_loss
point_data_loss = compute_data_loss(data_time_point_series,
from_t = start_t,
to_t = end_t,
series_resolution = series_resolution,
validity = validity,
first_last = first_last)
# TODO: unroll the following before the compute resampled point call
interval_timeseries = DataTimePointSeries()
prev_point = None
next_point = None
for data_time_point in data_time_point_series:
# TODO: better check the math here. We use >= and <= because, since we are basically comparing intervals,
# the "right" excluded rule is excluded by comparing right with left and then left with right.
if (data_time_point.t+(validity/2)) <= start_t:
prev_point = data_time_point
continue
if (data_time_point.t-(validity/2)) >= end_t:
next_point = data_time_point
continue
interval_timeseries.append(data_time_point)
# If we have to fully reconstruct data
if not interval_timeseries: #point_data_loss == 1:
# Reconstruct (fill_gaps)
slot_data = {}
for key in data_time_point_series[0].data.keys():
# Special case with only one datapoint (i.e. beginning or end)
if len(data_time_point_series) == 1:
slot_data[key] = data_time_point_series[0].data[key]
else:
prev_point = data_time_point_series[0]
next_point = data_time_point_series[1]
if self.interpolation_method == 'linear':
diff = next_point.data[key] - prev_point.data[key]
delta_t = next_point.t - prev_point.t
ratio = diff / delta_t
point_t = start_t + (unit.duration_s() /2)
slot_data[key] = prev_point.data[key] + ((point_t - prev_point.t) * ratio)
elif self.interpolation_method == 'uniform':
slot_data[key] = (prev_point.data[key] + next_point.data[key]) /2
else:
raise Exception('Unknown interpolation method "{}"'.format(self.interpolation_method))
else:
# Keys shortcut
keys = data_time_point_series.data_keys()
logger.debug('Slot timeseries: %s', interval_timeseries)
# Compute sample averages data
avgs = avg(interval_timeseries, prev_point=prev_point, next_point=next_point)
# Do we have a 100% and a fill_with?
if fill_with is not None and point_data_loss == 1:
slot_data = {key:fill_with for key in keys}
else:
if isinstance(avgs, dict):
slot_data = {key:avgs[key] for key in keys}
else:
slot_data = {keys[0]: avgs}
# Do we have a force data_loss? #TODO: do not compute data_loss if fill_with not present and force_data_loss
if force_data_loss is not None:
point_data_loss = force_data_loss
# # Create the DataTimePoint if we have data_loss
# if not point_data_loss:
# data_time_point = None
# else:
# pass
# Create the data time point
data_time_point = DataTimePoint(t = (start_t+((end_t-start_t)/2)),
tz = timezone,
data = slot_data,
data_loss = point_data_loss)
# Now handle indexes
for index in series_indexes:
if interval_timeseries:
if index == 'data_loss':
continue
index_sum = 0
index_count = 0
for item in interval_timeseries:
# Get index value
try:
index_value = getattr(item, index)
except:
new_point_index_value = None
else:
if index_value is not None:
index_count += 1
index_sum += index_value
# Compute the new index value (if there were indexes not None)
if index_count > 0:
new_point_index_value = index_sum/index_count
else:
new_point_index_value = None
else:
new_point_index_value = None
# Set the index. Handle special case for data_reconstructed
if index == 'data_reconstructed':
setattr(data_time_point, '_data_reconstructed', new_point_index_value)
else:
setattr(data_time_point, index, new_point_index_value)
# Return
return data_time_point
def process(self, data_time_point_series, from_t=None, to_t=None, from_dt=None, to_dt=None,
validity=None, force_close_last=True, include_extremes=False, fill_with=None,
force_data_loss=None, fill_gaps=True, force=False):
"""Start the resampling process. If start and/or end are not set, they are set automatically
based on first and last points of the series"""
if not isinstance(data_time_point_series, DataTimePointSeries):
raise TypeError('Can process only DataTimePointSeries, got "{}"'.format(data_time_point_series.__class__.__name__))
if not data_time_point_series:
raise ValueError('Cannot process empty data_time_point_series')
if include_extremes:
if from_t is not None or to_t is not None:
raise ValueError('Setting "include_extremes" is not compatible with giving a from_t or a to_t')
from_rounding_method = 'floor'
to_rounding_method = 'ceil'
else:
from_rounding_method = 'ceil'
to_rounding_method = 'floor'
# Move fromt_dt and to_dt to epoch to simplify the following
if from_dt is not None:
from_t = s_from_dt(from_dt)
if to_dt is not None:
to_t = s_from_dt(to_dt)
# Also force close if we have explicitly set an end
force_close_last = True
# Set "from" if not set, otherwise check for consistency # TODO: move to steaming
if from_t is None:
from_t = data_time_point_series[0].t
from_dt = dt_from_s(from_t, tz=data_time_point_series.tz)
# Is the point already rounded to the time unit or do we have to round it ourselves?
if not from_dt == self.time_unit.round_dt(from_dt):
from_dt = self.time_unit.round_dt(from_dt, how=from_rounding_method)
from_t = s_from_dt(from_dt)
else:
from_dt = dt_from_s(from_t, tz=data_time_point_series.tz)
if from_dt != self.time_unit.round_dt(from_dt):
raise ValueError('Sorry, provided from_t is not consistent with the self.time_unit of "{}" (Got "{}")'.format(self.time_unit, from_t))
# Set "to" if not set, otherwise check for consistency # TODO: move to streaming
if to_t is None:
to_t = data_time_point_series[-1].t
to_dt = dt_from_s(to_t, tz=data_time_point_series.tz)
# Is the point already rounded to the time unit or do we have to round it ourselves?
if not to_dt == self.time_unit.round_dt(to_dt):
to_dt = self.time_unit.round_dt(to_dt, how=to_rounding_method)
to_t = s_from_dt(to_dt)
else:
to_dt = dt_from_s(to_t, tz=data_time_point_series.tz)
if to_dt != self.time_unit.round_dt(to_dt):
raise ValueError('Sorry, provided to_t is not consistent with the self.time_unit of "{}" (Got "{}")'.format(self.time_unit, to_t))
# Move the start back of half and the end forward of
# half unit as well, as the point will be in the center
from_t = from_t - (self.time_unit.duration_s() /2)
from_dt = dt_from_s(from_t, data_time_point_series.tz)
to_t = to_t + (self.time_unit.duration_s() /2)
to_dt = dt_from_s(to_t, data_time_point_series.tz)
# Automatically detect validity if not set
if validity is None:
validity = data_time_point_series.autodetected_sampling_interval
logger.info('Using auto-detected sampling interval: %ss', validity)
# Check if not upsamplimg (with some tolearance):
if not force:
if validity > (self.time_unit.duration_s() * 1.10):
raise ValueError('Upsampling not supported yet (resampler unit: {}; detected time series sampling interval: {})'.format(self.time_unit, validity))
logger.debug('Started slotter from "%s" (%s) to "%s" (%s)', from_dt, from_t, to_dt, to_t)
if from_dt >= to_dt:
raise ValueError('Sorry, from is >= to! (from_t={}, to_t={})'.format(from_t, to_t))
# Set some support vars
slot_start_t = None
slot_end_t = None
prev_data_time_point = None
working_serie = DataTimePointSeries()
process_ended = False
resampled_data_time_point_series = DataTimePointSeries()
# Set timezone
timezone = data_time_point_series.tz
logger.debug('Using timezone "%s"', timezone)
# Counters
count = 0
first = True
# Indexes
series_indexes = data_time_point_series.indexes
series_resolution = data_time_point_series.resolution
# Now go trough all the data in the time series
for data_time_point in data_time_point_series:
logger.debug('Processing %s', data_time_point)
# Increase counter
count += 1
# Set start_dt if not already done TODO: implement it correctly
#if not from_dt:
# from_dt = self.time_unit.timeInterval.round_dt(data_time_point.dt) if rounded else data_time_point.dt
# Pretend there was a slot before if we are at the beginning. TOOD: improve me.
if slot_end_t is None:
slot_end_t = from_t
# First, check if we have some points to discard at the beginning
if data_time_point.t < from_t:
# If we are here it means we are going data belonging to a previous slot
# (probably just spare data loaded to have access to the prev_datapoint)
prev_data_time_point = data_time_point
continue
# Similar concept for the end
# TODO: what if we are in streaming mode? add if to_t is not None?
if data_time_point.t >= to_t:
if process_ended:
continue
# The following procedure works in general for slots at the beginning and in the middle.
# The approach is to detect if the current slot is "outdated" and spin a new one if so.
if data_time_point.t > slot_end_t:
# If the current slot is outdated:
# 1) Add this last point to the data_time_point_series:
working_serie.append(data_time_point)
#2) keep spinning new slots until the current data point falls in one of them.
# NOTE: Read the following "while" more as an "if" which can also lead to spin multiple
# slot if there are empty slots between the one being closed and the data_time_point.dt.
# TODO: leave or remove the above if for code readability?
while slot_end_t < data_time_point.t:
logger.debug('Checking for end {} with point {}'.format(slot_end_t, data_time_point.t))
# If we are in the pre-first slot, just silently spin a new slot:
if slot_start_t is not None:
# Append last point. Can be appended to multiple slots, this is normal since
# the empty slots in the middle will have only a far prev and a far next.
# can also be appended several times if working_serie is not reset (looping in the while)
if data_time_point not in working_serie:
working_serie.append(data_time_point)
logger.debug('This slot (start={}, end={}) is closed, now aggregating it..'.format(slot_start_t, slot_end_t))
logger.debug('working_serie len: %s', len(working_serie))
logger.debug('working_serie first point dt: %s', working_serie[0].dt)
logger.debug('working_serie last point dt: %s', working_serie[-1].dt)
# Compute slot...
dataTimePoint = self._compute_resampled_point(working_serie,
unit = self.time_unit,
start_t = slot_start_t,
end_t = slot_end_t,
validity = validity,
timezone = timezone,
fill_with = fill_with,
force_data_loss = force_data_loss,
fill_gaps = fill_gaps,
series_indexes = series_indexes,
series_resolution = series_resolution,
first_last = first)
# Set first to false
if first:
first = False
# .. and append results
if dataTimePoint:
logger.debug('Computed datapoint: %s',dataTimePoint )
resampled_data_time_point_series.append(dataTimePoint)
# Create a new slot. This is where all the "conventional" time logic kicks-in, and where the time zone is required.
slot_start_t = slot_end_t
slot_end_t = s_from_dt(dt_from_s(slot_start_t, tz=timezone) + self.time_unit)
# Create a new working_serie as part of the "create a new slot" procedure
working_serie = DataTimePointSeries()
# Append the previous prev_data_time_point to the new DataTimeSeries
if prev_data_time_point:
working_serie.append(prev_data_time_point)
logger.debug('Spinned a new slot (start={}, end={})'.format(slot_start_t, slot_end_t))
# If last slot mark process as completed (and aggregate last slot if necessary)
if data_time_point.dt >= to_dt:
# Edge case where we would otherwise miss the last slot
if data_time_point.dt == to_dt:
# Compute slot...
dataTimePoint = self._compute_resampled_point(working_serie,
unit = self.time_unit,
start_t = slot_start_t,
end_t = slot_end_t,
validity = validity,
timezone = timezone,
fill_with = fill_with,
force_data_loss = force_data_loss,
fill_gaps = fill_gaps,
series_indexes = series_indexes,
series_resolution = series_resolution,
first_last = True)
# .. and append results
if dataTimePoint:
resampled_data_time_point_series.append(dataTimePoint)
process_ended = True
# Append this point to the working serie
working_serie.append(data_time_point)
# ..and save as previous point
prev_data_time_point = data_time_point
# Last slots
if force_close_last:
# 1) Close the last slot and aggreagte it. You should never do it unless you knwo what you are doing
if working_serie:
logger.debug('This resampled point (start={}, end={}) is done, now computing it..'.format(slot_start_t, slot_end_t))
# Compute slot...
dataTimePoint = self._compute_resampled_point(working_serie,
unit = self.time_unit,
start_t = slot_start_t,
end_t = slot_end_t,
validity = validity,
timezone = timezone,
fill_with = fill_with,
force_data_loss = force_data_loss,
fill_gaps = fill_gaps,
series_indexes = series_indexes,
series_resolution = series_resolution,
first_last = True)
# .. and append results
if dataTimePoint:
resampled_data_time_point_series.append(dataTimePoint)
# 2) Handle missing slots until the requested end (end_dt)
# TODO: Implement it. Sure?
logger.info('Resampled %s DataTimePoints in %s DataTimePoints', count, len(resampled_data_time_point_series))
return resampled_data_time_point_series
#==========================
# Slotter Transformation
#==========================
class Slotter(Transformation):
"""Slotter transformation."""
def __init__(self, unit, default_operation=avg, extra_operations=None, interpolation_method='linear'):
if isinstance(unit, TimeUnit):
self.time_unit = unit
else:
self.time_unit = TimeUnit(unit)
self.default_operation=default_operation
self.extra_operations=extra_operations
self.interpolation_method=interpolation_method
def _compute_slot(self, data_time_point_series, unit, start_t, end_t, validity, timezone, fill_with, force_data_loss, fill_gaps, series_indexes, series_resolution, first_last):
# Compute data_loss
slot_data_loss = compute_data_loss(data_time_point_series,
from_t = start_t,
to_t = end_t,
series_resolution = series_resolution,
validity = validity,
first_last = first_last)
# Initialize slot data
slot_data = {}
# TODO: unroll the following before the compute slot call
slot_timeseries = DataTimePointSeries()
prev_point = None
next_point = None
for data_time_point in data_time_point_series:
if (data_time_point.t+(validity/2)) < start_t:
prev_point = data_time_point
continue
if (data_time_point.t-(validity/2)) >= end_t:
next_point = data_time_point
continue
slot_timeseries.append(data_time_point)
# If we have to fully reconstruct data
# TODO: we have a huge conceputal problem here if checkong on the slot_data_loss:
# if data_loss is 1 but due to point data loses (maybe even reconstructed)
# and not entirely missing points, we should actually compute as if not a full data loss.
if not slot_timeseries: # slot_data_loss == 1:
# Reconstruct (fill gaps)
for key in data_time_point_series[0].data.keys():
# Set data as None, will be interpolated afterwards
slot_data['{}_{}'.format(key, self.default_operation.__name__)] = None
# Handle also extra ops
if self.extra_operations:
for extra_operation in self.extra_operations:
slot_data['{}_{}'.format(key, extra_operation.__name__)] = None
else:
# Keys shortcut
keys = data_time_point_series.data_keys()
# Compute the default operation (in some cases it might not be defined, hence the "if")
if self.default_operation:
default_operation_data = self.default_operation(slot_timeseries, prev_point=prev_point, next_point=next_point)
# Do we have a 100% and a fill_with?
if fill_with is not None and slot_data_loss == 1:
for key in keys:
slot_data['{}_{}'.format(key, self.default_operation.__name__)] = fill_with
else:
#if isinstance(default_operation_data, dict):
# slot_data = {key:default_operation_data[key] for key in keys}
#else:
# slot_data = {keys[0]: default_operation_data}
if isinstance(default_operation_data, dict):
for key in keys:
slot_data['{}_{}'.format(key, self.default_operation.__name__)] = default_operation_data[key]
else:
slot_data['{}_{}'.format(keys[0], self.default_operation.__name__)] = default_operation_data
# Handle extra operations
if self.extra_operations:
for extra_operation in self.extra_operations:
extra_operation_data = extra_operation(slot_timeseries, prev_point=prev_point, next_point=next_point)
if isinstance(extra_operation_data, dict):
for result_key in extra_operation_data:
slot_data['{}_{}'.format(result_key, extra_operation.__name__)] = extra_operation_data[result_key]
else:
slot_data['{}_{}'.format(keys[0], extra_operation.__name__)] = extra_operation_data
# Do we have a force data_loss? #TODO: do not compute data_loss if fill_with not present and force_data_loss
if force_data_loss is not None:
slot_data_loss = force_data_loss
# Create the DataTimeSlot
data_time_slot = DataTimeSlot(start = TimePoint(t=start_t, tz=timezone),
end = TimePoint(t=end_t, tz=timezone),
unit = unit,
data = slot_data,
data_loss = slot_data_loss)
# Now handle indexes
for index in series_indexes:
if slot_timeseries:
if index == 'data_loss':
continue
index_sum = 0
index_count = 0
for item in slot_timeseries:
# Get index value
try:
index_value = getattr(item, index)
except:
slotted_index_value = None
else:
if index_value is not None:
index_count += 1
index_sum += index_value
# Compute the slotted index value (if there were indexes not None)
if index_count > 0:
slotted_index_value = index_sum/index_count
else:
slotted_index_value = None
else:
slotted_index_value = None
# Set the index. Handle special case for data_reconstructed
if index == 'data_reconstructed':
setattr(data_time_slot, '_data_reconstructed', slotted_index_value)
else:
setattr(data_time_slot, index, slotted_index_value)
# Return the slot
return data_time_slot
def process(self, data_time_point_series, from_t=None, from_dt=None, to_t=None, to_dt=None, validity=None, force_close_last=False,
include_extremes=False, fill_with=None, force_data_loss=None, fill_gaps=True, force=False):
"""Start the slotting process. If start and/or end are not set, they are set automatically based on first and last points of the series."""
if not isinstance(data_time_point_series, DataTimePointSeries):
raise TypeError('Can process only DataTimePointSeries, got "{}"'.format(data_time_point_series.__class__.__name__))
if not data_time_point_series:
raise ValueError('Cannot process empty data_time_point_series')
if include_extremes:
if from_t is not None or to_t is not None:
raise ValueError('Setting "include_extremes" is not compatible with giving a from_t or a to_t')
from_rounding_method = 'floor'
to_rounding_method = 'ceil'
force_close_last = True
else:
from_rounding_method = 'ceil'
to_rounding_method = 'floor'
# Move fromt_dt and to_dt to epoch to simplify the following
if from_dt is not None:
from_t = s_from_dt(from_dt)
if to_dt is not None:
to_t = s_from_dt(to_dt)
# Also force close if we have explicitly set an end
force_close_last = True
# Set "from" if not set, otherwise check for consistency # TODO: move to steaming
if from_t is None:
from_t = data_time_point_series[0].t
from_dt = dt_from_s(from_t, data_time_point_series.tz)
# Is the point already rounded to the time unit or do we have to round it ourselves?
if not from_dt == self.time_unit.round_dt(from_dt):
from_dt = self.time_unit.round_dt(from_dt, how=from_rounding_method)
from_t = s_from_dt(from_dt)
else:
from_dt = dt_from_s(from_t, data_time_point_series.tz)
if from_dt != self.time_unit.round_dt(from_dt):
raise ValueError('Sorry, provided from_t is not consistent with the self.time_unit of "{}" (Got "{}")'.format(self.time_unit, from_t))
# Set "to" if not set, otherwise check for consistency # TODO: move to streaming
if to_t is None:
to_t = data_time_point_series[-1].t
to_dt = dt_from_s(to_t, data_time_point_series.tz)
# Is the point already rounded to the time unit or do we have to round it ourselves?
if not to_dt == self.time_unit.round_dt(to_dt):
to_dt = self.time_unit.round_dt(to_dt, how=to_rounding_method)
to_t = s_from_dt(to_dt)
else:
to_dt = dt_from_s(to_t, data_time_point_series.tz)
if to_dt != self.time_unit.round_dt(to_dt):
raise ValueError('Sorry, provided to_t is not consistent with the self.time_unit of "{}" (Got "{}")'.format(self.time_unit, to_t))
# Also force close if we have explicitly set an end
force_close_last = True
# Automatically detect validity if not set
if validity is None:
validity = data_time_point_series.autodetected_sampling_interval
logger.info('Using auto-detected sampling interval: %ss', validity)
# Check if not upslotting (with some tolerance)
if not force:
# TODO: this check is super-weak. Will fail in loads of edge cases, i.e. months slotted in 30 days.
unit_duration_s = self.time_unit.duration_s(data_time_point_series[0].dt)
if validity > (unit_duration_s * 1.1):
raise ValueError('Upslotting not supported yet (slotter unit: {}; detected time series sampling interval: {})'.format(unit_duration_s, validity))
# Log
logger.debug('Started slotter from "%s" (%s) to "%s" (%s)', from_dt, from_t, to_dt, to_t)
if from_dt >= to_dt:
raise ValueError('Sorry, from is >= to! (from_t={}, to_t={})'.format(from_t, to_t))
# Set some support vars
slot_start_t = None
slot_end_t = None
prev_data_time_point = None
working_serie = DataTimePointSeries()
process_ended = False
data_time_slot_series = DataTimeSlotSeries()
slots_to_be_interpolated = []
last_no_full_data_loss_slot = None
# Set timezone
timezone = data_time_point_series.tz
logger.debug('Using timezone "%s"', timezone)
# Counters
count = 0
first = True
# Indexes
series_indexes = data_time_point_series.indexes
series_resolution = data_time_point_series.resolution
# Now go trough all the data in the time series
for data_time_point in data_time_point_series:
logger.debug('Processing %s', data_time_point)
# Increase counter
count += 1
# Set start_dt if not already done TODO: implement it correctly
#if not from_dt:
# from_dt = self.time_unit.timeInterval.round_dt(data_time_point.dt) if rounded else data_time_point.dt
# Pretend there was a slot before if we are at the beginning. TOOD: improve me.
if slot_end_t is None:
slot_end_t = from_t
# First, check if we have some points to discard at the beginning
if data_time_point.t < from_t:
# If we are here it means we are going data belonging to a previous slot
# (probably just spare data loaded to have access to the prev_datapoint)
prev_data_time_point = data_time_point
continue
# Similar concept for the end
# TODO: what if we are in streaming mode? add if to_t is not None?
if data_time_point.t >= to_t:
if process_ended:
break
# The following procedure works in general for slots at the beginning and in the middle.
# The approach is to detect if the current slot is "outdated" and spin a new one if so.
if data_time_point.t > slot_end_t:
# If the current slot is outdated:
# 1) Add this last point to the data_time_point_series:
working_serie.append(data_time_point)
#2) keep spinning new slots until the current data point falls in one of them.
# NOTE: Read the following "while" more as an "if" which can also lead to spin multiple
# slot if there are empty slots between the one being closed and the data_time_point.dt.
# TODO: leave or remove the above if for code readability?
while slot_end_t < data_time_point.t:
logger.debug('Checking for end {} with point {}'.format(slot_end_t, data_time_point.t))
# If we are in the pre-first slot, just silently spin a new slot:
if slot_start_t is not None:
# Append last point. Can be appended to multiple slots, this is normal since
# the empty slots in the middle will have only a far prev and a far next.
# can also be appended several times if working_serie is not reset (looping in the while)
if data_time_point not in working_serie:
working_serie.append(data_time_point)
logger.debug('This slot (start={}, end={}) is closed, now aggregating it..'.format(slot_start_t, slot_end_t))
logger.debug('working_serie first point dt: %s', working_serie[0].dt)
logger.debug('working_serie last point dt: %s', working_serie[-1].dt)
# Compute slot...
data_time_slot = self._compute_slot(working_serie,
unit = self.time_unit,
start_t = slot_start_t,
end_t = slot_end_t,
validity = validity,
timezone = timezone,
fill_with = fill_with,
force_data_loss = force_data_loss,
fill_gaps = fill_gaps,
series_indexes = series_indexes,
series_resolution = series_resolution,
first_last = first)
# Set first to false
if first:
first = False
# .. and append results (unless we are before the first timeseries start point)
if slot_end_t > data_time_point_series[0].t:
if data_time_slot.data_loss == 1.0:
# if data loss is full, append to slot to the slots to be interpolated
slots_to_be_interpolated.append(data_time_slot)
else:
# If we have slots to be intepolated
if slots_to_be_interpolated:
for i, slot_to_be_interpolated in enumerate(slots_to_be_interpolated):
# Prepare for interpolated data
interpolated_data = {}
# Computed interpolated data
if self.interpolation_method == 'linear':
for data_key in data_time_slot_series.data_keys():
interpolated_data[data_key] = ((((data_time_slot.data[data_key] - last_no_full_data_loss_slot.data[data_key]) /
(len(slots_to_be_interpolated) + 1) ) * (i+1)) + last_no_full_data_loss_slot.data[data_key])
elif self.interpolation_method == 'uniform':
for data_key in data_time_slot_series.data_keys():
interpolated_data[data_key] = (((data_time_slot.data[data_key] - last_no_full_data_loss_slot.data[data_key]) / 2)
+ last_no_full_data_loss_slot.data[data_key])
else:
raise Exception('Unknown interpolation method "{}"'.format(self.interpolation_method))
# Add interpolated data
slot_to_be_interpolated._data = interpolated_data
data_time_slot_series.append(slot_to_be_interpolated)
# Reset the "buffer"
slots_to_be_interpolated = []
# Append this slot to the time series
data_time_slot_series.append(data_time_slot)
# ... and set this slot as the last with no full data loss
last_no_full_data_loss_slot = data_time_slot
# Create a new slot. This is where all the "calendar" time unit logic kicks-in, and where the time zone is required.
slot_start_t = slot_end_t
slot_start_dt = dt_from_s(slot_start_t, tz=timezone)
slot_end_t = s_from_dt(dt_from_s(slot_start_t, tz=timezone) + self.time_unit)
slot_end_dt = dt_from_s(slot_end_t, tz=timezone)
# Create a new working_serie as part of the "create a new slot" procedure
working_serie = DataTimePointSeries()
# Append the previous prev_data_time_point to the new DataTimeSeries
if prev_data_time_point:
working_serie.append(prev_data_time_point)
logger.debug('Spinned a new slot (start={}, end={})'.format(slot_start_dt, slot_end_dt))
# If last slot mark process as completed (and aggregate last slot if necessary)
if data_time_point.dt >= to_dt:
# Edge case where we would otherwise miss the last slot
if data_time_point.dt == to_dt:
# Compute slot...
data_time_slot = self._compute_slot(working_serie,
unit = self.time_unit,
start_t = slot_start_t,
end_t = slot_end_t,
validity = validity,
timezone = timezone,
fill_with = fill_with,
force_data_loss = force_data_loss,
fill_gaps = fill_gaps,
series_indexes = series_indexes,
series_resolution = series_resolution,
first_last = True)
# .. and append results
data_time_slot_series.append(data_time_slot)
process_ended = True
# Append this point to the working serie
working_serie.append(data_time_point)
# ..and save as previous point
prev_data_time_point = data_time_point
# Last slots
if force_close_last:
# 1) Close the last slot and aggreagte it. You should never do it unless you knwo what you are doing
if working_serie:
logger.debug('This slot (start={}, end={}) is closed, now aggregating it..'.format(slot_start_t, slot_end_t))
# Compute slot...
data_time_slot = self._compute_slot(working_serie,
unit = self.time_unit,
start_t = slot_start_t,
end_t = slot_end_t,
validity = validity,
timezone = timezone,
fill_with = fill_with,
force_data_loss = force_data_loss,
fill_gaps = fill_gaps,
series_indexes = series_indexes,
series_resolution = series_resolution,
first_last = True)
# .. and append results
data_time_slot_series.append(data_time_slot)
# 2) Handle missing slots until the requested end (end_dt)
# TODO: Implement it. Sure? Clashes with the idea of reconstructors..
logger.info('Slotted %s DataTimePoints in %s DataTimeSlots', count, len(data_time_slot_series))
return data_time_slot_series
| 50.504515 | 191 | 0.511364 | 4,874 | 44,747 | 4.406032 | 0.080632 | 0.055134 | 0.075064 | 0.051315 | 0.819604 | 0.784447 | 0.762701 | 0.731735 | 0.725355 | 0.714505 | 0 | 0.00285 | 0.427671 | 44,747 | 885 | 192 | 50.561582 | 0.835689 | 0.203969 | 0 | 0.751491 | 0 | 0 | 0.06374 | 0.001243 | 0 | 0 | 0 | 0.00226 | 0 | 1 | 0.015905 | false | 0 | 0.011928 | 0.001988 | 0.043738 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
23c6e962bec5a274ba6f74f5aefb08a30d327536 | 145 | py | Python | django_bouncy/tests/__init__.py | jobtender24/django-bouncy | 203250bf2d8219b69c36562512e3152699b0fdc2 | [
"MIT"
] | 31 | 2015-05-22T11:08:15.000Z | 2019-06-21T12:08:17.000Z | django_bouncy/tests/__init__.py | jobtender24/django-bouncy | 203250bf2d8219b69c36562512e3152699b0fdc2 | [
"MIT"
] | 26 | 2015-02-01T02:37:58.000Z | 2019-07-02T00:41:34.000Z | django_bouncy/tests/__init__.py | careergroup24/django-bouncy | 203250bf2d8219b69c36562512e3152699b0fdc2 | [
"MIT"
] | 30 | 2015-02-01T02:25:01.000Z | 2019-07-02T08:18:21.000Z | """Tests for django-bouncy"""
# pylint: disable=wildcard-import
from django_bouncy.tests.views import *
from django_bouncy.tests.utils import *
| 24.166667 | 39 | 0.77931 | 20 | 145 | 5.55 | 0.55 | 0.324324 | 0.288288 | 0.396396 | 0.486486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 145 | 5 | 40 | 29 | 0.853846 | 0.386207 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
23db6165a8d2ccdeef4288efbd58a77fe72d3bd2 | 157 | py | Python | shamanai/agents/__init__.py | adaptationio/Shaman-RL | 548fa847e6ba2105cc0a876b02db3f3d7c179c54 | [
"MIT"
] | 2 | 2020-06-13T04:38:08.000Z | 2022-03-22T08:38:10.000Z | shamanai/agents/__init__.py | adaptationio/Shaman-RL | 548fa847e6ba2105cc0a876b02db3f3d7c179c54 | [
"MIT"
] | 1 | 2020-11-13T17:46:38.000Z | 2020-11-13T17:46:38.000Z | shamanai/agents/__init__.py | adaptationio/Shaman-AI | 548fa847e6ba2105cc0a876b02db3f3d7c179c54 | [
"MIT"
] | null | null | null | #from .ppo_sb2 import *
#from .test_agent import *
from .config import *
from .human_agent_RT import *
from .human_agent_TB import *
#from .optimize import * | 26.166667 | 29 | 0.757962 | 24 | 157 | 4.708333 | 0.458333 | 0.442478 | 0.265487 | 0.353982 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007463 | 0.146497 | 157 | 6 | 30 | 26.166667 | 0.835821 | 0.44586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
7b0a35574b30eb0689b2a32100f33b7b7f8650f4 | 1,194 | py | Python | ci-scripts/dlrnapi_promoter/test_stage_qcows_unit.py | fuzzball81/ci-config | b3a8ff6be780bfae0ae9e3e0511dfa61010695eb | [
"Apache-2.0"
] | 8 | 2016-10-06T13:24:04.000Z | 2021-11-04T20:51:23.000Z | ci-scripts/dlrnapi_promoter/test_stage_qcows_unit.py | fuzzball81/ci-config | b3a8ff6be780bfae0ae9e3e0511dfa61010695eb | [
"Apache-2.0"
] | 8 | 2020-02-26T20:11:29.000Z | 2021-09-23T23:23:47.000Z | ci-scripts/dlrnapi_promoter/test_stage_qcows_unit.py | fuzzball81/ci-config | b3a8ff6be780bfae0ae9e3e0511dfa61010695eb | [
"Apache-2.0"
] | 9 | 2016-04-08T14:38:06.000Z | 2021-11-01T18:43:30.000Z | import unittest
import pytest
class TestQcowStagingServer(unittest.TestCase):
@pytest.mark.xfail(reason="Not Implemented", run=False)
def test_create_hierarchy_dir_exist(self):
assert False
@pytest.mark.xfail(reason="Not Implemented", run=False)
def test_create_hierarchy_success(self):
assert False
@pytest.mark.xfail(reason="Not Implemented", run=False)
def test_teardown_success(self):
assert False
@pytest.mark.xfail(reason="Not Implemented", run=False)
def test_teardown_no_dir(self):
assert False
@pytest.mark.xfail(reason="Not Implemented", run=False)
def test_stage_info(self):
assert False
@pytest.mark.xfail(reason="Not Implemented", run=False)
def test_promote_overcloud_images_success(self):
assert False
@pytest.mark.xfail(reason="Not Implemented", run=False)
def test_promote_overcloud_images_link_exist(self):
assert False
@pytest.mark.xfail(reason="Not Implemented", run=False)
def test_setup_dir_exist(self):
assert False
@pytest.mark.xfail(reason="Not Implemented", run=False)
def test_setup_success(self):
assert False
| 27.767442 | 59 | 0.710218 | 153 | 1,194 | 5.359477 | 0.202614 | 0.109756 | 0.164634 | 0.230488 | 0.868293 | 0.868293 | 0.868293 | 0.868293 | 0.868293 | 0.868293 | 0 | 0 | 0.188442 | 1,194 | 42 | 60 | 28.428571 | 0.846233 | 0 | 0 | 0.6 | 0 | 0 | 0.113065 | 0 | 0 | 0 | 0 | 0 | 0.3 | 1 | 0.3 | false | 0 | 0.066667 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9ea63813fc7239d471c6fb045f2469ac03960f50 | 1,604 | py | Python | MolecularRepresentation/aobw.py | MooersLab/jupyterlabpymolpysnipsplus | b886750d63372434df53d4d6d7cdad6cb02ae4e7 | [
"MIT"
] | null | null | null | MolecularRepresentation/aobw.py | MooersLab/jupyterlabpymolpysnipsplus | b886750d63372434df53d4d6d7cdad6cb02ae4e7 | [
"MIT"
] | null | null | null | MolecularRepresentation/aobw.py | MooersLab/jupyterlabpymolpysnipsplus | b886750d63372434df53d4d6d7cdad6cb02ae4e7 | [
"MIT"
] | null | null | null | # Description: Ambient occlussion in grayscale.
# Source: placeHolder
"""
cmd.do('# Note: requires the gscale() function from pymolshortcuts.py.')
cmd.do('# Download this script from http://GitHub.com/MooersLab/pymolshortcuts.')
cmd.do('# Load the functions from this script with the command "run pymolshortcuts.py"')
cmd.do('set_color oxygen, [1.0,0.4,0.4];')
cmd.do('set_color nitrogen, [0.5,0.5,1.0];')
cmd.do('remove solvent;')
cmd.do('as spheres;')
cmd.do('util.cbaw;')
cmd.do('bg white;')
cmd.do('gscale();')
cmd.do('set light_count,10;')
cmd.do('set spec_count,1;')
cmd.do('set shininess, 10;')
cmd.do('set specular,0.25;')
cmd.do('set ambient,0;')
cmd.do('set direct,0;')
cmd.do('set reflect,1.5;')
cmd.do('set ray_shadow_decay_factor, 0.1;')
cmd.do('set ray_shadow_decay_range, 2;')
cmd.do('set depth_cue, 0;')
cmd.do('ray;')
"""
cmd.do('# Note: requires the gscale() function from pymolshortcuts.py.')
cmd.do('# Download this script from http://GitHub.com/MooersLab/pymolshortcuts.')
cmd.do('# Load the functions from this script with the command "run pymolshortcuts.py"')
cmd.do('set_color oxygen, [1.0,0.4,0.4];')
cmd.do('set_color nitrogen, [0.5,0.5,1.0];')
cmd.do('remove solvent;')
cmd.do('as spheres;')
cmd.do('util.cbaw;')
cmd.do('bg white;')
cmd.do('gscale();')
cmd.do('set light_count,10;')
cmd.do('set spec_count,1;')
cmd.do('set shininess, 10;')
cmd.do('set specular,0.25;')
cmd.do('set ambient,0;')
cmd.do('set direct,0;')
cmd.do('set reflect,1.5;')
cmd.do('set ray_shadow_decay_factor, 0.1;')
cmd.do('set ray_shadow_decay_range, 2;')
cmd.do('set depth_cue, 0;')
cmd.do('ray;')
| 32.734694 | 88 | 0.687032 | 289 | 1,604 | 3.737024 | 0.214533 | 0.194444 | 0.177778 | 0.077778 | 0.948148 | 0.948148 | 0.948148 | 0.948148 | 0.948148 | 0.948148 | 0 | 0.038382 | 0.090399 | 1,604 | 48 | 89 | 33.416667 | 0.701851 | 0.516833 | 0 | 0 | 0 | 0 | 0.691906 | 0.061358 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7b5989b49be31c8a9819518738d3825e68bce973 | 97 | py | Python | src/gtfspy/__init__.py | WYishai/gtfs.py | 129cbe5cfd77d92d435db9502b4bb70dd067365c | [
"Apache-2.0"
] | 8 | 2018-07-05T08:16:07.000Z | 2020-09-05T20:13:41.000Z | src/gtfspy/__init__.py | WYishai/gtfs.py | 129cbe5cfd77d92d435db9502b4bb70dd067365c | [
"Apache-2.0"
] | 1 | 2019-04-14T07:01:32.000Z | 2019-04-14T07:01:32.000Z | src/gtfspy/__init__.py | WYishai/gtfs.py | 129cbe5cfd77d92d435db9502b4bb70dd067365c | [
"Apache-2.0"
] | 2 | 2019-04-12T14:44:41.000Z | 2021-06-13T18:46:37.000Z | import data_objects
from transit_data_object import TransitData
from transit_data_utils import *
| 24.25 | 43 | 0.886598 | 14 | 97 | 5.785714 | 0.571429 | 0.271605 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103093 | 97 | 3 | 44 | 32.333333 | 0.931034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7baddcff10b699011500a17ef88fca067b82a51b | 4,846 | py | Python | src/outpost/django/campusonline/migrations/0046_student_username.py | medunigraz/outpost.django.campusonline | 06776bce7556e438c1e00a96aaa9271a7aac8fe4 | [
"BSD-2-Clause"
] | null | null | null | src/outpost/django/campusonline/migrations/0046_student_username.py | medunigraz/outpost.django.campusonline | 06776bce7556e438c1e00a96aaa9271a7aac8fe4 | [
"BSD-2-Clause"
] | null | null | null | src/outpost/django/campusonline/migrations/0046_student_username.py | medunigraz/outpost.django.campusonline | 06776bce7556e438c1e00a96aaa9271a7aac8fe4 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.20 on 2019-03-27 13:29
from __future__ import unicode_literals
from django.db import migrations
from django.conf import settings
class Migration(migrations.Migration):
ops = [
(
"""
DROP INDEX IF EXISTS campusonline_student_cardid_idx;
""",
"""
CREATE INDEX campusonline_student_cardid_idx ON "public"."campusonline_student" ("cardid");
""",
),
(
"""
DROP INDEX IF EXISTS campusonline_student_matriculation_idx;
""",
"""
CREATE INDEX campusonline_student_matriculation_idx ON "public"."campusonline_student" ("matriculation");
""",
),
(
"""
DROP INDEX IF EXISTS campusonline_student_id_idx;
""",
"""
CREATE INDEX campusonline_student_id_idx ON "public"."campusonline_student" ("id");
""",
),
(
"""
DROP MATERIALIZED VIEW IF EXISTS "public"."campusonline_student";
""",
"""
CREATE MATERIALIZED VIEW "public"."campusonline_student" AS SELECT
stud_nr::integer AS id,
stud_mnr AS matriculation,
stud_famnam AS last_name,
stud_vorname AS first_name,
stud_akadgrad AS title,
stud_mifare AS cardid
FROM "campusonline"."stud"
WITH DATA;
""",
),
(
"""
DROP FOREIGN TABLE IF EXISTS "campusonline"."stud";
""",
"""
CREATE FOREIGN TABLE "campusonline"."stud" (
STUD_NR numeric,
STUD_MNR varchar,
STUD_FAMNAM varchar,
STUD_VORNAME varchar,
STUD_AKADGRAD varchar,
STUD_SEX varchar,
STUD_MIFARE varchar
)
SERVER sqlalchemy OPTIONS (
tablename 'STUD_V',
db_url '{}'
);
""".format(
settings.MULTICORN.get("campusonline")
),
),
(
"""
CREATE FOREIGN TABLE "campusonline"."stud" (
STUD_NR numeric,
STUD_MNR varchar,
STUD_FAMNAM varchar,
STUD_VORNAME varchar,
STUD_AKADGRAD varchar,
STUD_SEX varchar,
STUD_MIFARE varchar,
STUD_BENUTZERNAME varchar
)
SERVER sqlalchemy OPTIONS (
tablename 'STUD_V',
db_url '{}'
);
""".format(
settings.MULTICORN.get("campusonline")
),
"""
DROP FOREIGN TABLE IF EXISTS "campusonline"."stud";
""",
),
(
"""
CREATE MATERIALIZED VIEW "public"."campusonline_student" AS SELECT
stud_nr::integer AS id,
stud_mnr AS matriculation,
stud_famnam AS last_name,
stud_vorname AS first_name,
stud_akadgrad AS title,
stud_mifare AS cardid,
stud_benutzername as username
FROM "campusonline"."stud"
WITH DATA;
""",
"""
DROP MATERIALIZED VIEW IF EXISTS "public"."campusonline_student";
""",
),
(
"""
CREATE INDEX campusonline_student_id_idx ON "public"."campusonline_student" ("id");
""",
"""
DROP INDEX IF EXISTS campusonline_student_id_idx;
""",
),
(
"""
CREATE INDEX campusonline_student_matriculation_idx ON "public"."campusonline_student" ("matriculation");
""",
"""
DROP INDEX IF EXISTS campusonline_student_matriculation_idx;
""",
),
(
"""
CREATE INDEX campusonline_student_cardid_idx ON "public"."campusonline_student" ("cardid");
""",
"""
DROP INDEX IF EXISTS campusonline_student_cardid_idx;
""",
),
(
"""
CREATE INDEX campusonline_student_username_idx ON "public"."campusonline_student" ("username");
""",
"""
DROP INDEX IF EXISTS campusonline_student_username_idx;
""",
),
]
dependencies = [("campusonline", "0045_course_group_term_filterd")]
operations = [
migrations.RunSQL(
[forward for forward, reverse in ops],
[reverse for forward, reverse in reversed(ops)],
)
]
| 31.064103 | 117 | 0.483079 | 386 | 4,846 | 5.810881 | 0.222798 | 0.21177 | 0.122604 | 0.053054 | 0.830584 | 0.817209 | 0.776193 | 0.776193 | 0.740526 | 0.695051 | 0 | 0.007911 | 0.426125 | 4,846 | 155 | 118 | 31.264516 | 0.798634 | 0.014239 | 0 | 0.278689 | 1 | 0 | 0.059459 | 0.027027 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.04918 | 0 | 0.114754 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c8b0e5b71d22867ae35c0b9ba26fdd3f034fdea3 | 2,945 | py | Python | Bot/__init__.py | FckVanessa/NekoBot | 2eed89c69a78ff3447c99a934c83347904e39fa5 | [
"Apache-2.0"
] | 2 | 2021-09-17T12:34:00.000Z | 2022-02-24T07:41:26.000Z | Bot/__init__.py | FckVanessa/NekoBot | 2eed89c69a78ff3447c99a934c83347904e39fa5 | [
"Apache-2.0"
] | null | null | null | Bot/__init__.py | FckVanessa/NekoBot | 2eed89c69a78ff3447c99a934c83347904e39fa5 | [
"Apache-2.0"
] | 2 | 2021-05-14T04:05:30.000Z | 2021-09-11T11:38:33.000Z | from Bot.config import Development as Config
import logging
import sys
import telegram.ext as tg
import pymongo
import time
import os
BLACK='\033[0;30m'
BLUE='\033[0;34m'
GREEN='\033[0;32m'
CYAN='\033[0;36m'
RED='\033[0;31m'
PURPLE='\033[0;35m'
BROWN='\033[0;33m'
LGRAY='\033[0;37m'
DGRAY='\033[1;30m'
LBLUE='\033[1;34m'
LGREEN='\033[1;32m'
LCYAN='\033[1;36m'
LRED='\033[1;31m'
LPURPLE='\033[1;35m'
YELLOW='\033[1;33m'
WHITE='\033[1;37m'
NC='\033[om'
os.system("clear")
StartTime = time.time()
# Mengaktifkan logging/riwayat
logging.basicConfig(
format="\033[1;31%(asctime)s - %(name)s - %(levelname)s - %(message)s", level=logging.INFO
)
LOGGER = logging.getLogger(__name__)
LOGGER.info("\033[0;34mNeko \033[0;32mis \033[0;36mnow \033[0;31monline\n\033[1;36m[\033[0;33m+\033[1;36m]======================================================\033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]\033[1;33m _ _ \033[1;35m \033[1;31m _\033[1;36m \033[1;34m____ \033[1;37m _ \033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]\033[1;33m | \ | |\033[1;35m ___\033[1;31m| | __\033[1;36m___\033[1;34m | __ ) \033[1;32m ___\033[1;37m | |_ \033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]\033[1;33m | \| |\033[1;35m/ _ \ \033[1;31m|/ / \033[1;36m_ \ \033[1;34m| _ \ \033[1;32m/ _ \\\033[1;37m| __| \033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]\033[1;33m | |\ | \033[1;35m __/ \033[1;31m <\033[1;36m (_) |\033[1;34m| |_) |\033[1;32m (_) \033[1;37m| |_ \033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]\033[1;33m |_| \_|\033[1;35m\___|\033[1;31m_|\_\\\033[1;36m___\033[0;37m(_)\033[1;34m____/ \033[1;32m\___/\033[1;37m \__| \033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]======================================================\033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]\033[1;37m Created \033[1;32m :\033[1;37m NekoId \033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]\033[1;37m Github \033[1;32m :\033[1;37m https://github.com/NekoId \033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]\033[1;37m Telegram\033[1;32m :\033[1;37m https://t.me/Nekoid \033[1;36m[\033[0;33m+\033[1;36m]\n\033[1;36m[\033[0;33m+\033[1;36m]======================================================\033[1;36m[\033[0;33m+\033[1;36m]")
# python version < 3.6, atau bot akan berhenti.
if sys.version_info[0] < 3 or sys.version_info[1] < 6:
LOGGER.error(
"Anda harus memiliki versi python setidaknya 3.6! Beberapa fitur bergantung pada ini. Bot berhenti."
)
quit(1)
TOKEN = Config.TOKEN
MONGO_CLIENT = pymongo.MongoClient(Config.MONGO_URI)
updater = tg.Updater(TOKEN, use_context=True)
dispatcher = updater.dispatcher
| 58.9 | 1,857 | 0.582683 | 545 | 2,945 | 3.033028 | 0.188991 | 0.232305 | 0.211736 | 0.229885 | 0.487598 | 0.486993 | 0.478524 | 0.455535 | 0.455535 | 0.455535 | 0 | 0.311597 | 0.13039 | 2,945 | 49 | 1,858 | 60.102041 | 0.333854 | 0.025127 | 0 | 0 | 0 | 0.05 | 0.757671 | 0.394351 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.175 | 0 | 0.175 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c8c0d9ae6dc0df92fb8d177644f25fae4c799f49 | 7,008 | py | Python | geokey/contributions/tests/observations/test_renderers.py | universityofsussex/geokey | 25e161dbc81841c57c148053dbe99facc81e84b8 | [
"Apache-2.0"
] | null | null | null | geokey/contributions/tests/observations/test_renderers.py | universityofsussex/geokey | 25e161dbc81841c57c148053dbe99facc81e84b8 | [
"Apache-2.0"
] | null | null | null | geokey/contributions/tests/observations/test_renderers.py | universityofsussex/geokey | 25e161dbc81841c57c148053dbe99facc81e84b8 | [
"Apache-2.0"
] | null | null | null | """Tests for renderers of contributions (observations)."""
import json
from django.test import TestCase
from django.template.loader import render_to_string
from geokey.contributions.renderers.geojson import GeoJsonRenderer
from geokey.contributions.renderers.kml import KmlRenderer
class KmlRendererTest(TestCase):
def setUp(self):
self.contrib = {
'id': 1,
'location': {
'id': 1,
'name': 'Location',
'description': None,
'geometry': '{"type": "Point","coordinates": [ '
'-0.144415497779846, 51.54671869005856]}'
},
"properties": {
"child_friendly": False,
"name": "The Grafton",
"address": "20 Prince of Wales Rd, London NW5 3LG"
},
"display_field": {
"key": "name",
"value": "The Grafton"
},
"meta": {
"creator": {
"id": 2,
"display_name": "Oliver"
},
"isowner": True,
"updator": None,
"status": "active",
"created_at": "2014-09-19T15:51:32.804Z",
"updated_at": "2014-09-21T15:51:32.804Z",
"version": 1,
"category": {
"id": 40,
"name": "Pubs",
"description": "",
"status": "active",
"fields": [
{
"id": 117,
"name": "Name",
"key": "name",
"fieldtype": "TextField",
"description": "",
"status": "active",
"required": True
},
{
"id": 118,
"name": "Address",
"key": "address",
"fieldtype": "TextField",
"description": "",
"status": "active",
"required": False
},
{
"id": 119,
"name": "Child friedly",
"key": "child_friedly",
"fieldtype": "TrueFalseField",
"description": "Would your take your kids?",
"status": "active",
"required": False
}
],
"colour": "#0033ff",
"created_at": "2014-09-17T00:00:00Z"
}
},
"comments": [],
"review_comments": [],
"media": []
}
def test_render(self):
renderer = KmlRenderer()
result = renderer.render([self.contrib])
rendered = render_to_string(
'geometries/placemarks.kml',
{'data': [self.contrib]}
)
self.assertEqual(result, rendered)
class GeoJsonRendererTest(TestCase):
def setUp(self):
self.contrib = {
'id': 1,
'location': {
'id': 1,
'name': 'Location',
'description': None,
'geometry': '{"type": "Point","coordinates": [ '
'-0.144415497779846, 51.54671869005856]}'
},
"properties": {
"child_friendly": False,
"name": "The Grafton",
"address": "20 Prince of Wales Rd, London NW5 3LG"
},
"meta": {
"creator": {
"id": 2,
"display_name": "Oliver"
},
"isowner": True,
"updator": None,
"status": "active",
"created_at": "2014-09-19T15:51:32.804Z",
"updated_at": "2014-09-21T15:51:32.804Z",
"version": 1,
"category": {
"id": 40,
"name": "Pubs",
"description": "",
"status": "active",
"fields": [
{
"id": 117,
"name": "Name",
"key": "name",
"fieldtype": "TextField",
"description": "",
"status": "active",
"required": True
},
{
"id": 118,
"name": "Address",
"key": "address",
"fieldtype": "TextField",
"description": "",
"status": "active",
"required": False
},
{
"id": 119,
"name": "Child friedly",
"key": "child_friedly",
"fieldtype": "TrueFalseField",
"description": "Would your take your kids?",
"status": "active",
"required": False
}
],
"colour": "#0033ff",
"created_at": "2014-09-17T00:00:00Z"
}
},
"comments": [],
"review_comments": [],
"media": []
}
def test_render_single(self):
renderer = GeoJsonRenderer()
result = renderer.render_single(self.contrib)
self.assertTrue('geometry' in result)
self.assertFalse('geometry' in result.get('location'))
def test_render_many(self):
renderer = GeoJsonRenderer()
result = renderer.render_many([self.contrib])
self.assertEqual(result.get('type'), 'FeatureCollection')
self.assertEqual(len(result.get('features')), 1)
def test_render_with_none(self):
renderer = GeoJsonRenderer()
result = renderer.render(None)
self.assertEqual(result, '')
def test_render_with_single(self):
renderer = GeoJsonRenderer()
result = json.loads(renderer.render(self.contrib))
self.assertTrue('geometry' in result)
self.assertFalse('geometry' in result.get('location'))
def test_render_with_many(self):
renderer = GeoJsonRenderer()
result = json.loads(renderer.render([self.contrib]))
self.assertEqual(result.get('type'), 'FeatureCollection')
self.assertEqual(len(result.get('features')), 1)
| 35.04 | 72 | 0.383562 | 468 | 7,008 | 5.668803 | 0.260684 | 0.045232 | 0.018093 | 0.062194 | 0.810026 | 0.790426 | 0.737279 | 0.737279 | 0.737279 | 0.737279 | 0 | 0.058526 | 0.492865 | 7,008 | 199 | 73 | 35.21608 | 0.687957 | 0.00742 | 0 | 0.703911 | 0 | 0 | 0.226619 | 0.023741 | 0 | 0 | 0 | 0 | 0.055866 | 1 | 0.044693 | false | 0 | 0.027933 | 0 | 0.083799 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c8c3007c48282ae8265971a962ad0bd25b5a2fd4 | 6,386 | py | Python | mimo/distributions/gamma.py | hanyas/mimo | 6f9b327a1a202a88b33a419520474ef4f10749e8 | [
"MIT"
] | 16 | 2019-11-27T14:07:47.000Z | 2021-03-12T19:50:01.000Z | mimo/distributions/gamma.py | pnickl/mimo | 81c4bbd2594e2136445009eae752ab8a1602a1cf | [
"MIT"
] | null | null | null | mimo/distributions/gamma.py | pnickl/mimo | 81c4bbd2594e2136445009eae752ab8a1602a1cf | [
"MIT"
] | 6 | 2019-10-28T23:24:28.000Z | 2022-03-01T12:38:53.000Z | import numpy as np
import numpy.random as npr
from scipy.special import gammaln, digamma
from mimo.abstraction import Distribution
from mimo.abstraction import Statistics as Stats
class Gamma(Distribution):
# In comparison to a Wishart distribution
# alpha = nu / 2.
# beta = 1. / (2. * psi)
def __init__(self, alphas, betas):
self.alphas = alphas # shape
self.betas = betas # rate
@property
def params(self):
return self.alphas, self.betas
@params.setter
def params(self, values):
self.alphas, self.betas = values
@property
def dim(self):
return len(self.alphas)
def rvs(self, size=1):
# numpy uses a different parameterization
return npr.gamma(self.alphas, 1. / self.betas)
def mean(self):
return self.alphas / self.betas
def mode(self):
assert np.all(self.alphas >= 1.)
return (self.alphas - 1.) / self.betas
def log_likelihood(self, x):
# not vectorized
log_lik = np.sum((self.alphas - 1.) * np.log(x) - self.betas * x)
return - self.log_partition() + self.log_base() + log_lik
def statistics(self, data):
if isinstance(data, np.ndarray):
idx = ~np.isnan(data).any(axis=1)
data = data[idx]
logx = np.log(data)
x = data
n = np.ones((self.data.shape[0], ))
return Stats([logx, n, x, n])
else:
return list(map(self.statistics, data))
def weighted_statistics(self, data, weights):
if isinstance(data, np.ndarray):
idx = ~np.isnan(data).any(axis=1)
data = data[idx]
weights = weights[idx]
logx = np.einsum('n,nk->nk', weights, np.log(data))
x = np.einsum('n,nk->nk', weights, data)
n = weights
return Stats([logx, n, x, n])
else:
return list(map(self.weighted_statistics, data, weights))
@property
def base(self):
return 1.
def log_base(self):
return np.log(self.base)
@property
def nat_param(self):
return self.std_to_nat(self.params)
@nat_param.setter
def nat_param(self, natparam):
self.params = self.nat_to_std(natparam)
@staticmethod
def std_to_nat(params):
alphas = params[0] - 1
betas = - params[1]
return Stats([alphas, betas])
@staticmethod
def nat_to_std(natparam):
alphas = natparam[0] + 1
betas = - natparam[1]
return alphas, betas
def log_partition(self):
return np.sum(gammaln(self.alphas) - self.alphas * np.log(self.betas))
def expected_statistics(self):
E_log_x = digamma(self.alphas) - np.log(self.betas)
E_x = self.alphas / self.betas
return E_log_x, E_x
def entropy(self):
nat_param, stats = self.nat_param, self.expected_statistics()
return self.log_partition() - self.log_base()\
- (nat_param[0].dot(stats[0]) + np.dot(nat_param[1], stats[1]))
def cross_entropy(self, dist):
nat_param, stats = dist.nat_param, self.expected_statistics()
return dist.log_partition() - dist.log_base() \
- (nat_param[0].dot(stats[0]) + np.dot(nat_param[1], stats[1]))
class InverseGamma(Distribution):
def __init__(self, alphas, betas):
self.alphas = alphas # shape
self.betas = betas # rate
@property
def params(self):
return self.alphas, self.betas
@params.setter
def params(self, values):
self.alphas, self.betas = values
@property
def dim(self):
return len(self.alphas)
def rvs(self, size=1):
# numpy uses a different parameterization
return 1. / npr.gamma(self.alphas, 1. / self.betas)
def mean(self):
assert np.all(self.alphas >= 1.)
return self.betas / (self.alphas - 1)
def mode(self):
return self.betas / (self.alphas + 1.)
def log_likelihood(self, x):
# not vectorized
log_lik = np.sum((- self.alphas - 1.) * np.log(x) - self.betas / x)
return - self.log_partition() + self.log_base() + log_lik
def statistics(self, data):
if isinstance(data, np.ndarray):
idx = ~np.isnan(data).any(axis=1)
data = data[idx]
logx = np.log(data)
x = 1. / data
n = np.ones((data.shape[0], ))
return Stats([logx, n, x, n])
else:
return list(map(self.statistics, data))
def weighted_statistics(self, data, weights):
if isinstance(data, np.ndarray):
idx = ~np.isnan(data).any(axis=1)
data = data[idx]
weights = weights[idx]
logx = np.einsum('n,nk->nk', weights, np.log(data))
x = np.einsum('n,nk->nk', weights, 1. / data)
n = weights
return Stats([logx, n, x, n])
else:
return list(map(self.weighted_statistics, data, weights))
@property
def base(self):
return 1.
def log_base(self):
return np.log(self.base)
@property
def nat_param(self):
return self.std_to_nat(self.params)
@nat_param.setter
def nat_param(self, natparam):
self.params = self.nat_to_std(natparam)
@staticmethod
def std_to_nat(params):
alphas = - params[0] - 1
betas = - params[1]
return Stats([alphas, betas])
@staticmethod
def nat_to_std(natparam):
alphas = - natparam[0] - 1
betas = - natparam[1]
return alphas, betas
def log_partition(self):
return np.sum(gammaln(self.alphas) - self.alphas * np.log(self.betas))
def expected_statistics(self):
E_log_x = np.log(self.betas) - digamma(self.alphas)
E_x_inv = self.alphas / self.betas
return E_log_x, E_x_inv
def entropy(self):
nat_param, stats = self.nat_param, self.expected_statistics()
return self.log_partition() - self.log_base()\
- (nat_param[0].dot(stats[0]) + np.dot(nat_param[1], stats[1]))
def cross_entropy(self, dist):
nat_param, stats = dist.nat_param, self.expected_statistics()
return dist.log_partition() - dist.log_base() \
- (nat_param[0].dot(stats[0]) + np.dot(nat_param[1], stats[1]))
| 28.382222 | 78 | 0.580645 | 847 | 6,386 | 4.266824 | 0.109799 | 0.077476 | 0.034864 | 0.036801 | 0.904261 | 0.904261 | 0.886276 | 0.873824 | 0.873824 | 0.855008 | 0 | 0.011728 | 0.292358 | 6,386 | 224 | 79 | 28.508929 | 0.788006 | 0.032884 | 0 | 0.809816 | 0 | 0 | 0.005191 | 0 | 0 | 0 | 0 | 0 | 0.01227 | 1 | 0.245399 | false | 0 | 0.030675 | 0.09816 | 0.521472 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
b544cf55b869d26a9d063781c6a8c2a53084ef20 | 106 | py | Python | rajksimple/tests/test_nothing.py | rajk-apps/rajksimple | e3f283eded85927509e8ff2f9774caf9501910fd | [
"MIT"
] | 1 | 2020-04-09T06:42:11.000Z | 2020-04-09T06:42:11.000Z | rajksimple/tests/test_nothing.py | rajk-apps/rajksimple | e3f283eded85927509e8ff2f9774caf9501910fd | [
"MIT"
] | null | null | null | rajksimple/tests/test_nothing.py | rajk-apps/rajksimple | e3f283eded85927509e8ff2f9774caf9501910fd | [
"MIT"
] | 1 | 2020-04-12T11:54:38.000Z | 2020-04-12T11:54:38.000Z | # from django.test import TestCase
def test_import():
import rajksimple
rajksimple.__version__
| 13.25 | 34 | 0.745283 | 12 | 106 | 6.166667 | 0.666667 | 0.27027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.198113 | 106 | 7 | 35 | 15.142857 | 0.870588 | 0.301887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.666667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b590bf5d8d7c91c05970033221844edada1954fb | 29,454 | py | Python | mc2/controllers/docker/tests/test_docker_controller.py | praekeltfoundation/mc2 | 5367a8aed309fade0f17bc72efa099b0afc76aa7 | [
"BSD-2-Clause"
] | 4 | 2016-03-09T00:51:17.000Z | 2017-10-05T23:54:00.000Z | mc2/controllers/docker/tests/test_docker_controller.py | praekeltfoundation/mc2 | 5367a8aed309fade0f17bc72efa099b0afc76aa7 | [
"BSD-2-Clause"
] | 131 | 2015-11-19T16:45:23.000Z | 2018-07-24T09:36:08.000Z | mc2/controllers/docker/tests/test_docker_controller.py | praekeltfoundation/mc2 | 5367a8aed309fade0f17bc72efa099b0afc76aa7 | [
"BSD-2-Clause"
] | 2 | 2016-07-30T15:36:23.000Z | 2017-09-18T12:40:11.000Z | import pytest
import responses
from django.conf import settings
from django.contrib.auth.models import User
from django.core.urlresolvers import reverse
from mc2.controllers.base.tests.base import ControllerBaseTestCase
from mc2.controllers.docker.models import (
DockerController, marathon_lb_domains, traefik_domains)
from mc2.organizations.models import Organization, OrganizationUserRelation
@pytest.mark.django_db
class DockerControllerTestCase(ControllerBaseTestCase):
fixtures = ['test_users.json', 'test_social_auth.json']
def setUp(self):
self.user = User.objects.get(username='testuser')
self.maxDiff = None
def test_get_marathon_app_data(self):
controller = DockerController.objects.create(
name='Test App',
owner=self.user,
marathon_cmd='ping',
docker_image='docker/image',
)
org = Organization.objects.create(name="Test Org", slug="test-org")
OrganizationUserRelation.objects.create(
user=self.user, organization=org)
controller.organization = org
controller.save()
custom_urls = "testing.com url.com"
controller.domain_urls += custom_urls
domain_label = "{}.{} {}".format(
controller.app_id, settings.HUB_DOMAIN, custom_urls)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "test-org",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"parameters": [{"key": "memory-swappiness", "value": "0"}],
},
},
})
controller.port = 1234
controller.save()
domain_label = "{}.{} {}".format(
controller.app_id, settings.HUB_DOMAIN, custom_urls)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "test-org",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"portMappings": [{"containerPort": 1234, "hostPort": 0}],
"parameters": [{"key": "memory-swappiness", "value": "0"}],
},
},
})
controller.marathon_health_check_path = '/health/path/'
controller.save()
domain_label = "{}.{} {}".format(
controller.app_id, settings.HUB_DOMAIN, custom_urls)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "test-org",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"portMappings": [{"containerPort": 1234, "hostPort": 0}],
"parameters": [{"key": "memory-swappiness", "value": "0"}],
}
},
"ports": [0],
"healthChecks": [{
"gracePeriodSeconds": 60,
"intervalSeconds": 10,
"maxConsecutiveFailures": 3,
"path": '/health/path/',
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}]
})
controller.volume_needed = True
controller.volume_path = "/deploy/media/"
controller.save()
domain_label = "{}.{} {}".format(
controller.app_id, settings.HUB_DOMAIN, custom_urls)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "test-org",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"portMappings": [{"containerPort": 1234, "hostPort": 0}],
"parameters": [
{"key": "memory-swappiness", "value": "0"},
{"key": "volume-driver", "value": "xylem"},
{
"key": "volume",
"value":
"%s_media:/deploy/media/" % controller.app_id
}]
}
},
"ports": [0],
"healthChecks": [{
"gracePeriodSeconds": 60,
"intervalSeconds": 10,
"maxConsecutiveFailures": 3,
"path": '/health/path/',
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}]
})
controller.volume_path = ""
controller.save()
domain_label = "{}.{} {}".format(
controller.app_id, settings.HUB_DOMAIN, custom_urls)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "test-org",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"portMappings": [{"containerPort": 1234, "hostPort": 0}],
"parameters": [
{"key": "memory-swappiness", "value": "0"},
{"key": "volume-driver", "value": "xylem"},
{
"key": "volume",
"value":
"%s_media:%s" % (
controller.app_id,
settings.MARATHON_DEFAULT_VOLUME_PATH)
}]
}
},
"ports": [0],
"healthChecks": [{
"gracePeriodSeconds": 60,
"intervalSeconds": 10,
"maxConsecutiveFailures": 3,
"path": '/health/path/',
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}]
})
def test_get_marathon_app_data_with_env(self):
controller = DockerController.objects.create(
name='Test App',
owner=self.user,
marathon_cmd='ping',
docker_image='docker/image',
)
self.mk_env_variable(controller)
domain_label = "{}.{}".format(controller.app_id, settings.HUB_DOMAIN)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"env": {"TEST_KEY": "a test value"},
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"parameters": [{"key": "memory-swappiness", "value": "0"}],
},
},
})
def test_get_marathon_app_data_with_app_labels(self):
controller = DockerController.objects.create(
name='Test App',
owner=self.user,
marathon_cmd='ping',
docker_image='docker/image',
)
self.mk_env_variable(controller)
self.mk_labels_variable(controller)
domain_label = "{}.{}".format(controller.app_id, settings.HUB_DOMAIN)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"env": {"TEST_KEY": "a test value"},
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"TEST_LABELS_NAME": 'a test label value',
"org": "",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"parameters": [{"key": "memory-swappiness", "value": "0"}],
},
},
})
@responses.activate
def test_get_marathon_app_data_with_postgres_db_needed(self):
controller = DockerController.objects.create(
name='Test App',
owner=self.user,
marathon_cmd='ping',
docker_image='docker/image',
postgres_db_needed=True,
)
self.mock_create_postgres_db(200, {
'result': {
'name': 'joes_db',
'user': 'joe',
'password': '1234',
'host': 'localhost'}})
domain_label = "{}.{}".format(controller.app_id, settings.HUB_DOMAIN)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"env": {
"DATABASE_URL": u"postgres://joe:1234@localhost/joes_db"},
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"parameters": [{"key": "memory-swappiness", "value": "0"}],
},
},
})
@responses.activate
def test_to_dict(self):
controller = DockerController.objects.create(
name='Test App',
owner=self.user,
marathon_cmd='ping',
docker_image='docker/image',
port=1234,
marathon_health_check_path='/health/path/',
marathon_health_check_cmd='cmd ping',
)
self.assertEquals(controller.to_dict(), {
'id': controller.id,
'name': 'Test App',
'app_id': controller.app_id,
'state': 'initial',
'state_display': 'Initial',
'marathon_cmd': 'ping',
'marathon_args': '',
'port': 1234,
'marathon_health_check_path': '/health/path/',
'marathon_health_check_cmd': 'cmd ping',
})
@responses.activate
def test_marathon_cmd_optional(self):
controller = DockerController.objects.create(
name='Test App',
owner=self.user,
docker_image='docker/image',
)
domain_label = "{}.{}".format(controller.app_id, settings.HUB_DOMAIN)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"parameters": [{"key": "memory-swappiness", "value": "0"}],
},
},
})
@responses.activate
def test_marathon_with_args(self):
controller = DockerController.objects.create(
name='Test App',
owner=self.user,
docker_image='docker/image',
marathon_args='celery worker'
)
domain_label = "{}.{}".format(controller.app_id, settings.HUB_DOMAIN)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"args": ["celery", "worker"],
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"parameters": [{"key": "memory-swappiness", "value": "0"}],
},
},
})
@responses.activate
def test_get_marathon_app_data_using_health_timeout_strings(self):
controller = DockerController.objects.create(
name='Test App',
owner=self.user,
marathon_cmd='ping',
docker_image='docker/image',
marathon_health_check_path='/health/path/',
port=1234,
)
custom_urls = "testing.com url.com"
controller.domain_urls += custom_urls
with self.settings(
MESOS_DEFAULT_GRACE_PERIOD_SECONDS='600',
MESOS_DEFAULT_INTERVAL_SECONDS='100',
MESOS_DEFAULT_TIMEOUT_SECONDS='200'):
domain_label = "{}.{} {}".format(
controller.app_id, settings.HUB_DOMAIN, custom_urls)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"portMappings": [
{"containerPort": 1234, "hostPort": 0}],
"parameters": [
{"key": "memory-swappiness", "value": "0"}],
},
},
"ports": [0],
"healthChecks": [{
"gracePeriodSeconds": 600,
"intervalSeconds": 100,
"maxConsecutiveFailures": 3,
"path": '/health/path/',
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 200
}]
})
@responses.activate
def test_get_marathon_app_data_using_health_cmd(self):
controller = DockerController.objects.create(
name='Test App',
owner=self.user,
marathon_cmd='ping',
docker_image='docker/image',
marathon_health_check_cmd='cmd ping',
port=1234,
)
custom_urls = "testing.com url.com"
controller.domain_urls += custom_urls
with self.settings(
MESOS_DEFAULT_GRACE_PERIOD_SECONDS='600',
MESOS_DEFAULT_INTERVAL_SECONDS='100',
MESOS_DEFAULT_TIMEOUT_SECONDS='200'):
domain_label = "{}.{} {}".format(
controller.app_id, settings.HUB_DOMAIN, custom_urls)
self.assertEquals(controller.get_marathon_app_data(), {
"id": controller.app_id,
"cpus": 0.1,
"mem": 128.0,
"instances": 1,
"cmd": "ping",
"backoffFactor": settings.MESOS_DEFAULT_BACKOFF_FACTOR,
"backoffSeconds": settings.MESOS_DEFAULT_BACKOFF_SECONDS,
"labels": {
"domain": domain_label,
"HAPROXY_GROUP": "external",
"HAPROXY_0_VHOST": marathon_lb_domains(domain_label),
"traefik.frontend.rule": traefik_domains(domain_label),
"name": "Test App",
"org": "",
},
"container": {
"type": "DOCKER",
"docker": {
"image": "docker/image",
"forcePullImage": True,
"network": "BRIDGE",
"portMappings": [
{"containerPort": 1234, "hostPort": 0}],
"parameters": [
{"key": "memory-swappiness", "value": "0"}],
},
},
"healthChecks": [{
"gracePeriodSeconds": 600,
"intervalSeconds": 100,
"maxConsecutiveFailures": 3,
"command": {'value': 'cmd ping'},
"protocol": "COMMAND",
"timeoutSeconds": 200
}]
})
@responses.activate
def test_create_new_controller_with_no_port(self):
org = Organization.objects.create(name="Foo Org", slug="foo-org")
OrganizationUserRelation.objects.create(
user=self.user, organization=org)
self.client.login(username='testuser2', password='test')
self.client.get(
reverse('organizations:select-active', args=('foo-org',)))
self.mock_create_marathon_app()
self.mock_create_postgres_db(200, {
'result': {
'name': 'joes_db',
'user': 'joe',
'password': '1234',
'host': 'localhost'}})
data = {
'name': 'Another test app',
'docker_image': 'test/image',
'postgres_db_needed': True,
'env-TOTAL_FORMS': 0,
'env-INITIAL_FORMS': 0,
'env-MIN_NUM_FORMS': 0,
'env-MAX_NUM_FORMS': 100,
'label-TOTAL_FORMS': 0,
'label-INITIAL_FORMS': 0,
'label-MIN_NUM_FORMS': 0,
'label-MAX_NUM_FORMS': 100,
'link-TOTAL_FORMS': 0,
'link-INITIAL_FORMS': 0,
'link-MIN_NUM_FORMS': 0,
'link-MAX_NUM_FORMS': 100,
}
response = self.client.post(reverse('controllers.docker:add'), data)
self.assertEqual(response.status_code, 302)
controller = DockerController.objects.all().last()
self.assertEqual(controller.state, 'done')
self.assertEqual(controller.name, 'Another test app')
self.assertEqual(controller.organization.slug, 'foo-org')
self.assertIsNone(controller.port)
@responses.activate
def test_create_new_controller_with_health_path(self):
org = Organization.objects.create(name="Foo Org", slug="foo-org")
OrganizationUserRelation.objects.create(
user=self.user, organization=org)
self.client.login(username='testuser2', password='test')
self.client.get(
reverse('organizations:select-active', args=('foo-org',)))
self.mock_create_marathon_app()
self.mock_create_postgres_db(200, {
'result': {
'name': 'joes_db',
'user': 'joe',
'password': '1234',
'host': 'localhost'}})
data = {
'name': 'Another test app',
'docker_image': 'test/image',
'postgres_db_needed': True,
'port': 8000,
'env-TOTAL_FORMS': 0,
'env-INITIAL_FORMS': 0,
'env-MIN_NUM_FORMS': 0,
'env-MAX_NUM_FORMS': 100,
'label-TOTAL_FORMS': 0,
'label-INITIAL_FORMS': 0,
'label-MIN_NUM_FORMS': 0,
'label-MAX_NUM_FORMS': 100,
'link-TOTAL_FORMS': 0,
'link-INITIAL_FORMS': 0,
'link-MIN_NUM_FORMS': 0,
'link-MAX_NUM_FORMS': 100,
'marathon_health_check_path': '/health/path/',
}
response = self.client.post(reverse('controllers.docker:add'), data)
self.assertEqual(response.status_code, 302)
controller = DockerController.objects.all().last()
self.assertEqual(controller.state, 'done')
self.assertEqual(controller.name, 'Another test app')
self.assertEqual(controller.organization.slug, 'foo-org')
self.assertEqual(
controller.marathon_health_check_path,
'/health/path/')
@responses.activate
def test_create_new_controller_with_health_cmd(self):
org = Organization.objects.create(name="Foo Org", slug="foo-org")
OrganizationUserRelation.objects.create(
user=self.user, organization=org)
self.client.login(username='testuser2', password='test')
self.client.get(
reverse('organizations:select-active', args=('foo-org',)))
self.mock_create_marathon_app()
self.mock_create_postgres_db(200, {
'result': {
'name': 'joes_db',
'user': 'joe',
'password': '1234',
'host': 'localhost'}})
data = {
'name': 'Another test app',
'docker_image': 'test/image',
'postgres_db_needed': True,
'env-TOTAL_FORMS': 0,
'env-INITIAL_FORMS': 0,
'env-MIN_NUM_FORMS': 0,
'env-MAX_NUM_FORMS': 100,
'label-TOTAL_FORMS': 0,
'label-INITIAL_FORMS': 0,
'label-MIN_NUM_FORMS': 0,
'label-MAX_NUM_FORMS': 100,
'link-TOTAL_FORMS': 0,
'link-INITIAL_FORMS': 0,
'link-MIN_NUM_FORMS': 0,
'link-MAX_NUM_FORMS': 100,
'marathon_health_check_cmd': 'cmd ping',
}
response = self.client.post(reverse('controllers.docker:add'), data)
self.assertEqual(response.status_code, 302)
controller = DockerController.objects.all().last()
self.assertEqual(controller.state, 'done')
self.assertEqual(controller.name, 'Another test app')
self.assertEqual(controller.organization.slug, 'foo-org')
self.assertEqual(
controller.marathon_health_check_cmd,
'cmd ping')
@responses.activate
def test_create_new_controller_with_health_multiple_checks(self):
org = Organization.objects.create(name="Foo Org", slug="foo-org")
OrganizationUserRelation.objects.create(
user=self.user, organization=org)
self.client.login(username='testuser2', password='test')
self.client.get(
reverse('organizations:select-active', args=('foo-org',)))
self.mock_create_marathon_app()
self.mock_create_postgres_db(200, {
'result': {
'name': 'joes_db',
'user': 'joe',
'password': '1234',
'host': 'localhost'}})
data = {
'name': 'Another test app',
'docker_image': 'test/image',
'postgres_db_needed': True,
'env-TOTAL_FORMS': 0,
'env-INITIAL_FORMS': 0,
'env-MIN_NUM_FORMS': 0,
'env-MAX_NUM_FORMS': 100,
'label-TOTAL_FORMS': 0,
'label-INITIAL_FORMS': 0,
'label-MIN_NUM_FORMS': 0,
'label-MAX_NUM_FORMS': 100,
'link-TOTAL_FORMS': 0,
'link-INITIAL_FORMS': 0,
'link-MIN_NUM_FORMS': 0,
'link-MAX_NUM_FORMS': 100,
'marathon_health_check_path': '/health/path/',
'marathon_health_check_cmd': 'cmd ping',
}
response = self.client.post(reverse('controllers.docker:add'), data)
self.assertEqual(response.status_code, 302)
controller = DockerController.objects.all().last()
self.assertEqual(controller.state, 'done')
self.assertEqual(controller.name, 'Another test app')
self.assertEqual(controller.organization.slug, 'foo-org')
self.assertEqual(
controller.marathon_health_check_path,
'/health/path/')
self.assertEqual(
controller.marathon_health_check_cmd,
'cmd ping')
| 37.956186 | 79 | 0.498438 | 2,526 | 29,454 | 5.577593 | 0.07601 | 0.037476 | 0.028746 | 0.045993 | 0.921428 | 0.916034 | 0.911136 | 0.903258 | 0.893179 | 0.875009 | 0 | 0.01943 | 0.372683 | 29,454 | 775 | 80 | 38.005161 | 0.743086 | 0 | 0 | 0.84701 | 0 | 0 | 0.22435 | 0.026889 | 0 | 0 | 0 | 0 | 0.047288 | 1 | 0.019471 | false | 0.012517 | 0.011127 | 0 | 0.03338 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a9113de7c31758b84e7a603f5c7477130a5a1dd1 | 5,272 | py | Python | Develop-Files/title.py | Zfauser/Heros-Of-Gitward | 6baf9005479c969cabdb3c560976f72ff56db664 | [
"MIT"
] | 1 | 2021-03-06T23:40:21.000Z | 2021-03-06T23:40:21.000Z | Develop-Files/title.py | Zfauser/Python-Game-Gr-9 | 6baf9005479c969cabdb3c560976f72ff56db664 | [
"MIT"
] | null | null | null | Develop-Files/title.py | Zfauser/Python-Game-Gr-9 | 6baf9005479c969cabdb3c560976f72ff56db664 | [
"MIT"
] | null | null | null | import time
print("db d8b db d88888b db .o88b. .d88b. .88b d88. d88888b ")
print("88 I8I 88 88' 88 d8P Y8 .8P Y8. 88'YbdP`88 88' ")
print("88 I8I 88 88ooooo 88 8P 88 88 88 88 88 88ooooo ")
print("Y8 I8I 88 88~~~~~ 88 8b 88 88 88 88 88 88~~~~~ ")
print("`8b d8'8b d8' 88. 88booo. Y8b d8 `8b d8' 88 88 88 88. ")
print(" `8b8' `8d8' Y88888P Y88888P `Y88P' `Y88P' YP YP YP Y88888P ")
print(" ")
time.sleep(.7)
print("d888888b .d88b. ")
print("`~~88~~' .8P Y8. ")
print(" 88 88 88 ")
print(" 88 88 88 ")
print(" 88 `8b d8' ")
print(" YP `Y88P' ")
time.sleep(.7)
print(" ")
print(" ___ ___ ___ ___ ___ ")
time.sleep(.1)
print(" /\__\ /\ \ /\__\ /\ \ /\__\ ")
time.sleep(.1)
print(" /::| | /::\ \ /:/ / \:\ \ /:/ _/_ ")
time.sleep(.1)
print(" /:/:| | /:/\:\ \ /:/ / \:\ \ /:/ /\ \ ")
time.sleep(.1)
print(" /:/|:| |__ /:/ /::\ \ /:/ / ___ ___ /::\ \ /:/ /::\ \ ")
time.sleep(.1)
print(" /:/ |:| /\__\ /:/_/:/\:\__\ /:/__/ /\__\ /\ /:/\:\__\ /:/_/:/\:\__\ ")
time.sleep(.1)
print(" \/__|:|/:/ / \:\/:/ \/__/ \:\ \ /:/ / \:\/:/ \/__/ \:\/:/ /:/ / ")
time.sleep(.1)
print(" |:/:/ / \::/__/ \:\ /:/ / \::/__/ \::/ /:/ / ")
time.sleep(.1)
print(" |::/ / \:\ \ \:\/:/ / \:\ \ \/_/:/ / ")
time.sleep(.1)
print(" |:/ / \:\__\ \::/ / \:\__\ /:/ / ")
time.sleep(.1)
print(" |/__/ \/__/ \/__/ \/__/ \/__/ ")
time.sleep(.7)
print(" ")
print(" ***** ** *** * *** ** ")
time.sleep(.1)
print(" ****** * **** * ** *** * **** * * * ** ")
time.sleep(.1)
print(" ** * * ***** ** *** * * **** *** ** ** ** ")
time.sleep(.1)
print("* * * * * ** * ** ** * ** ** ** ")
time.sleep(.1)
print(" * * * *** **** **** **** **** ** * *** ******** ** *** **** *** **** ** ")
time.sleep(.1)
print(" ** ** * *** **** **** * * *** * * **** * * *** * ****** ** ** *** ******** ** *** *** * **** **** **** * *** ** ")
time.sleep(.1)
print(" ** ** * * *** ** **** * **** ** **** * **** ***** ** ** *** *** ** ** *** **** * *** * ** **** ********* ")
time.sleep(.1)
print(" ** ******** * *** ** ** ** **** ** ** ** ** ** **** * ** ** ** ** ** * **** ** ** **** ")
time.sleep(.1)
print(" ** ** * ** *** ** ** ** *** ** ** ** ** ** * **** ** ** ** ** ** ** ** ** ** ** ")
time.sleep(.1)
print(" ** ** ** ******** ** ** ** *** ** ** ** ** *** ** ** ** ** ** ** ** ** ** ** ** ")
time.sleep(.1)
print(" * ** ** ******* ** ** ** *** ** ** ** ** ** * ** ** ** ** ** ** ** ** ** ** ")
time.sleep(.1)
print(" * ** ** ** ** ** **** ** ** ** ** ** * * ** ** ** ** * ** ** ** ** ** ")
time.sleep(.1)
print(" **** ** **** * *** ****** * **** * ****** ** *** * ** ** ******* ******* ** ** *** ** ** ")
time.sleep(.1)
print(" * ***** ** ******* *** **** **** **** ** ******* **** ** ***** ***** ***** ** *** ****** ")
time.sleep(.1)
print("* ** ***** ** *** *** *** ** *** ")
time.sleep(.1)
print("* ")
time.sleep(.1)
print(" ** ")
time.sleep(3) | 71.243243 | 191 | 0.158953 | 222 | 5,272 | 3.459459 | 0.13964 | 0.351563 | 0.528646 | 0.507813 | 0.6875 | 0.609375 | 0.571615 | 0.571615 | 0.571615 | 0.571615 | 0 | 0.083712 | 0.58308 | 5,272 | 74 | 192 | 71.243243 | 0.265696 | 0 | 0 | 0.486486 | 0 | 0.135135 | 0.831026 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.013514 | 0 | 0.013514 | 0.581081 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
a91396127f0162ad5a955c615a8eb0fb8c9e87dd | 18 | py | Python | test/run/t110.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 2,671 | 2015-01-03T08:23:25.000Z | 2022-03-31T06:15:48.000Z | test/run/t110.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 972 | 2015-01-05T08:11:00.000Z | 2022-03-29T13:47:15.000Z | test/run/t110.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 845 | 2015-01-03T19:53:36.000Z | 2022-03-29T18:34:22.000Z | a = 1,2,3
print a
| 6 | 9 | 0.555556 | 6 | 18 | 1.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 0.277778 | 18 | 2 | 10 | 9 | 0.538462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.5 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
a97910f1410de7072f08b4ce88b3b9f7d409da01 | 258 | py | Python | lldb/test/API/repl/subclassing/TestREPLSubclassing.py | LaudateCorpus1/llvm-project | ff2e0f0c1112558b3f30d8afec7c9882c33c79e3 | [
"Apache-2.0"
] | 765 | 2015-12-03T16:44:59.000Z | 2022-03-07T12:41:10.000Z | lldb/test/API/repl/subclassing/TestREPLSubclassing.py | LaudateCorpus1/llvm-project | ff2e0f0c1112558b3f30d8afec7c9882c33c79e3 | [
"Apache-2.0"
] | 3,180 | 2019-10-18T01:21:21.000Z | 2022-03-31T23:25:41.000Z | lldb/test/API/repl/subclassing/TestREPLSubclassing.py | LaudateCorpus1/llvm-project | ff2e0f0c1112558b3f30d8afec7c9882c33c79e3 | [
"Apache-2.0"
] | 284 | 2015-12-03T16:47:25.000Z | 2022-03-12T05:39:48.000Z | import lldbsuite.test.lldbinrepl as lldbinrepl
import lldbsuite.test.lldbtest as lldbtest
import lldbsuite.test.decorators as decorators
lldbinrepl.MakeREPLTest(__file__, globals(),
decorators=[decorators.skipIfDarwin, decorators.skipUnlessDarwin])
| 36.857143 | 74 | 0.829457 | 27 | 258 | 7.777778 | 0.444444 | 0.214286 | 0.271429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096899 | 258 | 6 | 75 | 43 | 0.901288 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a97e32f9d31c8335d58ddd695e2ac22857d44456 | 117 | py | Python | gym_yugioh/envs/__init__.py | Synapt1x/gym-yugioh | e413aa29597dd744ef76cc99c038690c6f8810e7 | [
"MIT"
] | null | null | null | gym_yugioh/envs/__init__.py | Synapt1x/gym-yugioh | e413aa29597dd744ef76cc99c038690c6f8810e7 | [
"MIT"
] | null | null | null | gym_yugioh/envs/__init__.py | Synapt1x/gym-yugioh | e413aa29597dd744ef76cc99c038690c6f8810e7 | [
"MIT"
] | null | null | null | from gym_yugioh.envs.yugioh_env import YugiohEnv
from gym_yugioh.envs.yugioh_extrahard_env import YugiohExtraHardEnv
| 39 | 67 | 0.897436 | 17 | 117 | 5.882353 | 0.529412 | 0.14 | 0.26 | 0.34 | 0.46 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068376 | 117 | 2 | 68 | 58.5 | 0.917431 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a9925a1e773433b1cc5211eadd6c7a5a442941b8 | 116 | py | Python | worker/tasks/__init__.py | zhijiahu/gopigo-car | 9d03778c3a074426c7fa5877b1f14f5798b70364 | [
"MIT"
] | 2 | 2020-07-28T12:09:48.000Z | 2020-07-31T23:58:10.000Z | worker/tasks/__init__.py | zhijiahu/gopigo-car | 9d03778c3a074426c7fa5877b1f14f5798b70364 | [
"MIT"
] | null | null | null | worker/tasks/__init__.py | zhijiahu/gopigo-car | 9d03778c3a074426c7fa5877b1f14f5798b70364 | [
"MIT"
] | null | null | null |
from .detect_object_yolo import detect as detect_person
from .detect_face_haarcascade import detect as detect_face
| 29 | 58 | 0.87069 | 18 | 116 | 5.277778 | 0.5 | 0.210526 | 0.294737 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112069 | 116 | 3 | 59 | 38.666667 | 0.92233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
8d137ac9e075b1ddcd5cb55504189875c48e4077 | 11,770 | py | Python | venv/lib/python3.8/site-packages/spaceone/api/config/v1/domain_config_pb2_grpc.py | choonho/plugin-prometheus-mon-webhook | afa7d65d12715fd0480fb4f92a9c62da2d6128e0 | [
"Apache-2.0"
] | null | null | null | venv/lib/python3.8/site-packages/spaceone/api/config/v1/domain_config_pb2_grpc.py | choonho/plugin-prometheus-mon-webhook | afa7d65d12715fd0480fb4f92a9c62da2d6128e0 | [
"Apache-2.0"
] | null | null | null | venv/lib/python3.8/site-packages/spaceone/api/config/v1/domain_config_pb2_grpc.py | choonho/plugin-prometheus-mon-webhook | afa7d65d12715fd0480fb4f92a9c62da2d6128e0 | [
"Apache-2.0"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
from google.protobuf import struct_pb2 as google_dot_protobuf_dot_struct__pb2
from spaceone.api.config.v1 import domain_config_pb2 as spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2
class DomainConfigStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.create = channel.unary_unary(
'/spaceone.api.config.v1.DomainConfig/create',
request_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.CreateDomainConfigRequest.SerializeToString,
response_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigInfo.FromString,
)
self.update = channel.unary_unary(
'/spaceone.api.config.v1.DomainConfig/update',
request_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.UpdateDomainConfigRequest.SerializeToString,
response_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigInfo.FromString,
)
self.delete = channel.unary_unary(
'/spaceone.api.config.v1.DomainConfig/delete',
request_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.get = channel.unary_unary(
'/spaceone.api.config.v1.DomainConfig/get',
request_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.GetDomainConfigRequest.SerializeToString,
response_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigInfo.FromString,
)
self.list = channel.unary_unary(
'/spaceone.api.config.v1.DomainConfig/list',
request_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigQuery.SerializeToString,
response_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigsInfo.FromString,
)
self.stat = channel.unary_unary(
'/spaceone.api.config.v1.DomainConfig/stat',
request_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigStatQuery.SerializeToString,
response_deserializer=google_dot_protobuf_dot_struct__pb2.Struct.FromString,
)
class DomainConfigServicer(object):
"""Missing associated documentation comment in .proto file."""
def create(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def update(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def delete(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def get(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def list(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def stat(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_DomainConfigServicer_to_server(servicer, server):
rpc_method_handlers = {
'create': grpc.unary_unary_rpc_method_handler(
servicer.create,
request_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.CreateDomainConfigRequest.FromString,
response_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigInfo.SerializeToString,
),
'update': grpc.unary_unary_rpc_method_handler(
servicer.update,
request_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.UpdateDomainConfigRequest.FromString,
response_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigInfo.SerializeToString,
),
'delete': grpc.unary_unary_rpc_method_handler(
servicer.delete,
request_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'get': grpc.unary_unary_rpc_method_handler(
servicer.get,
request_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.GetDomainConfigRequest.FromString,
response_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigInfo.SerializeToString,
),
'list': grpc.unary_unary_rpc_method_handler(
servicer.list,
request_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigQuery.FromString,
response_serializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigsInfo.SerializeToString,
),
'stat': grpc.unary_unary_rpc_method_handler(
servicer.stat,
request_deserializer=spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigStatQuery.FromString,
response_serializer=google_dot_protobuf_dot_struct__pb2.Struct.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'spaceone.api.config.v1.DomainConfig', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class DomainConfig(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def create(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/spaceone.api.config.v1.DomainConfig/create',
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.CreateDomainConfigRequest.SerializeToString,
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigInfo.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def update(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/spaceone.api.config.v1.DomainConfig/update',
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.UpdateDomainConfigRequest.SerializeToString,
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigInfo.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def delete(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/spaceone.api.config.v1.DomainConfig/delete',
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def get(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/spaceone.api.config.v1.DomainConfig/get',
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.GetDomainConfigRequest.SerializeToString,
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigInfo.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def list(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/spaceone.api.config.v1.DomainConfig/list',
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigQuery.SerializeToString,
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigsInfo.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def stat(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/spaceone.api.config.v1.DomainConfig/stat',
spaceone_dot_api_dot_config_dot_v1_dot_domain__config__pb2.DomainConfigStatQuery.SerializeToString,
google_dot_protobuf_dot_struct__pb2.Struct.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
| 50.299145 | 138 | 0.699405 | 1,223 | 11,770 | 6.270646 | 0.089943 | 0.050072 | 0.06259 | 0.068718 | 0.900248 | 0.896206 | 0.879906 | 0.828791 | 0.763854 | 0.748989 | 0 | 0.009628 | 0.232285 | 11,770 | 233 | 139 | 50.515021 | 0.839088 | 0.063127 | 0 | 0.541237 | 1 | 0 | 0.07693 | 0.049064 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072165 | false | 0 | 0.020619 | 0.030928 | 0.139175 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8d2a8930b6afd95c77a5fc2a68fb47f1294066dc | 8,906 | py | Python | mrl/configs/make_continuous_agents.py | ag8/mrl | f05b00347f88020cbeb216c7e4764a4d2523b67e | [
"MIT"
] | null | null | null | mrl/configs/make_continuous_agents.py | ag8/mrl | f05b00347f88020cbeb216c7e4764a4d2523b67e | [
"MIT"
] | null | null | null | mrl/configs/make_continuous_agents.py | ag8/mrl | f05b00347f88020cbeb216c7e4764a4d2523b67e | [
"MIT"
] | 1 | 2021-08-12T23:13:03.000Z | 2021-08-12T23:13:03.000Z | from mrl.import_all import *
from argparse import Namespace
import gym
import time
def make_ddpg_agent(base_config=default_ddpg_config,
args=Namespace(env='InvertedPendulum-v2',
tb='',
parent_folder='/tmp/mrl',
layers=(256, 256),
num_envs=None),
agent_name_attrs=['env', 'seed', 'tb'],
**kwargs):
if callable(base_config): # If the base_config parameter is a function, make sure to call it
base_config = base_config()
config = base_config
# Train on as many environments as the CPU allows,
# unless specified otherwise.
if hasattr(args, 'num_envs') and args.num_envs is None:
import multiprocessing as mp
args.num_envs = max(mp.cpu_count() - 1, 1)
# set the prefix (todo: why?)
if not hasattr(args, 'prefix'):
args.prefix = 'ddpg'
# set whatever this is (todo: why?)
if not args.tb:
args.tb = str(time.time())
merge_args_into_config(args, config)
config.agent_name = make_agent_name(config, agent_name_attrs, prefix=args.prefix)
base_modules = {
k: v
for k, v in dict(module_train=StandardTrain(),
module_eval=EpisodicEval(),
module_policy=ActorPolicy(),
module_logger=Logger(),
module_state_normalizer=Normalizer(MeanStdNormalizer()),
module_replay=OnlineHERBuffer(),
module_action_noise=ContinuousActionNoise(GaussianProcess,
std=ConstantSchedule(config.action_noise)),
module_algorithm=DDPG()).items() if not k in config
}
config.update(base_modules)
if type(args.env) is str:
env = lambda: gym.make(args.env)
eval_env = env
else:
env = args.env
eval_env = env
if hasattr(args, 'eval_env') and args.eval_env is not None:
if type(args.eval_env) is str:
eval_env = lambda: gym.make(args.eval_env)
else:
eval_env = args.eval_env
config.module_train_env = EnvModule(env, num_envs=config.num_envs, seed=config.seed)
config.module_eval_env = EnvModule(eval_env, num_envs=config.num_eval_envs, name='eval_env',
seed=config.seed + 1138)
layer_norm = nn.LayerNorm if (hasattr(args, 'layer_norm') and args.layer_norm) else nn.Identity
e = config.module_eval_env
config.module_actor = PytorchModel(
'actor',
lambda: Actor(FCBody(e.state_dim + e.goal_dim, args.layers, layer_norm, make_activ(config.activ)), e.action_dim,
e.max_action))
config.module_critic = PytorchModel(
'critic', lambda: Critic(
FCBody(e.state_dim + e.goal_dim + e.action_dim, args.layers, layer_norm, make_activ(config.activ)), 1))
if e.goal_env:
config.never_done = True # important for standard Gym goal environments, which are never done
return config
def make_random_agent(base_config=default_ddpg_config,
args=Namespace(env='InvertedPendulum-v2',
tb='',
parent_folder='/tmp/mrl',
layers=(256, 256),
num_envs=None),
agent_name_attrs=['env', 'seed', 'tb'],
**kwargs):
if callable(base_config): # If the base_config parameter is a function, make sure to call it
base_config = base_config()
config = base_config
# Train on as many environments as the CPU allows,
# unless specified otherwise.
if hasattr(args, 'num_envs') and args.num_envs is None:
import multiprocessing as mp
args.num_envs = max(mp.cpu_count() - 1, 1)
# set the prefix (todo: why?)
if not hasattr(args, 'prefix'):
args.prefix = 'random'
# set whatever this is (todo: why?)
if not args.tb:
args.tb = str(time.time())
merge_args_into_config(args, config)
config.agent_name = make_agent_name(config, agent_name_attrs, prefix=args.prefix)
base_modules = {
k: v
for k, v in dict(module_train=StandardTrain(),
module_eval=EpisodicEval(),
module_policy=ActorPolicy(),
module_logger=Logger(),
module_state_normalizer=Normalizer(MeanStdNormalizer()),
module_replay=OnlineHERBuffer(),
module_action_noise=ContinuousActionNoise(GaussianProcess,
std=ConstantSchedule(config.action_noise)),
module_algorithm=DDPG()).items() if not k in config
}
config.update(base_modules)
if type(args.env) is str:
env = lambda: gym.make(args.env)
eval_env = env
else:
env = args.env
eval_env = env
if hasattr(args, 'eval_env') and args.eval_env is not None:
if type(args.eval_env) is str:
eval_env = lambda: gym.make(args.eval_env)
else:
eval_env = args.eval_env
config.module_train_env = EnvModule(env, num_envs=config.num_envs, seed=config.seed)
config.module_eval_env = EnvModule(eval_env, num_envs=config.num_eval_envs, name='eval_env',
seed=config.seed + 1138)
layer_norm = nn.LayerNorm if (hasattr(args, 'layer_norm') and args.layer_norm) else nn.Identity
e = config.module_eval_env
config.module_actor = PytorchModel(
'actor',
lambda: Actor(FCBody(e.state_dim + e.goal_dim, args.layers, layer_norm, make_activ(config.activ)), e.action_dim,
e.max_action))
config.module_critic = PytorchModel(
'critic', lambda: Critic(
FCBody(e.state_dim + e.goal_dim + e.action_dim, args.layers, layer_norm, make_activ(config.activ)), 1))
if e.goal_env:
config.never_done = True # important for standard Gym goal environments, which are never done
return config
def make_td3_agent(base_config=spinning_up_td3_config,
args=Namespace(env='InvertedPendulum-v2',
tb='',
prefix='td3',
parenFt_folder='/tmp/mrl',
layers=(256, 256),
num_envs=None),
agent_name_attrs=['env', 'seed', 'tb'],
**kwargs):
config = __ddpg_agent(base_config, args, agent_name_attrs, **kwargs)
del config.module_algorithm
config.module_algorithm = TD3()
layer_norm = nn.LayerNorm if (hasattr(args, 'layer_norm') and args.layer_norm) else nn.Identity
e = config.module_eval_env
config.module_critic2 = PytorchModel('critic2',
lambda: Critic(
FCBody(e.state_dim + e.goal_dim + e.action_dim, args.layers, layer_norm,
make_activ(config.activ), False), 1, False))
return config
def make_sac_agent(base_config=spinning_up_sac_config,
args=Namespace(env='InvertedPendulum-v2',
tb='',
prefix='sac',
parent_folder='/tmp/mrl',
layers=(256, 256),
num_envs=None),
agent_name_attrs=['env', 'seed', 'tb'],
**kwargs):
config = make_ddpg_agent(base_config, args, agent_name_attrs, **kwargs)
e = config.module_eval_env
layer_norm = nn.LayerNorm if (hasattr(args, 'layer_norm') and args.layer_norm) else nn.Identity
del config.module_actor
del config.module_action_noise
del config.module_policy
config.module_policy = StochasticActorPolicy()
del config.module_algorithm
config.module_algorithm = SAC()
config.module_actor = PytorchModel(
'actor',
lambda: StochasticActor(FCBody(e.state_dim + e.goal_dim, args.layers, layer_norm, make_activ(config.activ)),
e.action_dim, e.max_action, log_std_bounds=(-20, 2)))
config.module_critic2 = PytorchModel('critic2',
lambda: Critic(
FCBody(e.state_dim + e.goal_dim + e.action_dim, args.layers, layer_norm,
make_activ(config.activ), False), 1, False))
return config
| 39.93722 | 120 | 0.562654 | 1,022 | 8,906 | 4.678082 | 0.135029 | 0.040996 | 0.023008 | 0.021962 | 0.945409 | 0.928258 | 0.919891 | 0.901067 | 0.880987 | 0.862999 | 0 | 0.009415 | 0.344038 | 8,906 | 222 | 121 | 40.117117 | 0.80897 | 0.060746 | 0 | 0.85119 | 0 | 0 | 0.036039 | 0 | 0 | 0 | 0 | 0.004505 | 0 | 1 | 0.02381 | false | 0 | 0.035714 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8d4ecae2f759e4fcb8026c3563df4b0b1689761f | 226 | py | Python | Drawing Shapes using Code (Python Turtle)/hexagon.py | IlmopediaKids/Ilmopedia-Kids-Coding-Courses | dc51baf9894a540fbc858f5f71e178190913712c | [
"BSD-3-Clause"
] | 1 | 2022-01-03T13:25:30.000Z | 2022-01-03T13:25:30.000Z | Drawing Shapes using Code (Python Turtle)/hexagon.py | IlmopediaKids/Ilmopedia-Kids-Coding-Courses | dc51baf9894a540fbc858f5f71e178190913712c | [
"BSD-3-Clause"
] | null | null | null | Drawing Shapes using Code (Python Turtle)/hexagon.py | IlmopediaKids/Ilmopedia-Kids-Coding-Courses | dc51baf9894a540fbc858f5f71e178190913712c | [
"BSD-3-Clause"
] | null | null | null | import turtle
turtle.forward(70)
turtle.left(60)
turtle.forward(70)
turtle.left(60)
turtle.forward(70)
turtle.left(60)
turtle.forward(70)
turtle.left(60)
turtle.forward(70)
turtle.left(60)
turtle.forward(70)
turtle.left(60)
| 15.066667 | 18 | 0.769912 | 38 | 226 | 4.578947 | 0.157895 | 0.448276 | 0.517241 | 0.724138 | 0.931034 | 0.931034 | 0.931034 | 0.931034 | 0.931034 | 0.931034 | 0 | 0.114286 | 0.070796 | 226 | 14 | 19 | 16.142857 | 0.714286 | 0 | 0 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.076923 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
8da673e1a390ec7f8ade5b4705500532a68c4e3f | 38,376 | py | Python | src/webpubsub/azext_webpubsub/vendored_sdks/azure_messaging_webpubsubservice/rest.py | haroonf/azure-cli-extensions | 61c044d34c224372f186934fa7c9313f1cd3a525 | [
"MIT"
] | 207 | 2017-11-29T06:59:41.000Z | 2022-03-31T10:00:53.000Z | src/webpubsub/azext_webpubsub/vendored_sdks/azure_messaging_webpubsubservice/rest.py | haroonf/azure-cli-extensions | 61c044d34c224372f186934fa7c9313f1cd3a525 | [
"MIT"
] | 4,061 | 2017-10-27T23:19:56.000Z | 2022-03-31T23:18:30.000Z | src/webpubsub/azext_webpubsub/vendored_sdks/azure_messaging_webpubsubservice/rest.py | haroonf/azure-cli-extensions | 61c044d34c224372f186934fa7c9313f1cd3a525 | [
"MIT"
] | 802 | 2017-10-11T17:36:26.000Z | 2022-03-31T22:24:32.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
# pylint: disable=line-too-long
__all__ = [
'build_add_connection_to_group_request',
'build_add_user_to_group_request',
'build_connection_exists_request',
'build_group_exists_request',
'build_check_permission_request',
'build_user_exists_request',
'build_close_client_connection_request',
'build_grant_permission_request',
'build_healthapi_get_health_status_request',
'build_remove_connection_from_group_request',
'build_remove_user_from_all_groups_request',
'build_remove_user_from_group_request',
'build_revoke_permission_request',
'build_send_to_all_request',
'build_send_to_connection_request',
'build_send_to_group_request',
'build_send_to_user_request'
]
from typing import TYPE_CHECKING
from msrest import Serializer
from azure.core.pipeline.transport._base import _format_url_section
from .core.rest import HttpRequest
if TYPE_CHECKING:
# pylint: disable=unused-import,ungrouped-imports
from typing import Any, IO, List, Optional, Union, Dict
from typing_extensions import Literal
Permissions = Union[Literal['joinLeaveGroup'], Literal['sendToGroup']] # pylint: disable=unsubscriptable-object
_SERIALIZER = Serializer()
def build_healthapi_get_health_status_request(
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Get service health status.
Get service health status.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/health')
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="HEAD",
url=url,
params=query_parameters,
**kwargs
)
def build_send_to_all_request(
hub, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Broadcast content inside request body to all the connected client connections.
Broadcast content inside request body to all the connected client connections.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:keyword json: The payload body.
:paramtype json: Any
:keyword content: The payload body.
:paramtype content: IO
:keyword excluded: Excluded connection Ids.
:paramtype excluded: list[str]
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
Example:
.. code-block:: python
# JSON input template you can fill out and use as your `json` input.
json = "Any (optional)"
"""
excluded = kwargs.pop('excluded', None) # type: Optional[List[str]]
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
content_type = kwargs.pop("content_type", None)
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/:send')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if excluded is not None:
query_parameters['excluded'] = [_SERIALIZER.query("excluded", q, 'str') if q is not None else '' for q in excluded]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_connection_exists_request(
hub, # type: str
connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Check if the connection with the given connectionId exists.
Check if the connection with the given connectionId exists.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param connection_id: The connection Id.
:type connection_id: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/connections/{connectionId}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'connectionId': _SERIALIZER.url("connection_id", connection_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="HEAD",
url=url,
params=query_parameters,
**kwargs
)
def build_close_client_connection_request(
hub, # type: str
connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Close the client connection.
Close the client connection.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param connection_id: Target connection Id.
:type connection_id: str
:keyword reason: The reason closing the client connection.
:paramtype reason: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
reason = kwargs.pop('reason', None) # type: Optional[str]
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/connections/{connectionId}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'connectionId': _SERIALIZER.url("connection_id", connection_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if reason is not None:
query_parameters['reason'] = _SERIALIZER.query("reason", reason, 'str')
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="DELETE",
url=url,
params=query_parameters,
**kwargs
)
def build_send_to_connection_request(
hub, # type: str
connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Send content inside request body to the specific connection.
Send content inside request body to the specific connection.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param connection_id: The connection Id.
:type connection_id: str
:keyword json: The payload body.
:paramtype json: Any
:keyword content: The payload body.
:paramtype content: IO
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
Example:
.. code-block:: python
# JSON input template you can fill out and use as your `json` input.
json = "Any (optional)"
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
content_type = kwargs.pop("content_type", None)
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/connections/{connectionId}/:send')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'connectionId': _SERIALIZER.url("connection_id", connection_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_group_exists_request(
hub, # type: str
group, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Check if there are any client connections inside the given group.
Check if there are any client connections inside the given group.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param group: Target group name, which length should be greater than 0 and less than 1025.
:type group: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/groups/{group}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'group': _SERIALIZER.url("group", group, 'str', max_length=1024, min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="HEAD",
url=url,
params=query_parameters,
**kwargs
)
def build_send_to_group_request(
hub, # type: str
group, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Send content inside request body to a group of connections.
Send content inside request body to a group of connections.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param group: Target group name, which length should be greater than 0 and less than 1025.
:type group: str
:keyword json: The payload body.
:paramtype json: Any
:keyword content: The payload body.
:paramtype content: IO
:keyword excluded: Excluded connection Ids.
:paramtype excluded: list[str]
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
Example:
.. code-block:: python
# JSON input template you can fill out and use as your `json` input.
json = "Any (optional)"
"""
excluded = kwargs.pop('excluded', None) # type: Optional[List[str]]
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
content_type = kwargs.pop("content_type", None)
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/groups/{group}/:send')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'group': _SERIALIZER.url("group", group, 'str', max_length=1024, min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if excluded is not None:
query_parameters['excluded'] = [_SERIALIZER.query("excluded", q, 'str') if q is not None else '' for q in excluded]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_add_connection_to_group_request(
hub, # type: str
group, # type: str
connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Add a connection to the target group.
Add a connection to the target group.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param group: Target group name, which length should be greater than 0 and less than 1025.
:type group: str
:param connection_id: Target connection Id.
:type connection_id: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/groups/{group}/connections/{connectionId}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'group': _SERIALIZER.url("group", group, 'str', max_length=1024, min_length=1),
'connectionId': _SERIALIZER.url("connection_id", connection_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="PUT",
url=url,
params=query_parameters,
**kwargs
)
def build_remove_connection_from_group_request(
hub, # type: str
group, # type: str
connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Remove a connection from the target group.
Remove a connection from the target group.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param group: Target group name, which length should be greater than 0 and less than 1025.
:type group: str
:param connection_id: Target connection Id.
:type connection_id: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/groups/{group}/connections/{connectionId}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'group': _SERIALIZER.url("group", group, 'str', max_length=1024, min_length=1),
'connectionId': _SERIALIZER.url("connection_id", connection_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="DELETE",
url=url,
params=query_parameters,
**kwargs
)
def build_user_exists_request(
hub, # type: str
user_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Check if there are any client connections connected for the given user.
Check if there are any client connections connected for the given user.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param user_id: Target user Id.
:type user_id: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/users/{userId}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'userId': _SERIALIZER.url("user_id", user_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="HEAD",
url=url,
params=query_parameters,
**kwargs
)
def build_send_to_user_request(
hub, # type: str
user_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Send content inside request body to the specific user.
Send content inside request body to the specific user.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param user_id: The user Id.
:type user_id: str
:keyword json: The payload body.
:paramtype json: Any
:keyword content: The payload body.
:paramtype content: IO
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
Example:
.. code-block:: python
# JSON input template you can fill out and use as your `json` input.
json = "Any (optional)"
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
content_type = kwargs.pop("content_type", None)
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/users/{userId}/:send')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'userId': _SERIALIZER.url("user_id", user_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_add_user_to_group_request(
hub, # type: str
group, # type: str
user_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Add a user to the target group.
Add a user to the target group.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param group: Target group name, which length should be greater than 0 and less than 1025.
:type group: str
:param user_id: Target user Id.
:type user_id: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/users/{userId}/groups/{group}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'group': _SERIALIZER.url("group", group, 'str', max_length=1024, min_length=1),
'userId': _SERIALIZER.url("user_id", user_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="PUT",
url=url,
params=query_parameters,
**kwargs
)
def build_remove_user_from_group_request(
hub, # type: str
group, # type: str
user_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Remove a user from the target group.
Remove a user from the target group.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param group: Target group name, which length should be greater than 0 and less than 1025.
:type group: str
:param user_id: Target user Id.
:type user_id: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/users/{userId}/groups/{group}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'group': _SERIALIZER.url("group", group, 'str', max_length=1024, min_length=1),
'userId': _SERIALIZER.url("user_id", user_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="DELETE",
url=url,
params=query_parameters,
**kwargs
)
def build_remove_user_from_all_groups_request(
hub, # type: str
user_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Remove a user from all groups.
Remove a user from all groups.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param user_id: Target user Id.
:type user_id: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/users/{userId}/groups')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'userId': _SERIALIZER.url("user_id", user_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="DELETE",
url=url,
params=query_parameters,
**kwargs
)
def build_grant_permission_request(
hub, # type: str
permission, # type: Permissions
connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Grant permission to the connection.
Grant permission to the connection.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param permission: The permission: current supported actions are joinLeaveGroup and
sendToGroup.
:type permission: str or ~Permissions
:param connection_id: Target connection Id.
:type connection_id: str
:keyword target_name: Optional. If not set, grant the permission to all the targets. If set,
grant the permission to the specific target. The meaning of the target depends on the specific
permission.
:paramtype target_name: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
target_name = kwargs.pop('target_name', None) # type: Optional[str]
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/permissions/{permission}/connections/{connectionId}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'permission': _SERIALIZER.url("permission", permission, 'str'),
'connectionId': _SERIALIZER.url("connection_id", connection_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if target_name is not None:
query_parameters['targetName'] = _SERIALIZER.query("target_name", target_name, 'str')
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="PUT",
url=url,
params=query_parameters,
**kwargs
)
def build_revoke_permission_request(
hub, # type: str
permission, # type: Permissions
connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Revoke permission for the connection.
Revoke permission for the connection.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param permission: The permission: current supported actions are joinLeaveGroup and
sendToGroup.
:type permission: str or ~Permissions
:param connection_id: Target connection Id.
:type connection_id: str
:keyword target_name: Optional. If not set, revoke the permission for all targets. If set,
revoke the permission for the specific target. The meaning of the target depends on the
specific permission.
:paramtype target_name: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
target_name = kwargs.pop('target_name', None) # type: Optional[str]
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/permissions/{permission}/connections/{connectionId}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'permission': _SERIALIZER.url("permission", permission, 'str'),
'connectionId': _SERIALIZER.url("connection_id", connection_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if target_name is not None:
query_parameters['targetName'] = _SERIALIZER.query("target_name", target_name, 'str')
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="DELETE",
url=url,
params=query_parameters,
**kwargs
)
def build_check_permission_request(
hub, # type: str
permission, # type: Permissions
connection_id, # type: str
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Check if a connection has permission to the specified action.
Check if a connection has permission to the specified action.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this request builder into your code flow.
:param hub: Target hub name, which should start with alphabetic characters and only contain
alpha-numeric characters or underscore.
:type hub: str
:param permission: The permission: current supported actions are joinLeaveGroup and
sendToGroup.
:type permission: ~Permissions
:param connection_id: Target connection Id.
:type connection_id: str
:keyword target_name: Optional. If not set, get the permission for all targets. If set, get the
permission for the specific target. The meaning of the target depends on the specific
permission.
:paramtype target_name: str
:keyword api_version: Api Version.
:paramtype api_version: str
:return: Returns an :class:`~azure.messaging.webpubsubservice.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/llcwiki for how to incorporate this response into your code flow.
:rtype: ~azure.messaging.webpubsubservice.core.rest.HttpRequest
"""
target_name = kwargs.pop('target_name', None) # type: Optional[str]
api_version = kwargs.pop('api_version', "2021-05-01-preview") # type: Optional[str]
# Construct URL
url = kwargs.pop("template_url", '/api/hubs/{hub}/permissions/{permission}/connections/{connectionId}')
path_format_arguments = {
'hub': _SERIALIZER.url("hub", hub, 'str', pattern=r'^[A-Za-z][A-Za-z0-9_`,.[\]]{0,127}$'),
'permission': _SERIALIZER.url("permission", permission, 'str'),
'connectionId': _SERIALIZER.url("connection_id", connection_id, 'str', min_length=1),
}
url = _format_url_section(url, **path_format_arguments)
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
if target_name is not None:
query_parameters['targetName'] = _SERIALIZER.query("target_name", target_name, 'str')
if api_version is not None:
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
return HttpRequest(
method="HEAD",
url=url,
params=query_parameters,
**kwargs
)
| 40.695652 | 146 | 0.680269 | 5,008 | 38,376 | 5.073882 | 0.045927 | 0.060213 | 0.014719 | 0.017395 | 0.95793 | 0.942936 | 0.9329 | 0.92137 | 0.92137 | 0.91342 | 0 | 0.009938 | 0.19765 | 38,376 | 942 | 147 | 40.738854 | 0.815303 | 0.498098 | 0 | 0.792056 | 0 | 0 | 0.231045 | 0.10268 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03972 | false | 0 | 0.014019 | 0 | 0.093458 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a5c350d349622433343df766debb1e1d71b98fe4 | 2,538 | py | Python | 008.py | zlsun/ProjectEuler | 813ec545484924a052f1bd7fd90a4c676eea3bba | [
"MIT"
] | null | null | null | 008.py | zlsun/ProjectEuler | 813ec545484924a052f1bd7fd90a4c676eea3bba | [
"MIT"
] | null | null | null | 008.py | zlsun/ProjectEuler | 813ec545484924a052f1bd7fd90a4c676eea3bba | [
"MIT"
] | null | null | null | #-*- encoding: utf-8 -*-
"""
Largest product in a series
The four adjacent digits in the 1000-digit number that have the greatest product are 9 × 9 × 8 × 9 = 5832.
73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450
Find the thirteen adjacent digits in the 1000-digit number that have the greatest product. What is the value of this product?
"""
N = """
73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450
""".replace('\n', '')
from utils import *
l = len(N)
print max(product(map(int, N[i:i + 13])) for i in range(l - 13))
# 23514624000
| 43.016949 | 126 | 0.904255 | 118 | 2,538 | 19.474576 | 0.550847 | 0.012185 | 0.013925 | 0.016536 | 0.922541 | 0.922541 | 0.922541 | 0.922541 | 0.922541 | 0.922541 | 0 | 0.863212 | 0.072498 | 2,538 | 58 | 127 | 43.758621 | 0.111725 | 0.01379 | 0 | 0 | 0 | 0 | 0.888021 | 0.868056 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.04 | null | null | 0.04 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
5735da40651626085cbd1ec0e3799e645ad28990 | 136 | py | Python | engine/gl/__init__.py | alexcher-im/sgemu | e85c6834b1057a27ba5c41c357c0de2336a12e2e | [
"Zlib"
] | 5 | 2020-10-24T12:39:52.000Z | 2021-04-04T22:47:44.000Z | engine/gl/__init__.py | alexcher-im/sgemu | e85c6834b1057a27ba5c41c357c0de2336a12e2e | [
"Zlib"
] | null | null | null | engine/gl/__init__.py | alexcher-im/sgemu | e85c6834b1057a27ba5c41c357c0de2336a12e2e | [
"Zlib"
] | null | null | null | from .base import *
from .mesh import *
from .shader import *
from .shader_buffer import *
from .texture import *
from .window import *
| 19.428571 | 28 | 0.735294 | 19 | 136 | 5.210526 | 0.421053 | 0.505051 | 0.323232 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 136 | 6 | 29 | 22.666667 | 0.883929 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9e07255568706c6ae0b2efb91fb074e108ee6cfd | 2,898 | py | Python | isiscb/isisdata/migrations/0050_auto_20161024_1447.py | bgopalachary/IsisCB | c28e3f504eea60ebeff38318d8bb2071abb28ebb | [
"MIT"
] | 4 | 2016-01-25T20:35:33.000Z | 2020-04-07T15:39:52.000Z | isiscb/isisdata/migrations/0050_auto_20161024_1447.py | bgopalachary/IsisCB | c28e3f504eea60ebeff38318d8bb2071abb28ebb | [
"MIT"
] | 41 | 2015-08-19T17:34:41.000Z | 2022-03-11T23:19:01.000Z | isiscb/isisdata/migrations/0050_auto_20161024_1447.py | bgopalachary/IsisCB | c28e3f504eea60ebeff38318d8bb2071abb28ebb | [
"MIT"
] | 2 | 2020-11-25T20:18:18.000Z | 2021-06-24T15:15:41.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
class Migration(migrations.Migration):
dependencies = [
('isisdata', '0049_auto_20160901_1408'),
]
operations = [
migrations.AlterField(
model_name='authority',
name='classification_system',
field=models.CharField(default=b'SPWC', choices=[(b'SPWT', b'Weldon Thesaurus Terms (2002-present)'), (b'SPWC', b'Weldon Classification System (2002-present)'), (b'GUE', b'Guerlac Committee Classification System (1953-2001)'), (b'NEU', b'Neu'), (b'MW', b'Whitrow Classification System (1913-1999)'), (b'SHOT', b'SHOT Thesaurus Terms'), (b'FHSA', b'Forum for the History of Science in America'), (b'SAC', b'Search App Concept'), (b'PN', b'Proper name')], max_length=4, blank=True, help_text=b'Specifies the classification system that is the source of the authority. Used to group resources by the Classification system. The system used currently is the Weldon System. All the other ones are for reference or archival purposes only.', null=True),
),
migrations.AlterField(
model_name='historicalauthority',
name='classification_system',
field=models.CharField(default=b'SPWC', choices=[(b'SPWT', b'Weldon Thesaurus Terms (2002-present)'), (b'SPWC', b'Weldon Classification System (2002-present)'), (b'GUE', b'Guerlac Committee Classification System (1953-2001)'), (b'NEU', b'Neu'), (b'MW', b'Whitrow Classification System (1913-1999)'), (b'SHOT', b'SHOT Thesaurus Terms'), (b'FHSA', b'Forum for the History of Science in America'), (b'SAC', b'Search App Concept'), (b'PN', b'Proper name')], max_length=4, blank=True, help_text=b'Specifies the classification system that is the source of the authority. Used to group resources by the Classification system. The system used currently is the Weldon System. All the other ones are for reference or archival purposes only.', null=True),
),
migrations.AlterField(
model_name='historicalperson',
name='classification_system',
field=models.CharField(default=b'SPWC', choices=[(b'SPWT', b'Weldon Thesaurus Terms (2002-present)'), (b'SPWC', b'Weldon Classification System (2002-present)'), (b'GUE', b'Guerlac Committee Classification System (1953-2001)'), (b'NEU', b'Neu'), (b'MW', b'Whitrow Classification System (1913-1999)'), (b'SHOT', b'SHOT Thesaurus Terms'), (b'FHSA', b'Forum for the History of Science in America'), (b'SAC', b'Search App Concept'), (b'PN', b'Proper name')], max_length=4, blank=True, help_text=b'Specifies the classification system that is the source of the authority. Used to group resources by the Classification system. The system used currently is the Weldon System. All the other ones are for reference or archival purposes only.', null=True),
),
]
| 96.6 | 756 | 0.699793 | 409 | 2,898 | 4.909535 | 0.229829 | 0.179283 | 0.035857 | 0.043327 | 0.88496 | 0.88496 | 0.88496 | 0.88496 | 0.88496 | 0.88496 | 0 | 0.038017 | 0.164941 | 2,898 | 29 | 757 | 99.931034 | 0.791736 | 0.007246 | 0 | 0.521739 | 0 | 0.130435 | 0.609391 | 0.029913 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f52d9c0c076e44eb0a4e2208bb1518fdaa5855b7 | 18,296 | py | Python | test_main.py | kirubasankars/kdb3 | 5f9fea1bae5093086692bb9565d57286884ff5fb | [
"Apache-2.0"
] | 6 | 2020-01-24T01:03:44.000Z | 2021-11-06T22:16:21.000Z | test_main.py | kirubasankars/kdb3 | 5f9fea1bae5093086692bb9565d57286884ff5fb | [
"Apache-2.0"
] | 2 | 2020-11-29T02:37:10.000Z | 2021-11-10T22:50:56.000Z | test_main.py | kirubasankars/kdb3 | 5f9fea1bae5093086692bb9565d57286884ff5fb | [
"Apache-2.0"
] | 1 | 2020-01-13T19:21:09.000Z | 2020-01-13T19:21:09.000Z | import random
import requests
DBHOST = "http://localhost:8001"
DBNAME = "testdb"
def test_info():
r = requests.get("{}".format(DBHOST))
assert r.status_code == 200, "status code should be 200"
def delete_database():
requests.delete("{}/{}".format(DBHOST, DBNAME))
def test_create_database():
delete_database()
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201
r = requests.get("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 200
rs = r.json()
assert rs["doc_count"] == 1
assert rs["deleted_doc_count"] == 0
assert rs["name"] == DBNAME
assert len(rs["update_seq"]) == 138
r = requests.get("{}/_cat/dbs".format(DBHOST))
assert DBNAME in r.json(), "Failed: get created database"
r = requests.get("{}/{}/_design/_views".format(DBHOST, DBNAME))
assert r.status_code == 200
delete_database()
def test_create_database_with_invalid_name():
DBNAME = "$3213324"
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 400, "Failed: expecting bad request"
r = requests.get("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 404, "Failed: expecting not found"
r = requests.get("{}/_cat/dbs".format(DBHOST))
assert DBNAME not in r.json(), "Failed: expecting not found"
def test_create_database_exists():
delete_database()
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201, "Failed: create database"
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 412, "Failed: expecting database already exists"
delete_database()
def test_delete_database():
delete_database()
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201, "Failed: create database"
r = requests.delete("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 200, "Failed: delete database"
r = requests.get("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 404, "Failed: delete database"
r = requests.get("{}/_cat/dbs".format(DBHOST))
assert DBNAME not in r.json(), "Failed: expecting not found"
delete_database()
def test_single_insert_documents():
delete_database()
seed_data = []
for x in range(12):
x = x + 1
if x < 6:
seed_data.append({"_id": x, "foo": "bar"})
else:
seed_data.append({"foo": "bar"})
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201
for seed in seed_data:
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert "_id" in rs
assert "_rev" in rs
seed["_id"] = rs["_id"]
seed["_rev"] = rs["_rev"]
for seed in seed_data:
r = requests.get("{}/{}/{}".format(DBHOST, DBNAME, seed["_id"]), headers={"Content-Type": "application/json"})
assert r.status_code == 200
assert r.json()["_rev"][:2] == "1-"
r = requests.get("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 200
rs = r.json()
assert rs["doc_count"] == 13
assert rs["deleted_doc_count"] == 0
delete_database()
def test_single_insert_invalid_documents():
delete_database()
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201
r = requests.post("{}/{}".format(DBHOST, DBNAME), json={}, headers={"Content-Type": "application/json"})
assert r.status_code == 200
_id = r.json()["_id"]
r = requests.put("{}/{}/{}".format(DBHOST, DBNAME, _id), json=[], headers={"Content-Type": "application/json"})
assert r.status_code == 400
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=[], headers={"Content-Type": "application/json"})
assert r.status_code == 400
r = requests.post("{}/{}".format(DBHOST, DBNAME), json={"_rev":"1-dfasdfsfsdfsdfasdfasfdsadfsdf"},
headers={"Content-Type": "application/json"})
assert r.status_code == 400
r = requests.put("{}/{}/{}".format(DBHOST, DBNAME, _id), json={"_rev": "1-dfasdfsfsdfsdfasdfasfdsadfsdf"}, headers={"Content-Type": "application/json"})
assert r.status_code == 400
r = requests.post("{}/{}".format(DBHOST, DBNAME), json={"_deleted": True},
headers={"Content-Type": "application/json"})
assert r.status_code == 400
r = requests.put("{}/{}/{}".format(DBHOST, DBNAME, _id), json={"deleted": True}, headers={"Content-Type": "application/json"})
assert r.status_code == 409
delete_database()
def test_conflict_single_insert_update_delete_documents():
delete_database()
seed = {"foo": "bar"}
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
rs = r.json()
assert r.status_code == 200
assert "_id" in rs
assert "_rev" in rs
assert rs["_rev"][:2] == "1-"
_id = rs["_id"]
_rev1 = rs["_rev"]
seed["_id"] = _id
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 409
r = requests.put("{}/{}/{}".format(DBHOST, DBNAME, _id), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 409
seed["_rev"] = _rev1
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert rs["_rev"][:2] == "2-"
_rev2 = rs["_rev"]
seed["_rev"] = _rev2
r = requests.put("{}/{}/{}".format(DBHOST, DBNAME, _id), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert rs["_rev"][:2] == "3-"
_rev3 = rs["_rev"]
seed["_rev"] = _rev3
r = requests.delete("{}/{}/{}?rev={}".format(DBHOST, DBNAME, _id, _rev2))
assert r.status_code == 409
r = requests.delete("{}/{}/{}?rev={}".format(DBHOST, DBNAME, _id, _rev3))
assert r.status_code == 200
r = requests.delete("{}/{}/{}?rev={}".format(DBHOST, DBNAME, _id, _rev3))
assert r.status_code == 409
#delete and insert
r = requests.post("{}/{}".format(DBHOST, DBNAME), json={"_id": _id})
assert r.status_code == 200
assert r.json()['_rev'][:2] == "5-"
delete_database()
def test_single_update_documents():
delete_database()
seed_data = []
for x in range(12):
x = x + 1
if x < 6:
seed_data.append({"_id": x, "foo": "bar"})
else:
seed_data.append({"foo": "bar"})
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201
# creating
for seed in seed_data:
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert "_id" in rs
assert "_rev" in rs
seed["_id"] = rs["_id"]
seed["_rev"] = rs["_rev"]
# get
for seed in seed_data:
r = requests.get("{}/{}/{}".format(DBHOST, DBNAME, seed["_id"]), headers={"Content-Type": "application/json"})
rs = r.json()
assert r.status_code == 200
assert rs["_rev"][0] == "1", "failed, Expecting version number 1"
# update
for seed in seed_data:
r = requests.post("{}/{}".format(DBHOST, DBNAME), headers={"Content-Type": "application/json"}, json=seed)
assert r.status_code == 200
rs = r.json()
assert "_id" in rs
assert "_rev" in rs
seed["_id"] = rs["_id"]
seed["_rev"] = rs["_rev"]
# get
for seed in seed_data:
r = requests.get("{}/{}/{}".format(DBHOST, DBNAME, seed["_id"]), headers={"Content-Type": "application/json"})
rs = r.json()
assert r.status_code == 200
assert rs["_rev"][0] == "2", "failed, Expecting version number 2"
# update
for seed in seed_data:
r = requests.put("{}/{}/{}".format(DBHOST, DBNAME, seed["_id"]), headers={"Content-Type": "application/json"}, json=seed)
assert r.status_code == 200
# get
for seed in seed_data:
r = requests.get("{}/{}/{}".format(DBHOST, DBNAME, seed["_id"]),
headers={"Content-Type": "application/json"})
rs = r.json()
assert r.status_code == 200
assert rs["_rev"][0] == "3", "failed, Expecting version number 2"
delete_database()
def test_single_delete_documents():
delete_database()
seed_data = []
for x in range(12):
x = x + 1
if x < 6:
seed_data.append({"_id": x, "foo": "bar"})
else:
seed_data.append({"foo": "bar"})
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201
for seed in seed_data:
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert "_id" in rs
assert "_rev" in rs
seed["_id"] = rs["_id"]
seed["_rev"] = rs["_rev"]
for seed in seed_data[:6]:
r = requests.delete("{}/{}/{}?rev={}".format(DBHOST, DBNAME, seed["_id"], seed["_rev"]), headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert "_id" in rs
assert "_rev" in rs
seed["_id"] = rs["_id"]
seed["_rev"] = rs["_rev"]
seed = seed_data[6]
seed["_deleted"] = True
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
for seed in seed_data[:7]:
r = requests.get("{}/{}/{}".format(DBHOST, DBNAME, seed["_id"]), headers={"Content-Type": "application/json"})
assert r.status_code == 404
r = requests.get("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 200
rs = r.json()
assert rs["doc_count"] == 6
assert rs["deleted_doc_count"] == 7
delete_database()
def test_bulk_insert_documents():
delete_database()
seed_data = []
for x in range(12):
x = x + 1
if x < 6:
seed_data.append({"_id": x, "foo": "bar"})
else:
seed_data.append({"foo": "bar"})
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201
r = requests.post("{}/{}/_bulk_docs".format(DBHOST, DBNAME), json={"_docs": seed_data}, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
for i in range(len(seed_data)):
seed_data[i]["_id"] = rs[i]["_id"]
seed_data[i]["_rev"] = rs[i]["_rev"]
r = requests.get("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 200
rs = r.json()
assert rs["doc_count"] == 13
assert rs["deleted_doc_count"] == 0
r = requests.post("{}/{}/_bulk_docs".format(DBHOST, DBNAME), json=[],
headers={"Content-Type": "application/json"})
assert r.status_code == 400
r = requests.post("{}/{}/_bulk_docs".format(DBHOST, DBNAME), json={},
headers={"Content-Type": "application/json"})
assert r.status_code == 400
r = requests.post("{}/{}/_bulk_docs".format(DBHOST, DBNAME), json={"_docs":[]},
headers={"Content-Type": "application/json"})
assert r.status_code == 400
seed = seed_data.pop()
del seed["_rev"]
r = requests.post("{}/{}/_bulk_docs".format(DBHOST, DBNAME), json={"_docs":[{}, {"_id": "with_id"}, {"_rev": "1"}, seed , seed_data.pop(), seed_data.pop()]},
headers={"Content-Type": "application/json"})
rs = r.json()
assert r.status_code == 200
assert len(rs) == 6
assert rs[0]["_rev"][0] == "1"
assert rs[1]["_rev"][0] == "1"
assert rs[4]["_rev"][0] == "2"
assert rs[5]["_rev"][0] == "2"
assert rs[1]["_id"] == "with_id"
assert rs[2]["error"] == "invalid_rev_id"
assert rs[3]["error"] == "doc_conflict"
delete_database()
def test_bulk_get_documents():
delete_database()
seed_data = []
for x in range(12):
x = x + 1
if x < 6:
seed_data.append({"_id": x, "foo": "bar"})
else:
seed_data.append({"foo": "bar"})
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201
for seed in seed_data:
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert "_id" in rs
assert "_rev" in rs
seed["_id"] = rs["_id"]
seed["_rev"] = rs["_rev"]
r = requests.get("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 200
rs = r.json()
assert rs["doc_count"] == 13
assert rs["deleted_doc_count"] == 0
req_data = {"_docs": [{"_id": x["_id"]} for x in seed_data] }
r = requests.post("{}/{}/_bulk_gets".format(DBHOST, DBNAME), json=req_data,
headers={"Content-Type": "application/json"})
assert r.status_code == 200
assert len(r.json()) == 12
r = requests.post("{}/{}/_bulk_gets".format(DBHOST, DBNAME), json=[],
headers={"Content-Type": "application/json"})
assert r.status_code == 400
r = requests.post("{}/{}/_bulk_gets".format(DBHOST, DBNAME), json={},
headers={"Content-Type": "application/json"})
assert r.status_code == 400
r = requests.post("{}/{}/_bulk_gets".format(DBHOST, DBNAME), json={"_docs":[]},
headers={"Content-Type": "application/json"})
assert r.status_code == 400
req_data_1 = [req_data["_docs"].pop(), req_data["_docs"].pop(), {"_id": "4234"}]
item = req_data["_docs"].pop()
item["_rev"] = "1-34234234"
req_data_1.append(item)
item = req_data["_docs"].pop()
item["_rev"] = "1-12345678123456781234567812345678"
req_data_1.append(item)
r = requests.post("{}/{}/_bulk_gets".format(DBHOST, DBNAME), json={"_docs": req_data_1},
headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert len(rs) == 5
assert rs[0]["_rev"][:2] == "1-"
assert rs[1]["_rev"][:2] == "1-"
assert rs[2]["error"] == "doc_not_found"
assert rs[3]["error"] == "invalid_rev_id"
assert rs[4]["error"] == "doc_not_found"
delete_database()
def test_get_all_docs():
delete_database()
seed_data = []
for x in range(12):
x = x + 1
if x < 6:
seed_data.append({"_id": x, "foo": "bar"})
else:
seed_data.append({"foo": "bar"})
r = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 201
for seed in seed_data:
r = requests.post("{}/{}".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert "_id" in rs
assert "_rev" in rs
seed["_id"] = rs["_id"]
seed["_rev"] = rs["_rev"]
r = requests.get("{}/{}".format(DBHOST, DBNAME))
assert r.status_code == 200
rs = r.json()
assert rs["doc_count"] == 13
assert rs["deleted_doc_count"] == 0
r = requests.get("{}/{}/_all_docs".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert len(rs["rows"]) == 10
assert rs["total_rows"] == 13
assert rs["offset"] == 1
r = requests.get("{}/{}/_all_docs?page=1".format(DBHOST, DBNAME), json=seed, headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert len(rs["rows"]) == 10
assert rs["total_rows"] == 13
assert rs["offset"] == 1
r = requests.get("{}/{}/_all_docs?page=2".format(DBHOST, DBNAME), json=seed,
headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert len(rs["rows"]) == 3
assert rs["total_rows"] == 13
assert rs["offset"] == 11
r = requests.get("{}/{}/_all_docs?page=1&limit=13".format(DBHOST, DBNAME), json=seed,
headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert len(rs["rows"]) == 13
assert rs["total_rows"] == 13
assert rs["offset"] == 1
r = requests.get("{}/{}/_all_docs?limit=13".format(DBHOST, DBNAME), json=seed,
headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert len(rs["rows"]) == 13
assert rs["total_rows"] == 13
assert rs["offset"] == 1
items = {item["id"]: item for item in rs["rows"]}
for seed in seed_data:
assert seed["_id"] in items
assert "_design/_views" in items
r = requests.post("{}/{}".format(DBHOST, DBNAME), json={}, headers={"Content-Type": "application/json"})
assert r.status_code == 200
r = requests.get("{}/{}/_all_docs?limit=13".format(DBHOST, DBNAME), json=seed,
headers={"Content-Type": "application/json"})
assert r.status_code == 200
rs = r.json()
assert len(rs["rows"]) == 13
assert rs["total_rows"] == 14
assert rs["offset"] == 1
delete_database()
def test_curd_design_docs():
delete_database()
r1 = requests.put("{}/{}".format(DBHOST, DBNAME))
assert r1.status_code == 201
r2 = requests.get("{}/{}/_design/_views".format(DBHOST, DBNAME))
assert r2.status_code == 200
r3 = requests.put("{}/{}/_design/_views".format(DBHOST, DBNAME), json=r2.json(),
headers={"Content-Type": "application/json"})
assert r3.status_code == 200
assert r3.json()["_rev"][:2] == "2-"
r3 = requests.post("{}/{}".format(DBHOST, DBNAME), json=r3.json(),
headers={"Content-Type": "application/json"})
assert r3.status_code == 200
assert r3.json()["_rev"][:2] == "3-"
delete_database()
# design docs CURD
# changes api | 31.653979 | 161 | 0.574279 | 2,350 | 18,296 | 4.29617 | 0.053191 | 0.093899 | 0.133716 | 0.119552 | 0.87708 | 0.826862 | 0.797742 | 0.785955 | 0.750099 | 0.74168 | 0 | 0.031835 | 0.229121 | 18,296 | 578 | 162 | 31.653979 | 0.68399 | 0.004427 | 0 | 0.705742 | 0 | 0 | 0.191904 | 0.012028 | 0 | 0 | 0 | 0 | 0.366029 | 1 | 0.035885 | false | 0 | 0.004785 | 0 | 0.04067 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
19b085a995fb8c1d654588c92f689c231a7b7e15 | 58,596 | py | Python | stochastic_gater.py | jfsantos/ift6266h14 | 1f36c7594cd015a6f3509e6022afc12eb622aa0f | [
"MIT"
] | 8 | 2015-06-27T08:46:51.000Z | 2021-10-08T13:39:18.000Z | stochastic_gater.py | jfsantos/ift6266h14 | 1f36c7594cd015a6f3509e6022afc12eb622aa0f | [
"MIT"
] | null | null | null | stochastic_gater.py | jfsantos/ift6266h14 | 1f36c7594cd015a6f3509e6022afc12eb622aa0f | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Wed Feb 13 19:56:19 2013
@author: Nicholas Léonard
"""
import time, sys
from pylearn2.utils import serial
from itertools import izip
from pylearn2.utils import safe_zip
from collections import OrderedDict
from pylearn2.utils import safe_union
import numpy as np
import theano.sparse as S
from theano.gof.op import get_debug_values
from theano.printing import Print
from theano import function
from theano import config
from theano.sandbox.rng_mrg import MRG_RandomStreams
from theano import tensor as T
import theano
from pylearn2.linear.matrixmul import MatrixMul
from pylearn2.models.model import Model
from pylearn2.utils import sharedX
from pylearn2.costs.cost import Cost
from pylearn2.costs.mlp import Default
from pylearn2.models.mlp import MLP, Softmax, Layer, Linear
from pylearn2.space import VectorSpace, Conv2DSpace, CompositeSpace, Space
#from pylearn2_objects import MLPCost
class Stochastic1Cost(Default):
def get_gradients(self, model, data, ** kwargs):
"""
model: a pylearn2 Model instance
X: a batch in model.get_input_space()
Y: a batch in model.get_output_space()
returns: gradients, updates
gradients:
a dictionary mapping from the model's parameters
to their gradients
The default implementation is to compute the gradients
using T.grad applied to the value returned by __call__.
However, subclasses may return other values for the gradient.
For example, an intractable cost may return a sampling-based
approximation to its gradient.
updates:
a dictionary mapping shared variables to updates that must
be applied to them each time these gradients are computed.
This is to facilitate computation of sampling-based approximate
gradients.
The parameters should never appear in the updates dictionary.
This would imply that computing their gradient changes
their value, thus making the gradient value outdated.
"""
try:
cost = self.expr(model=model, data=data, **kwargs)
except TypeError,e:
# If anybody knows how to add type(seslf) to the exception message
# but still preserve the stack trace, please do so
# The current code does neither
e.message += " while calling "+str(type(self))+".__call__"
print str(type(self))
print e.message
raise e
if cost is None:
raise NotImplementedError(str(type(self))+" represents an intractable "
" cost and does not provide a gradient approximation scheme.")
layers = model.layers
params = [self.z]
grads = T.grad(cost, params, disconnected_inputs = 'raise',
consider_constant=model.get_params())
known_grads = OrderedDict(izip(params, grads))
rupdates = OrderedDict()
rgradients = OrderedDict()
'''In reverse order, get gradients one layer at a time.'''
for layer in reversed(layers):
gradients, updates \
= layer.get_gradients(known_grads.copy(), cost)
known_grads.update(gradients)
rupdates.update(updates)
rgradients.update(gradients)
'''print len(rgradients), len(rupdates), len(known_grads)
for param in model.get_params():
print param.name
print 'grads'
for (param, grad) in rgradients.iteritems():
print param.name, grad'''
return rgradients, rupdates
def get_test_cost(self, model, X, Y):
state_below = X
for layer in model.layers:
if hasattr(layer, 'test_fprop'):
state_below = layer.test_fprop(state_below)
else:
state_below = layer.fprop(state_below)
y = state_below
MCE = T.mean(T.cast(T.neq(T.argmax(y, axis=1),
T.argmax(Y, axis=1)), dtype='int32'),
dtype=config.floatX)
return MCE
class StochasticBinaryNeuron(Layer):
"""
Formerly Stochastic2
Stochastic Binary Neuron
A linear layer for the continus part,
and two layers with stochastic outputs and non-linear hidden units
that generates a binary mask for the outputs of the continus parts.
"""
def __init__(self,
dim,
hidden_dim,
layer_name,
mean_loss_coeff = 0.5,
hidden_activation = 'tanh',
sparsity_target = 0.1,
sparsity_cost_coeff = 1.0,
stoch_grad_coeff = 0.01,
linear_activation = None,
irange = [None,None,None],
istdev = [None,None,None],
sparse_init = [None,None,None],
sparse_stdev = [1.,1.,1.],
init_bias = [0.,0.,0.],
W_lr_scale = [None,None,None],
b_lr_scale = [None,None,None],
max_col_norm = [None,None,None],
weight_decay_coeff = [None,None,None]):
'''
params
------
dim:
number of units on output layer
hidden_dim:
number of units on hidden layer of non-linear part
mean_loss_coeff:
weight of the past moving averages in
calculating the moving average vs the weight of the average
of the current batch.
hidden_activation:
activation function used on hidden layer of non-linear part
sparsity_target:
target sparsity of the output layer.
sparsity_cost_coeff:
coefficient of the sparsity constraint when summing costs
weight_decay_coeff:
coefficients of L2 weight decay when summing costs
other:
in the lists of params, the first index is for the linear
part, while the second and third indices are for the first
and second layer of the non-linear part, respectively
'''
self.__dict__.update(locals())
del self.self
def get_lr_scalers(self):
rval = OrderedDict()
for i in range(3):
if self.W_lr_scale[i] is not None:
rval[self.W[i]] = self.W_lr_scale[i]
if self.b_lr_scale[i] is not None:
rval[self.b[i]] = self.b_lr_scale[i]
return rval
def set_input_space(self, space):
self.input_space = space
if isinstance(space, VectorSpace):
self.requires_reformat = False
self.input_dim = space.dim
else:
self.requires_reformat = True
self.input_dim = space.get_total_dimension()
self.desired_space = VectorSpace(self.input_dim)
self.output_space = VectorSpace(self.dim)
self.input_dims = [self.input_dim, self.input_dim, self.hidden_dim]
self.output_dims = [self.dim, self.hidden_dim, self.dim]
self.W = [None,None,None]
self.b = [None,None,None]
for i in range(3):
self._init_inner_layer(i)
e = 1e-6
self.mean_loss_deno = sharedX(e+np.zeros((self.output_dims[0],)))
self.mean_loss_nume = sharedX(e+np.zeros((self.output_dims[0],)))
self.stoch_grad = sharedX(0)
self.kl_grad = sharedX(0)
self.linear_grad = sharedX(0)
def _init_inner_layer(self, idx):
rng = self.mlp.rng
if self.irange[idx] is not None:
assert self.istdev[idx] is None
assert self.sparse_init[idx] is None
W = rng.uniform(-self.irange[idx], self.irange[idx],
(self.input_dims[idx], self.output_dims[idx]))
elif self.istdev[idx] is not None:
assert self.sparse_init[idx] is None
W = rng.randn(self.input_dims[idx], self.output_dims[idx]) \
* self.istdev[idx]
else:
assert self.sparse_init[idx] is not None
W = np.zeros((self.input_dims[idx], self.output_dims[idx]))
for i in xrange(self.output_dims[idx]):
assert self.sparse_init[idx] <= self.input_dims[idx]
for j in xrange(self.sparse_init[idx]):
idx2 = rng.randint(0, self.input_dims[idx])
while W[idx2, i] != 0:
idx2 = rng.randint(0, self.input_dims[idx])
W[idx2, i] = rng.randn()
W *= self.sparse_stdev[idx]
W = sharedX(W)
W.name = self.layer_name + '_W' + str(idx)
b = sharedX( np.zeros((self.output_dims[idx],)) \
+ self.init_bias[idx], \
name = self.layer_name + '_b' + str(idx))
self.W[idx] = W
self.b[idx] = b
def censor_updates(self, updates):
for idx in range(3):
if self.max_col_norm[idx] is not None:
W = self.W[idx]
if W in updates:
updated_W = updates[W]
col_norms = T.sqrt(T.sum(T.sqr(updated_W), axis=0))
desired_norms = T.clip(col_norms, 0, self.max_col_norm[idx])
updates[W] = updated_W * desired_norms / (1e-7 + col_norms)
def get_params(self):
rval = [self.W[0], self.W[1], self.W[2], self.b[0], self.b[1], self.b[2]]
return rval
def get_weights(self):
rval = []
for i in range(3):
W = self.W[i]
rval.append(W.get_value())
return rval
def set_weights(self, weights):
for i in range(3):
W = self.W[i]
W.set_value(weights[i])
def set_biases(self, biases):
for i in range(3):
self.b[i].set_value(biases[i])
def get_biases(self):
rval = []
for i in range(3):
rval.append(self.b[i].get_value())
return rval
def get_weights_format(self):
return ('v', 'h')
def get_weights_topo(self):
raise NotImplementedError()
def get_monitoring_channels(self):
rval = OrderedDict([
('mean_loss_nume_mean', self.mean_loss_nume.mean()),
('mean_loss_deno_mean', self.mean_loss_deno.mean()),
])
rval['stoch_grad'] = self.stoch_grad
rval['kl_grad'] = self.kl_grad
rval['linear_grad'] = self.linear_grad
for i in range(3):
sq_W = T.sqr(self.W[i])
row_norms = T.sqrt(sq_W.sum(axis=1))
col_norms = T.sqrt(sq_W.sum(axis=0))
rval['row_norms_max'+str(i)] = row_norms.max()
rval['col_norms_max'+str(i)] = col_norms.max()
return rval
def get_monitoring_channels_from_state(self, state, target=None):
rval = OrderedDict()
# sparisty of outputs:
rval['mean_output_sparsity'] = self.m_mean.mean()
# proportion of sigmoids that have prob > 0.5
# good when equal to sparsity
floatX = theano.config.floatX
rval['mean_sparsity_prop'] \
= T.cast(T.gt(self.m_mean, 0.5),floatX).mean()
# same as above but for intermediate thresholds:
rval['mean_sparsity_prop0.2'] \
= T.cast(T.gt(self.m_mean, 0.2),floatX).mean()
rval['mean_sparsity_prop0.3'] \
= T.cast(T.gt(self.m_mean, 0.3),floatX).mean()
rval['mean_sparsity_prop0.4'] \
= T.cast(T.gt(self.m_mean, 0.4),floatX).mean()
# or just plain standard deviation (less is bad):
rval['output_stdev'] = self.m_mean.std()
# stdev of unit stdevs (more is bad)
rval['output_meta_stdev'] = self.m_mean.std(axis=0).std()
# max and min proportion of these probs per unit
prop_per_unit = T.cast(T.gt(self.m_mean, 0.5),floatX).mean(0)
# if this is high, it means a unit is likely always active (bad)
rval['max_unit_sparsity_prop'] = prop_per_unit.max()
rval['min_unit_sparsity_prop'] = prop_per_unit.min()
# in both cases, high means units are popular (bad)
# proportion of units with p>0.5 more than 50% of time:
rval['mean_unit_sparsity_meta_prop'] \
= T.cast(T.gt(prop_per_unit,0.5),floatX).mean()
# proportion of units with p>0.5 more than 75% of time:
rval['mean_unit_sparsity_meta_prop2'] \
= T.cast(T.gt(prop_per_unit,0.75),floatX).mean()
return rval
def fprop(self, state_below, threshold=None, stochastic=True):
self.input_space.validate(state_below)
if self.requires_reformat:
if not isinstance(state_below, tuple):
for sb in get_debug_values(state_below):
if sb.shape[0] != self.dbm.batch_size:
raise ValueError("self.dbm.batch_size is %d but got shape of %d" % (self.dbm.batch_size, sb.shape[0]))
assert reduce(lambda x,y: x * y, sb.shape[1:]) == self.input_dim
state_below = self.input_space.format_as(state_below, self.desired_space)
self.x = state_below
# linear part
if isinstance(self.x, S.SparseVariable):
z = S.dot(self.x,self.W[0]) + self.b[0]
else:
z = T.dot(self.x,self.W[0]) + self.b[0]
self.output_activation = None
# activate hidden units of non-linear part
if self.output_activation is None:
self.z = z
elif self.hidden_activation == 'tanh':
self.z = T.tanh(z)
elif self.output_activation == 'sigmoid':
self.z = T.nnet.sigmoid(z)
elif self.output_activation == 'softmax':
self.z = T.nnet.softmax(z)
elif self.output_activation == 'rectifiedlinear':
self.z = T.maximum(0, z)
else:
raise NotImplementedError()
# first layer non-linear part
if isinstance(self.x, S.SparseVariable):
h = S.dot(self.x,self.W[1]) + self.b[1]
else:
h = T.dot(self.x,self.W[1]) + self.b[1]
# activate hidden units of non-linear part
if self.hidden_activation is None:
self.h = h
elif self.hidden_activation == 'tanh':
self.h = T.tanh(h)
elif self.hidden_activation == 'sigmoid':
self.h = T.nnet.sigmoid(h)
elif self.hidden_activation == 'softmax':
self.h = T.nnet.softmax(h)
elif self.hidden_activation == 'rectifiedlinear':
self.h = T.maximum(0, h)
else:
raise NotImplementedError()
# second layer non-linear part
self.a = T.dot(self.h,self.W[2]) + self.b[2]
# activate non-linear part to get bernouilli probabilities
self.m_mean = T.nnet.sigmoid(self.a)
if threshold is None:
if stochastic:
# sample from bernouili probs to generate a mask
rng = MRG_RandomStreams(self.mlp.rng.randint(2**15))
self.m = rng.binomial(size = self.m_mean.shape, n = 1,
p = self.m_mean, dtype=self.m_mean.type.dtype)
else:
self.m = self.m_mean
else:
# deterministic mask:
self.m = T.cast(T.gt(self.m_mean, threshold), \
theano.config.floatX)
# mask output of linear part with samples from linear part
self.p = self.m * self.z
if self.layer_name is not None:
self.z.name = self.layer_name + '_z'
self.h.name = self.layer_name + '_h'
self.a.name = self.layer_name + '_a'
self.m_mean.name = self.layer_name + '_m_mean'
self.m.name = self.layer_name + '_m'
self.p.name = self.layer_name + '_p'
return self.p
def test_fprop(self, state_below, threshold=None, stochastic=True):
return self.fprop(state_below, threshold, stochastic)
def cost(self, Y, Y_hat):
return self.cost_from_cost_matrix(self.cost_matrix(Y, Y_hat))
def cost_from_cost_matrix(self, cost_matrix):
return cost_matrix.sum(axis=1).mean()
def cost_matrix(self, Y, Y_hat):
return T.sqr(Y - Y_hat)
def get_gradients(self, known_grads, loss):
'''
Computes gradients and updates for this layer given the known
gradients of the upper layers, and the vector of losses for the
batch.
'''
updates = OrderedDict()
cost = self.get_kl_divergence() + self.get_weight_decay()
# gradient of linear part.
params = [self.W[0], self.b[0]]
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.m, self.x],
disconnected_inputs='raise')
cost_grads = T.grad(cost=cost, wrt=params,
consider_constant=[self.m, self.x],
disconnected_inputs='ignore')
updates[self.linear_grad] = T.abs_(grads[0]).mean()
for i in range(len(grads)):
grads[i] += cost_grads[i]
gradients = OrderedDict(izip(params, grads))
# update moving average loss for each unit where 1 was sampled
loss = loss.dimshuffle(0,'x')
delta = (self.m - self.m_mean)
updates[self.mean_loss_nume] = \
(self.mean_loss_coeff * self.mean_loss_nume) \
+ ((1. - self.mean_loss_coeff) * \
(T.sqr(delta) * loss).mean(axis=0))
updates[self.mean_loss_deno] = \
(self.mean_loss_coeff * self.mean_loss_deno) \
+ ((1. - self.mean_loss_coeff) * \
T.sqr(delta).mean(axis=0))
# gradients of non-linear part.
## obtain a lower-variance unbiased estimator by using
## separate moving averages of the loss for each unit
mean_loss = self.mean_loss_nume/self.mean_loss_deno
known_grads[self.a] = \
self.stoch_grad_coeff \
* delta * (loss - mean_loss.dimshuffle('x',0))
params = [self.W[1],self.W[2],self.b[1],self.b[2]]
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.z, self.x],
disconnected_inputs='raise')
updates[self.stoch_grad] = T.abs_(grads[1]).mean()
cost_grads = T.grad(cost=cost, wrt=params,
consider_constant=[self.z, self.x],
disconnected_inputs='ignore')
updates[self.kl_grad] = T.abs_(cost_grads[1]).mean()
for i in range(len(grads)):
grads[i] += cost_grads[i]
gradients.update(OrderedDict(izip(params, grads)))
return gradients, updates
def get_kl_divergence(self):
'''
Minimize KL-divergence of unit binomial distributions with
binomial distribution of probability self.sparsity_target.
This could also be modified to keep a running average of unit
samples
'''
e = 1e-6
cost = - self.sparsity_cost_coeff * ( \
(self.sparsity_target * T.log(e+self.m_mean.mean(axis=0))) \
+((1.-self.sparsity_target) * T.log(e+(1.-self.m_mean.mean(axis=0)))) \
).sum()
return cost
def get_weight_decay(self):
rval = 0
for i in range(3):
if self.weight_decay_coeff[i] is not None:
rval += self.weight_decay_coeff[i]*T.sqr(self.W[i]).sum()
return rval
class GaterOnly(Layer):
"""
No experts.
Formerly Stochastic3
One tanh layer followed by a stochastic sigmoid layer, they both
learn using unbiased estimator of the gradient.
"""
def __init__(self,
dim,
hidden_dim,
layer_name,
mean_loss_coeff = 0.9,
hidden_activation = 'tanh',
sparsity_target = 0.1,
sparsity_cost_coeff = 1.0,
stoch_grad_coeff = 0.01,
linear_activation = None,
irange = [None,None],
istdev = [None,None],
sparse_init = [None,None],
sparse_stdev = [1.,1.],
init_bias = [0.,0.],
W_lr_scale = [None,None],
b_lr_scale = [None,None],
max_col_norm = [None,None],
weight_decay_coeff = [None,None]):
'''
params
------
dim:
number of units on output layer
hidden_dim:
number of units on hidden layer
mean_loss_coeff:
weight of the past moving averages in
calculating the moving average vs the weight of the average
of the current batch.
hidden_activation:
activation function used on hidden layer of non-linear part
sparsity_target:
target sparsity of the output layer.
sparsity_cost_coeff:
coefficient of the sparsity constraint when summing costs
weight_decay_coeff:
coefficients of L2 weight decay when summing costs
other:
in the lists of params, the first index is for the linear
part, while the second and third indices are for the first
and second layer of the non-linear part, respectively
'''
self.__dict__.update(locals())
del self.self
def get_lr_scalers(self):
rval = OrderedDict()
for i in range(2):
if self.W_lr_scale[i] is not None:
rval[self.W[i]] = self.W_lr_scale[i]
if self.b_lr_scale[i] is not None:
rval[self.b[i]] = self.b_lr_scale[i]
return rval
def set_input_space(self, space):
self.input_space = space
if isinstance(space, VectorSpace):
self.requires_reformat = False
self.input_dim = space.dim
else:
self.requires_reformat = True
self.input_dim = space.get_total_dimension()
self.desired_space = VectorSpace(self.input_dim)
self.output_space = VectorSpace(self.dim)
self.input_dims = [self.input_dim, self.hidden_dim]
self.output_dims = [self.hidden_dim, self.dim]
self.W = [None,None]
self.b = [None,None]
for i in range(2):
self._init_inner_layer(i)
e = 1e-6
self.mean_loss_deno = sharedX(e+np.zeros((self.output_dims[1],)))
self.mean_loss_nume = sharedX(e+np.zeros((self.output_dims[1],)))
self.stoch_grad = sharedX(0)
self.kl_grad = sharedX(0)
def _init_inner_layer(self, idx):
rng = self.mlp.rng
if self.irange[idx] is not None:
assert self.istdev[idx] is None
assert self.sparse_init[idx] is None
W = rng.uniform(-self.irange[idx], self.irange[idx],
(self.input_dims[idx], self.output_dims[idx]))
elif self.istdev[idx] is not None:
assert self.sparse_init[idx] is None
W = rng.randn(self.input_dims[idx], self.output_dims[idx]) \
* self.istdev[idx]
else:
assert self.sparse_init[idx] is not None
W = np.zeros((self.input_dims[idx], self.output_dims[idx]))
for i in xrange(self.output_dims[idx]):
assert self.sparse_init[idx] <= self.input_dims[idx]
for j in xrange(self.sparse_init[idx]):
idx2 = rng.randint(0, self.input_dims[idx])
while W[idx2, i] != 0:
idx2 = rng.randint(0, self.input_dims[idx])
W[idx2, i] = rng.randn()
W *= self.sparse_stdev[idx]
W = sharedX(W)
W.name = self.layer_name + '_W' + str(idx)
b = sharedX( np.zeros((self.output_dims[idx],)) \
+ self.init_bias[idx], \
name = self.layer_name + '_b' + str(idx))
self.W[idx] = W
self.b[idx] = b
def censor_updates(self, updates):
for idx in range(2):
if self.max_col_norm[idx] is not None:
W = self.W[idx]
if W in updates:
updated_W = updates[W]
col_norms = T.sqrt(T.sum(T.sqr(updated_W), axis=0))
desired_norms = T.clip(col_norms, 0, self.max_col_norm[idx])
updates[W] = updated_W * desired_norms / (1e-7 + col_norms)
def get_params(self):
rval = [self.W[0], self.W[1], self.b[0], self.b[1]]
return rval
def get_weights(self):
rval = []
for i in range(2):
W = self.W[i]
rval.append(W.get_value())
return rval
def set_weights(self, weights):
for i in range(2):
W = self.W[i]
W.set_value(weights[i])
def set_biases(self, biases):
for i in range(2):
self.b[i].set_value(biases[i])
def get_biases(self):
rval = []
for i in range(2):
rval.append(self.b[i].get_value())
return rval
def get_weights_format(self):
return ('v', 'h')
def get_weights_topo(self):
raise NotImplementedError()
def get_monitoring_channels(self):
rval = OrderedDict([
('mean_loss_nume_mean', self.mean_loss_nume.mean()),
('mean_loss_deno_mean', self.mean_loss_deno.mean()),
])
rval['stoch_grad'] = self.stoch_grad
rval['kl_grad'] = self.kl_grad
for i in range(2):
sq_W = T.sqr(self.W[i])
row_norms = T.sqrt(sq_W.sum(axis=1))
col_norms = T.sqrt(sq_W.sum(axis=0))
rval['row_norms_max'+str(i)] = row_norms.max()
rval['col_norms_max'+str(i)] = col_norms.max()
return rval
def get_monitoring_channels_from_state(self, state, target=None):
rval = OrderedDict()
# sparisty of outputs:
rval['mean_output_sparsity'] = self.m_mean.mean()
# proportion of sigmoids that have prob > 0.5
# good when equal to sparsity
floatX = theano.config.floatX
rval['mean_sparsity_prop'] \
= T.cast(T.gt(self.m_mean, 0.5),floatX).mean()
# same as above but for intermediate thresholds:
rval['mean_sparsity_prop0.2'] \
= T.cast(T.gt(self.m_mean, 0.2),floatX).mean()
rval['mean_sparsity_prop0.3'] \
= T.cast(T.gt(self.m_mean, 0.3),floatX).mean()
rval['mean_sparsity_prop0.4'] \
= T.cast(T.gt(self.m_mean, 0.4),floatX).mean()
# or just plain standard deviation (less is bad):
rval['output_stdev'] = self.m_mean.std()
# stdev of unit stdevs (more is bad)
rval['output_meta_stdev'] = self.m_mean.std(axis=0).std()
# max and min proportion of these probs per unit
prop_per_unit = T.cast(T.gt(self.m_mean, 0.5),floatX).mean(0)
# if this is high, it means a unit is likely always active (bad)
rval['max_unit_sparsity_prop'] = prop_per_unit.max()
rval['min_unit_sparsity_prop'] = prop_per_unit.min()
# in both cases, high means units are popular (bad)
# proportion of units with p>0.5 more than 50% of time:
rval['mean_unit_sparsity_meta_prop'] \
= T.cast(T.gt(prop_per_unit,0.5),floatX).mean()
# proportion of units with p>0.5 more than 75% of time:
rval['mean_unit_sparsity_meta_prop2'] \
= T.cast(T.gt(prop_per_unit,0.75),floatX).mean()
return rval
def fprop(self, state_below, threshold=None, stochastic=True):
self.input_space.validate(state_below)
if self.requires_reformat:
if not isinstance(state_below, tuple):
for sb in get_debug_values(state_below):
if sb.shape[0] != self.dbm.batch_size:
raise ValueError("self.dbm.batch_size is %d but got shape of %d" % (self.dbm.batch_size, sb.shape[0]))
assert reduce(lambda x,y: x * y, sb.shape[1:]) == self.input_dim
state_below = self.input_space.format_as(state_below, self.desired_space)
self.x = state_below
# first layer
if isinstance(self.x, S.SparseVariable):
h = S.dot(self.x,self.W[0]) + self.b[0]
else:
h = T.dot(self.x,self.W[0]) + self.b[0]
# activate hidden units
if self.hidden_activation is None:
self.h = h
elif self.hidden_activation == 'tanh':
self.h = T.tanh(h)
elif self.hidden_activation == 'sigmoid':
self.h = T.nnet.sigmoid(h)
elif self.hidden_activation == 'softmax':
self.h = T.nnet.softmax(h)
elif self.hidden_activation == 'rectifiedlinear':
self.h = T.maximum(0, h)
else:
raise NotImplementedError()
# second layer
self.a = T.dot(self.h,self.W[1]) + self.b[1]
# activate non-linear part to get bernouilli probabilities
self.m_mean = T.nnet.sigmoid(self.a)
if threshold is None:
if stochastic:
# sample from bernouili probs to generate a mask
rng = MRG_RandomStreams(self.mlp.rng.randint(2**15))
self.m = rng.binomial(size = self.m_mean.shape, n = 1,
p = self.m_mean, dtype=self.m_mean.type.dtype)
'''uniform = rng.uniform(size = self.m_mean.shape, dtype=self.m_mean.type.dtype)
self.m = T.cast(T.gt(uniform,self.m_mean),dtype=theano.config.floatX)'''
else:
self.m = self.m_mean
else:
# deterministic mask:
self.m = T.cast(T.gt(self.m_mean, threshold), \
theano.config.floatX)
if self.layer_name is not None:
self.h.name = self.layer_name + '_h'
self.a.name = self.layer_name + '_a'
self.m_mean.name = self.layer_name + '_m_mean'
self.m.name = self.layer_name + '_m'
return self.m
def test_fprop(self, state_below, threshold=None, stochastic=True):
return self.fprop(state_below, threshold, stochastic)
def cost(self, Y, Y_hat):
return self.cost_from_cost_matrix(self.cost_matrix(Y, Y_hat))
def cost_from_cost_matrix(self, cost_matrix):
return cost_matrix.sum(axis=1).mean()
def cost_matrix(self, Y, Y_hat):
return T.sqr(Y - Y_hat)
def get_gradients(self, known_grads, loss):
'''
Computes gradients and updates for this layer given the vector
of losses for the batch.
'''
updates = OrderedDict()
gradients = OrderedDict()
cost = self.get_kl_divergence() + self.get_weight_decay()
# update moving average loss for each unit where 1 was sampled
loss = loss.dimshuffle(0,'x')
delta = (self.m - self.m_mean)
updates[self.mean_loss_nume] = \
(self.mean_loss_coeff * self.mean_loss_nume) \
+ ((1. - self.mean_loss_coeff) * \
(T.sqr(delta) * loss).mean(axis=0))
updates[self.mean_loss_deno] = \
(self.mean_loss_coeff * self.mean_loss_deno) \
+ ((1. - self.mean_loss_coeff) * \
T.sqr(delta).mean(axis=0))
# gradients of non-linear part.
## obtain a lower-variance unbiased estimator by using
## separate moving averages of the loss for each unit
mean_loss = self.mean_loss_nume/self.mean_loss_deno
known_grads[self.a] = \
self.stoch_grad_coeff \
* delta * (loss - mean_loss.dimshuffle('x',0))
params = [self.W[0],self.W[1],self.b[0],self.b[1]]
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.x],
disconnected_inputs='raise')
updates[self.stoch_grad] = T.abs_(grads[0]).mean()
cost_grads = T.grad(cost=cost, wrt=params,
consider_constant=[self.x],
disconnected_inputs='ignore')
updates[self.kl_grad] = T.abs_(cost_grads[0]).mean()
for i in range(len(grads)):
grads[i] += cost_grads[i]
gradients.update(OrderedDict(izip(params, grads)))
return gradients, updates
def get_kl_divergence(self):
'''
Minimize KL-divergence of unit binomial distributions with
binomial distribution of probability self.sparsity_target.
This could also be modified to keep a running average of unit
samples
'''
e = 1e-6
cost = - self.sparsity_cost_coeff * ( \
(self.sparsity_target * T.log(e+self.m_mean.mean(axis=0))) \
+((1.-self.sparsity_target) * T.log(e+(1.-self.m_mean.mean(axis=0)))) \
).sum()
return cost
def get_weight_decay(self):
rval = 0
for i in range(2):
if self.weight_decay_coeff[i] is not None:
rval += self.weight_decay_coeff[i]*T.sqr(self.W[i]).sum()
return rval
class StochasticSoftmax(Softmax):
def __init__(self, n_classes, layer_name, irange = None,
istdev = None,
sparse_init = None, W_lr_scale = None,
b_lr_scale = None, max_row_norm = None,
no_affine = False,
max_col_norm = None, init_bias_target_marginals= None,
weight_decay_coeff = None):
"""
"""
self.weight_decay_coeff = weight_decay_coeff
Softmax.__init__(self, n_classes, layer_name,irange,istdev,
sparse_init,W_lr_scale,b_lr_scale,max_row_norm,
no_affine,max_col_norm,init_bias_target_marginals)
def get_weight_decay(self):
if self.weight_decay_coeff is None:
return None
return self.weight_decay_coeff * T.sqr(self.W).sum()
def fprop(self, state_below):
self.input_space.validate(state_below)
if self.needs_reformat:
state_below = self.input_space.format_as(state_below, self.desired_space)
for value in get_debug_values(state_below):
if self.mlp.batch_size is not None and value.shape[0] != self.mlp.batch_size:
raise ValueError("state_below should have batch size "+str(self.dbm.batch_size)+" but has "+str(value.shape[0]))
self.desired_space.validate(state_below)
assert state_below.ndim == 2
self.x = state_below
if not hasattr(self, 'no_affine'):
self.no_affine = False
if self.no_affine:
Z = state_below
else:
assert self.W.ndim == 2
b = self.b
Z = T.dot(state_below, self.W) + b
rval = T.nnet.softmax(Z)
for value in get_debug_values(rval):
if self.mlp.batch_size is not None:
assert value.shape[0] == self.mlp.batch_size
return rval
def get_gradients(self, known_grads, loss):
params = self.get_params()
cost = self.get_weight_decay()
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.x],
disconnected_inputs='raise')
if cost is not None:
cost_grads = T.grad(cost=cost, wrt=params,
consider_constant=[self.x],
disconnected_inputs='raise')
for i in range(len(grads)):
grads[i] += cost_grads[i]
gradients = OrderedDict(izip(params, grads))
return gradients, OrderedDict()
class SparseTanh(Linear):
"""
Implementation of the tanh nonlinearity for MLP.
"""
def _linear_part(self, state_below):
# TODO: Refactor More Better(tm)
self.input_space.validate(state_below)
if self.requires_reformat:
if not isinstance(state_below, tuple):
for sb in get_debug_values(state_below):
if sb.shape[0] != self.dbm.batch_size:
raise ValueError("self.dbm.batch_size is %d but got shape of %d" % (self.dbm.batch_size, sb.shape[0]))
assert reduce(lambda x,y: x * y, sb.shape[1:]) == self.input_dim
state_below = self.input_space.format_as(state_below, self.desired_space)
if self.softmax_columns:
W, = self.transformer.get_params()
W = W.T
W = T.nnet.softmax(W)
W = W.T
z = S.dot(state_below, W) + self.b
else:
W, = self.transformer.get_params()
z = S.dot(state_below, W) + self.b
if self.layer_name is not None:
z.name = self.layer_name + '_z'
if self.copy_input:
z = T.concatenate((z, state_below), axis=1)
return z
def fprop(self, state_below):
p = self._linear_part(state_below)
p = T.tanh(p)
return p
def cost(self, *args, **kwargs):
raise NotImplementedError()
class StraightThrough(Layer):
"""
Formerly Stochastic4
Biased low-variance estimator
Straight-Through
"""
def __init__(self,
dim,
hidden_dim,
layer_name,
hidden_activation = 'tanh',
expert_activation = 'linear',
derive_sigmoid = True,
sparsity_target = 0.1,
sparsity_cost_coeff = 1.0,
irange = [None,None,None],
istdev = [None,None,None],
sparse_init = [None,None,None],
sparse_stdev = [1.,1.,1.],
init_bias = [0.,0.,0.],
W_lr_scale = [None,None,None],
b_lr_scale = [None,None,None],
max_col_norm = [None,None,None],
weight_decay_coeff = [None,None,None]):
'''
params
------
dim:
number of units on output layer
hidden_dim:
number of units on hidden layer of non-linear part
hidden_activation:
activation function used on hidden layer of non-linear part
sparsity_target:
target sparsity of the output layer.
sparsity_cost_coeff:
coefficient of the sparsity constraint when summing costs
weight_decay_coeff:
coefficients of L2 weight decay when summing costs
other:
in the lists of params, the first index is for the linear
part, while the second and third indices are for the first
and second layer of the non-linear part, respectively
'''
self.__dict__.update(locals())
del self.self
def get_lr_scalers(self):
rval = OrderedDict()
for i in range(3):
if self.W_lr_scale[i] is not None:
rval[self.W[i]] = self.W_lr_scale[i]
if self.b_lr_scale[i] is not None:
rval[self.b[i]] = self.b_lr_scale[i]
return rval
def activate(self, x, function_name):
if (function_name is None) or (function_name == 'linear'):
y = x
elif function_name == 'tanh':
y = T.tanh(x)
elif function_name == 'sigmoid':
y = T.nnet.sigmoid(x)
elif function_name == 'softmax':
y = T.nnet.softmax(x)
elif function_name == 'rectifiedlinear':
y = T.maximum(0, x)
elif function_name == 'softplus':
y = T.nnet.softplus(x)
elif function_name == 'softmax':
y = T.nnet.softmax(x)
elif function_name == 'rectifiedsoftplus':
y = T.nnet.softplus(T.maximum(0, x))
else:
raise NotImplementedError()
return y
def set_input_space(self, space):
self.input_space = space
if isinstance(space, VectorSpace):
self.requires_reformat = False
self.input_dim = space.dim
else:
self.requires_reformat = True
self.input_dim = space.get_total_dimension()
self.desired_space = VectorSpace(self.input_dim)
self.output_space = VectorSpace(self.dim)
self.input_dims = [self.input_dim, self.input_dim, self.hidden_dim]
self.output_dims = [self.dim, self.hidden_dim, self.dim]
self.W = [None,None,None]
self.b = [None,None,None]
for i in range(3):
self._init_inner_layer(i)
self.stoch_grad = sharedX(0)
self.kl_grad = sharedX(0)
self.linear_grad = sharedX(0)
def _init_inner_layer(self, idx):
rng = self.mlp.rng
if self.irange[idx] is not None:
assert self.istdev[idx] is None
assert self.sparse_init[idx] is None
W = rng.uniform(-self.irange[idx], self.irange[idx],
(self.input_dims[idx], self.output_dims[idx]))
elif self.istdev[idx] is not None:
assert self.sparse_init[idx] is None
W = rng.randn(self.input_dims[idx], self.output_dims[idx]) \
* self.istdev[idx]
else:
assert self.sparse_init[idx] is not None
W = np.zeros((self.input_dims[idx], self.output_dims[idx]))
for i in xrange(self.output_dims[idx]):
assert self.sparse_init[idx] <= self.input_dims[idx]
for j in xrange(self.sparse_init[idx]):
idx2 = rng.randint(0, self.input_dims[idx])
while W[idx2, i] != 0:
idx2 = rng.randint(0, self.input_dims[idx])
W[idx2, i] = rng.randn()
W *= self.sparse_stdev[idx]
W = sharedX(W)
W.name = self.layer_name + '_W' + str(idx)
b = sharedX( np.zeros((self.output_dims[idx],)) \
+ self.init_bias[idx], \
name = self.layer_name + '_b' + str(idx))
self.W[idx] = W
self.b[idx] = b
def censor_updates(self, updates):
for idx in range(3):
if self.max_col_norm[idx] is not None:
W = self.W[idx]
if W in updates:
updated_W = updates[W]
col_norms = T.sqrt(T.sum(T.sqr(updated_W), axis=0))
desired_norms = T.clip(col_norms, 0, self.max_col_norm[idx])
updates[W] = updated_W * desired_norms / (1e-7 + col_norms)
def get_params(self):
rval = [self.W[0], self.W[1], self.W[2], self.b[0], self.b[1], self.b[2]]
return rval
def get_weights(self):
rval = []
for i in range(3):
W = self.W[i]
rval.append(W.get_value())
return rval
def set_weights(self, weights):
for i in range(3):
W = self.W[i]
W.set_value(weights[i])
def set_biases(self, biases):
for i in range(3):
self.b[i].set_value(biases[i])
def get_biases(self):
rval = []
for i in range(3):
rval.append(self.b[i].get_value())
return rval
def get_weights_format(self):
return ('v', 'h')
def get_weights_topo(self):
raise NotImplementedError()
def get_monitoring_channels(self):
rval = OrderedDict()
rval['stoch_grad'] = self.stoch_grad
rval['kl_grad'] = self.kl_grad
rval['linear_grad'] = self.linear_grad
for i in range(3):
sq_W = T.sqr(self.W[i])
row_norms = T.sqrt(sq_W.sum(axis=1))
col_norms = T.sqrt(sq_W.sum(axis=0))
rval['row_norms_max'+str(i)] = row_norms.max()
rval['col_norms_max'+str(i)] = col_norms.max()
return rval
def get_monitoring_channels_from_state(self, state, target=None):
rval = OrderedDict()
# sparisty of outputs:
rval['mean_output_sparsity'] = self.m_mean.mean()
# proportion of sigmoids that have prob > 0.5
# good when equal to sparsity
floatX = theano.config.floatX
rval['mean_sparsity_prop'] \
= T.cast(T.gt(self.m_mean, 0.5),floatX).mean()
# same as above but for intermediate thresholds:
rval['mean_sparsity_prop0.2'] \
= T.cast(T.gt(self.m_mean, 0.2),floatX).mean()
rval['mean_sparsity_prop0.3'] \
= T.cast(T.gt(self.m_mean, 0.3),floatX).mean()
rval['mean_sparsity_prop0.4'] \
= T.cast(T.gt(self.m_mean, 0.4),floatX).mean()
# or just plain standard deviation (less is bad):
rval['output_stdev'] = self.m_mean.std()
# stdev of unit stdevs (more is bad)
rval['output_meta_stdev'] = self.m_mean.std(axis=0).std()
# max and min proportion of these probs per unit
prop_per_unit = T.cast(T.gt(self.m_mean, 0.5),floatX).mean(0)
# if this is high, it means a unit is likely always active (bad)
rval['max_unit_sparsity_prop'] = prop_per_unit.max()
rval['min_unit_sparsity_prop'] = prop_per_unit.min()
# in both cases, high means units are popular (bad)
# proportion of units with p>0.5 more than 50% of time:
rval['mean_unit_sparsity_meta_prop'] \
= T.cast(T.gt(prop_per_unit,0.5),floatX).mean()
# proportion of units with p>0.5 more than 75% of time:
rval['mean_unit_sparsity_meta_prop2'] \
= T.cast(T.gt(prop_per_unit,0.75),floatX).mean()
return rval
def fprop(self, state_below, threshold=None, stochastic=True):
self.input_space.validate(state_below)
if self.requires_reformat:
if not isinstance(state_below, tuple):
for sb in get_debug_values(state_below):
if sb.shape[0] != self.dbm.batch_size:
raise ValueError("self.dbm.batch_size is %d but got shape of %d" % (self.dbm.batch_size, sb.shape[0]))
assert reduce(lambda x,y: x * y, sb.shape[1:]) == self.input_dim
state_below = self.input_space.format_as(state_below, self.desired_space)
self.x = state_below
# linear part
if isinstance(self.x, S.SparseVariable):
z = S.dot(self.x,self.W[0]) + self.b[0]
else:
z = T.dot(self.x,self.W[0]) + self.b[0]
self.z = self.activate(z, self.expert_activation)
# first layer non-linear part
if isinstance(self.x, S.SparseVariable):
h = S.dot(self.x,self.W[1]) + self.b[1]
else:
h = T.dot(self.x,self.W[1]) + self.b[1]
self.h = self.activate(h, self.hidden_activation)
# second layer non-linear part
self.a = T.dot(self.h,self.W[2]) + self.b[2]
# activate non-linear part to get bernouilli probabilities
self.m_mean = T.nnet.sigmoid(self.a)
if threshold is None:
if stochastic:
# sample from bernouili probs to generate a mask
rng = MRG_RandomStreams(self.mlp.rng.randint(2**15))
self.m = rng.binomial(size = self.m_mean.shape, n = 1,
p = self.m_mean, dtype=self.m_mean.type.dtype)
else:
self.m = self.m_mean
else:
# deterministic mask:
self.m = T.cast(T.gt(self.m_mean, threshold), \
theano.config.floatX)
# mask output of linear part with samples from linear part
self.p = self.m * self.z
if self.layer_name is not None:
self.z.name = self.layer_name + '_z'
self.h.name = self.layer_name + '_h'
self.a.name = self.layer_name + '_a'
self.m_mean.name = self.layer_name + '_m_mean'
self.m.name = self.layer_name + '_m'
self.p.name = self.layer_name + '_p'
return self.p
def test_fprop(self, state_below, threshold=None, stochastic=True):
return self.fprop(state_below, threshold, stochastic)
def cost(self, Y, Y_hat):
return self.cost_from_cost_matrix(self.cost_matrix(Y, Y_hat))
def cost_from_cost_matrix(self, cost_matrix):
return cost_matrix.sum(axis=1).mean()
def cost_matrix(self, Y, Y_hat):
return T.sqr(Y - Y_hat)
def get_gradients(self, known_grads, loss):
'''
Computes gradients and updates for this layer given the known
gradients of the upper layers, and the vector of losses for the
batch.
'''
updates = OrderedDict()
cost = self.get_kl_divergence() + self.get_weight_decay()
# gradient of linear part.
params = [self.W[0], self.b[0]]
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.m, self.x],
disconnected_inputs='raise')
cost_grads = T.grad(cost=cost, wrt=params,
consider_constant=[self.m, self.x],
disconnected_inputs='ignore')
updates[self.linear_grad] = T.abs_(grads[0]).mean()
for i in range(len(grads)):
grads[i] += cost_grads[i]
gradients = OrderedDict(izip(params, grads))
# gradients of non-linear part:
## start by getting gradients at binary mask:
params = [self.m]
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.m, self.x],
disconnected_inputs='raise')
print "grads at bin", grads
# estimate gradient at simoid input using above:
grad_m = grads[0]
if self.derive_sigmoid:
# multiplying by derivative of sigmoid is optional:
known_grads[self.a] \
= grad_m * self.m_mean * (1. - self.m_mean)
else:
known_grads[self.a] = grad_m
params = [self.W[1],self.W[2],self.b[1],self.b[2]]
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.z, self.x],
disconnected_inputs='raise')
updates[self.stoch_grad] = T.abs_(grads[1]).mean()
cost_grads = T.grad(cost=cost, wrt=params,
consider_constant=[self.z, self.x],
disconnected_inputs='ignore')
updates[self.kl_grad] = T.abs_(cost_grads[1]).mean()
for i in range(len(grads)):
grads[i] += cost_grads[i]
gradients.update(OrderedDict(izip(params, grads)))
return gradients, updates
def get_kl_divergence(self):
'''
Minimize KL-divergence of unit binomial distributions with
binomial distribution of probability self.sparsity_target.
This could also be modified to keep a running average of unit
samples
'''
e = 1e-6
cost = - self.sparsity_cost_coeff * ( \
(self.sparsity_target * T.log(e+self.m_mean.mean(axis=0))) \
+((1.-self.sparsity_target) * T.log(e+(1.-self.m_mean.mean(axis=0)))) \
).sum()
return cost
def get_weight_decay(self):
rval = 0
for i in range(3):
if self.weight_decay_coeff[i] is not None:
rval += self.weight_decay_coeff[i]*T.sqr(self.W[i]).sum()
return rval
class CurveThrough(StraightThrough):
"""
Like Straight-Through (ST) but with the gradient at a (pre-sigmoid)
multiplied by sqr(b - p) where b is mask, and p is sigmoid.
Or multiplied by Bin(e)/e where e is hyper-parameter between 0 and 1
and Bin is a binomial distribution.
"""
def __init__(self,
dim,
hidden_dim,
layer_name,
hidden_activation = 'tanh',
expert_activation = 'linear',
derive_sigmoid = True,
curve = 'sqr(b-p)',
curve_noise = None,
sparsity_target = 0.1,
sparsity_cost_coeff = 1.0,
irange = [None,None,None],
istdev = [None,None,None],
sparse_init = [None,None,None],
sparse_stdev = [1.,1.,1.],
init_bias = [0.,0.,0.],
W_lr_scale = [None,None,None],
b_lr_scale = [None,None,None],
max_col_norm = [None,None,None],
weight_decay_coeff = [None,None,None]):
'''
params
------
dim:
number of units on output layer
hidden_dim:
number of units on hidden layer of non-linear part
hidden_activation:
activation function used on hidden layer of non-linear part
sparsity_target:
target sparsity of the output layer.
sparsity_cost_coeff:
coefficient of the sparsity constraint when summing costs
weight_decay_coeff:
coefficients of L2 weight decay when summing costs
other:
in the lists of params, the first index is for the linear
part, while the second and third indices are for the first
and second layer of the non-linear part, respectively
'''
self.__dict__.update(locals())
del self.self
def get_gradients(self, known_grads, loss):
'''
Computes gradients and updates for this layer given the known
gradients of the upper layers, and the vector of losses for the
batch.
'''
updates = OrderedDict()
cost = self.get_kl_divergence() + self.get_weight_decay()
# gradient of linear part.
params = [self.W[0], self.b[0]]
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.m, self.x],
disconnected_inputs='raise')
cost_grads = T.grad(cost=cost, wrt=params,
consider_constant=[self.m, self.x],
disconnected_inputs='ignore')
updates[self.linear_grad] = T.abs_(grads[0]).mean()
for i in range(len(grads)):
grads[i] += cost_grads[i]
gradients = OrderedDict(izip(params, grads))
# gradients of non-linear part:
## start by getting gradients at binary mask:
params = [self.m]
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.m, self.x],
disconnected_inputs='raise')
print "grads at bin", grads
# estimate gradient at simoid input using above:
grad_a = grads[0]
if self.derive_sigmoid:
# multiplying by derivative of sigmoid is optional:
grad_a *= self.m_mean * (1. - self.m_mean)
if self.curve == 'sqr(b-p)':
grad_a *= T.sqr(self.m - self.m_mean)
elif self.curve == 'Bin(e)/e':
assert (self.curve_noise is not None)
# sample from e to generate a mask
rng = MRG_RandomStreams(self.mlp.rng.randint(2**15))
curve_mask = rng.binomial(size = grad_a.shape, n = 1,
p = self.curve_noise, dtype=grad_a.type.dtype)
grad_a *= curve_mask / self.curve_noise
known_grads[self.a] = grad_m
params = [self.W[1],self.W[2],self.b[1],self.b[2]]
grads = T.grad(cost=None, wrt=params, known_grads=known_grads,
consider_constant=[self.z, self.x],
disconnected_inputs='raise')
updates[self.stoch_grad] = T.abs_(grads[1]).mean()
cost_grads = T.grad(cost=cost, wrt=params,
consider_constant=[self.z, self.x],
disconnected_inputs='ignore')
updates[self.kl_grad] = T.abs_(cost_grads[1]).mean()
for i in range(len(grads)):
grads[i] += cost_grads[i]
gradients.update(OrderedDict(izip(params, grads)))
return gradients, updates
| 37.251113 | 128 | 0.546095 | 7,499 | 58,596 | 4.10268 | 0.065209 | 0.014627 | 0.017844 | 0.011441 | 0.840083 | 0.824644 | 0.817233 | 0.806832 | 0.801047 | 0.796106 | 0 | 0.011238 | 0.350041 | 58,596 | 1,572 | 129 | 37.274809 | 0.796587 | 0.05625 | 0 | 0.81982 | 0 | 0 | 0.035513 | 0.010296 | 0 | 0 | 0 | 0.000636 | 0.023023 | 0 | null | null | 0 | 0.022022 | null | null | 0.005005 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5fd8405bb243b79ae611269ad834ed9221ccd00b | 116 | py | Python | tests/utils/test_utils_wsgi.py | bitcaster-io/bitcaster | 9f1bad96e00e3bc78a22451731e231d30662b166 | [
"BSD-3-Clause"
] | 4 | 2018-03-01T10:22:30.000Z | 2020-04-04T16:31:11.000Z | tests/utils/test_utils_wsgi.py | bitcaster-io/bitcaster | 9f1bad96e00e3bc78a22451731e231d30662b166 | [
"BSD-3-Clause"
] | 60 | 2018-05-20T04:42:32.000Z | 2022-02-10T17:03:37.000Z | tests/utils/test_utils_wsgi.py | bitcaster-io/bitcaster | 9f1bad96e00e3bc78a22451731e231d30662b166 | [
"BSD-3-Clause"
] | 1 | 2018-08-04T05:06:45.000Z | 2018-08-04T05:06:45.000Z |
from bitcaster.utils.wsgi import get_client_ip
def test_get_client_ip(rf):
assert get_client_ip(rf.get('/'))
| 16.571429 | 46 | 0.758621 | 20 | 116 | 4.05 | 0.6 | 0.333333 | 0.407407 | 0.320988 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12931 | 116 | 6 | 47 | 19.333333 | 0.80198 | 0 | 0 | 0 | 0 | 0 | 0.008696 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
27146205093a7251495ec54449d20a6ba7dd737e | 215 | py | Python | src/test/python/request/test_http_request.py | photowey/pytest-dynamic-framework | 4e7b6d74594191006b50831d42e7aae21e154d56 | [
"Apache-2.0"
] | null | null | null | src/test/python/request/test_http_request.py | photowey/pytest-dynamic-framework | 4e7b6d74594191006b50831d42e7aae21e154d56 | [
"Apache-2.0"
] | null | null | null | src/test/python/request/test_http_request.py | photowey/pytest-dynamic-framework | 4e7b6d74594191006b50831d42e7aae21e154d56 | [
"Apache-2.0"
] | null | null | null | # -*- coding:utf-8 -*-
# ---------------------------------------------
# @file test_http_request
# @description test_http_request
# @author WcJun
# @date 2021/07/19
# ---------------------------------------------
| 21.5 | 47 | 0.381395 | 17 | 215 | 4.588235 | 0.823529 | 0.205128 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046632 | 0.102326 | 215 | 9 | 48 | 23.888889 | 0.357513 | 0.92093 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
27dbdda2cdbac145472cb45605d72e3d6d4526dd | 10,561 | py | Python | aviso-server/auth/tests/test_backend.py | mpejcoch/aviso | 250b5646220fae85725278b3ca80fed4e15a103a | [
"Apache-2.0"
] | 6 | 2021-02-03T17:55:05.000Z | 2022-02-20T08:05:42.000Z | aviso-server/auth/tests/test_backend.py | mpejcoch/aviso | 250b5646220fae85725278b3ca80fed4e15a103a | [
"Apache-2.0"
] | 1 | 2021-04-26T14:42:39.000Z | 2021-04-26T14:42:39.000Z | aviso-server/auth/tests/test_backend.py | mpejcoch/aviso | 250b5646220fae85725278b3ca80fed4e15a103a | [
"Apache-2.0"
] | 2 | 2021-02-09T15:07:41.000Z | 2021-08-13T09:55:30.000Z | # (C) Copyright 1996- ECMWF.
#
# This software is licensed under the terms of the Apache Licence Version 2.0
# which can be obtained at http://www.apache.org/licenses/LICENSE-2.0.
# In applying this licence, ECMWF does not waive the privileges and immunities
# granted to it by virtue of its status as an intergovernmental organisation
# nor does it submit to any jurisdiction.
import json
import os
import pytest
from aviso_auth import config, logger
from aviso_auth.authorisation import Authoriser
from aviso_auth.backend_adapter import BackendAdapter
from aviso_auth.custom_exceptions import InternalSystemError, InvalidInputError
def conf() -> config.Config: # this automatically configure the logging
return config.Config(conf_path=os.path.expanduser("~/.aviso-auth/testing/config.yaml"))
class RequestDict(dict):
def __init__(self, *args, **kwargs):
dict.__init__(self, *args, **kwargs)
self.__dict__ = self
def test_backend():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
# prepare request
key = "/ec/diss/SCL"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": 100,
"sort_order": "DESCEND",
"sort_target": "KEY",
"keys_only": False,
"revision": None,
"min_mod_revision": None,
"max_mod_revision": None,
}
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
resp = backend.forward_impl(request)
assert resp
def test_not_existing_dest():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
key = "/ec/diss/SCL/not_existing"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": 100,
"sort_order": "DESCEND",
"sort_target": "KEY",
"keys_only": False,
"revision": None,
"min_mod_revision": None,
"max_mod_revision": None,
}
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
resp = backend.forward_impl(request)
assert resp
assert json.loads(resp.decode()) != {}
assert json.loads(resp.decode()).get("kvs") is None
def test_bad_etcd_format_request():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
key = "/ec/diss/SCL"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": "aa", # this returns a 400 as it's expecting a number
"sort_order": "DESCEND",
"sort_target": "KEY",
"keys_only": False,
"revision": None,
"min_mod_revision": None,
"max_mod_revision": None,
}
# make the call
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
try:
backend.forward_impl(request)
except Exception as e:
assert isinstance(e, InternalSystemError)
def test_bad_etcd_request_value():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
key = "/ec/diss/SCL"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": 100,
"sort_order": "DESCEND",
"sort_target": "KEYY", # this returns a 400 as it's an unknown value
"keys_only": False,
"revision": None,
"min_mod_revision": None,
"max_mod_revision": None,
}
# make the call
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
try:
backend.forward_impl(request)
except Exception as e:
assert isinstance(e, InternalSystemError)
def test_future_rev():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
key = "/ec/diss/SCL"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": 100,
"sort_order": "DESCEND",
"sort_target": "KEY",
"keys_only": False,
"revision": 10000000000, # this returns a 400 as it's a future revision
"min_mod_revision": None,
"max_mod_revision": None,
}
# make the call
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
try:
backend.forward_impl(request)
except Exception as e:
assert isinstance(e, InternalSystemError)
@pytest.mark.skip()
def test_compacted_rev():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
key = "/ec/diss/SCL"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": 100,
"sort_order": "DESCEND",
"sort_target": "KEY",
"keys_only": False,
"revision": 72484, # this returns a 400 as it's a compacted revision
"min_mod_revision": None,
"max_mod_revision": None,
}
# make the call
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
try:
backend.forward_impl(request)
except Exception as e:
assert isinstance(e, InvalidInputError)
def test_range_rev():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
key = "/ec/diss/SCL"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": 100,
"sort_order": "DESCEND",
"sort_target": "KEY",
"keys_only": False,
"revision": None,
"min_mod_revision": 10, # this is a compacted revision
"max_mod_revision": 100000000000, # this is a future revision
}
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
resp = backend.forward_impl(request)
assert resp
assert json.loads(resp.decode()) != {}
def test_range_compacted():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
key = "/ec/diss/SCL"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": 100,
"sort_order": "DESCEND",
"sort_target": "KEY",
"keys_only": False,
"revision": None,
"min_mod_revision": 10, # this is a compacted revision
"max_mod_revision": 90, # this is a compacted revision
}
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
resp = backend.forward_impl(request)
assert resp
assert json.loads(resp.decode()) != {}
def test_range_future():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
key = "/ec/diss/SCL"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": 100,
"sort_order": "DESCEND",
"sort_target": "KEY",
"keys_only": False,
"revision": None,
"min_mod_revision": 1000000000, # this is a future revision
"max_mod_revision": 100000000000, # this is a future revision
}
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
resp = backend.forward_impl(request)
assert resp
assert json.loads(resp.decode()) != {}
def test_from_compacted_rev():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
key = "/ec/diss/SCL"
# encode key
encoded_key = Authoriser._encode_to_str_base64(key)
range_end = Authoriser._encode_to_str_base64(str(Authoriser._incr_last_byte(key), "utf-8"))
# create the body for the get range on the etcd sever
body = {
"key": encoded_key,
"range_end": range_end,
"limit": 100,
"sort_order": "DESCEND",
"sort_target": "KEY",
"keys_only": False,
"revision": None,
"min_mod_revision": 10, # this is a compacted revision
"max_mod_revision": None,
}
request = RequestDict(data=json.dumps(body))
# make the call
backend = BackendAdapter(conf())
resp = backend.forward_impl(request)
assert resp
assert json.loads(resp.decode()) != {}
def test_no_body():
logger.debug(os.environ.get("PYTEST_CURRENT_TEST").split(":")[-1].split(" ")[0])
# make the call
request = RequestDict(data=None)
# make the call
backend = BackendAdapter(conf())
try:
backend.forward_impl(request)
except Exception as e:
assert isinstance(e, InvalidInputError)
| 33.003125 | 95 | 0.643878 | 1,372 | 10,561 | 4.730321 | 0.124636 | 0.03698 | 0.040062 | 0.064715 | 0.849307 | 0.840986 | 0.835901 | 0.829738 | 0.823267 | 0.823267 | 0 | 0.021702 | 0.227725 | 10,561 | 319 | 96 | 33.106583 | 0.774031 | 0.15633 | 0 | 0.813559 | 0 | 0 | 0.160736 | 0.006551 | 0 | 0 | 0 | 0 | 0.072034 | 1 | 0.055085 | false | 0 | 0.029661 | 0.004237 | 0.09322 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
27f4dedff13db2a12438c374e364390a89d243d1 | 229 | py | Python | zeynep/verification/models/__init__.py | realsuayip/zeynep | 76d525fa529b92525f5e45d42219279bdfcd1125 | [
"BSD-3-Clause"
] | null | null | null | zeynep/verification/models/__init__.py | realsuayip/zeynep | 76d525fa529b92525f5e45d42219279bdfcd1125 | [
"BSD-3-Clause"
] | 1 | 2022-03-24T19:02:23.000Z | 2022-03-29T20:40:36.000Z | zeynep/verification/models/__init__.py | realsuayip/zeynep | 76d525fa529b92525f5e45d42219279bdfcd1125 | [
"BSD-3-Clause"
] | null | null | null | # flake8: noqa
from zeynep.verification.models.email import EmailVerification
from zeynep.verification.models.password import PasswordResetVerification
from zeynep.verification.models.registration import RegistrationVerification
| 45.8 | 76 | 0.886463 | 23 | 229 | 8.826087 | 0.565217 | 0.147783 | 0.325123 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004673 | 0.065502 | 229 | 4 | 77 | 57.25 | 0.943925 | 0.052402 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 8 |
fd65b1cd9424a0116dc7a6280f51bac4eae8826b | 14,402 | py | Python | matchzoo/dataloader/callbacks/padding.py | ChrisRBXiong/MatchZoo-py | 8883d0933a62610d71fec0215dce643630e03b1c | [
"Apache-2.0"
] | null | null | null | matchzoo/dataloader/callbacks/padding.py | ChrisRBXiong/MatchZoo-py | 8883d0933a62610d71fec0215dce643630e03b1c | [
"Apache-2.0"
] | null | null | null | matchzoo/dataloader/callbacks/padding.py | ChrisRBXiong/MatchZoo-py | 8883d0933a62610d71fec0215dce643630e03b1c | [
"Apache-2.0"
] | null | null | null | import typing
import numpy as np
from matchzoo.engine.base_callback import BaseCallback
class BasicPadding(BaseCallback):
"""
Pad data for basic preprocessor.
:param fixed_length_left: Integer. If set, `text_left` will be padded
to this length.
:param fixed_length_right: Integer. If set, `text_right` will be padded
to this length.
:param pad_value: the value to fill text.
:param pad_mode: String, `pre` or `post`:
pad either before or after each sequence.
"""
def __init__(
self,
fixed_length_left: int = None,
fixed_length_right: int = None,
pad_value: typing.Union[int, str] = 0,
pad_mode: str = 'pre',
):
"""Init."""
self._fixed_length_left = fixed_length_left
self._fixed_length_right = fixed_length_right
self._pad_value = pad_value
self._pad_mode = pad_mode
def on_batch_unpacked(self, x: dict, y: np.ndarray):
"""Pad `x['text_left']` and `x['text_right]`."""
pad_length_left = 0
pad_length_right = 0
batch_size = len(x['id_left'])
max_length_left = max(x['length_left'])
max_length_right = max(x['length_right'])
if self._fixed_length_left is None:
pad_length_left = max_length_left
else:
pad_length_left = self._fixed_length_left
if self._fixed_length_right is None:
pad_length_right = max_length_right
else:
pad_length_right = self._fixed_length_right
for key, value in x.items():
if key != 'text_left' and key != 'text_right':
continue
elif key == 'text_left':
padded_value = np.full([batch_size, pad_length_left],
self._pad_value, dtype=value.dtype)
if self._pad_mode == 'post':
for i in range(len(value)):
end_pos = min(len(value[i]), pad_length_left)
if end_pos > 0:
padded_value[i][:end_pos] = value[i][:end_pos]
elif self._pad_mode == 'pre':
for i in range(len(value)):
start_pos = min(len(value[i]), pad_length_left)
if start_pos > 0:
padded_value[i][-start_pos:] = \
value[i][-start_pos:]
else:
raise ValueError('{} is not a vaild '
'pad mode.'.format(self._pad_mode))
else: # key == 'text_right'
padded_value = np.full([batch_size, pad_length_right],
self._pad_value, dtype=value.dtype)
if self._pad_mode == 'post':
for i in range(len(value)):
end_pos = min(len(value[i]), pad_length_right)
if end_pos > 0:
padded_value[i][:end_pos] = value[i][:end_pos]
elif self._pad_mode == 'pre':
for i in range(len(value)):
start_pos = min(len(value[i]), pad_length_right)
if len(value[i]) > 0:
padded_value[i][-start_pos:] = \
value[i][-start_pos:]
else:
raise ValueError('{} is not a vaild '
'pad mode.'.format(self._pad_mode))
x[key] = padded_value
class DRMMPadding(BaseCallback):
"""
Pad data for DRMM Model.
:param fixed_length_left: Integer. If set, `text_left` and
`match_histogram` will be padded to this length.
:param fixed_length_right: Integer. If set, `text_right` will be padded
to this length.
:param pad_value: the value to fill text.
:param pad_mode: String, `pre` or `post`:
pad either before or after each sequence.
"""
def __init__(
self,
fixed_length_left: int = None,
fixed_length_right: int = None,
pad_value: typing.Union[int, str] = 0,
pad_mode: str = 'pre',
):
"""Init."""
self._fixed_length_left = fixed_length_left
self._fixed_length_right = fixed_length_right
self._pad_value = pad_value
self._pad_mode = pad_mode
def on_batch_unpacked(self, x: dict, y: np.ndarray):
"""
Padding.
Pad `x['text_left']`, `x['text_right]` and `x['match_histogram']`.
"""
pad_length_left = 0
pad_length_right = 0
batch_size = len(x['id_left'])
max_length_left = max(x['length_left'])
max_length_right = max(x['length_right'])
bin_size = len(x['match_histogram'][0][0])
if self._fixed_length_left is None:
pad_length_left = max_length_left
else:
pad_length_left = self._fixed_length_left
if self._fixed_length_right is None:
pad_length_right = max_length_right
else:
pad_length_right = self._fixed_length_right
for key, value in x.items():
if key != 'text_left' and key != 'text_right' and \
key != 'match_histogram':
continue
elif key == 'text_left':
padded_value = np.full([batch_size, pad_length_left],
self._pad_value, dtype=value.dtype)
if self._pad_mode == 'post':
for i in range(len(value)):
end_pos = min(len(value[i]), pad_length_left)
if end_pos > 0:
padded_value[i][:end_pos] = value[i][:end_pos]
elif self._pad_mode == 'pre':
for i in range(len(value)):
start_pos = min(len(value[i]), pad_length_left)
if start_pos > 0:
padded_value[i][-start_pos:] = \
value[i][-start_pos:]
else:
raise ValueError('{} is not a vaild '
'pad mode.'.format(self._pad_mode))
elif key == 'text_right':
padded_value = np.full([batch_size, pad_length_right],
self._pad_value, dtype=value.dtype)
if self._pad_mode == 'post':
for i in range(len(value)):
end_pos = min(len(value[i]), pad_length_right)
if end_pos > 0:
padded_value[i][:end_pos] = value[i][:end_pos]
elif self._pad_mode == 'pre':
for i in range(len(value)):
start_pos = min(len(value[i]), pad_length_right)
if start_pos > 0:
padded_value[i][-start_pos:] = \
value[i][-start_pos:]
else:
raise ValueError('{} is not a vaild '
'pad mode.'.format(self._pad_mode))
else: # key == 'match_histogram'
padded_value = np.full(
[batch_size, pad_length_left, bin_size],
self._pad_value, dtype=value.dtype)
if self._pad_mode == 'post':
for i in range(len(value)):
end_pos = min(len(value[i]), pad_length_left)
if end_pos > 0:
padded_value[i][:end_pos] = value[i][:end_pos]
elif self._pad_mode == 'pre':
for i in range(len(value)):
start_pos = min(len(value[i]), pad_length_left)
if start_pos > 0:
padded_value[i][-start_pos:] = \
value[i][-start_pos:]
else:
raise ValueError('{} is not a vaild '
'pad mode.'.format(self._pad_mode))
x[key] = padded_value
class CDSSMPadding(BaseCallback):
"""
Pad data for cdssm preprocessor.
:param fixed_length_left: Integer. If set, `text_left` will be padded
to this length.
:param fixed_length_right: Integer. If set, `text_right` will be padded
to this length.
:param pad_value: the value to fill text.
:param pad_mode: String, `pre` or `post`:
pad either before or after each sequence.
"""
def __init__(
self,
fixed_length_left: int = None,
fixed_length_right: int = None,
pad_value: typing.Union[int, str] = 0,
pad_mode: str = 'pre',
):
"""Init."""
self._fixed_length_left = fixed_length_left
self._fixed_length_right = fixed_length_right
self._pad_value = pad_value
self._pad_mode = pad_mode
def on_batch_unpacked(self, x: dict, y: np.ndarray):
"""Pad `x['text_left']` and `x['text_right]`."""
pad_length_left = 0
pad_length_right = 0
batch_size = len(x['id_left'])
max_length_left = max(x['length_left'])
max_length_right = max(x['length_right'])
vocab_size = len(x['text_left'][0][0])
if self._fixed_length_left is None:
pad_length_left = max_length_left
else:
pad_length_left = self._fixed_length_left
if self._fixed_length_right is None:
pad_length_right = max_length_right
else:
pad_length_right = self._fixed_length_right
for key, value in x.items():
if key == 'text_left':
padded_value = np.full(
[batch_size, pad_length_left, vocab_size],
fill_value=0, dtype=value.dtype)
if self._pad_mode == 'post':
for i in range(batch_size):
left_len = np.array(value[i]).shape[0]
end_pos = min(left_len, pad_length_left)
if end_pos > 0:
padded_value[i][:end_pos] = value[i][:end_pos]
if end_pos < pad_length_left:
padded_value[i, end_pos:, self._pad_value] = \
[1] * (pad_length_left - end_pos)
elif self._pad_mode == 'pre':
for i in range(batch_size):
left_len = np.array(value[i]).shape[0]
start_pos = min(left_len, pad_length_left)
if start_pos > 0:
padded_value[i][-start_pos:] = \
value[i][-start_pos:]
if start_pos < pad_length_left:
padded_value[i, :-start_pos, self._pad_value] = \
[1] * (pad_length_left - start_pos)
else:
raise ValueError('{} is not a vaild '
'pad mode.'.format(self._pad_mode))
elif key == 'text_right':
padded_value = np.full(
[batch_size, pad_length_right, vocab_size],
fill_value=0, dtype=value.dtype)
if self._pad_mode == 'post':
for i in range(batch_size):
right_len = np.array(value[i]).shape[0]
end_pos = min(right_len, pad_length_right)
if end_pos > 0:
padded_value[i][:end_pos] = value[i][:end_pos]
if end_pos < pad_length_right:
padded_value[i, end_pos:, self._pad_value] = \
[1] * (pad_length_right - end_pos)
elif self._pad_mode == 'pre':
for i in range(batch_size):
right_len = np.array(value[i]).shape[0]
start_pos = min(right_len, pad_length_right)
if start_pos > 0:
padded_value[i][-start_pos:] = \
value[i][-start_pos:]
if start_pos < pad_length_right:
padded_value[i, :-start_pos, self._pad_value] = \
[1] * (pad_length_right - start_pos)
else:
raise ValueError('{} is not a vaild '
'pad mode.'.format(self._pad_mode))
else:
continue
x[key] = padded_value
class BertPadding(BaseCallback):
"""
Pad data for bert preprocessor.
:param fixed_length_left: Integer. If set, `text_left` will be padded
to this length.
:param fixed_length_right: Integer. If set, `text_right` will be padded
to this length.
:param pad_value: the value to fill text.
:param pad_mode: String, `pre` or `post`:
pad either before or after each sequence.
"""
def __init__(
self,
fixed_length_left: int = None,
fixed_length_right: int = None,
pad_value: typing.Union[int, str] = 0,
pad_mode: str = 'pre',
):
"""Init."""
self._padding = BasicPadding(fixed_length_left=fixed_length_left,
fixed_length_right=fixed_length_right,
pad_value=pad_value,
pad_mode=pad_mode)
def on_batch_unpacked(self, x: dict, y: np.ndarray):
"""Pad `x['text_left']` and `x['text_right]`."""
self._padding.on_batch_unpacked(x, y)
x['text_left'] = np.insert(x['text_left'], 0, 101, axis=1)
x['text_right'] = np.insert(x['text_right'], 0, 102, axis=1)
SEP = [[102]] * len(x['text_right'])
x['text_right'] = np.append(x['text_right'], SEP, axis=1)
| 42.863095 | 78 | 0.487918 | 1,656 | 14,402 | 3.927536 | 0.064614 | 0.086101 | 0.049969 | 0.02952 | 0.908518 | 0.905443 | 0.89591 | 0.893143 | 0.885763 | 0.873616 | 0 | 0.006154 | 0.413276 | 14,402 | 335 | 79 | 42.991045 | 0.76355 | 0.116095 | 0 | 0.849206 | 0 | 0 | 0.043768 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031746 | false | 0 | 0.011905 | 0 | 0.059524 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fd6d1a0dad7ae2801f26b51cf285f7cff5b4f731 | 107,493 | py | Python | catalog/migrations/0001_initial.py | ruettet/StageBrainz | 9e89adf1d55119b49b55560411c3ea8a27f9a6ab | [
"Apache-2.0"
] | 3 | 2019-11-03T10:21:56.000Z | 2022-02-22T15:08:34.000Z | catalog/migrations/0001_initial.py | ruettet/StageBrainz | 9e89adf1d55119b49b55560411c3ea8a27f9a6ab | [
"Apache-2.0"
] | 62 | 2019-10-29T07:29:21.000Z | 2022-02-10T11:40:52.000Z | catalog/migrations/0001_initial.py | ruettet/StageBrainz | 9e89adf1d55119b49b55560411c3ea8a27f9a6ab | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.2.6 on 2019-12-02 20:25
from django.db import migrations, models
import django.db.models.deletion
import partial_date.fields
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='EntityCharacter',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_type_str', models.CharField(blank=True, max_length=200, null=True)),
('name', models.CharField(default='', max_length=200)),
('sort_name', models.CharField(default='', max_length=200)),
('disambiguation', models.CharField(default='', max_length=200)),
],
options={
'ordering': ['sort_name'],
'abstract': False,
},
),
migrations.CreateModel(
name='EntityCharacterAliasType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
],
),
migrations.CreateModel(
name='EntityCharacterType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='EntityGenre',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_type_str', models.CharField(blank=True, max_length=200, null=True)),
('name', models.CharField(default='', max_length=200)),
('sort_name', models.CharField(default='', max_length=200)),
('disambiguation', models.CharField(default='', max_length=200)),
],
options={
'ordering': ['sort_name'],
'abstract': False,
},
),
migrations.CreateModel(
name='EntityGenreAliasType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='EntityGenreType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='EntityOrganity',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_type_str', models.CharField(blank=True, max_length=200, null=True)),
('name', models.CharField(default='', max_length=200)),
('sort_name', models.CharField(default='', max_length=200)),
('disambiguation', models.CharField(default='', max_length=200)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
],
options={
'ordering': ['sort_name'],
'abstract': False,
},
),
migrations.CreateModel(
name='EntityOrganityAliasType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='EntityOrganityType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='EntityProduction',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_type_str', models.CharField(blank=True, max_length=200, null=True)),
('name', models.CharField(default='', max_length=200)),
('sort_name', models.CharField(default='', max_length=200)),
('disambiguation', models.CharField(default='', max_length=200)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
],
options={
'ordering': ['sort_name'],
'abstract': False,
},
),
migrations.CreateModel(
name='EntityProductionType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='EntityShow',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_type_str', models.CharField(blank=True, max_length=200, null=True)),
('name', models.CharField(default='', max_length=200)),
('sort_name', models.CharField(default='', max_length=200)),
('disambiguation', models.CharField(default='', max_length=200)),
('when_date', partial_date.fields.PartialDateField(blank=True, null=True)),
('when_time', models.TimeField(blank=True, null=True)),
],
options={
'ordering': ['-when_date', '-when_time'],
},
),
migrations.CreateModel(
name='EntityShowType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='EntityUrl',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_type_str', models.CharField(blank=True, max_length=200, null=True)),
('disambiguation', models.CharField(default='', max_length=200)),
('name', models.URLField()),
],
options={
'ordering': ['name'],
},
),
migrations.CreateModel(
name='EntityUrlType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='EntityWork',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_type_str', models.CharField(blank=True, max_length=200, null=True)),
('name', models.CharField(default='', max_length=200)),
('sort_name', models.CharField(default='', max_length=200)),
('disambiguation', models.CharField(default='', max_length=200)),
('start_date', models.CharField(default='', max_length=10)),
('isbn', models.CharField(blank=True, max_length=20, null=True)),
],
options={
'ordering': ['sort_name'],
'abstract': False,
},
),
migrations.CreateModel(
name='EntityWorkAliasType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
],
),
migrations.CreateModel(
name='EntityWorkType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Locale',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('abbreviation', models.CharField(max_length=10)),
('name', models.CharField(max_length=100)),
('description', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='RelationCharacterCharacterType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationCharacterGenreType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationCharacterUrlType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationGenreGenreType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationGenreUrlType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityCharacterType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityGenreType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityOrganityType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityUrlType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityWorkType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionCharacterType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionGenreType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionOrganityType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionProductionType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionUrlType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionWorkType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowCharacterType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowGenreType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowOrganityType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowProductionType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowShowType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowUrlType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowWorkType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationUrlUrlType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationWorkCharacterType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationWorkGenreType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationWorkUrlType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationWorkWorkType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('inverted_name', models.CharField(max_length=200)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Season',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
],
),
migrations.CreateModel(
name='RelationWorkWork',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkwork_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkwork_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkwork_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkwork_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkwork_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='worka', to='catalog.EntityWork')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='workb', to='catalog.EntityWork')),
('relation_type', models.ManyToManyField(to='catalog.RelationWorkWorkType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationWorkUrl',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkurl_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkurl_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkurl_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkurl_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkurl_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityWork')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityUrl')),
('relation_type', models.ManyToManyField(to='catalog.RelationWorkUrlType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationWorkGenre',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkgenre_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkgenre_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkgenre_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkgenre_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkgenre_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityWork')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityGenre')),
('relation_type', models.ManyToManyField(to='catalog.RelationWorkGenreType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationWorkCharacter',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkcharacter_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkcharacter_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkcharacter_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkcharacter_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationworkcharacter_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityWork')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityCharacter')),
('relation_type', models.ManyToManyField(to='catalog.RelationWorkCharacterType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationUrlUrl',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationurlurl_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationurlurl_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationurlurl_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationurlurl_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationurlurl_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='urla', to='catalog.EntityUrl')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='urlb', to='catalog.EntityUrl')),
('relation_type', models.ManyToManyField(to='catalog.RelationUrlUrlType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowWork',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowwork_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowwork_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowwork_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowwork_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowwork_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityShow')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityWork')),
('relation_type', models.ManyToManyField(to='catalog.RelationShowWorkType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowUrl',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowurl_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowurl_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowurl_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowurl_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowurl_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityShow')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityUrl')),
('relation_type', models.ManyToManyField(to='catalog.RelationShowUrlType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowShow',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowshow_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowshow_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowshow_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowshow_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowshow_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='showa', to='catalog.EntityShow')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='showb', to='catalog.EntityShow')),
('relation_type', models.ManyToManyField(to='catalog.RelationShowShowType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowProduction',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowproduction_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowproduction_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowproduction_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowproduction_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowproduction_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityShow')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityProduction')),
('relation_type', models.ManyToManyField(to='catalog.RelationShowProductionType')),
],
options={
'ordering': ['entity_a'],
},
),
migrations.CreateModel(
name='RelationShowOrganity',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshoworganity_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshoworganity_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshoworganity_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshoworganity_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshoworganity_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityShow')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityOrganity')),
('relation_type', models.ManyToManyField(to='catalog.RelationShowOrganityType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowGenre',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowgenre_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowgenre_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowgenre_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowgenre_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowgenre_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityShow')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityGenre')),
('relation_type', models.ManyToManyField(to='catalog.RelationShowGenreType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationShowCharacter',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowcharacter_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowcharacter_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowcharacter_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowcharacter_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationshowcharacter_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityShow')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityCharacter')),
('relation_type', models.ManyToManyField(to='catalog.RelationShowCharacterType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionWork',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionwork_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionwork_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionwork_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionwork_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionwork_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityProduction')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityWork')),
('relation_type', models.ManyToManyField(to='catalog.RelationProductionWorkType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionUrl',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionurl_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionurl_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionurl_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionurl_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionurl_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityProduction')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityUrl')),
('relation_type', models.ManyToManyField(to='catalog.RelationProductionUrlType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionProduction',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionproduction_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionproduction_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionproduction_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionproduction_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionproduction_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='productiona', to='catalog.EntityProduction')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='productionb', to='catalog.EntityProduction')),
('relation_type', models.ManyToManyField(to='catalog.RelationProductionProductionType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionOrganity',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionorganity_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionorganity_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionorganity_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionorganity_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductionorganity_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityProduction')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityOrganity')),
('relation_type', models.ManyToManyField(to='catalog.RelationProductionOrganityType')),
],
options={
'ordering': ['-highlighted_relation', 'entity_b'],
},
),
migrations.CreateModel(
name='RelationProductionGenre',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductiongenre_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductiongenre_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductiongenre_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductiongenre_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductiongenre_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityProduction')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityGenre')),
('relation_type', models.ManyToManyField(to='catalog.RelationProductionGenreType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationProductionCharacter',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductioncharacter_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductioncharacter_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductioncharacter_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductioncharacter_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationproductioncharacter_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityProduction')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityCharacter')),
('relation_type', models.ManyToManyField(to='catalog.RelationProductionCharacterType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityWork',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitywork_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitywork_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitywork_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitywork_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitywork_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityOrganity')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityWork')),
('relation_type', models.ManyToManyField(to='catalog.RelationOrganityWorkType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityUrl',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityurl_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityurl_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityurl_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityurl_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityurl_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityOrganity')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityUrl')),
('relation_type', models.ManyToManyField(to='catalog.RelationOrganityUrlType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityOrganity',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityorganity_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityorganity_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityorganity_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityorganity_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganityorganity_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='organitya', to='catalog.EntityOrganity')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='organityb', to='catalog.EntityOrganity')),
('relation_type', models.ManyToManyField(to='catalog.RelationOrganityOrganityType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityGenre',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitygenre_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitygenre_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitygenre_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitygenre_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitygenre_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityOrganity')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityGenre')),
('relation_type', models.ManyToManyField(to='catalog.RelationOrganityGenreType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationOrganityCharacter',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitycharacter_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitycharacter_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitycharacter_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitycharacter_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationorganitycharacter_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityOrganity')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityCharacter')),
('relation_type', models.ManyToManyField(to='catalog.RelationOrganityCharacterType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationGenreUrl',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenreurl_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenreurl_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenreurl_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenreurl_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenreurl_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityGenre')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityUrl')),
('relation_type', models.ManyToManyField(to='catalog.RelationGenreUrlType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationGenreGenre',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenregenre_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenregenre_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenregenre_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenregenre_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationgenregenre_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='genrea', to='catalog.EntityGenre')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='genreb', to='catalog.EntityGenre')),
('relation_type', models.ManyToManyField(to='catalog.RelationGenreGenreType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationCharacterUrl',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharacterurl_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharacterurl_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharacterurl_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharacterurl_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharacterurl_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityCharacter')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityUrl')),
('relation_type', models.ManyToManyField(to='catalog.RelationCharacterUrlType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationCharacterGenre',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactergenre_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactergenre_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactergenre_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactergenre_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactergenre_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityCharacter')),
('entity_b', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityGenre')),
('relation_type', models.ManyToManyField(to='catalog.RelationCharacterGenreType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='RelationCharacterCharacter',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('entity_a_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('entity_b_credited_as', models.CharField(blank=True, max_length=200, null=True)),
('relation_name', models.CharField(blank=True, max_length=200, null=True)),
('start_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('end_date', partial_date.fields.PartialDateField(blank=True, default=None, null=True)),
('highlighted_relation', models.BooleanField(blank=True, null=True)),
('inverted_relation', models.BooleanField(default=False)),
('context_of_production_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_show_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_work_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_organity_str', models.CharField(blank=True, max_length=200, null=True)),
('context_of_character', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactercharacter_context_character', to='catalog.EntityCharacter')),
('context_of_organity', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactercharacter_context_organity', to='catalog.EntityOrganity')),
('context_of_production', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactercharacter_context_production', to='catalog.EntityProduction')),
('context_of_show', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactercharacter_context_show', to='catalog.EntityShow')),
('context_of_work', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='relationcharactercharacter_context_work', to='catalog.EntityWork')),
('entity_a', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='charactera', to='catalog.EntityCharacter')),
('entity_b', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='characterb', to='catalog.EntityCharacter')),
('relation_type', models.ManyToManyField(to='catalog.RelationCharacterCharacterType')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='EntityWorkAlias',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('alias_type', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityWorkAliasType')),
('locale', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.Locale')),
('super_entity', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityWork')),
],
options={
'abstract': False,
},
),
migrations.AddField(
model_name='entitywork',
name='entity_type',
field=models.ManyToManyField(blank=True, to='catalog.EntityWorkType'),
),
migrations.AddField(
model_name='entityurl',
name='entity_type',
field=models.ManyToManyField(blank=True, to='catalog.EntityUrlType'),
),
migrations.AddField(
model_name='entityshow',
name='entity_type',
field=models.ManyToManyField(blank=True, to='catalog.EntityShowType'),
),
migrations.AddField(
model_name='entityproduction',
name='entity_type',
field=models.ManyToManyField(blank=True, to='catalog.EntityProductionType'),
),
migrations.AddField(
model_name='entityproduction',
name='season',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.Season'),
),
migrations.CreateModel(
name='EntityOrganityAlias',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('alias_type', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityOrganityAliasType')),
('locale', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.Locale')),
('super_entity', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityOrganity')),
],
options={
'abstract': False,
},
),
migrations.AddField(
model_name='entityorganity',
name='entity_type',
field=models.ManyToManyField(blank=True, to='catalog.EntityOrganityType'),
),
migrations.CreateModel(
name='EntityGenreAlias',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('alias_type', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityGenreAliasType')),
('locale', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='catalog.Locale')),
('super_entity', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityGenre')),
],
options={
'abstract': False,
},
),
migrations.AddField(
model_name='entitygenre',
name='entity_type',
field=models.ManyToManyField(blank=True, to='catalog.EntityGenreType'),
),
migrations.CreateModel(
name='EntityCharacterAlias',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('alias_type', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityCharacterAliasType')),
('super_entity', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='catalog.EntityCharacter')),
],
),
migrations.AddField(
model_name='entitycharacter',
name='entity_type',
field=models.ManyToManyField(blank=True, to='catalog.EntityCharacterType'),
),
]
| 74.2868 | 224 | 0.6434 | 11,237 | 107,493 | 5.936549 | 0.016552 | 0.065433 | 0.057563 | 0.083467 | 0.947908 | 0.946619 | 0.926607 | 0.923084 | 0.923084 | 0.916743 | 0 | 0.011743 | 0.22047 | 107,493 | 1,446 | 225 | 74.338174 | 0.784364 | 0.000419 | 0 | 0.730368 | 1 | 0 | 0.222521 | 0.114895 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.002085 | 0 | 0.004864 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fd7fb3620fecf9f58a2310809c9979b0d733fc88 | 5,445 | py | Python | Technical_Indicators/MACD.py | vhn0912/Finance | 39cf49d4d778d322537531cee4ce3981cc9951f9 | [
"MIT"
] | 441 | 2020-04-22T02:21:19.000Z | 2022-03-29T15:00:24.000Z | Technical_Indicators/MACD.py | happydasch/Finance | 4f6c5ea8f60fb0dc3b965ffb9628df83c2ecef35 | [
"MIT"
] | 5 | 2020-07-06T15:19:58.000Z | 2021-07-23T18:32:29.000Z | Technical_Indicators/MACD.py | happydasch/Finance | 4f6c5ea8f60fb0dc3b965ffb9628df83c2ecef35 | [
"MIT"
] | 111 | 2020-04-21T11:40:39.000Z | 2022-03-20T07:26:17.000Z | import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import warnings
warnings.filterwarnings("ignore")
import yfinance as yf
yf.pdr_override()
import datetime as dt
# input
symbol = 'FB'
start = dt.date.today() - dt.timedelta(days = 365*3)
end = dt.date.today()
# Read data
df = yf.download(symbol,start,end)
import talib as ta
df['macd'], df['macdsignal'], df['macdhist'] = ta.MACD(df['Adj Close'], fastperiod=12, slowperiod=26, signalperiod=9)
df = df.dropna()
# Line Chart
fig = plt.figure(figsize=(14,7))
ax1 = plt.subplot(2, 1, 1)
ax1.plot(df.index, df['Adj Close'])
ax1.axhline(y=df['Adj Close'].mean(),color='r')
ax1.grid()
#ax1.grid(True, which='both')
#ax1.grid(which='minor', linestyle='-', linewidth='0.5', color='black')
#ax1.grid(which='major', linestyle='-', linewidth='0.5', color='red')
#ax1.minorticks_on()
#ax1.legend(loc='best')
ax1v = ax1.twinx()
ax1v.fill_between(df.index[0:],0, df.Volume[0:], facecolor='#0079a3', alpha=0.4)
ax1v.axes.yaxis.set_ticklabels([])
ax1v.set_ylim(0, 3*df.Volume.max())
ax1.set_title('Stock '+ symbol +' Closing Price')
ax1.set_ylabel('Price')
labels = ['macd','macdsignal']
ax2 = plt.subplot(2, 1, 2)
ax2.plot(df[['macd','macdsignal']], label=labels)
ax2.bar(df.index, df['macdhist'], label='macdhist')
ax2.grid()
ax2.set_ylabel('MACD')
ax2.set_xlabel('Date')
ax2.legend(loc='best')
plt.show()
# Line Chart
fig = plt.figure(figsize=(14,7))
ax1 = plt.subplot(2, 1, 1)
ax1.plot(df.index, df['Adj Close'])
ax1.axhline(y=df['Adj Close'].mean(),color='r')
ax1.grid()
#ax1.grid(True, which='both')
#ax1.grid(which='minor', linestyle='-', linewidth='0.5', color='black')
#ax1.grid(which='major', linestyle='-', linewidth='0.5', color='red')
#ax1.minorticks_on()
#ax1.legend(loc='best')
ax1v = ax1.twinx()
ax1v.fill_between(df.index[0:],0, df.Volume[0:], facecolor='#0079a3', alpha=0.4)
ax1v.axes.yaxis.set_ticklabels([])
ax1v.set_ylim(0, 3*df.Volume.max())
ax1.set_title('Stock '+ symbol +' Closing Price')
ax1.set_ylabel('Price')
labels = ['macd','macdsignal']
ax2 = plt.subplot(2, 1, 2)
df['positive'] = df['macdhist'] > 0
ax2.plot(df[['macd','macdsignal']], label=labels)
ax2.bar(df.index, df['macdhist'], color=df.positive.map({True: 'g', False: 'r'}), label='macdhist')
ax2.grid()
ax2.set_ylabel('MACD')
ax2.set_xlabel('Date')
ax2.legend(loc='best')
plt.show()
# ## Candlestick with MACD
from matplotlib import dates as mdates
dfc = df.copy()
dfc['macd'], dfc['macdsignal'], dfc['macdhist'] = ta.MACD(dfc['Adj Close'], fastperiod=12, slowperiod=26, signalperiod=9)
dfc['VolumePositive'] = dfc['Open'] < dfc['Adj Close']
dfc = dfc.dropna()
dfc = dfc.reset_index()
dfc['Date'] = mdates.date2num(dfc['Date'].tolist())
from mplfinance.original_flavor import candlestick_ohlc
fig = plt.figure(figsize=(14,7))
ax1 = plt.subplot(2, 1, 1)
candlestick_ohlc(ax1,dfc.values, width=0.5, colorup='g', colordown='r', alpha=1.0)
ax1.xaxis_date()
ax1.xaxis.set_major_formatter(mdates.DateFormatter('%d-%m-%Y'))
ax1.grid(True, which='both')
ax1.minorticks_on()
ax1v = ax1.twinx()
ax1v.fill_between(dfc.Date, 0, dfc.Volume[0:], facecolor='#0079a3', alpha=0.4)
ax1v.axes.yaxis.set_ticklabels([])
ax1v.set_ylim(0, 3*df.Volume.max())
ax1.set_title('Stock '+ symbol +' Closing Price')
ax1.set_ylabel('Price')
labels = ['macd','macdsignal']
ax2 = plt.subplot(2, 1, 2)
ax2.plot(df[['macd','macdsignal']], label=labels)
ax2.bar(df.index, df['macdhist'], label='macdhist')
ax2.grid()
ax2.set_xlabel('Date')
ax2.legend(loc='best')
plt.show()
fig = plt.figure(figsize=(14,7))
ax1 = plt.subplot(2, 1, 1)
candlestick_ohlc(ax1,dfc.values, width=0.5, colorup='g', colordown='r', alpha=1.0)
ax1.xaxis_date()
ax1.xaxis.set_major_formatter(mdates.DateFormatter('%d-%m-%Y'))
ax1.grid(True, which='both')
ax1.minorticks_on()
ax1v = ax1.twinx()
colors = dfc.VolumePositive.map({True: 'g', False: 'r'})
ax1v.bar(dfc.Date, dfc['Volume'], color=colors, alpha=0.4)
ax1v.axes.yaxis.set_ticklabels([])
ax1v.set_ylim(0, 3*df.Volume.max())
ax1.set_title('Stock '+ symbol +' Closing Price')
ax1.set_ylabel('Price')
labels = ['macd','macdsignal']
ax2 = plt.subplot(2, 1, 2)
df['positive'] = df['macdhist'] > 0
ax2.plot(df[['macd','macdsignal']], label=labels)
ax2.bar(df.index, df['macdhist'], color=df.positive.map({True: 'g', False: 'r'}), label='macdhist')
ax2.grid()
ax2.set_ylabel('MACD')
ax2.set_xlabel('Date')
ax2.legend(loc='best')
plt.show()
fig = plt.figure(figsize=(14,7))
ax1 = plt.subplot(3, 1, 1)
candlestick_ohlc(ax1,dfc.values, width=0.5, colorup='g', colordown='r', alpha=1.0)
ax1.xaxis_date()
ax1.xaxis.set_major_formatter(mdates.DateFormatter('%d-%m-%Y'))
ax1.grid(True, which='both')
ax1.minorticks_on()
ax1v = ax1.twinx()
colors = dfc.VolumePositive.map({True: 'g', False: 'r'})
ax1v.bar(dfc.Date, dfc['Volume'], color=colors, alpha=0.4)
ax1v.axes.yaxis.set_ticklabels([])
ax1v.set_ylim(0, 3*df.Volume.max())
ax1.set_title('Stock '+ symbol +' Closing Price')
ax1.set_ylabel('Price')
labels = ['macd','macdsignal']
ax2 = plt.subplot(3, 1, 2)
df['positive'] = df['macdhist'] > 0
ax2.plot(df[['macd','macdsignal']], label=labels)
ax2.bar(df.index, df['macdhist'], color=df.positive.map({True: 'g', False: 'r'}), label='macdhist')
ax2.grid()
ax2.set_ylabel('MACD')
ax2.set_xlabel('Date')
ax2.legend(loc='best')
ax3 = plt.subplot(3, 1, 3)
ax3.bar(dfc.Date, dfc['Volume'], color=dfc.VolumePositive.map({True: 'g', False: 'r'}))
ax3.grid()
ax3.set_ylabel('Volume')
ax3.set_xlabel('Date')
plt.show() | 32.60479 | 121 | 0.688705 | 888 | 5,445 | 4.161036 | 0.156532 | 0.02977 | 0.023816 | 0.025981 | 0.828687 | 0.828687 | 0.818133 | 0.809743 | 0.785386 | 0.785386 | 0 | 0.047915 | 0.08393 | 5,445 | 167 | 122 | 32.60479 | 0.692863 | 0.087236 | 0 | 0.775362 | 0 | 0 | 0.140319 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.065217 | 0 | 0.065217 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fdbf3142ae2cdfa1f8c6da2c804e0633d873d144 | 11,563 | py | Python | ingestion/functions/parsing/variants/variants_test.py | globaldothealth/list | 26e63c83e0066aa57a341c1b2282c7f4d6bbd6f0 | [
"MIT"
] | 25 | 2020-09-01T23:03:21.000Z | 2022-01-12T08:08:31.000Z | ingestion/functions/parsing/variants/variants_test.py | globaldothealth/list | 26e63c83e0066aa57a341c1b2282c7f4d6bbd6f0 | [
"MIT"
] | 1,342 | 2020-07-27T09:51:00.000Z | 2022-03-31T17:03:35.000Z | ingestion/functions/parsing/variants/variants_test.py | open-covid-data/healthmap-gdo-temp | 5af5c9e2f7dcefa9039dc6b3a2e2c5094566fc2e | [
"MIT"
] | 7 | 2020-08-31T00:15:17.000Z | 2020-11-17T12:01:03.000Z | import os
import unittest
from variants import variants
_SOURCE_ID = "placeholder_ID"
_SOURCE_URL = "placeholder_URL"
class VariantsTest(unittest.TestCase):
def setUp(self):
self.maxDiff = 5000
def test_parse(self):
current_dir = os.path.dirname(__file__)
sample_data_file = os.path.join(current_dir, "sample_data.csv")
result = variants.parse_cases(
sample_data_file, _SOURCE_ID, _SOURCE_URL)
self.assertCountEqual(list(result),
[{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://www.wtoc.com/2021/01/28/health-department-nations-first-cases-south-african-covid-variant-found-sc/',
'sourceEntryId': 'voc_1'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/28/2020Z', 'end': '12/28/2020Z'}}],
'demographics': {'ageRange': {'start': 18.0, 'end': 120.0}},
'location': {'query': 'South Carolina, United States'},
'travelHistory': {'traveledPrior30Days': False},
'notes': 'Lowcountry region'},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://www.wtoc.com/2021/01/28/health-department-nations-first-cases-south-african-covid-variant-found-sc/',
'sourceEntryId': 'voc_2'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/28/2020Z', 'end': '12/28/2020Z'}}],
'demographics': {'ageRange': {'start': 18.0, 'end': 120.0}},
'location': {'query': 'South Carolina, United States'},
'travelHistory': {'traveledPrior30Days': False},
'notes': 'Pee Dee region'},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://www.theguardian.com/world/2021/jan/27/encouraging-signs-new-zealand-hopes-for-covid-all-clear-after-no-new-cases-reported',
'sourceEntryId': 'voc_3'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '01/24/2021Z', 'end': '01/27/2021Z'}}],
'additionalSources': ['https://www.theguardian.com/world/2021/jan/24/new-zealand-records-first-covid-community-case-intwo-months'],
'demographics': {'ageRange': {'start': 56.0, 'end': 56.0},
'nationalities': ['New Zealander'],
'gender': 'Female'},
'location': {'query': 'Auckland, Auckland, New Zealand'},
'travelHistory': {'traveledPrior30Days': True,
'travel': [{'methods': ['Flight'],
'location': {'query': 'London, United Kingdom'},
'dateRange': {'start': '12/30/2020Z', 'end': '12/30/2020Z'}}]}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://www.theguardian.com/world/2021/jan/27/encouraging-signs-new-zealand-hopes-for-covid-all-clear-after-no-new-cases-reported',
'sourceEntryId': 'voc_4'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '01/27/2021Z', 'end': '01/27/2021Z'}}],
'location': {'query': 'Auckland, Auckland, New Zealand'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://www.theguardian.com/world/2021/jan/27/encouraging-signs-new-zealand-hopes-for-covid-all-clear-after-no-new-cases-reported',
'sourceEntryId': 'voc_5'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '01/27/2021Z', 'end': '01/27/2021Z'}}],
'location': {'query': 'Auckland, Auckland, New Zealand'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_6'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_7'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_8'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_9'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_10'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_11'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_12'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_13'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_14'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_15'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_16'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_17'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_18'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_19'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_20'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_21'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_22'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_23'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_24'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_25'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_26'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://twitter.com/danieljbridges/status/1344227535870164992',
'sourceEntryId': 'voc_27'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/23/2020Z', 'end': '12/23/2020Z'}}],
'location': {'query': 'Zambia'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://www.sabcnews.com/sabcnews/botswana-government-adjusts-national-lockdown-regulations/',
'sourceEntryId': 'voc_28'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '01/01/2021Z', 'end': '01/07/2021Z'}}],
'location': {'query': 'Botswana'}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://www.france24.com/en/france/20201231-france-detects-first-case-of-south-african-strain-of-covid-19',
'sourceEntryId': 'voc_29'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '12/31/2020Z', 'end': '12/31/2020Z'}}],
'location': {'query': 'Haut-Rhin, France'},
'travelHistory': {'traveledPrior30Days': True,
'travel': [{'methods': ['Flight'],
'location': {'query': 'South Africa'}}]}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'http://outbreaknewstoday.com/sweden-reports-1st-south-african-covid-19-virus-variant-49394/',
'sourceEntryId': 'voc_30'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '01/03/2021Z', 'end': '01/03/2021Z'}}],
'location': {'query': 'Sweden'},
'travelHistory': {'traveledPrior30Days': True,
'travel': [{'location': {'query': 'South Africa'}}]}},
{'caseReference': {'sourceId': 'placeholder_ID',
'sourceUrl': 'https://www.vg.no/nyheter/innenriks/i/rgzer8/mer-smittsom-virusmutasjon-fra-soer-afrika-paavist-i-norge',
'sourceEntryId': 'voc_31'},
'events': [{'name': 'confirmed',
'dateRange': {'start': '01/04/2021Z', 'end': '01/04/2021Z'}}],
'location': {'query': 'Norway'},
'travelHistory': {'traveledPrior30Days': True,
'travel': [{'methods': ['Flight'],
'location': {'query': 'South Africa'},
'dateRange': {'start': '12/31/2020Z', 'end': '12/01/2020Z'}}]}}])
| 50.49345 | 148 | 0.640751 | 1,202 | 11,563 | 6.093178 | 0.155574 | 0.024031 | 0.054069 | 0.14391 | 0.84189 | 0.839569 | 0.812534 | 0.79178 | 0.780857 | 0.765838 | 0 | 0.10675 | 0.112082 | 11,563 | 228 | 149 | 50.714912 | 0.606604 | 0 | 0 | 0.67713 | 0 | 0.044843 | 0.654761 | 0 | 0 | 0 | 0 | 0 | 0.004484 | 1 | 0.008969 | false | 0 | 0.013453 | 0 | 0.026906 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fdd6e2638bc82c495a99fa923d03b68229ae2966 | 10,370 | py | Python | examples/tof-viewer/external/newton_host_driver/src/host_api/examples/python/tests/dms_eval_tests/fc_asm_mscm_xtal/fc_asm_mscm_xtal.py | rick-yhchen1013/aditof-sdk-rework | 911465dd1e05dd0b1c5107197b3b4dc3a10f77f9 | [
"MIT"
] | 5 | 2021-09-22T10:04:47.000Z | 2022-02-08T17:55:09.000Z | examples/tof-viewer/external/newton_host_driver/src/host_api/examples/python/tests/dms_eval_tests/fc_asm_mscm_xtal/fc_asm_mscm_xtal.py | rick-yhchen1013/aditof-sdk-rework | 911465dd1e05dd0b1c5107197b3b4dc3a10f77f9 | [
"MIT"
] | 99 | 2021-02-01T12:45:09.000Z | 2022-03-08T09:54:13.000Z | examples/tof-viewer/external/newton_host_driver/src/host_api/examples/python/tests/dms_eval_tests/fc_asm_mscm_xtal/fc_asm_mscm_xtal.py | rick-yhchen1013/aditof-sdk-rework | 911465dd1e05dd0b1c5107197b3b4dc3a10f77f9 | [
"MIT"
] | 4 | 2021-08-09T12:32:55.000Z | 2021-12-13T05:38:55.000Z | #!/usr/bin/env python
""" Script generated from simulation of the fc_asm_mscm_xtal test case.
Usage:
fc_asm_mscm_xtal.py [--no_reset]
Options:
--help Shows this help message.
"""
from __future__ import print_function
from __future__ import absolute_import
from __future__ import unicode_literals
from docopt import docopt
import sys
import io
import os
import time
import struct
import subprocess
import ctypes
from collections import OrderedDict
import threading
from newton_control_main import newton as newton
if __name__ == "__main__":
performReset = True
args = docopt(__doc__, version='0.1')
rc = newton.adi_newton_config( 0 )
if rc != 0:
print( "ERROR: newton.adi_newton_config return an error (" + str( rc ) + ")." )
sys.exit( rc )
if args['--no_reset']:
performReset = False
if performReset == True:
newton.adi_reset_newton( newton.PIN_MODE_HSP_DEBUG )
newton.adi_check_register_py( 0x0142, 0x0500 ) # pll_status
newton.adi_write_register( 0x0114, 0x000f ) # ckgen_ctrl
newton.adi_write_register( 0x0140, 0x0332 ) # pll_ctrl
newton.adi_write_register( 0x0140, 0x0330 ) # pll_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0142, 0x0500 ) # pll_status
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0142, 0x0501 ) # pll_status
newton.adi_write_register( 0x0138, 0x00a0 ) # lsctrl0_s1
newton.adi_write_register( 0x013a, 0x0005 ) # lsmod_en
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x000e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x100e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x200e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x300e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x400e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x500e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x600e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x700e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x800e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0x900e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0xa00e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0xb00e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0xc00e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0xd00e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0xe00e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0158, 0xf00e ) # sspll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0140, 0x0233 ) # pll_ctrl
newton.adi_write_register( 0x0140, 0x0033 ) # pll_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0142, 0x0502 ) # pll_status
newton.adi_write_register( 0x018e, 0x8a2a ) # ana_serial_spare_0
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0138, 0x00c0 ) # lsctrl0_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x018e, 0x8aea ) # ana_serial_spare_0
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0138, 0x00b0 ) # lsctrl0_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x0140, 0x0323 ) # pll_ctrl
newton.adi_write_register( 0x0140, 0x0303 ) # pll_ctrl
newton.adi_write_register( 0x0118, 0x0000 ) # clk_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0142, 0x0512 ) # pll_status
newton.adi_write_register( 0x0138, 0x000a ) # lsctrl0_s1
newton.adi_write_register( 0x010a, 0x0114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0x1114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0x2114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0x3114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0x4114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0x5114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0x6114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0x7114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0x8114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0x9114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0xa114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
newton.adi_write_register( 0x010a, 0xb114 ) # adcpll_ctrl2_s1
newton.adi_write_register( 0x0150, 0x0105 ) # regif_ctrl
newton.adi_check_register_py( 0x0032, 0x0000 ) # errorStatus
newton.adi_check_register_py( 0x0150, 0x0101 ) # regif_ctrl
| 52.908163 | 85 | 0.771938 | 1,407 | 10,370 | 5.285004 | 0.105188 | 0.186391 | 0.143088 | 0.224852 | 0.874126 | 0.871705 | 0.849247 | 0.825175 | 0.825175 | 0.825175 | 0 | 0.173268 | 0.147927 | 10,370 | 195 | 86 | 53.179487 | 0.668289 | 0.193057 | 0 | 0.596591 | 1 | 0 | 0.008771 | 0.002924 | 0 | 0 | 0.220733 | 0 | 0 | 1 | 0 | false | 0 | 0.079545 | 0 | 0.079545 | 0.011364 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
fde508c121ac1c7efc9c7284b7fdfa3f40b1f644 | 103 | py | Python | img2dataset/__init__.py | heurainbow/img2dataset | 3d23d73e6e270a0fa1619e91d7728f87c52d3181 | [
"MIT"
] | 5 | 2022-01-16T13:54:52.000Z | 2022-03-10T20:27:20.000Z | img2dataset/__init__.py | yogesh-iitj/img2dataset | d4f8d33ce29d771daa1a8124211a820f9f17d149 | [
"MIT"
] | null | null | null | img2dataset/__init__.py | yogesh-iitj/img2dataset | d4f8d33ce29d771daa1a8124211a820f9f17d149 | [
"MIT"
] | null | null | null | """Img2dataset"""
from img2dataset.downloader import main
from img2dataset.downloader import download
| 20.6 | 43 | 0.825243 | 11 | 103 | 7.727273 | 0.545455 | 0.352941 | 0.588235 | 0.729412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0.097087 | 103 | 4 | 44 | 25.75 | 0.88172 | 0.106796 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
a903ee271ab80a82c67a430d1f69f2022213a4ad | 3,117 | py | Python | tests/processor/test_ConditionCodeFlags.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | tests/processor/test_ConditionCodeFlags.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | tests/processor/test_ConditionCodeFlags.py | jamesjiang52/Bitwise | c71f151d23034b3f9e2a939f637be0eaa16c45c3 | [
"MIT"
] | null | null | null | import bitwise as bw
class TestConditionCodeFlags:
def test_ConditionCodeFlags(self):
data_bus = bw.wire.Bus16()
overflow = bw.wire.Wire()
carry_out = bw.wire.Wire()
enable = bw.wire.Wire()
clock = bw.wire.Wire()
z = bw.wire.Wire()
v = bw.wire.Wire()
n = bw.wire.Wire()
c = bw.wire.Wire()
flags = bw.wire.Bus4(z, v, n, c)
a = bw.processor.ConditionCodeFlags(
data_bus,
overflow,
carry_out,
enable,
clock,
z, v, n, c
)
enable.value = 1
data_bus.wire_values = (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1)
overflow.value = 0
carry_out.value = 0
clock.value = 0
clock.value = 1
assert flags.wire_values == (0, 0, 0, 0)
data_bus.wire_values = (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
overflow.value = 0
carry_out.value = 0
clock.value = 0
clock.value = 1
assert flags.wire_values == (1, 0, 0, 0)
data_bus.wire_values = (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
overflow.value = 1
carry_out.value = 0
clock.value = 0
clock.value = 1
assert flags.wire_values == (1, 1, 0, 0)
data_bus.wire_values = (1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
overflow.value = 0
carry_out.value = 0
clock.value = 0
clock.value = 1
assert flags.wire_values == (0, 0, 1, 0)
data_bus.wire_values = (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1)
overflow.value = 0
carry_out.value = 1
clock.value = 0
clock.value = 1
assert flags.wire_values == (0, 0, 0, 1)
data_bus.wire_values = (1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
overflow.value = 0
carry_out.value = 1
clock.value = 0
clock.value = 1
assert flags.wire_values == (0, 0, 1, 1)
data_bus.wire_values = (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
overflow.value = 1
carry_out.value = 1
clock.value = 0
clock.value = 1
assert flags.wire_values == (1, 1, 0, 1)
enable.value = 0
data_bus.wire_values = (1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
overflow.value = 0
carry_out.value = 0
clock.value = 0
clock.value = 1
assert flags.wire_values == (1, 1, 0, 1)
data_bus.wire_values = (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
overflow.value = 0
carry_out.value = 0
clock.value = 0
clock.value = 1
assert flags.wire_values == (1, 1, 0, 1)
print(a.__doc__)
print(a)
a(
data_bus=(1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0),
overflow=0,
carry_out=1,
enable=1,
clock=0,
z=None,
v=None,
n=None,
c=None
)
a(clock=1)
assert flags.wire_values == (0, 0, 1, 1)
| 28.59633 | 79 | 0.466795 | 498 | 3,117 | 2.825301 | 0.070281 | 0.220327 | 0.294243 | 0.355366 | 0.72779 | 0.72779 | 0.727079 | 0.727079 | 0.727079 | 0.7086 | 0 | 0.129747 | 0.391723 | 3,117 | 108 | 80 | 28.861111 | 0.612342 | 0 | 0 | 0.543478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108696 | 1 | 0.01087 | false | 0 | 0.01087 | 0 | 0.032609 | 0.021739 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8bf72b6778e26adcc4f198ed7c1fcf053f8c34f0 | 1,307 | py | Python | oxe-api/test/resource/media/test_get_document.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/media/test_get_document.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | oxe-api/test/resource/media/test_get_document.py | CybersecurityLuxembourg/openxeco | 8d4e5578bde6a07f5d6d569b16b4de224abf7bf0 | [
"BSD-2-Clause"
] | null | null | null | from datetime import datetime
from test.BaseCase import BaseCase
class TestGetDocument(BaseCase):
@BaseCase.login
def test_ok(self, token):
self.db.insert({
"id": 50,
"filename": "empty_pdf.pdf",
"size": 10,
"creation_date": datetime.today(),
}, self.db.tables["Document"])
response = self.application.get('/media/get_document/empty_pdf.pdf',
headers=self.get_standard_header(token))
self.assertEqual(200, response.status_code)
@BaseCase.login
def test_ok_with_id(self, token):
self.db.insert({
"id": 50,
"filename": "empty_pdf.pdf",
"size": 10,
"creation_date": datetime.today(),
}, self.db.tables["Document"])
response = self.application.get('/media/get_document/50',
headers=self.get_standard_header(token))
self.assertEqual(200, response.status_code)
@BaseCase.login
def test_unexisting_id(self, token):
response = self.application.get('/media/get_document/empty_pdf.pdf',
headers=self.get_standard_header(token))
self.assertEqual("422 Object not found", response.status)
| 31.119048 | 80 | 0.577659 | 140 | 1,307 | 5.228571 | 0.307143 | 0.061475 | 0.060109 | 0.081967 | 0.803279 | 0.770492 | 0.770492 | 0.770492 | 0.770492 | 0.770492 | 0 | 0.020856 | 0.302984 | 1,307 | 41 | 81 | 31.878049 | 0.782656 | 0 | 0 | 0.733333 | 0 | 0 | 0.156083 | 0.06733 | 0 | 0 | 0 | 0 | 0.1 | 1 | 0.1 | false | 0 | 0.066667 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e31ff98459e9fbb31ea5899eb12f3afa0d1db5cc | 4,874 | py | Python | data_structures/queues/tests/test_doubly_linked_list_based_deque.py | vinta/fuck-coding-interviews | 915ff55963430e81134a35f65f511e5684c52f11 | [
"MIT"
] | 590 | 2020-06-17T08:26:47.000Z | 2022-03-30T18:47:32.000Z | data_structures/queues/tests/test_doubly_linked_list_based_deque.py | parvathirajan/fuck-coding-interviews | 915ff55963430e81134a35f65f511e5684c52f11 | [
"MIT"
] | 12 | 2020-07-14T09:24:32.000Z | 2020-11-02T03:43:47.000Z | data_structures/queues/tests/test_doubly_linked_list_based_deque.py | parvathirajan/fuck-coding-interviews | 915ff55963430e81134a35f65f511e5684c52f11 | [
"MIT"
] | 75 | 2020-07-29T06:50:13.000Z | 2022-03-13T16:14:57.000Z | # coding: utf-8
import unittest
from data_structures.queues.doubly_linked_list_based_deque import DoublyLinkedListBasedDeque
class TestCase(unittest.TestCase):
def setUp(self):
self.deque = DoublyLinkedListBasedDeque()
def test_append(self):
self.deque.append(0)
self.assertEqual(self.deque.linked_list.head.value, 0)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.value, 0)
self.assertEqual(self.deque.linked_list.tail.next, None)
self.deque.append(1)
self.assertEqual(self.deque.linked_list.head.value, 0)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.value, 1)
self.assertEqual(self.deque.linked_list.tail.next, None)
self.deque.append(2)
self.assertEqual(self.deque.linked_list.head.value, 0)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.value, 2)
self.assertEqual(self.deque.linked_list.tail.next, None)
expected = [0, 1, 2]
self.assertEqual(len(self.deque), 3)
self.assertEqual(list(self.deque), expected)
self.assertEqual(list(reversed(self.deque)), list(reversed(expected)))
def test_append_left(self):
self.deque.append_left(0)
self.assertEqual(self.deque.linked_list.head.value, 0)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.value, 0)
self.assertEqual(self.deque.linked_list.tail.next, None)
self.deque.append_left(1)
self.assertEqual(self.deque.linked_list.head.value, 1)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.value, 0)
self.assertEqual(self.deque.linked_list.tail.next, None)
self.deque.append_left(2)
self.assertEqual(self.deque.linked_list.head.value, 2)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.value, 0)
self.assertEqual(self.deque.linked_list.tail.next, None)
expected = [2, 1, 0]
self.assertEqual(len(self.deque), 3)
self.assertEqual(list(self.deque), expected)
self.assertEqual(list(reversed(self.deque)), list(reversed(expected)))
def test_pop(self):
with self.assertRaises(ValueError):
print(self.deque.pop())
self.deque.append(0)
self.deque.append(1)
self.deque.append(2)
self.assertEqual(self.deque.pop(), 2)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.next, None)
self.assertEqual(self.deque.pop(), 1)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.next, None)
self.assertEqual(self.deque.pop(), 0)
self.assertEqual(self.deque.linked_list.head, None)
self.assertEqual(self.deque.linked_list.tail, None)
self.assertEqual(len(self.deque), 0)
self.assertEqual(list(self.deque), [])
self.assertEqual(list(reversed(self.deque)), [])
with self.assertRaises(ValueError):
print(self.deque.pop())
def test_pop_left(self):
with self.assertRaises(ValueError):
print(self.deque.pop_left())
self.deque.append(0)
self.deque.append(1)
self.deque.append(2)
self.assertEqual(self.deque.pop_left(), 0)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.next, None)
self.assertEqual(self.deque.pop_left(), 1)
self.assertEqual(self.deque.linked_list.head.previous, None)
self.assertEqual(self.deque.linked_list.tail.next, None)
self.assertEqual(self.deque.pop_left(), 2)
self.assertEqual(self.deque.linked_list.head, None)
self.assertEqual(self.deque.linked_list.tail, None)
self.assertEqual(len(self.deque), 0)
self.assertEqual(list(self.deque), [])
self.assertEqual(list(reversed(self.deque)), [])
with self.assertRaises(ValueError):
print(self.deque.pop_left())
def test(self):
self.deque.append(0)
self.deque.append(1)
self.assertEqual(self.deque.pop(), 1)
self.deque.append(2)
self.deque.append_left(-1)
self.assertEqual(self.deque.pop_left(), -1)
self.deque.append_left(-2)
self.deque.append_left(-3)
self.assertEqual(self.deque.linked_list.head.value, -3)
self.assertEqual(self.deque.linked_list.tail.value, 2)
if __name__ == '__main__':
unittest.main()
| 41.65812 | 92 | 0.675421 | 638 | 4,874 | 5.051724 | 0.070533 | 0.226187 | 0.271176 | 0.342538 | 0.914366 | 0.909711 | 0.892647 | 0.889544 | 0.865653 | 0.762023 | 0 | 0.012977 | 0.193681 | 4,874 | 116 | 93 | 42.017241 | 0.807125 | 0.002667 | 0 | 0.693878 | 0 | 0 | 0.001646 | 0 | 0 | 0 | 0 | 0 | 0.632653 | 1 | 0.061224 | false | 0 | 0.020408 | 0 | 0.091837 | 0.040816 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
8b61c27337f5458ff123c8b47e807593144bfd34 | 77 | py | Python | python/testData/multipleArgumentsCompletion/suggestArgumentsForParametersWithDefaultValue.after.py | 06needhamt/intellij-community | 63d7b8030e4fdefeb4760e511e289f7e6b3a5c5b | [
"Apache-2.0"
] | null | null | null | python/testData/multipleArgumentsCompletion/suggestArgumentsForParametersWithDefaultValue.after.py | 06needhamt/intellij-community | 63d7b8030e4fdefeb4760e511e289f7e6b3a5c5b | [
"Apache-2.0"
] | null | null | null | python/testData/multipleArgumentsCompletion/suggestArgumentsForParametersWithDefaultValue.after.py | 06needhamt/intellij-community | 63d7b8030e4fdefeb4760e511e289f7e6b3a5c5b | [
"Apache-2.0"
] | null | null | null | def foo(x, y, z=42):
pass
x = 42
y = 100500
z = 42
foo(x, y, z)<caret>
| 8.555556 | 20 | 0.506494 | 18 | 77 | 2.166667 | 0.5 | 0.205128 | 0.25641 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 0.298701 | 77 | 8 | 21 | 9.625 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.166667 | 0 | null | null | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
8bbd0d991b08f6537044f2c2b814be2e123682db | 162 | py | Python | wgm/cli/wgmcontroller/__init__.py | Skylaski-VPN/WireGuard-Gateway-Manager | 5cacbbc2318fdf662cd4793a786e3c7b9b74c5c4 | [
"MIT"
] | 1 | 2021-11-28T21:26:58.000Z | 2021-11-28T21:26:58.000Z | wgm/cli/wgmcontroller/__init__.py | Skylaski-VPN/WireGuard-Gateway-Manager | 5cacbbc2318fdf662cd4793a786e3c7b9b74c5c4 | [
"MIT"
] | 2 | 2021-04-07T18:10:07.000Z | 2021-04-07T21:41:35.000Z | wgm/cli/wgmcontroller/__init__.py | Skylaski-VPN/WireGuard-Gateway-Manager | 5cacbbc2318fdf662cd4793a786e3c7b9b74c5c4 | [
"MIT"
] | 2 | 2021-04-07T16:13:55.000Z | 2021-04-23T18:33:22.000Z | # wgmcontroller
from .controller import attach_gw
from .controller import detach_gw
from .controller import attach_client
from .controller import detach_client
| 20.25 | 37 | 0.845679 | 21 | 162 | 6.333333 | 0.380952 | 0.421053 | 0.601504 | 0.390977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123457 | 162 | 7 | 38 | 23.142857 | 0.93662 | 0.080247 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8bc30705bbb1dc754426129ce62e6e6f6a6126c7 | 12,229 | py | Python | lreid/operation/test_p_s.py | TPCD/LifelongReID | cb33f9c29fe398e7546db345fab1c338dda8252f | [
"MIT"
] | 63 | 2021-03-20T15:33:11.000Z | 2022-03-30T03:04:14.000Z | lreid/operation/test_p_s.py | TPCD/LifelongReID | cb33f9c29fe398e7546db345fab1c338dda8252f | [
"MIT"
] | 5 | 2021-03-23T08:04:21.000Z | 2022-03-10T02:28:43.000Z | lreid/operation/test_p_s.py | TPCD/LifelongReID | cb33f9c29fe398e7546db345fab1c338dda8252f | [
"MIT"
] | 10 | 2021-04-30T11:14:10.000Z | 2022-03-18T16:44:55.000Z | import torch
from lreid.tools import time_now, CatMeter
from lreid.evaluation import (fast_evaluate_rank, compute_distance_matrix)
def fast_test_p_s(config, base, loaders, current_step, if_test_forget=True):
# using Cython test during train
# return mAP, Rank-1
base.set_all_model_eval()
print(f'****** start perform fast testing! ******')
# meters
# compute query and gallery features
def _cmc_map(_query_features_meter, _gallery_features_meter):
query_features = _query_features_meter.get_val()
gallery_features = _gallery_features_meter.get_val()
distance_matrix = compute_distance_matrix(query_features, gallery_features, config.test_metric)
distance_matrix = distance_matrix.data.cpu().numpy()
CMC, mAP = fast_evaluate_rank(distance_matrix,
query_pids_meter.get_val_numpy(),
gallery_pids_meter.get_val_numpy(),
query_cids_meter.get_val_numpy(),
gallery_cids_meter.get_val_numpy(),
max_rank=50,
use_metric_cuhk03=False,
use_cython=True)
return CMC[0] * 100, mAP * 100
results_dict = {}
for dataset_name, temp_loaders in loaders.test_loader_dict.items():
query_features_meter, query_pids_meter, query_cids_meter = CatMeter(), CatMeter(), CatMeter()
gallery_features_meter, gallery_pids_meter, gallery_cids_meter = CatMeter(), CatMeter(), CatMeter()
query_metagraph_features_meter, query_metagraph_pids_meter, query_metagraph_cids_meter = CatMeter(), CatMeter(), CatMeter()
gallery_metagraph_features_meter, gallery_metagraph_pids_meter, gallery_metagraph_cids_meter = CatMeter(), CatMeter(), CatMeter()
query_fuse_features_meter, query_fuse_pids_meter, query_fuse_cids_meter = CatMeter(), CatMeter(), CatMeter()
gallery_fuse_features_meter, gallery_fuse_pids_meter, gallery_fuse_cids_meter = CatMeter(), CatMeter(), CatMeter()
print(time_now(), f' {dataset_name} feature start ')
with torch.no_grad():
for loader_id, loader in enumerate(temp_loaders):
for data in loader:
# compute feautres
images, pids, cids = data[0:3]
images = images.to(base.device)
features, featuremaps = base.model_dict['tasknet'](images, current_step)
if config.if_test_metagraph:
features_metagraph, _ = base.model_dict['metagraph'](features)
features_fuse = features + features_metagraph
# save as query features
if loader_id == 0:
query_features_meter.update(features.data)
if config.if_test_metagraph:
query_fuse_features_meter.update(features_fuse.data)
query_metagraph_features_meter.update(features_metagraph.data)
query_pids_meter.update(pids)
query_cids_meter.update(cids)
# save as gallery features
elif loader_id == 1:
gallery_features_meter.update(features.data)
if config.if_test_metagraph:
gallery_metagraph_features_meter.update(features_metagraph.data)
gallery_fuse_features_meter.update(features_fuse.data)
gallery_pids_meter.update(pids)
gallery_cids_meter.update(cids)
# print(f'Save distance matrix to RegDB_three_stream_dist({current_step}).npy')
#
#
#
# np.save(os.path.join(config.feature_save_path, f'query_features_({dataset_name})_({current_step}).pth'),
# query_features_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'query_pids_({dataset_name})_({current_step}).pth'),
# query_pids_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'query_cids_({dataset_name})_({current_step}).pth'),
# query_cids_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'gallery_features_({dataset_name})_({current_step}).pth'),
# gallery_features_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'gallery_pids_({dataset_name})_({current_step}).pth'),
# gallery_pids_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'gallery_cids_({dataset_name})_({current_step}).pth'),
# gallery_cids_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'query_fuse_features_({dataset_name})_({current_step}).pth'),
# query_fuse_features_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'gallery_fuse_features_({dataset_name})_({current_step}).pth'),
# gallery_fuse_features_meter.get_val_numpy())
print(time_now(), f' {dataset_name} feature done')
rank1, map = _cmc_map(query_features_meter, gallery_features_meter)
results_dict[f'{dataset_name}_tasknet_mAP'], results_dict[f'{dataset_name}_tasknet_Rank1'] = map, rank1
if config.if_test_metagraph:
# rank1, map = _cmc_map(query_metagraph_features_meter, gallery_metagraph_features_meter)
# results_dict['metagraph_mAP'], results_dict['metagraph_Rank1'] = map, rank1
rank1, map = _cmc_map(query_fuse_features_meter, gallery_fuse_features_meter)
results_dict[f'{dataset_name}_fuse_mAP'], results_dict[f'{dataset_name}_fuse_Rank1'] = map, rank1
results_str = ''
for criterion, value in results_dict.items():
results_str = results_str + f'\n{criterion}: {value}'
return results_dict, results_str
def save_and_fast_test_p_s(config, base, loaders, current_step, current_epoch,if_test_forget=True):
# using Cython test during train
# return mAP, Rank-1
base.set_all_model_eval()
print(f'****** start perform fast testing! ******')
# meters
# compute query and gallery features
def _cmc_map(_query_features_meter, _gallery_features_meter):
query_features = _query_features_meter.get_val()
gallery_features = _gallery_features_meter.get_val()
distance_matrix = compute_distance_matrix(query_features, gallery_features, config.test_metric)
distance_matrix = distance_matrix.data.cpu().numpy()
CMC, mAP = fast_evaluate_rank(distance_matrix,
query_pids_meter.get_val_numpy(),
gallery_pids_meter.get_val_numpy(),
query_cids_meter.get_val_numpy(),
gallery_cids_meter.get_val_numpy(),
max_rank=50,
use_metric_cuhk03=False,
use_cython=True)
return CMC[0] * 100, mAP * 100
results_dict = {}
for dataset_name, temp_loaders in loaders.test_loader_dict.items():
query_features_meter, query_pids_meter, query_cids_meter = CatMeter(), CatMeter(), CatMeter()
gallery_features_meter, gallery_pids_meter, gallery_cids_meter = CatMeter(), CatMeter(), CatMeter()
query_metagraph_features_meter, query_metagraph_pids_meter, query_metagraph_cids_meter = CatMeter(), CatMeter(), CatMeter()
gallery_metagraph_features_meter, gallery_metagraph_pids_meter, gallery_metagraph_cids_meter = CatMeter(), CatMeter(), CatMeter()
query_fuse_features_meter, query_fuse_pids_meter, query_fuse_cids_meter = CatMeter(), CatMeter(), CatMeter()
gallery_fuse_features_meter, gallery_fuse_pids_meter, gallery_fuse_cids_meter = CatMeter(), CatMeter(), CatMeter()
print(time_now(), f' {dataset_name} feature start ')
with torch.no_grad():
for loader_id, loader in enumerate(temp_loaders):
for data in loader:
# compute feautres
images, pids, cids = data[0:3]
images = images.to(base.device)
features, featuremaps = base.model_dict['tasknet'](images, current_step)
if config.if_test_metagraph:
features_metagraph, _ = base.model_dict['metagraph'](features)
features_fuse = features + features_metagraph
# save as query features
if loader_id == 0:
query_features_meter.update(features.data)
if config.if_test_metagraph:
query_fuse_features_meter.update(features_fuse.data)
query_metagraph_features_meter.update(features_metagraph.data)
query_pids_meter.update(pids)
query_cids_meter.update(cids)
# save as gallery features
elif loader_id == 1:
gallery_features_meter.update(features.data)
if config.if_test_metagraph:
gallery_metagraph_features_meter.update(features_metagraph.data)
gallery_fuse_features_meter.update(features_fuse.data)
gallery_pids_meter.update(pids)
gallery_cids_meter.update(cids)
# print(f'Save ({dataset_name}) => feature ({current_step}).npy')
#
# np.save(os.path.join(config.feature_save_path, f'query_features_({dataset_name})_({current_step})_({current_epoch}).pth'),
# query_features_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'query_pids_({dataset_name})_({current_step})_({current_epoch}).pth'),
# query_pids_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'query_cids_({dataset_name})_({current_step})_({current_epoch}).pth'),
# query_cids_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'gallery_features_({dataset_name})_({current_step})_({current_epoch}).pth'),
# gallery_features_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'gallery_pids_({dataset_name})_({current_step})_({current_epoch}).pth'),
# gallery_pids_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'gallery_cids_({dataset_name})_({current_step})_({current_epoch}).pth'),
# gallery_cids_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'query_fuse_features_({dataset_name})_({current_step})_({current_epoch}).pth'),
# query_fuse_features_meter.get_val_numpy())
# np.save(os.path.join(config.feature_save_path, f'gallery_fuse_features_({dataset_name})_({current_step})_({current_epoch}).pth'),
# gallery_fuse_features_meter.get_val_numpy())
print(time_now(), f' {dataset_name} feature done')
rank1, map = _cmc_map(query_features_meter, gallery_features_meter)
results_dict[f'{dataset_name}_tasknet_mAP'], results_dict[f'{dataset_name}_tasknet_Rank1'] = map, rank1
if config.if_test_metagraph:
# rank1, map = _cmc_map(query_metagraph_features_meter, gallery_metagraph_features_meter)
# results_dict['metagraph_mAP'], results_dict['metagraph_Rank1'] = map, rank1
rank1, map = _cmc_map(query_fuse_features_meter, gallery_fuse_features_meter)
results_dict[f'{dataset_name}_fuse_mAP'], results_dict[f'{dataset_name}_fuse_Rank1'] = map, rank1
results_str = ''
for criterion, value in results_dict.items():
results_str = results_str + f'\n{criterion}: {value}'
return results_dict, results_str
| 60.840796 | 139 | 0.635784 | 1,417 | 12,229 | 5.038109 | 0.077629 | 0.094691 | 0.043143 | 0.053789 | 0.973806 | 0.973806 | 0.973806 | 0.973806 | 0.967082 | 0.949713 | 0 | 0.005574 | 0.266498 | 12,229 | 200 | 140 | 61.145 | 0.790301 | 0.277619 | 0 | 0.958678 | 0 | 0 | 0.054479 | 0.023251 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033058 | false | 0 | 0.024793 | 0 | 0.090909 | 0.049587 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
47a148e2b035b71003b3de793e4e7c5fbb8559b9 | 35 | py | Python | src/print_helper/__init__.py | bjoern-hempel/pytorch-classification | 8a4bd6aef488360b88234b008d1d7308469bc5d8 | [
"MIT"
] | null | null | null | src/print_helper/__init__.py | bjoern-hempel/pytorch-classification | 8a4bd6aef488360b88234b008d1d7308469bc5d8 | [
"MIT"
] | null | null | null | src/print_helper/__init__.py | bjoern-hempel/pytorch-classification | 8a4bd6aef488360b88234b008d1d7308469bc5d8 | [
"MIT"
] | null | null | null | # __init__.py
from .print import *
| 11.666667 | 20 | 0.714286 | 5 | 35 | 4.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 35 | 2 | 21 | 17.5 | 0.724138 | 0.314286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 7 |
47a1c4300cbe4addca9cda3b4239e59068d963dc | 160 | py | Python | avionix_airflow/kubernetes/cloud/__init__.py | zbrookle/avionix_airflow | a9b4665ce7699bcee7252a3f10d588a57c1f32c4 | [
"BSD-3-Clause"
] | 5 | 2020-08-31T07:33:47.000Z | 2022-01-19T09:03:09.000Z | avionix_airflow/kubernetes/cloud/__init__.py | zbrookle/avionix_airflow | a9b4665ce7699bcee7252a3f10d588a57c1f32c4 | [
"BSD-3-Clause"
] | 20 | 2020-07-28T23:39:22.000Z | 2020-10-06T20:21:32.000Z | avionix_airflow/kubernetes/cloud/__init__.py | zbrookle/avionix_airflow | a9b4665ce7699bcee7252a3f10d588a57c1f32c4 | [
"BSD-3-Clause"
] | 1 | 2021-09-27T14:48:41.000Z | 2021-09-27T14:48:41.000Z | # flake8: noqa
from avionix_airflow.kubernetes.cloud.aws.aws_options import AwsOptions
from avionix_airflow.kubernetes.cloud.cloud_options import CloudOptions
| 32 | 71 | 0.86875 | 21 | 160 | 6.428571 | 0.571429 | 0.162963 | 0.266667 | 0.414815 | 0.488889 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006757 | 0.075 | 160 | 4 | 72 | 40 | 0.905405 | 0.075 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
47ba7ec6c3f17684485578a8bfcb3ed0159c4e30 | 104 | py | Python | pysoup/utils/assets.py | illBeRoy/pysoup | 742fd6630e1be27c275cb8dc6ee94412472cb20b | [
"MIT"
] | 4 | 2016-02-21T12:40:44.000Z | 2019-06-13T13:23:19.000Z | pysoup/utils/assets.py | illBeRoy/pysoup | 742fd6630e1be27c275cb8dc6ee94412472cb20b | [
"MIT"
] | null | null | null | pysoup/utils/assets.py | illBeRoy/pysoup | 742fd6630e1be27c275cb8dc6ee94412472cb20b | [
"MIT"
] | 1 | 2020-07-16T12:22:12.000Z | 2020-07-16T12:22:12.000Z | LOGO = '''
X X X
X X X
+------------------+
+----------------+
+------------+
'''
| 11.555556 | 20 | 0.096154 | 7 | 104 | 1.428571 | 0.285714 | 1 | 1.2 | 1.2 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.336538 | 104 | 8 | 21 | 13 | 0.144928 | 0 | 0 | 0.285714 | 0 | 0 | 0.865385 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
47bc426662bd0292464a6afdc5b54a5a71c3aa2d | 33,856 | py | Python | utils_methods.py | venkatesh-saligrama/FedDyn | 98c519e48ffcacbf318d33ad97c63cc99871ac57 | [
"MIT"
] | 15 | 2021-08-13T03:07:29.000Z | 2022-03-29T06:31:18.000Z | utils_methods.py | venkatesh-saligrama/FedDyn | 98c519e48ffcacbf318d33ad97c63cc99871ac57 | [
"MIT"
] | 1 | 2021-11-02T16:42:11.000Z | 2022-01-06T19:46:02.000Z | utils_methods.py | venkatesh-saligrama/FedDyn | 98c519e48ffcacbf318d33ad97c63cc99871ac57 | [
"MIT"
] | 7 | 2021-09-14T16:26:13.000Z | 2022-03-22T13:07:47.000Z | from utils_libs import *
from utils_dataset import *
from utils_models import *
from utils_general import *
### Methods
def train_FedAvg(data_obj, act_prob ,learning_rate, batch_size, epoch, com_amount, print_per, weight_decay, model_func, init_model, save_period, lr_decay_per_round, rand_seed=0):
method_name = 'FedAvg'
n_clnt=data_obj.n_client
clnt_x = data_obj.clnt_x; clnt_y=data_obj.clnt_y
cent_x = np.concatenate(clnt_x, axis=0)
cent_y = np.concatenate(clnt_y, axis=0)
weight_list = np.asarray([len(clnt_y[i]) for i in range(n_clnt)])
weight_list = weight_list.reshape((n_clnt, 1))
if not os.path.exists('Output/%s/%s' %(data_obj.name, method_name)):
os.mkdir('Output/%s/%s' %(data_obj.name, method_name))
n_save_instances = int(com_amount / save_period)
fed_mdls_sel = list(range(n_save_instances)); fed_mdls_all = list(range(n_save_instances))
trn_perf_sel = np.zeros((com_amount, 2)); trn_perf_all = np.zeros((com_amount, 2))
tst_perf_sel = np.zeros((com_amount, 2)); tst_perf_all = np.zeros((com_amount, 2))
n_par = len(get_mdl_params([model_func()])[0])
init_par_list=get_mdl_params([init_model], n_par)[0]
clnt_params_list=np.ones(n_clnt).astype('float32').reshape(-1, 1) * init_par_list.reshape(1, -1) # n_clnt X n_par
clnt_models = list(range(n_clnt))
avg_model = model_func().to(device)
avg_model.load_state_dict(copy.deepcopy(dict(init_model.named_parameters())))
all_model = model_func().to(device)
all_model.load_state_dict(copy.deepcopy(dict(init_model.named_parameters())))
if os.path.exists('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, com_amount)):
# Load performances and models...
for j in range(n_save_instances):
fed_model = model_func()
fed_model.load_state_dict(torch.load('Output/%s/%s/%d_com_sel.pt' %(data_obj.name, method_name, (j+1)*save_period)))
fed_model.eval()
fed_model = fed_model.to(device)
fed_mdls_sel[j] = fed_model
fed_model = model_func()
fed_model.load_state_dict(torch.load('Output/%s/%s/%d_com_all.pt' %(data_obj.name, method_name, (j+1)*save_period)))
fed_model.eval()
fed_model = fed_model.to(device)
fed_mdls_all[j] = fed_model
trn_perf_sel = np.load('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, com_amount))
trn_perf_all = np.load('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, com_amount))
tst_perf_sel = np.load('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, com_amount))
tst_perf_all = np.load('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, com_amount))
clnt_params_list = np.load('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, com_amount))
else:
for i in range(com_amount):
inc_seed = 0
while(True):
# Fix randomness in client selection
np.random.seed(i + rand_seed + inc_seed)
act_list = np.random.uniform(size=n_clnt)
act_clients = act_list <= act_prob
selected_clnts = np.sort(np.where(act_clients)[0])
inc_seed += 1
if len(selected_clnts) != 0:
break
print('Selected Clients: %s' %(', '.join(['%2d' %item for item in selected_clnts])))
for clnt in selected_clnts:
print('---- Training client %d' %clnt)
trn_x = clnt_x[clnt]
trn_y = clnt_y[clnt]
clnt_models[clnt] = model_func().to(device)
clnt_models[clnt].load_state_dict(copy.deepcopy(dict(avg_model.named_parameters())))
for params in clnt_models[clnt].parameters():
params.requires_grad = True
clnt_models[clnt] = train_model(clnt_models[clnt], trn_x, trn_y, learning_rate * (lr_decay_per_round ** i), batch_size, epoch, print_per, weight_decay, data_obj.dataset)
clnt_params_list[clnt] = get_mdl_params([clnt_models[clnt]], n_par)[0]
# Scale with weights
avg_model = set_client_from_params(model_func(), np.sum(clnt_params_list[selected_clnts]*weight_list[selected_clnts]/np.sum(weight_list[selected_clnts]), axis = 0))
all_model = set_client_from_params(model_func(), np.sum(clnt_params_list*weight_list/np.sum(weight_list), axis = 0))
###
loss_tst, acc_tst = get_acc_loss(data_obj.tst_x, data_obj.tst_y, avg_model, data_obj.dataset)
tst_perf_sel[i] = [loss_tst, acc_tst]
print("**** Communication sel %3d, Test Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(cent_x, cent_y, avg_model, data_obj.dataset)
trn_perf_sel[i] = [loss_tst, acc_tst]
print("**** Communication sel %3d, Cent Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(data_obj.tst_x, data_obj.tst_y, all_model, data_obj.dataset)
tst_perf_all[i] = [loss_tst, acc_tst]
print("**** Communication all %3d, Test Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(cent_x, cent_y, all_model, data_obj.dataset)
trn_perf_all[i] = [loss_tst, acc_tst]
print("**** Communication all %3d, Cent Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
if ((i+1) % save_period == 0):
torch.save(avg_model.state_dict(), 'Output/%s/%s/%d_com_sel.pt' %(data_obj.name, method_name, (i+1)))
torch.save(all_model.state_dict(), 'Output/%s/%s/%d_com_all.pt' %(data_obj.name, method_name, (i+1)))
np.save('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, (i+1)), clnt_params_list)
np.save('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, (i+1)), trn_perf_sel[:i+1])
np.save('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, (i+1)), tst_perf_sel[:i+1])
np.save('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, (i+1)), trn_perf_all[:i+1])
np.save('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, (i+1)), tst_perf_all[:i+1])
if (i+1) > save_period:
if os.path.exists('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period)):
# Delete the previous saved arrays
os.remove('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, i+1-save_period))
if ((i+1) % save_period == 0):
fed_mdls_sel[i//save_period] = avg_model
fed_mdls_all[i//save_period] = all_model
return fed_mdls_sel, trn_perf_sel, tst_perf_sel, fed_mdls_all, trn_perf_all, tst_perf_all
def train_SCAFFOLD(data_obj, act_prob, learning_rate, batch_size, n_minibatch, com_amount, print_per, weight_decay, model_func, init_model, save_period, lr_decay_per_round, rand_seed=0, global_learning_rate=1):
method_name = 'Scaffold'
n_clnt=data_obj.n_client
clnt_x = data_obj.clnt_x; clnt_y=data_obj.clnt_y
cent_x = np.concatenate(clnt_x, axis=0)
cent_y = np.concatenate(clnt_y, axis=0)
weight_list = np.asarray([len(clnt_y[i]) for i in range(n_clnt)])
weight_list = weight_list / np.sum(weight_list) * n_clnt # normalize it
if not os.path.exists('Output/%s/%s' %(data_obj.name, method_name)):
os.mkdir('Output/%s/%s' %(data_obj.name, method_name))
n_save_instances = int(com_amount / save_period)
fed_mdls_sel = list(range(n_save_instances)); fed_mdls_all = list(range(n_save_instances))
trn_perf_sel = np.zeros((com_amount, 2)); trn_perf_all = np.zeros((com_amount, 2))
tst_perf_sel = np.zeros((com_amount, 2)); tst_perf_all = np.zeros((com_amount, 2))
n_par = len(get_mdl_params([model_func()])[0])
state_param_list = np.zeros((n_clnt+1, n_par)).astype('float32') #including cloud state
init_par_list=get_mdl_params([init_model], n_par)[0]
clnt_params_list=np.ones(n_clnt).astype('float32').reshape(-1, 1) * init_par_list.reshape(1, -1) # n_clnt X n_par
clnt_models = list(range(n_clnt))
avg_model = model_func().to(device)
avg_model.load_state_dict(copy.deepcopy(dict(init_model.named_parameters())))
all_model = model_func().to(device)
all_model.load_state_dict(copy.deepcopy(dict(init_model.named_parameters())))
if os.path.exists('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, com_amount)):
# Load performances and models...
for j in range(n_save_instances):
fed_model = model_func()
fed_model.load_state_dict(torch.load('Output/%s/%s/%d_com_sel.pt' %(data_obj.name, method_name, (j+1)*save_period)))
fed_model.eval()
fed_model = fed_model.to(device)
fed_mdls_sel[j] = fed_model
fed_model = model_func()
fed_model.load_state_dict(torch.load('Output/%s/%s/%d_com_all.pt' %(data_obj.name, method_name, (j+1)*save_period)))
fed_model.eval()
fed_model = fed_model.to(device)
fed_mdls_all[j] = fed_model
trn_perf_sel = np.load('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, com_amount))
trn_perf_all = np.load('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, com_amount))
tst_perf_sel = np.load('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, com_amount))
tst_perf_all = np.load('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, com_amount))
clnt_params_list = np.load('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, com_amount))
state_param_list = np.load('Output/%s/%s/%d_state_param_list.npy' %(data_obj.name, method_name, com_amount))
else:
for i in range(com_amount):
inc_seed = 0
while(True):
# Fix randomness in client selection
np.random.seed(i + rand_seed + inc_seed)
act_list = np.random.uniform(size=n_clnt)
act_clients = act_list <= act_prob
selected_clnts = np.sort(np.where(act_clients)[0])
inc_seed += 1
if len(selected_clnts) != 0:
break
print('Selected Clients: %s' %(', '.join(['%2d' %item for item in selected_clnts])))
delta_c_sum = np.zeros(n_par)
prev_params = get_mdl_params([avg_model], n_par)[0]
for clnt in selected_clnts:
print('---- Training client %d' %clnt)
trn_x = clnt_x[clnt]
trn_y = clnt_y[clnt]
clnt_models[clnt] = model_func().to(device)
clnt_models[clnt].load_state_dict(copy.deepcopy(dict(avg_model.named_parameters())))
for params in clnt_models[clnt].parameters():
params.requires_grad = True
# Scale down c
state_params_diff_curr = torch.tensor(-state_param_list[clnt] + state_param_list[-1]/weight_list[clnt], dtype=torch.float32, device=device)
clnt_models[clnt] = train_scaffold_mdl(clnt_models[clnt], model_func, state_params_diff_curr, trn_x, trn_y, learning_rate * (lr_decay_per_round ** i), batch_size, n_minibatch, print_per, weight_decay, data_obj.dataset)
curr_model_param = get_mdl_params([clnt_models[clnt]], n_par)[0]
new_c = state_param_list[clnt] - state_param_list[-1] + 1/n_minibatch/learning_rate * (prev_params - curr_model_param)
# Scale up delta c
delta_c_sum += (new_c - state_param_list[clnt])*weight_list[clnt]
state_param_list[clnt] = new_c
clnt_params_list[clnt] = curr_model_param
avg_model_params = global_learning_rate*np.mean(clnt_params_list[selected_clnts], axis = 0) + (1-global_learning_rate)*prev_params
state_param_list[-1] += 1 / n_clnt * delta_c_sum
avg_model = set_client_from_params(model_func().to(device), avg_model_params)
all_model = set_client_from_params(model_func(), np.mean(clnt_params_list, axis = 0))
###
loss_tst, acc_tst = get_acc_loss(data_obj.tst_x, data_obj.tst_y, avg_model, data_obj.dataset)
tst_perf_sel[i] = [loss_tst, acc_tst]
print("**** Communication sel %3d, Test Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(cent_x, cent_y, avg_model, data_obj.dataset)
trn_perf_sel[i] = [loss_tst, acc_tst]
print("**** Communication sel %3d, Cent Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(data_obj.tst_x, data_obj.tst_y, all_model, data_obj.dataset)
tst_perf_all[i] = [loss_tst, acc_tst]
print("**** Communication all %3d, Test Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(cent_x, cent_y, all_model, data_obj.dataset)
trn_perf_all[i] = [loss_tst, acc_tst]
print("**** Communication all %3d, Cent Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
if ((i+1) % save_period == 0):
torch.save(avg_model.state_dict(), 'Output/%s/%s/%d_com_sel.pt' %(data_obj.name, method_name, (i+1)))
torch.save(all_model.state_dict(), 'Output/%s/%s/%d_com_all.pt' %(data_obj.name, method_name, (i+1)))
np.save('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, (i+1)), clnt_params_list)
np.save('Output/%s/%s/%d_state_param_list.npy' %(data_obj.name, method_name, (i+1)), state_param_list)
np.save('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, (i+1)), trn_perf_sel[:i+1])
np.save('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, (i+1)), tst_perf_sel[:i+1])
np.save('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, (i+1)), trn_perf_all[:i+1])
np.save('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, (i+1)), tst_perf_all[:i+1])
if (i+1) > save_period:
if os.path.exists('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period)):
os.remove('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_state_param_list.npy' %(data_obj.name, method_name, i+1-save_period))
if ((i+1) % save_period == 0):
fed_mdls_sel[i//save_period] = avg_model
fed_mdls_all[i//save_period] = all_model
return fed_mdls_sel, trn_perf_sel, tst_perf_sel, fed_mdls_all, trn_perf_all, tst_perf_all
def train_FedDyn(data_obj, act_prob, learning_rate, batch_size, epoch, com_amount, print_per, weight_decay, model_func, init_model, alpha_coef, save_period, lr_decay_per_round, rand_seed=0):
method_name = 'FedDyn'
n_clnt = data_obj.n_client
clnt_x = data_obj.clnt_x; clnt_y=data_obj.clnt_y
cent_x = np.concatenate(clnt_x, axis=0)
cent_y = np.concatenate(clnt_y, axis=0)
weight_list = np.asarray([len(clnt_y[i]) for i in range(n_clnt)])
weight_list = weight_list / np.sum(weight_list) * n_clnt
if not os.path.exists('Output/%s/%s' %(data_obj.name, method_name)):
os.mkdir('Output/%s/%s' %(data_obj.name, method_name))
n_save_instances = int(com_amount / save_period)
fed_mdls_sel = list(range(n_save_instances)) # Avg active clients
fed_mdls_all = list(range(n_save_instances)) # Avg all clients
fed_mdls_cld = list(range(n_save_instances)) # Cloud models
trn_perf_sel = np.zeros((com_amount, 2)); trn_perf_all = np.zeros((com_amount, 2))
tst_perf_sel = np.zeros((com_amount, 2)); tst_perf_all = np.zeros((com_amount, 2))
n_par = len(get_mdl_params([model_func()])[0])
local_param_list = np.zeros((n_clnt, n_par)).astype('float32')
init_par_list=get_mdl_params([init_model], n_par)[0]
clnt_params_list = np.ones(n_clnt).astype('float32').reshape(-1, 1) * init_par_list.reshape(1, -1) # n_clnt X n_par
clnt_models = list(range(n_clnt))
avg_model = model_func().to(device)
avg_model.load_state_dict(copy.deepcopy(dict(init_model.named_parameters())))
all_model = model_func().to(device)
all_model.load_state_dict(copy.deepcopy(dict(init_model.named_parameters())))
cld_model = model_func().to(device)
cld_model.load_state_dict(copy.deepcopy(dict(init_model.named_parameters())))
cld_mdl_param = get_mdl_params([cld_model], n_par)[0]
if os.path.exists('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, com_amount)):
# Load performances and models...
for j in range(n_save_instances):
fed_model = model_func()
fed_model.load_state_dict(torch.load('Output/%s/%s/%d_com_sel.pt' %(data_obj.name, method_name, (j+1)*save_period)))
fed_model.eval()
fed_model = fed_model.to(device)
fed_mdls_sel[j] = fed_model
fed_model = model_func()
fed_model.load_state_dict(torch.load('Output/%s/%s/%d_com_all.pt' %(data_obj.name, method_name, (j+1)*save_period)))
fed_model.eval()
fed_model = fed_model.to(device)
fed_mdls_all[j] = fed_model
fed_model = model_func()
fed_model.load_state_dict(torch.load('Output/%s/%s/%d_com_cld.pt' %(data_obj.name, method_name, (j+1)*save_period)))
fed_model.eval()
fed_model = fed_model.to(device)
fed_mdls_cld[j] = fed_model
trn_perf_sel = np.load('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, com_amount))
trn_perf_all = np.load('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, com_amount))
tst_perf_sel = np.load('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, com_amount))
tst_perf_all = np.load('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, com_amount))
clnt_params_list = np.load('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, com_amount))
local_param_list = np.load('Output/%s/%s/%d_local_param_list.npy' %(data_obj.name, method_name, com_amount))
else:
for i in range(com_amount):
inc_seed = 0
while(True):
# Fix randomness in client selection
np.random.seed(i + rand_seed + inc_seed)
act_list = np.random.uniform(size=n_clnt)
act_clients = act_list <= act_prob
selected_clnts = np.sort(np.where(act_clients)[0])
unselected_clnts = np.sort(np.where(act_clients == False)[0])
inc_seed += 1
if len(selected_clnts) != 0:
break
print('Selected Clients: %s' %(', '.join(['%2d' %item for item in selected_clnts])))
cld_mdl_param_tensor = torch.tensor(cld_mdl_param, dtype=torch.float32, device=device)
for clnt in selected_clnts:
# Train locally
print('---- Training client %d' %clnt)
trn_x = clnt_x[clnt]
trn_y = clnt_y[clnt]
clnt_models[clnt] = model_func().to(device)
model = clnt_models[clnt]
# Warm start from current avg model
model.load_state_dict(copy.deepcopy(dict(cld_model.named_parameters())))
for params in model.parameters():
params.requires_grad = True
# Scale down
alpha_coef_adpt = alpha_coef / weight_list[clnt] # adaptive alpha coef
local_param_list_curr = torch.tensor(local_param_list[clnt], dtype=torch.float32, device=device)
clnt_models[clnt] = train_feddyn_mdl(model, model_func, alpha_coef_adpt, cld_mdl_param_tensor, local_param_list_curr, trn_x, trn_y, learning_rate * (lr_decay_per_round ** i), batch_size, epoch, print_per, weight_decay, data_obj.dataset)
curr_model_par = get_mdl_params([clnt_models[clnt]], n_par)[0]
# No need to scale up hist terms. They are -\nabla/alpha and alpha is already scaled.
local_param_list[clnt] += curr_model_par-cld_mdl_param
clnt_params_list[clnt] = curr_model_par
avg_mdl_param = np.mean(clnt_params_list[selected_clnts], axis = 0)
cld_mdl_param = avg_mdl_param + np.mean(local_param_list, axis=0)
avg_model = set_client_from_params(model_func(), avg_mdl_param)
all_model = set_client_from_params(model_func(), np.mean(clnt_params_list, axis = 0))
cld_model = set_client_from_params(model_func().to(device), cld_mdl_param)
###
loss_tst, acc_tst = get_acc_loss(data_obj.tst_x, data_obj.tst_y, avg_model, data_obj.dataset)
tst_perf_sel[i] = [loss_tst, acc_tst]
print("**** Communication sel %3d, Test Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(cent_x, cent_y, avg_model, data_obj.dataset)
trn_perf_sel[i] = [loss_tst, acc_tst]
print("**** Communication sel %3d, Cent Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(data_obj.tst_x, data_obj.tst_y, all_model, data_obj.dataset)
tst_perf_all[i] = [loss_tst, acc_tst]
print("**** Communication all %3d, Test Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(cent_x, cent_y, all_model, data_obj.dataset)
trn_perf_all[i] = [loss_tst, acc_tst]
print("**** Communication all %3d, Cent Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
if ((i+1) % save_period == 0):
torch.save(avg_model.state_dict(), 'Output/%s/%s/%d_com_sel.pt' %(data_obj.name, method_name, (i+1)))
torch.save(all_model.state_dict(), 'Output/%s/%s/%d_com_all.pt' %(data_obj.name, method_name, (i+1)))
torch.save(cld_model.state_dict(), 'Output/%s/%s/%d_com_cld.pt' %(data_obj.name, method_name, (i+1)))
np.save('Output/%s/%s/%d_local_param_list.npy' %(data_obj.name, method_name, (i+1)), local_param_list)
np.save('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, (i+1)), clnt_params_list)
np.save('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, (i+1)), trn_perf_sel[:i+1])
np.save('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, (i+1)), tst_perf_sel[:i+1])
np.save('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, (i+1)), trn_perf_all[:i+1])
np.save('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, (i+1)), tst_perf_all[:i+1])
if (i+1) > save_period:
# Delete the previous saved arrays
os.remove('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_local_param_list.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, i+1-save_period))
if ((i+1) % save_period == 0):
fed_mdls_sel[i//save_period] = avg_model
fed_mdls_all[i//save_period] = all_model
fed_mdls_cld[i//save_period] = cld_model
return fed_mdls_sel, trn_perf_sel, tst_perf_sel, fed_mdls_all, trn_perf_all, tst_perf_all, fed_mdls_cld
def train_FedProx(data_obj, act_prob ,learning_rate, batch_size, epoch, com_amount, print_per, weight_decay, model_func, init_model, save_period, mu, lr_decay_per_round, rand_seed=0):
method_name = 'FedProx'
n_clnt=data_obj.n_client
clnt_x = data_obj.clnt_x; clnt_y=data_obj.clnt_y
cent_x = np.concatenate(clnt_x, axis=0)
cent_y = np.concatenate(clnt_y, axis=0)
# Average them based on number of datapoints (The one implemented)
weight_list = np.asarray([len(clnt_y[i]) for i in range(n_clnt)])
weight_list = weight_list.reshape((n_clnt, 1))
if not os.path.exists('Output/%s/%s' %(data_obj.name, method_name)):
os.mkdir('Output/%s/%s' %(data_obj.name, method_name))
n_save_instances = int(com_amount / save_period)
fed_mdls_sel = list(range(n_save_instances)); fed_mdls_all = list(range(n_save_instances))
trn_perf_sel = np.zeros((com_amount, 2)); trn_perf_all = np.zeros((com_amount, 2))
tst_perf_sel = np.zeros((com_amount, 2)); tst_perf_all = np.zeros((com_amount, 2))
n_par = len(get_mdl_params([model_func()])[0])
init_par_list=get_mdl_params([init_model], n_par)[0]
clnt_params_list=np.ones(n_clnt).astype('float32').reshape(-1, 1) * init_par_list.reshape(1, -1) # n_clnt X n_par
clnt_models = list(range(n_clnt))
avg_model = model_func().to(device)
avg_model.load_state_dict(copy.deepcopy(dict(init_model.named_parameters())))
all_model = model_func().to(device)
all_model.load_state_dict(copy.deepcopy(dict(init_model.named_parameters())))
if os.path.exists('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, com_amount)):
# Load performances and models...
for j in range(n_save_instances):
fed_model = model_func()
fed_model.load_state_dict(torch.load('Output/%s/%s/%d_com_sel.pt' %(data_obj.name, method_name, (j+1)*save_period)))
fed_model.eval()
fed_model = fed_model.to(device)
fed_mdls_sel[j] = fed_model
fed_model = model_func()
fed_model.load_state_dict(torch.load('Output/%s/%s/%d_com_all.pt' %(data_obj.name, method_name, (j+1)*save_period)))
fed_model.eval()
fed_model = fed_model.to(device)
fed_mdls_all[j] = fed_model
trn_perf_sel = np.load('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, com_amount))
trn_perf_all = np.load('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, com_amount))
tst_perf_sel = np.load('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, com_amount))
tst_perf_all = np.load('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, com_amount))
clnt_params_list = np.load('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, com_amount))
else:
for i in range(com_amount):
inc_seed = 0
while(True):
# Fix randomness in client selection
np.random.seed(i + rand_seed + inc_seed)
act_list = np.random.uniform(size=n_clnt)
act_clients = act_list <= act_prob
selected_clnts = np.sort(np.where(act_clients)[0])
inc_seed += 1
if len(selected_clnts) != 0:
break
print('Selected Clients: %s' %(', '.join(['%2d' %item for item in selected_clnts])))
avg_model_param = get_mdl_params([avg_model], n_par)[0]
avg_model_param_tensor = torch.tensor(avg_model_param, dtype=torch.float32, device=device)
for clnt in selected_clnts:
print('---- Training client %d' %clnt)
trn_x = clnt_x[clnt]
trn_y = clnt_y[clnt]
clnt_models[clnt] = model_func().to(device)
clnt_models[clnt].load_state_dict(copy.deepcopy(dict(avg_model.named_parameters())))
for params in clnt_models[clnt].parameters():
params.requires_grad = True
clnt_models[clnt] = train_fedprox_mdl(clnt_models[clnt], avg_model_param_tensor, mu, trn_x, trn_y, learning_rate * (lr_decay_per_round ** i), batch_size, epoch, print_per, weight_decay, data_obj.dataset)
clnt_params_list[clnt] = get_mdl_params([clnt_models[clnt]], n_par)[0]
# Scale with weights
avg_model = set_client_from_params(model_func(), np.sum(clnt_params_list[selected_clnts]*weight_list[selected_clnts]/np.sum(weight_list[selected_clnts]), axis = 0))
all_model = set_client_from_params(model_func(), np.sum(clnt_params_list*weight_list/np.sum(weight_list), axis = 0))
###
loss_tst, acc_tst = get_acc_loss(data_obj.tst_x, data_obj.tst_y, avg_model, data_obj.dataset)
tst_perf_sel[i] = [loss_tst, acc_tst]
print("**** Communication sel %3d, Test Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(cent_x, cent_y, avg_model, data_obj.dataset)
trn_perf_sel[i] = [loss_tst, acc_tst]
print("**** Communication sel %3d, Cent Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(data_obj.tst_x, data_obj.tst_y, all_model, data_obj.dataset)
tst_perf_all[i] = [loss_tst, acc_tst]
print("**** Communication all %3d, Test Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
###
loss_tst, acc_tst = get_acc_loss(cent_x, cent_y, all_model, data_obj.dataset)
trn_perf_all[i] = [loss_tst, acc_tst]
print("**** Communication all %3d, Cent Accuracy: %.4f, Loss: %.4f" %(i+1, acc_tst, loss_tst))
if ((i+1) % save_period == 0):
torch.save(avg_model.state_dict(), 'Output/%s/%s/%d_com_sel.pt' %(data_obj.name, method_name, (i+1)))
torch.save(all_model.state_dict(), 'Output/%s/%s/%d_com_all.pt' %(data_obj.name, method_name, (i+1)))
np.save('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, (i+1)), clnt_params_list)
np.save('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, (i+1)), trn_perf_sel[:i+1])
np.save('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, (i+1)), tst_perf_sel[:i+1])
np.save('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, (i+1)), trn_perf_all[:i+1])
np.save('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, (i+1)), tst_perf_all[:i+1])
if (i+1) > save_period:
if os.path.exists('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period)):
# Delete the previous saved arrays
os.remove('Output/%s/%s/%d_com_trn_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_tst_perf_sel.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_trn_perf_all.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_com_tst_perf_all.npy' %(data_obj.name, method_name, i+1-save_period))
os.remove('Output/%s/%s/%d_clnt_params_list.npy' %(data_obj.name, method_name, i+1-save_period))
if ((i+1) % save_period == 0):
fed_mdls_sel[i//save_period] = avg_model
fed_mdls_all[i//save_period] = all_model
return fed_mdls_sel, trn_perf_sel, tst_perf_sel, fed_mdls_all, trn_perf_all, tst_perf_all | 58.072041 | 252 | 0.626122 | 5,250 | 33,856 | 3.674476 | 0.036952 | 0.054792 | 0.041055 | 0.087243 | 0.934425 | 0.926028 | 0.916386 | 0.91058 | 0.896014 | 0.880203 | 0 | 0.010952 | 0.23408 | 33,856 | 583 | 253 | 58.072041 | 0.732984 | 0.024161 | 0 | 0.852381 | 0 | 0 | 0.133416 | 0.093941 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009524 | false | 0 | 0.009524 | 0 | 0.028571 | 0.07619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d07395b80f1aa28853744e79c31a389fc20a19c8 | 63 | py | Python | ChromProcess/Loading/peak_collection/__init__.py | thijsdejong10/ChromProcess | aba9c261824d0f29e0a92d7ca7c4a78e03249d62 | [
"BSD-3-Clause"
] | null | null | null | ChromProcess/Loading/peak_collection/__init__.py | thijsdejong10/ChromProcess | aba9c261824d0f29e0a92d7ca7c4a78e03249d62 | [
"BSD-3-Clause"
] | 1 | 2022-01-21T16:14:05.000Z | 2022-01-21T16:14:05.000Z | ChromProcess/Loading/peak_collection/__init__.py | thijsdejong10/ChromProcess | aba9c261824d0f29e0a92d7ca7c4a78e03249d62 | [
"BSD-3-Clause"
] | 1 | 2022-01-18T16:17:05.000Z | 2022-01-18T16:17:05.000Z | from .peak_collection_from_csv import peak_collection_from_csv
| 31.5 | 62 | 0.920635 | 10 | 63 | 5.2 | 0.5 | 0.538462 | 0.692308 | 0.807692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063492 | 63 | 1 | 63 | 63 | 0.881356 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
d0f74d5ad6ac2676f5933a500f28daafd0d68813 | 3,076 | py | Python | Curve fitting/Least squares regression/RegMinimos.py | luismnd/my-implementations---numeric-methods | adfa016495277f4ea0cbcbc75d63f2eacea0fe5c | [
"MIT"
] | null | null | null | Curve fitting/Least squares regression/RegMinimos.py | luismnd/my-implementations---numeric-methods | adfa016495277f4ea0cbcbc75d63f2eacea0fe5c | [
"MIT"
] | null | null | null | Curve fitting/Least squares regression/RegMinimos.py | luismnd/my-implementations---numeric-methods | adfa016495277f4ea0cbcbc75d63f2eacea0fe5c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import numpy as np
import pylab as plt
def polinomial(orden):
datos=np.loadtxt('Retina.txt')
x=datos[:,0.]
y=datos[:,1.]
yl=np.log(y)
m=orden+1
n=len(x)
z=np.zeros((n,m))
y2=np.zeros((n))
for i in range(0,n,1):
for j in range(0,m,1):
z[i,j]=x[i]**j
zt=z.transpose()
mult=np.dot(zt,z)
a=np.linalg.solve(mult,np.dot(zt,y))
print 'a',orden,'=',a
sum2=0
for i in range(0,n,1):
sum=0
for j in range(0,m,1):
sum=sum+a[j]*x[i]**j
y2[i]=sum
sum2=sum2+(y[i]-sum)**2
print 'R^2=',np.sqrt(sum2/(n-(m+1)))
'''Manejo de Graficas'''
#Se crea una figura para grabarla en un archivo
# fig=plt.figure(orden)
#Grafica la funcion 1 aplicada a x,y vs. Y en linea continua
plt.plot(x,y,'-.')
#Grafica la segunda fila de la matriz B vs. la segunda columna de la matriz A, en puntos
plt.plot(x,y2,'-')
#Incluye leyenda
plt.legend(['Datos','Aproximacion'])
#Incluye Titulo
plt.title('Intensidad Reflejada de Luz vs Posicion Laser')
#Incluye Nombre en los ejes
plt.xlabel('Posicion laser')
#Incluye Nombre en los ejes
plt.ylabel('Intensidad Refleja de Luz')
if orden ==1:
plt.savefig('Figura1.png')
if orden==2:
plt.savefig('Figura2.png')
if orden==2:
plt.savefig('Figura3.png')
plt.show()
def exponencial(orden):
datos=np.loadtxt('Retina.txt')
x=datos[:,0.]
y=datos[:,1.]
yl=np.log(y)
m=orden+1
n=len(x)
z=np.zeros((n,m))
y2=np.zeros((n))
for i in range(0,n,1):
for j in range(0,m,1):
z[i,j]=x[i]**j
zt=z.transpose()
mult=np.dot(zt,z)
a=np.linalg.solve(mult,np.dot(zt,y))
print 'a',orden,'=',a
sum2=0
for i in range(0,n,1):
sum=0
for j in range(0,m,1):
sum=sum+a[j]*x[i]**j
y2[i]=sum
sum2=sum2+(y[i]-sum)**2
print 'R^2=',np.sqrt(sum2/(n-(m+1)))
'''Manejo de Graficas'''
#Se crea una figura para grabarla en un archivo
# fig=plt.figure(orden)
#Grafica la funcion 1 aplicada a x,y vs. Y en linea continua
plt.plot(x,y,'-.')
#Grafica la segunda fila de la matriz B vs. la segunda columna de la matriz A, en puntos
plt.plot(x,y2,'-')
#Incluye leyenda
plt.legend(['Datos','Aproximacion'])
#Incluye Titulo
plt.title('Intensidad Reflejada de Luz vs Posicion Laser')
#Incluye Nombre en los ejes
plt.xlabel('Posicion laser')
#Incluye Nombre en los ejes
plt.ylabel('Intensidad Refleja de Luz')
if orden ==1:
plt.savefig('Figura1.png')
if orden==2:
plt.savefig('Figura2.png')
if orden==2:
plt.savefig('Figura3.png')
plt.show()
polinomial(1)
polinomial(2)
polinomial(3)
exponencial(1)
exponencial(2)
exponencial(3)
| 26.290598 | 93 | 0.537386 | 487 | 3,076 | 3.394251 | 0.205339 | 0.033878 | 0.038717 | 0.026618 | 0.91712 | 0.91712 | 0.91712 | 0.91712 | 0.91712 | 0.91712 | 0 | 0.031367 | 0.305592 | 3,076 | 116 | 94 | 26.517241 | 0.742509 | 0.198635 | 0 | 0.878049 | 0 | 0 | 0.134034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.02439 | null | null | 0.04878 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ef8b05e68e020510f4d76b6be5780bed1a635cf4 | 725 | py | Python | medialib/__init__.py | ppcrong/pymisc2 | 06013a7842d7633e8ddf8241718e5ffdb51d9d0c | [
"Apache-2.0"
] | null | null | null | medialib/__init__.py | ppcrong/pymisc2 | 06013a7842d7633e8ddf8241718e5ffdb51d9d0c | [
"Apache-2.0"
] | null | null | null | medialib/__init__.py | ppcrong/pymisc2 | 06013a7842d7633e8ddf8241718e5ffdb51d9d0c | [
"Apache-2.0"
] | null | null | null | DICT_RESOLUTIONS = {
'1920x1080': {'w': 1920, 'h': 1080, 'default': False},
'1280x720': {'w': 1280, 'h': 720, 'default': False},
'1024x768': {'w': 1024, 'h': 768, 'default': False},
'800x600': {'w': 800, 'h': 600, 'default': False},
'640x480': {'w': 640, 'h': 480, 'default': False},
'352x240': {'w': 352, 'h': 288, 'default': False},
'320x240': {'w': 320, 'h': 240, 'default': False}
}
DICT_RESOLUTIONS_MIN = {
'1920x1080': {'w': 1920, 'h': 1080, 'default': False},
'1280x720': {'w': 1280, 'h': 720, 'default': False},
'1024x768': {'w': 1024, 'h': 768, 'default': False},
'800x600': {'w': 800, 'h': 600, 'default': False},
'640x480': {'w': 640, 'h': 480, 'default': True},
}
| 40.277778 | 58 | 0.517241 | 89 | 725 | 4.179775 | 0.325843 | 0.354839 | 0.075269 | 0.080645 | 0.741935 | 0.741935 | 0.741935 | 0.741935 | 0.741935 | 0.741935 | 0 | 0.273973 | 0.194483 | 725 | 17 | 59 | 42.647059 | 0.363014 | 0 | 0 | 0.5 | 0 | 0 | 0.275862 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
eff288b9459ed65f3e7747655ba5795f933f9922 | 118 | py | Python | InSituToolkit/analysis/__init__.py | czbiohub/InSituToolkit | 4c97a29f87e755e61451f4a9875d0bdf397864b5 | [
"MIT"
] | 4 | 2019-10-16T15:37:07.000Z | 2020-11-01T00:04:07.000Z | InSituToolkit/analysis/__init__.py | czbiohub/InSituToolkit | 4c97a29f87e755e61451f4a9875d0bdf397864b5 | [
"MIT"
] | 4 | 2019-07-19T01:45:57.000Z | 2020-03-04T00:57:33.000Z | InSituToolkit/analysis/__init__.py | czbiohub/InSituToolkit | 4c97a29f87e755e61451f4a9875d0bdf397864b5 | [
"MIT"
] | 1 | 2019-11-20T21:02:04.000Z | 2019-11-20T21:02:04.000Z | from .save_stack import save_stack
from .results_viewer import view_results
from .stack_from_tif import stack_from_tif | 39.333333 | 42 | 0.881356 | 20 | 118 | 4.8 | 0.4 | 0.28125 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09322 | 118 | 3 | 42 | 39.333333 | 0.897196 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4bda4f283e8b7d2cf22e65d1dc4a7304d148c173 | 11,059 | py | Python | openapi_client/api/votes_api.py | osuka/dognews-scraper | 12373064061157083a48ced8e2cabf9d1ace30a5 | [
"MIT"
] | 1 | 2019-11-15T13:19:36.000Z | 2019-11-15T13:19:36.000Z | openapi_client/api/votes_api.py | osuka/news-extractor | 12373064061157083a48ced8e2cabf9d1ace30a5 | [
"MIT"
] | null | null | null | openapi_client/api/votes_api.py | osuka/news-extractor | 12373064061157083a48ced8e2cabf9d1ace30a5 | [
"MIT"
] | null | null | null | """
Dognews Server API
Dognews Server client API # noqa: E501
The version of the OpenAPI document: 1.0.0
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from openapi_client.api_client import ApiClient, Endpoint as _Endpoint
from openapi_client.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from openapi_client.model.vote import Vote
class VotesApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def __votes_destroy(
self,
id,
**kwargs
):
"""votes_destroy # noqa: E501
Vote management, through /votes (put, patch, destroy) or through **Permission restrictions:** + `IsAuthenticated`: *Rejects all operations if the user is not authenticated* + `IsOwnerOrModeratorOrStaff`: *Blocks update/partial_updated/destroy if: * the user is NOT in the staff group * AND if the model has a property called 'owner' and its value differs from the request user * AND if the user is not in the Moderators group Everything else is allowed* + `DjangoModelPermissions`: *The request is authenticated using `django.contrib.auth` permissions. See: https://docs.djangoproject.com/en/dev/topics/auth/#permissions It ensures that the user is authenticated, and has the appropriate `add`/`change`/`delete` permissions on the model. This permission can only be applied against view classes that provide a `.queryset` attribute.* # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.votes_destroy(id, async_req=True)
>>> result = thread.get()
Args:
id (int): A unique integer value identifying this vote.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
None
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['id'] = \
id
return self.call_with_http_info(**kwargs)
self.votes_destroy = _Endpoint(
settings={
'response_type': None,
'auth': [
'basicAuth',
'cookieAuth',
'jwtAuth',
'tokenAuth'
],
'endpoint_path': '/votes/{id}',
'operation_id': 'votes_destroy',
'http_method': 'DELETE',
'servers': None,
},
params_map={
'all': [
'id',
],
'required': [
'id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'id':
(int,),
},
'attribute_map': {
'id': 'id',
},
'location_map': {
'id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [],
'content_type': [],
},
api_client=api_client,
callable=__votes_destroy
)
def __votes_retrieve(
self,
id,
**kwargs
):
"""votes_retrieve # noqa: E501
Vote management, through /votes (put, patch, destroy) or through **Permission restrictions:** + `IsAuthenticated`: *Rejects all operations if the user is not authenticated* + `IsOwnerOrModeratorOrStaff`: *Blocks update/partial_updated/destroy if: * the user is NOT in the staff group * AND if the model has a property called 'owner' and its value differs from the request user * AND if the user is not in the Moderators group Everything else is allowed* + `DjangoModelPermissions`: *The request is authenticated using `django.contrib.auth` permissions. See: https://docs.djangoproject.com/en/dev/topics/auth/#permissions It ensures that the user is authenticated, and has the appropriate `add`/`change`/`delete` permissions on the model. This permission can only be applied against view classes that provide a `.queryset` attribute.* # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.votes_retrieve(id, async_req=True)
>>> result = thread.get()
Args:
id (int): A unique integer value identifying this vote.
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
Vote
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['id'] = \
id
return self.call_with_http_info(**kwargs)
self.votes_retrieve = _Endpoint(
settings={
'response_type': (Vote,),
'auth': [
'basicAuth',
'cookieAuth',
'jwtAuth',
'tokenAuth'
],
'endpoint_path': '/votes/{id}',
'operation_id': 'votes_retrieve',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'id',
],
'required': [
'id',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'id':
(int,),
},
'attribute_map': {
'id': 'id',
},
'location_map': {
'id': 'path',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client,
callable=__votes_retrieve
)
| 39.496429 | 898 | 0.500678 | 1,046 | 11,059 | 5.110899 | 0.213193 | 0.023569 | 0.013468 | 0.012346 | 0.845866 | 0.845866 | 0.845866 | 0.845866 | 0.830153 | 0.830153 | 0 | 0.004501 | 0.417398 | 11,059 | 279 | 899 | 39.637993 | 0.825237 | 0.440998 | 0 | 0.616667 | 1 | 0 | 0.187757 | 0.024263 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016667 | false | 0 | 0.027778 | 0 | 0.061111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0872f38289409c87e2fdf8f22c60e4d85add907d | 111 | py | Python | info.py | Jiminger/CSMuseum | 13de6d7ae289da045de9a75cec83ccc229715baa | [
"MIT"
] | null | null | null | info.py | Jiminger/CSMuseum | 13de6d7ae289da045de9a75cec83ccc229715baa | [
"MIT"
] | 3 | 2022-03-25T05:40:00.000Z | 2022-03-25T15:26:43.000Z | info.py | Jiminger/CSMuseum | 13de6d7ae289da045de9a75cec83ccc229715baa | [
"MIT"
] | null | null | null | """ Used to get database credentials """
def get_db_user():
return ""
def get_db_pass():
return ""
| 11.1 | 40 | 0.612613 | 15 | 111 | 4.266667 | 0.666667 | 0.1875 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.243243 | 111 | 9 | 41 | 12.333333 | 0.761905 | 0.288288 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.25 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 8 |
087568da79ed48fa1ab11987f0fa39f774146c8d | 11,294 | py | Python | example/test2/room.py | dmilos/IceRay | 4e01f141363c0d126d3c700c1f5f892967e3d520 | [
"MIT-0"
] | 2 | 2020-09-04T12:27:15.000Z | 2022-01-17T14:49:40.000Z | example/test2/room.py | dmilos/IceRay | 4e01f141363c0d126d3c700c1f5f892967e3d520 | [
"MIT-0"
] | null | null | null | example/test2/room.py | dmilos/IceRay | 4e01f141363c0d126d3c700c1f5f892967e3d520 | [
"MIT-0"
] | 1 | 2020-09-04T12:27:52.000Z | 2020-09-04T12:27:52.000Z | import math
import IceRayPy
Coord3D = IceRayPy.type.math.coord.Scalar3D
def vacuum( P_dll, P_config = None, P_light = None, P_exponat = None ):
geometry = IceRayPy.core.geometry.volumetric.Vacuum( P_dll )
wrapper = IceRayPy.core.object.Wrapper( P_dll )
wrapper.geometrySet( geometry )
return wrapper
def plate( P_dll, P_config = { 'level': - 1.01, 'size' : 3, 'shadow': False, 'pigment': None }, P_light = None, P_exponat = None ):
level = -1.00001;
if( 'level' in P_config ):
level = P_config['level']
size = 3;
if( 'size' in P_config ):
size = P_config['size']
geometry = IceRayPy.core.geometry.simple.Box( P_dll )
geometry.box( Coord3D( -size, -size, level - 0.1) , Coord3D( size, size, level ) )
wrapper = IceRayPy.core.object.Wrapper( P_dll )
I_scene = { 'light': P_light, 'barrier' : P_exponat }
if( 'shadow' in P_config ):
if( False == P_config['shadow'] ):
I_scene['barrier'] = IceRayPy.core.geometry.volumetric.Vacuum( P_dll )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 0.5, 0.5, 0.5 ) )
if( 'pigment' in P_config ):
pigment = P_config['pigment'] #utility.material.pattern.Checker( P_dll, I_scene )
#pigment = IceRayPy.utility.material.pattern.Checker( P_dll, I_scene )
wrapper.pigment( pigment )
wrapper.geometrySet( geometry )
return wrapper
def plane( P_dll, P_config = { 'level': - 1.001, 'shadow': False, 'pigment': None }, P_light = None, P_exponat = None ):
level = -1.0001;
if( 'level' in P_config ):
level = P_config['level']
geometry = IceRayPy.core.geometry.simple.Plane( P_dll )
geometry.origin( Coord3D(0, 0, level ) )
I_scene = { 'light': P_light, 'barrier' : P_exponat }
if( 'shadow' in P_config ):
if( False == P_config['shadow'] ):
I_scene['barrier'] = IceRayPy.core.geometry.volumetric.Vacuum( P_dll )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 0.5, 0.5, 0.5 ) )
if( 'pigment' in P_config ):
pigment = P_config['pigment'] #utility.material.pattern.Checker( P_dll, I_scene )
pigment = IceRayPy.utility.material.pattern.Checker( P_dll, I_scene )
wrapper = IceRayPy.core.object.Wrapper( P_dll )
wrapper.pigment( pigment )
wrapper.geometrySet( geometry )
return wrapper
def cornell(
P_dll
,P_config = None
,P_light = None
,P_exponat = None
): # non-classic
global G_dimesion
I_dimension = [ 8, 8, 4 ] # [ 6, 6, 3.5 ]
I_move = [ 0, 0, I_dimension[2]/2-1 ]
wall = 0.1
lo = Coord3D()
lo[0] = -I_dimension[0]/2 + I_move[0]
lo[1] = -I_dimension[1]/2 + I_move[1]
lo[2] = -I_dimension[2]/2 + I_move[2]
hi = Coord3D()
hi[0] = +I_dimension[0]/2 + I_move[0]
hi[1] = +I_dimension[1]/2 + I_move[1]
hi[2] = +I_dimension[2]/2 + I_move[2]
I_scene = { 'light': P_light, 'barrier' : P_exponat }
if( 'shadow' in P_config ):
if( False == P_config['shadow'] ):
I_scene['barrier'] = IceRayPy.core.geometry.volumetric.Vacuum( P_dll )
leftG = IceRayPy.core.geometry.simple.Box( P_dll )
leftG.box( Coord3D( lo[0]-wall, lo[1], lo[2]) , Coord3D(lo[0], hi[1], hi[2]) )
leftW = IceRayPy.core.object.Wrapper( P_dll )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 1, 0.33, 0.33 ) )
leftW.pigment( pigment )
leftW.geometrySet( leftG )
rightG = IceRayPy.core.geometry.simple.Box( P_dll )
rightG.box( Coord3D( hi[0], lo[1], lo[2]) , Coord3D(hi[0]+ wall,hi[1], hi[2]) )
rightW = IceRayPy.core.object.Wrapper( P_dll )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 0.33, 1, 0.33 ) )
rightW.pigment( pigment )
rightW.geometrySet( rightG )
backgroundG = IceRayPy.core.geometry.simple.Box( P_dll )
backgroundG.box( Coord3D( lo[0], lo[1]-wall, lo[2] ) , Coord3D( hi[0], lo[1], hi[2] ) )
backgroundW = IceRayPy.core.object.Wrapper( P_dll )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 0.33, 0.33, 1 ) )
backgroundW.pigment( pigment )
backgroundW.geometrySet( backgroundG )
foregroundG = IceRayPy.core.geometry.simple.Box( P_dll )
foregroundG.box( Coord3D( lo[0], hi[1], lo[2] ), Coord3D( hi[0], hi[1] + wall, hi[2] ) )
foregroundW = IceRayPy.core.object.Wrapper( P_dll )
pigment = IceRayPy.utility.material.transmission.reflect.One( P_dll, I_scene )
foregroundW.pigment( pigment )
foregroundW.geometrySet( foregroundG )
floorG = IceRayPy.core.geometry.simple.Box( P_dll )
floorG.box( Coord3D( lo[0], lo[1], lo[2]-wall ) , Coord3D( hi[0], hi[0], lo[2] ) )
floorW = IceRayPy.core.object.Wrapper( P_dll )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 0.5, 0.5, 0.5 ) )
floorW.pigment( pigment )
floorW.geometrySet( floorG )
ceilG = IceRayPy.core.geometry.simple.Box( P_dll )
ceilG.box( Coord3D( lo[0], lo[1], hi[2] ), Coord3D( hi[0], hi[1], hi[2] + wall ) )
ceilW = IceRayPy.core.object.Wrapper( P_dll )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 0.5, 0.5, 0.5 ) )
ceilW.pigment( pigment )
ceilW.geometrySet( ceilG )
rtss = IceRayPy.core.geometry.rtss.Object( P_dll )
list = IceRayPy.core.geometry.rtss.List( P_dll )
rtss.rtss( list )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, leftW.cast2Geometry(), leftW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, rightW.cast2Geometry(), rightW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, backgroundW.cast2Geometry(), backgroundW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, foregroundW.cast2Geometry(), foregroundW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, floorW.cast2Geometry(), floorW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, ceilW.cast2Geometry(), ceilW ) )
wrapper = IceRayPy.core.object.Wrapper( P_dll )
wrapper.geometrySet( rtss )
return wrapper
G_option =3
G_angle = 2
G_size = 4
def cornell_radiosity(
P_dll
,P_config = None
,P_light = None
,P_exponat = None
): # non-classic
global G_dimesion
I_dimension = [ 10, 10, 5 ]
I_move = [ 0, 0, I_dimension[2]/2-1 ]
wall = 0.1
lo = Coord3D()
lo[0] = -I_dimension[0]/2 + I_move[0]
lo[1] = -I_dimension[1]/2 + I_move[1]
lo[2] = -I_dimension[2]/2 + I_move[2]
hi = Coord3D()
hi[0] = +I_dimension[0]/2 + I_move[0]
hi[1] = +I_dimension[1]/2 + I_move[1]
hi[2] = +I_dimension[2]/2 + I_move[2]
I_scene = { 'light': P_light, 'barrier' : P_exponat }
if( 'shadow' in P_config ):
if( False == P_config['shadow'] ):
I_scene['barrier'] = IceRayPy.core.geometry.volumetric.Vacuum( P_dll )
leftG = IceRayPy.core.geometry.simple.Box( P_dll )
leftG.box( Coord3D( lo[0]-wall, lo[1], lo[2]) , Coord3D(lo[0], hi[1], hi[2]) )
leftW = IceRayPy.core.object.Wrapper( P_dll )
pigment = IceRayPy.utility.material.transmission.reflect.One( P_dll, I_scene )
leftW.pigment( pigment )
leftW.geometrySet( leftG )
rightG = IceRayPy.core.geometry.simple.Box( P_dll )
rightG.box( Coord3D( hi[0], lo[1], lo[2]) , Coord3D(hi[0]+ wall,hi[1], hi[2]) )
rightW = IceRayPy.core.object.Wrapper( P_dll )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 0.33, 0.33, 1 ) )
rightW.pigment( pigment )
rightW.geometrySet( rightG )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 1, 0.33, 0.33 ) )
backgroundG = IceRayPy.core.geometry.simple.Box( P_dll )
backgroundG.box( Coord3D( lo[0], lo[1]-wall, lo[2] ) , Coord3D( hi[0], lo[1], hi[2] ) )
backgroundW = IceRayPy.core.object.Wrapper( P_dll )
backgroundW.pigment( pigment )
backgroundW.geometrySet( backgroundG )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 0.33, 1, 0.33 ) )
foregroundG = IceRayPy.core.geometry.simple.Box( P_dll )
foregroundG.box( Coord3D( lo[0], hi[1], lo[2] ), Coord3D( hi[0], hi[1] + wall, hi[2] ) )
foregroundW = IceRayPy.core.object.Wrapper( P_dll )
foregroundW.pigment( pigment )
foregroundW.geometrySet( foregroundG )
global G_option
global G_angle
global G_size
if 1 == G_option :
pigment = IceRayPy.utility.material.transmission.blossom.VDC( P_dll, I_scene, IceRayPy.type.color.RGB( 0.5, 0.5, 0.5 ), 0, 16, math.radians(G_angle) )
if 2 == G_option :
pigment = IceRayPy.utility.material.transmission.blossom.Random( P_dll, I_scene, IceRayPy.type.color.RGB( 0.5, 0.5, 0.5 ), 0, 3, math.radians(G_angle) )
if 3 == G_option :
pigment = IceRayPy.utility.material.transmission.blossom.Hexagon( P_dll, I_scene, IceRayPy.type.color.RGB( 0.5, 0.5, 0.5 ), 0, G_size, math.radians(G_angle) )
if 4 == G_option :
pigment = IceRayPy.utility.material.transmission.blossom.Grid( P_dll, I_scene, IceRayPy.type.color.RGB( 0.5, 0.5, 0.5 ), 0, 4, math.radians(G_angle) )
G_option = 3
G_angle = G_angle + 2
G_size = G_size + 1
floorG = IceRayPy.core.geometry.simple.Box( P_dll )
floorG.box( Coord3D( lo[0], lo[1], lo[2]-wall ) , Coord3D( hi[0], hi[0], lo[2] ) )
floorW = IceRayPy.core.object.Wrapper( P_dll )
floorW.pigment( pigment )
floorW.geometrySet( floorG )
pigment = IceRayPy.utility.material.illumination.Lambert( P_dll, I_scene, IceRayPy.type.color.RGB( 0.5, 0.5, 0.5 ) )
ceilG = IceRayPy.core.geometry.simple.Box( P_dll )
ceilG.box( Coord3D( lo[0], lo[1], hi[2] ), Coord3D( hi[0], hi[1], hi[2] + wall ) )
ceilW = IceRayPy.core.object.Wrapper( P_dll )
ceilW.pigment( pigment )
ceilW.geometrySet( ceilG )
rtss = IceRayPy.core.geometry.rtss.Object( P_dll )
list = IceRayPy.core.geometry.rtss.List( P_dll )
rtss.rtss( list )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, leftW.cast2Geometry(), leftW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, rightW.cast2Geometry(), rightW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, backgroundW.cast2Geometry(), backgroundW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, foregroundW.cast2Geometry(), foregroundW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, floorW.cast2Geometry(), floorW ) )
rtss.push( IceRayPy.core.geometry.Pretender( P_dll, ceilW.cast2Geometry(), ceilW ) )
wrapper = IceRayPy.core.object.Wrapper( P_dll )
wrapper.geometrySet( rtss )
return wrapper
| 43.272031 | 167 | 0.632814 | 1,581 | 11,294 | 4.393422 | 0.063251 | 0.044918 | 0.100777 | 0.031673 | 0.947884 | 0.927008 | 0.860495 | 0.844803 | 0.808379 | 0.773971 | 0 | 0.037412 | 0.221356 | 11,294 | 260 | 168 | 43.438462 | 0.752445 | 0.018328 | 0 | 0.821782 | 0 | 0 | 0.020331 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024752 | false | 0 | 0.009901 | 0 | 0.059406 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
089798de1e569934ece26955144566e01c66ddf5 | 26,123 | py | Python | ablations_code/ablations.py | leopauly/Observation-Learning-Simulations | 462c04a87c45aae51537b8ea5b44646afa31d3a5 | [
"MIT"
] | 49 | 2017-12-11T11:00:02.000Z | 2022-03-30T05:19:31.000Z | ablations_code/ablations.py | leopauly/Observation-Learning-Simulations | 462c04a87c45aae51537b8ea5b44646afa31d3a5 | [
"MIT"
] | 2 | 2018-01-01T17:39:56.000Z | 2019-07-24T04:49:08.000Z | ablations_code/ablations.py | leopauly/Observation-Learning-Simulations | 462c04a87c45aae51537b8ea5b44646afa31d3a5 | [
"MIT"
] | 12 | 2017-12-13T11:52:17.000Z | 2020-12-03T00:53:29.000Z | import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
from tensorflow import gfile
import imageio
import pickle
import scipy.misc
import sys
from IPython.display import HTML
import imageio
import argparse
def transform(image, resize_height=36, resize_width=64):
cropped_image = scipy.misc.imresize(image, [resize_height, resize_width])
return np.array(cropped_image)/127.5 - 1.
def inverse_transform(images):
return (images+1.)/2.
def lrelu(x, leak=0.2, name="lrelu"):
return tf.maximum(x, leak*x)
def conv2d(input_, output_dim,
k_h=5, k_w=5, d_h=2, d_w=2, stddev=0.02,
name="conv2d"):
with tf.variable_scope(name):
w = tf.get_variable('w', [k_h, k_w, input_.get_shape()[-1], output_dim],
initializer=tf.truncated_normal_initializer(stddev=stddev))
# print("c", w.get_shape())
conv = tf.nn.conv2d(input_, w, strides=[1, d_h, d_w, 1], padding='SAME')
biases = tf.get_variable('biases', [output_dim], initializer=tf.constant_initializer(0.0))
conv = tf.reshape(tf.nn.bias_add(conv, biases), conv.get_shape())
return conv
class batch_norm(object):
def __init__(self, epsilon=1e-5, momentum = 0.9, name="batch_norm"):
with tf.variable_scope(name):
self.epsilon = epsilon
self.momentum = momentum
self.name = name
def __call__(self, x):
return tf.contrib.layers.batch_norm(x,
decay=self.momentum,
updates_collections=None,
epsilon=self.epsilon,
scale=True,
is_training=tftrain,
scope=self.name)
def linear(input_, output_size, scope=None, stddev=0.02, bias_start=0.0, with_w=False):
shape = input_.get_shape().as_list()
with tf.variable_scope(scope or "Linear"):
matrix = tf.get_variable("Matrix", [shape[1], output_size], tf.float32,
tf.random_normal_initializer(stddev=stddev))
bias = tf.get_variable("bias", [output_size],
initializer=tf.constant_initializer(bias_start))
if with_w:
return tf.matmul(input_, matrix) + bias, matrix, bias
else:
return tf.matmul(input_, matrix) + bias
def deconv2d(input_, output_shape,
k_h=5, k_w=5, d_h=2, d_w=2, stddev=0.02,
name="deconv2d", with_w=False):
with tf.variable_scope(name):
# filter : [height, width, output_channels, in_channels]
w = tf.get_variable('w', [k_h, k_w, output_shape[-1], input_.get_shape()[-1]],
initializer=tf.random_normal_initializer(stddev=stddev))
# print("w", w.get_shape())
try:
deconv = tf.nn.conv2d_transpose(input_, w, output_shape=output_shape,
strides=[1, d_h, d_w, 1])
# Support for verisons of TensorFlow before 0.7.0
except AttributeError:
deconv = tf.nn.deconv2d(input_, w, output_shape=output_shape,
strides=[1, d_h, d_w, 1])
biases = tf.get_variable('biases', [output_shape[-1]], initializer=tf.constant_initializer(0.0))
deconv = tf.reshape(tf.nn.bias_add(deconv, biases), deconv.get_shape())
if with_w:
return deconv, w, biases
else:
return deconv
class ContextAEPushReal:
def __init__(self, gf_dim=64, df_dim=64,
gfc_dim=1024, dfc_dim=1024,
c_dim=3):
self.gf_dim = gf_dim
self.df_dim = df_dim
self.c_dim = c_dim
self.gfc_dim = gfc_dim
self.dfc_dim = dfc_dim
def build(self, image, ablation_type):
imgshape = image.get_shape().as_list()
print(imgshape)
self.output_height, self.output_width = imgshape[-3:-1]
self.batch_size = imgshape[1]
featsize = 100
srcimg = image[0]
tgtimg = image[2]
tgtctx = image[1]
nf0 = 32
nf1 = 16
nf2 = 16
nf3 = 8
ns0 = 1
ns1 = 2
ns2 = 1
ns3 = 2
# with tf.variable_scope("conv_context") as scope:
def encode(img):
img_h0 = lrelu(conv2d(img, nf0, d_h=ns0, d_w=ns0, name='h0_conv'))
img_h1 = lrelu(conv2d(img_h0, nf1, d_h=ns1, d_w=ns1, name='h1_conv'))
img_h2 = lrelu(conv2d(img_h1, nf2, d_h=ns2, d_w=ns2, name='h2_conv'))
img_h3 = lrelu(conv2d(img_h2, nf3, d_h=ns3, d_w=ns3, name='h3_conv'))
print(img_h3.get_shape())
img_h4 = lrelu(linear(tf.nn.dropout(tf.reshape(img_h3, [self.batch_size, -1]), keep_prob), featsize, 'h4_lin'))
img_z = lrelu(linear(tf.nn.dropout(img_h4, keep_prob), featsize, 'hz_lin'))
return img_h0, img_h1, img_h2, img_h3, img_h4, img_z
with tf.variable_scope("conv") as scope:
srcimg_h0, srcimg_h1, srcimg_h2, srcimg_h3, srcimg_h4, srcimg_z = encode(srcimg)
scope.reuse_variables()
tgtimg_h0, tgtimg_h1, tgtimg_h2, tgtimg_h3, tgtimg_h4, tgtimg_z = encode(tgtimg)
tgtctx_h0, tgtctx_h1, tgtctx_h2, tgtctx_h3, tgtctx_h4, tgtctx_z = encode(tgtctx)
with tf.variable_scope("translate") as scope:
trans_h0 = lrelu(linear(tf.nn.dropout(tf.concat([srcimg_z, tgtctx_z], 1), keep_prob), featsize, 'trans_h0'))
trans_z = linear(tf.nn.dropout(trans_h0, keep_prob), featsize, 'trans_z')
self.translated_z = trans_z
s_h, s_w = self.output_height, self.output_width
s_h0, s_h1, s_h2, s_h3 = \
int(s_h/ns0), int(s_h/ns0/ns1), int(s_h/ns0/ns1/ns2), int(s_h/ns0/ns1/ns2/ns3)
s_w0, s_w1, s_w2, s_w3 = \
int(s_w/ns0), int(s_w/ns0/ns1), int(s_w/ns0/ns1/ns2), int(s_w/ns0/ns1/ns2/ns3)
def decode(z, skip_h3, skip_h2, skip_h1, skip_h0):
z_ = lrelu(linear(tf.nn.dropout(z, keep_prob), nf3*s_h3*s_w3, 'd_h0_lin'))
h0 = tf.nn.dropout(tf.reshape(z_, [-1, s_h3, s_w3, nf3]), keep_prob)
h1 = lrelu(deconv2d(tf.concat([h0, skip_h3], 3),
[self.batch_size, s_h2, s_w2, nf2], name='d_h1', d_h=ns3, d_w=ns3))
h2 = lrelu(deconv2d(tf.concat([h1, skip_h2], 3),
[self.batch_size, s_h1, s_w1, nf1], name='d_h2', d_h=ns2, d_w=ns2))
h3 = lrelu(deconv2d(tf.concat([h2, skip_h1], 3),
[self.batch_size, s_h0, s_w0, nf0], name='d_h3', d_h=ns1, d_w=ns1))
print(h3.get_shape())
h4 = deconv2d(tf.concat([h3, skip_h0], 3),
[self.batch_size, s_h, s_w, self.c_dim], name='d_h4', d_h=ns0, d_w=ns0)
return h4
with tf.variable_scope("deconv") as scope:
output_h4 = decode(trans_z, tgtctx_h3, tgtctx_h2, tgtctx_h1, tgtctx_h0)
scope.reuse_variables()
truthoutput_h4 = decode(tgtimg_z, tgtctx_h3, tgtctx_h2, tgtctx_h1, tgtctx_h0)
self.simloss = tf.reduce_mean((trans_z - tgtimg_z) ** 2) * 1e3
print(tgtimg_z.get_shape())
self.out = output_h4
self.out2 = truthoutput_h4
print(self.out.get_shape())
self.recon1 = tf.nn.l2_loss(tgtimg - self.out)
self.recon2 = tf.nn.l2_loss(tgtimg - self.out2)
if ablation_type == "None":
self.loss = self.recon1 + self.recon2 + self.simloss
elif ablation_type == "L2":
self.loss = self.recon1 + self.recon2
elif ablation_type == "L2L3":
self.loss = self.recon1
elif ablation_type == "L1":
self.loss = self.recon2 + self.simloss
class ContextAEPush:
def __init__(self, gf_dim=64, df_dim=64,
gfc_dim=1024, dfc_dim=1024,
c_dim=3):
self.gf_dim = gf_dim
self.df_dim = df_dim
self.c_dim = c_dim
self.gfc_dim = gfc_dim
self.dfc_dim = dfc_dim
def build(self, image, ablation_type):
imgshape = image.get_shape().as_list()
print(imgshape)
self.output_height, self.output_width = imgshape[-3:-1]
self.batch_size = imgshape[1]
featsize = 1024
srcimg = image[0]
tgtimg = image[2]
tgtctx = image[1]
with tf.variable_scope("conv_context") as scope:
tgtctx_h0 = lrelu(conv2d(tgtctx, self.df_dim, name='h0_conv'))
tgtctx_h1 = lrelu(conv2d(tgtctx_h0, self.df_dim*2, name='h1_conv'))
tgtctx_h2 = lrelu(conv2d(tgtctx_h1, self.df_dim*4, name='h2_conv'))
tgtctx_h3 = lrelu(conv2d(tgtctx_h2, self.df_dim*8, name='h3_conv'))
tgtctx_h4 = lrelu(linear(tf.reshape(tgtctx_h3, [self.batch_size, -1]), featsize, 'h4_lin'))
tgtctx_z = linear(tgtctx_h4, featsize, 'hz_lin')
with tf.variable_scope("conv") as scope:
srcimg_h0 = lrelu(conv2d(srcimg, self.df_dim, name='h0_conv'))
srcimg_h1 = lrelu(conv2d(srcimg_h0, self.df_dim*2, name='h1_conv'))
srcimg_h2 = lrelu(conv2d(srcimg_h1, self.df_dim*4, name='h2_conv'))
srcimg_h3 = lrelu(conv2d(srcimg_h2, self.df_dim*8, name='h3_conv'))
print(srcimg_h3.get_shape())
srcimg_h4 = lrelu(linear(tf.reshape(srcimg_h3, [self.batch_size, -1]), featsize, 'h4_lin'))
srcimg_z = lrelu(linear(srcimg_h4, featsize, 'hz_lin'))
scope.reuse_variables()
tgtimg_h0 = lrelu(conv2d(tgtimg, self.df_dim, name='h0_conv'))
tgtimg_h1 = lrelu(conv2d(tgtimg_h0, self.df_dim*2, name='h1_conv'))
tgtimg_h2 = lrelu(conv2d(tgtimg_h1, self.df_dim*4, name='h2_conv'))
tgtimg_h3 = lrelu(conv2d(tgtimg_h2, self.df_dim*8, name='h3_conv'))
tgtimg_h4 = lrelu(linear(tf.reshape(tgtimg_h3, [self.batch_size, -1]), featsize, 'h4_lin'))
tgtimg_z = lrelu(linear(tgtimg_h4, featsize, 'hz_lin'))
with tf.variable_scope("translate") as scope:
trans_h0 = lrelu(linear(tf.concat([srcimg_z, tgtctx_z], 1), featsize, 'trans_h0'))
trans_z = linear(trans_h0, featsize, 'trans_z')
self.translated_z = trans_z
with tf.variable_scope("deconv") as scope:
s_h, s_w = self.output_height, self.output_width
s_h2, s_h4, s_h8, s_h16 = \
int(s_h/2), int(s_h/4), int(s_h/8), int(s_h/16)
s_w2, s_w4, s_w8, s_w16 = \
int(s_w/2), int(s_w/4), int(s_w/8), int(s_w/16)
output_z_ = lrelu(linear(trans_z, self.gf_dim*8*s_h16*s_w16, 'd_h0_lin'))
output_h0 = tf.reshape(output_z_, [-1, s_h16, s_w16, self.gf_dim * 8])
output_h1 = lrelu(deconv2d(tf.concat([output_h0, tgtctx_h3], 3),
[self.batch_size, s_h8, s_w8, self.gf_dim*4], name='d_h1'))
output_h2 = lrelu(deconv2d(tf.concat([output_h1, tgtctx_h2], 3),
[self.batch_size, s_h4, s_w4, self.gf_dim*2], name='d_h2'))
output_h3 = lrelu(deconv2d(tf.concat([output_h2, tgtctx_h1], 3),
[self.batch_size, s_h2, s_w2, self.gf_dim*1], name='d_h3'))
output_h4 = deconv2d(tf.concat([output_h3, tgtctx_h0], 3),
[self.batch_size, s_h, s_w, self.c_dim], name='d_h4')
scope.reuse_variables()
truthoutput_z_ = lrelu(linear(tgtimg_z, self.gf_dim*8*s_h16*s_w16, 'd_h0_lin'))
truthoutput_h0 = tf.reshape(truthoutput_z_, [-1, s_h16, s_w16, self.gf_dim * 8])
truthoutput_h1 = lrelu(deconv2d(tf.concat([truthoutput_h0, tgtctx_h3], 3),
[self.batch_size, s_h8, s_w8, self.gf_dim*4], name='d_h1'))
truthoutput_h2 = lrelu(deconv2d(tf.concat([truthoutput_h1, tgtctx_h2], 3),
[self.batch_size, s_h4, s_w4, self.gf_dim*2], name='d_h2'))
truthoutput_h3 = lrelu(deconv2d(tf.concat([truthoutput_h2, tgtctx_h1], 3),
[self.batch_size, s_h2, s_w2, self.gf_dim*1], name='d_h3'))
truthoutput_h4 = deconv2d(tf.concat([truthoutput_h3, tgtctx_h0], 3),
[self.batch_size, s_h, s_w, self.c_dim], name='d_h4')
self.simloss = tf.reduce_mean((trans_z - tgtimg_z) ** 2) * 1e3
mean, var = tf.nn.moments(tgtimg_z, axes=[0])
print(var.get_shape())
# self.simloss /= tf.reduce_mean(var)
print(tgtimg_z.get_shape())
self.out = output_h4# + contextimg#tf.nn.tanh(h4)
self.out2 = truthoutput_h4
self.recon1 = tf.nn.l2_loss(tgtimg - self.out)
self.recon2 = tf.nn.l2_loss(tgtimg - self.out2)
self.loss = self.recon1 + self.recon2 + self.simloss
if ablation_type == "None":
self.loss = self.recon1 + self.recon2 + self.simloss
elif ablation_type == "L2":
self.loss = self.recon1 + self.recon2
elif ablation_type == "L2L3":
self.loss = self.recon1
elif ablation_type == "L1":
self.loss = self.recon2 + self.simloss
class ContextAEReach:
def __init__(self, gf_dim=64, df_dim=64,
gfc_dim=1024, dfc_dim=1024,
c_dim=3):
self.gf_dim = gf_dim
self.df_dim = df_dim
self.c_dim = c_dim
self.gfc_dim = gfc_dim
self.dfc_dim = dfc_dim
def build(self, image, ablation_type):
imgshape = image.get_shape().as_list()
print(imgshape)
self.output_height, self.output_width = imgshape[-3:-1]
self.batch_size = imgshape[1]
featsize = 1024
srcimg = image[0]
tgtimg = image[2]
tgtctx = image[1]
with tf.variable_scope("conv_context") as scope:
tgtctx_h0 = lrelu(conv2d(tgtctx, self.df_dim, name='h0_conv'))
tgtctx_h1 = lrelu(conv2d(tgtctx_h0, self.df_dim*2, name='h1_conv'))
tgtctx_h2 = lrelu(conv2d(tgtctx_h1, self.df_dim*4, name='h2_conv'))
tgtctx_h3 = lrelu(conv2d(tgtctx_h2, self.df_dim*8, name='h3_conv'))
tgtctx_h4 = lrelu(linear(tf.reshape(tgtctx_h3, [self.batch_size, -1]), featsize, 'h4_lin'))
tgtctx_z = linear(tgtctx_h4, featsize, 'hz_lin')
with tf.variable_scope("conv") as scope:
srcimg_h0 = lrelu(conv2d(srcimg, self.df_dim, name='h0_conv'))
srcimg_h1 = lrelu(conv2d(srcimg_h0, self.df_dim*2, name='h1_conv'))
srcimg_h2 = lrelu(conv2d(srcimg_h1, self.df_dim*4, name='h2_conv'))
srcimg_h3 = lrelu(conv2d(srcimg_h2, self.df_dim*8, name='h3_conv'))
print(srcimg_h3.get_shape())
srcimg_h4 = lrelu(linear(tf.reshape(srcimg_h3, [self.batch_size, -1]), featsize, 'h4_lin'))
srcimg_z = lrelu(linear(srcimg_h4, featsize, 'hz_lin'))
scope.reuse_variables()
tgtimg_h0 = lrelu(conv2d(tgtimg, self.df_dim, name='h0_conv'))
tgtimg_h1 = lrelu(conv2d(tgtimg_h0, self.df_dim*2, name='h1_conv'))
tgtimg_h2 = lrelu(conv2d(tgtimg_h1, self.df_dim*4, name='h2_conv'))
tgtimg_h3 = lrelu(conv2d(tgtimg_h2, self.df_dim*8, name='h3_conv'))
tgtimg_h4 = lrelu(linear(tf.reshape(tgtimg_h3, [self.batch_size, -1]), featsize, 'h4_lin'))
tgtimg_z = lrelu(linear(tgtimg_h4, featsize, 'hz_lin'))
with tf.variable_scope("translate") as scope:
trans_h0 = lrelu(linear(tf.concat([srcimg_z, tgtctx_z], 1), featsize, 'trans_h0'))
trans_z = linear(trans_h0, featsize, 'trans_z')
self.translated_z = trans_z
with tf.variable_scope("deconv") as scope:
s_h, s_w = self.output_height, self.output_width
s_h2, s_h4, s_h8, s_h16 = \
int(s_h/2), int(s_h/4), int(s_h/8), int(s_h/16)
s_w2, s_w4, s_w8, s_w16 = \
int(s_w/2), int(s_w/4), int(s_w/8), int(s_w/16)
output_z_ = lrelu(linear(trans_z, self.gf_dim*8*s_h16*s_w16, 'd_h0_lin'))
output_h0 = tf.reshape(output_z_, [-1, s_h16, s_w16, self.gf_dim * 8])
output_h1 = lrelu(deconv2d(tf.concat([output_h0, tgtctx_h3], 3),
[self.batch_size, s_h8, s_w8, self.gf_dim*4], name='d_h1'))
output_h2 = lrelu(deconv2d(tf.concat([output_h1, tgtctx_h2], 3),
[self.batch_size, s_h4, s_w4, self.gf_dim*2], name='d_h2'))
output_h3 = lrelu(deconv2d(tf.concat([output_h2, tgtctx_h1], 3),
[self.batch_size, s_h2, s_w2, self.gf_dim*1], name='d_h3'))
output_h4 = deconv2d(tf.concat([output_h3, tgtctx_h0], 3),
[self.batch_size, s_h, s_w, self.c_dim], name='d_h4')
scope.reuse_variables()
truthoutput_z_ = lrelu(linear(tgtimg_z, self.gf_dim*8*s_h16*s_w16, 'd_h0_lin'))
truthoutput_h0 = tf.reshape(truthoutput_z_, [-1, s_h16, s_w16, self.gf_dim * 8])
truthoutput_h1 = lrelu(deconv2d(tf.concat([truthoutput_h0, tgtctx_h3], 3),
[self.batch_size, s_h8, s_w8, self.gf_dim*4], name='d_h1'))
truthoutput_h2 = lrelu(deconv2d(tf.concat([truthoutput_h1, tgtctx_h2], 3),
[self.batch_size, s_h4, s_w4, self.gf_dim*2], name='d_h2'))
truthoutput_h3 = lrelu(deconv2d(tf.concat([truthoutput_h2, tgtctx_h1], 3),
[self.batch_size, s_h2, s_w2, self.gf_dim*1], name='d_h3'))
truthoutput_h4 = deconv2d(tf.concat([truthoutput_h3, tgtctx_h0], 3),
[self.batch_size, s_h, s_w, self.c_dim], name='d_h4')
self.simloss = tf.reduce_mean((trans_z - tgtimg_z) ** 2) * 1e3
mean, var = tf.nn.moments(tgtimg_z, axes=[0])
print(var.get_shape())
# self.simloss /= tf.reduce_mean(var)
print(tgtimg_z.get_shape())
self.out = output_h4# + contextimg#tf.nn.tanh(h4)
self.out2 = truthoutput_h4
self.recon1 = tf.nn.l2_loss(tgtimg - self.out)
self.recon2 = tf.nn.l2_loss(tgtimg - self.out2)
# self.loss = self.recon1 + self.recon2 + self.simloss
if ablation_type == "None":
self.loss = self.recon1 + self.recon2 + self.simloss
elif ablation_type == "L2":
self.loss = self.recon1 + self.recon2
elif ablation_type == "L2L3":
self.loss = self.recon1
elif ablation_type == "L1":
self.loss = self.recon2 + self.simloss
class ContextAESweep:
def __init__(self, gf_dim=64, df_dim=64,
gfc_dim=1024, dfc_dim=1024,
c_dim=3):
self.gf_dim = gf_dim
self.df_dim = df_dim
self.c_dim = c_dim
self.gfc_dim = gfc_dim
self.dfc_dim = dfc_dim
def build(self, image, ablation_type):
imgshape = image.get_shape().as_list()
print(imgshape)
self.output_height, self.output_width = imgshape[-3:-1]
self.batch_size = imgshape[1]
featsize = 100
srcimg = image[0]
tgtimg = image[2]
tgtctx = image[1]
nf0 = 32
nf1 = 16
nf2 = 16
nf3 = 8
ns0 = 1
ns1 = 2
ns2 = 1
ns3 = 2
# with tf.variable_scope("conv_context") as scope:
def encode(img):
img_h0 = lrelu(conv2d(img, nf0, d_h=ns0, d_w=ns0, name='h0_conv'))
img_h1 = lrelu(conv2d(img_h0, nf1, d_h=ns1, d_w=ns1, name='h1_conv'))
img_h2 = lrelu(conv2d(img_h1, nf2, d_h=ns2, d_w=ns2, name='h2_conv'))
img_h3 = lrelu(conv2d(img_h2, nf3, d_h=ns3, d_w=ns3, name='h3_conv'))
print(img_h3.get_shape())
img_h4 = lrelu(linear(tf.nn.dropout(tf.reshape(img_h3, [self.batch_size, -1]), keep_prob), featsize, 'h4_lin'))
img_z = lrelu(linear(tf.nn.dropout(img_h4, keep_prob), featsize, 'hz_lin'))
return img_h0, img_h1, img_h2, img_h3, img_h4, img_z
with tf.variable_scope("conv") as scope:
srcimg_h0, srcimg_h1, srcimg_h2, srcimg_h3, srcimg_h4, srcimg_z = encode(srcimg)
scope.reuse_variables()
tgtimg_h0, tgtimg_h1, tgtimg_h2, tgtimg_h3, tgtimg_h4, tgtimg_z = encode(tgtimg)
tgtctx_h0, tgtctx_h1, tgtctx_h2, tgtctx_h3, tgtctx_h4, tgtctx_z = encode(tgtctx)
with tf.variable_scope("translate") as scope:
trans_h0 = lrelu(linear(tf.nn.dropout(tf.concat([srcimg_z, tgtctx_z], 1), keep_prob), featsize, 'trans_h0'))
trans_z = linear(tf.nn.dropout(trans_h0, keep_prob), featsize, 'trans_z')
self.translated_z = trans_z
s_h, s_w = self.output_height, self.output_width
s_h0, s_h1, s_h2, s_h3 = \
int(s_h/ns0), int(s_h/ns0/ns1), int(s_h/ns0/ns1/ns2), int(s_h/ns0/ns1/ns2/ns3)
s_w0, s_w1, s_w2, s_w3 = \
int(s_w/ns0), int(s_w/ns0/ns1), int(s_w/ns0/ns1/ns2), int(s_w/ns0/ns1/ns2/ns3)
def decode(z, skip_h3, skip_h2, skip_h1, skip_h0):
z_ = lrelu(linear(tf.nn.dropout(z, keep_prob), nf3*s_h3*s_w3, 'd_h0_lin'))
h0 = tf.nn.dropout(tf.reshape(z_, [-1, s_h3, s_w3, nf3]), keep_prob)
import IPython
IPython.embed()
h1 = lrelu(deconv2d(tf.concat([h0, skip_h3], 3),
[self.batch_size, s_h2, s_w2, nf2], name='d_h1', d_h=ns3, d_w=ns3))
h2 = lrelu(deconv2d(tf.concat([h1, skip_h2], 3),
[self.batch_size, s_h1, s_w1, nf1], name='d_h2', d_h=ns2, d_w=ns2))
h3 = lrelu(deconv2d(tf.concat([h2, skip_h1], 3),
[self.batch_size, s_h0, s_w0, nf0], name='d_h3', d_h=ns1, d_w=ns1))
print(h3.get_shape())
h4 = deconv2d(tf.concat([h3, skip_h0], 3),
[self.batch_size, s_h, s_w, self.c_dim], name='d_h4', d_h=ns0, d_w=ns0)
return h4
with tf.variable_scope("deconv") as scope:
output_h4 = decode(trans_z, tgtctx_h3, tgtctx_h2, tgtctx_h1, tgtctx_h0)
scope.reuse_variables()
truthoutput_h4 = decode(tgtimg_z, tgtctx_h3, tgtctx_h2, tgtctx_h1, tgtctx_h0)
self.simloss = tf.reduce_mean((trans_z - tgtimg_z) ** 2) * 1e3
print(tgtimg_z.get_shape())
self.out = output_h4
self.out2 = truthoutput_h4
print(self.out.get_shape())
self.recon1 = tf.nn.l2_loss(tgtimg - self.out)
self.recon2 = tf.nn.l2_loss(tgtimg - self.out2)
self.loss = self.recon1 + self.recon2 + self.simloss
if ablation_type == "None":
self.loss = self.recon1 + self.recon2 + self.simloss
elif ablation_type == "L2":
self.loss = self.recon1 + self.recon2
elif ablation_type == "L2L3":
self.loss = self.recon1
elif ablation_type == "L1":
self.loss = self.recon2 + self.simloss
if __name__ == "__main__":
#TODO: add in an argparse
parser = argparse.ArgumentParser(description='Run ablations on models')
parser.add_argument('experiment_type', type=str,
help='type of ablation')
parser.add_argument('ablation_type', type=str,
help='type of ablation')
parser.add_argument('data_location', type=str,
help='data_location')
args = parser.parse_args()
vdata = np.load(args.data_location)
tf.reset_default_graph()
idim = (36, 64)
keep_prob = tf.placeholder(tf.float32, name='keep_prob')
tftrain = tf.placeholder(tf.bool, name='tftrain')
batch_size=100
if (args.experiment_type == "reach") or (args.experiment_type == "push"):
idim = (48, 48)
tfinput = tf.placeholder(tf.float32, (3, batch_size) + idim + (3, ), name='x')
if args.experiment_type == "reach":
test = ContextAEReach()
elif args.experiment_type == "push":
test = ContextAEPush()
elif args.experiment_type == "pushreal":
test = ContextAEPushReal()
elif args.experiment_type == "sweep":
test = ContextAESweep()
test.build(tfinput, args.ablation_type)
config = tf.ConfigProto()
config.gpu_options.allow_growth=True
sess = tf.Session(config=config)
learning_rate = tf.placeholder(tf.float32, shape=[])
optimizer = tf.train.AdamOptimizer(learning_rate).minimize(test.loss)
sess.run(tf.global_variables_initializer())
allloss = []
validloss = []
itr = 0
saver = tf.train.Saver()
n = vdata.shape[1]
nlen = vdata.shape[0]
ntrain = int(0.8*n)
nvalid = n - ntrain
validdata = vdata[:, ntrain:]
traindata = vdata[:, :ntrain]
while True:
choicesrc = np.random.choice(ntrain, batch_size)
choicetgt = np.random.choice(ntrain, batch_size)
srcdata = traindata[np.arange(0, batch_size) % nlen, choicesrc]
tgtdata = traindata[np.arange(0, batch_size) % nlen, choicetgt]
tgtctx = traindata[0, choicetgt]
batch = [srcdata, tgtctx, tgtdata]
_, loss, sim, r1, r2 = sess.run( [optimizer, test.loss, test.simloss, test.recon1, test.recon2],
{tfinput: batch, learning_rate:1e-4, tftrain:False, keep_prob:0.5})
if itr % 4 == 0:
print(loss, sim, r1, r2)
allloss.append(loss)
if itr % 40 == 0:
choicesrc = np.random.choice(nvalid, batch_size)
choicetgt = np.random.choice(nvalid, batch_size)
srcdata = validdata[np.arange(0, batch_size) % nlen, choicesrc]
tgtdata = validdata[np.arange(0, batch_size) % nlen, choicetgt]
tgtctx = validdata[0, choicetgt]
batch = [srcdata, tgtctx, tgtdata]
loss, sim, r1, r2 = sess.run([test.loss, test.simloss, test.recon1, test.recon2],
{tfinput: batch, tftrain:False, keep_prob:1.0})
print(loss, sim, r1, r2,'E')
validloss.append(loss)
saver.save(sess, 'ablation_' + str(args.experiment_type) + '_' + str(args.ablation_type) + "_" + str(itr))
if itr == 30000 or (itr>30000 and itr%10000 == 0):
import IPython
IPython.embed()
itr += 1 | 46.317376 | 123 | 0.591777 | 3,817 | 26,123 | 3.787791 | 0.07388 | 0.028635 | 0.03237 | 0.02324 | 0.812491 | 0.801425 | 0.768917 | 0.765528 | 0.755153 | 0.752525 | 0 | 0.054324 | 0.274892 | 26,123 | 564 | 124 | 46.317376 | 0.708954 | 0.018949 | 0 | 0.736515 | 0 | 0 | 0.035617 | 0 | 0 | 0 | 0 | 0.001773 | 0 | 1 | 0.041494 | false | 0 | 0.026971 | 0.006224 | 0.105809 | 0.041494 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4104c5aaff9b97c65248d6fc9399c81996d73373 | 10,802 | py | Python | src/spitfire/time/stepcontrol.py | sandialabs/Spitfire | 65670e3ba5d1ccb4ac72524b77957706345c5bf6 | [
"Apache-2.0"
] | 11 | 2020-03-20T02:10:17.000Z | 2021-12-14T10:08:09.000Z | src/spitfire/time/stepcontrol.py | sandialabs/Spitfire | 65670e3ba5d1ccb4ac72524b77957706345c5bf6 | [
"Apache-2.0"
] | 18 | 2020-03-18T18:58:56.000Z | 2021-12-21T02:35:35.000Z | src/spitfire/time/stepcontrol.py | sandialabs/Spitfire | 65670e3ba5d1ccb4ac72524b77957706345c5bf6 | [
"Apache-2.0"
] | 2 | 2021-05-31T17:24:56.000Z | 2021-06-20T05:27:41.000Z | """
This module contains controllers for adaptive time stepping based on embedded temporal error estimation.
"""
# Spitfire - a Python-C++ library for building tabulated chemistry models and solving differential equations
# Copyright 2019 National Technology & Engineering Solutions of Sandia, LLC (NTESS).
#
# You should have received a copy of the 3-clause BSD License
# along with this program. If not, see <https://opensource.org/licenses/BSD-3-Clause>.
#
# Questions? Contact Mike Hansen (mahanse@sandia.gov)
from numpy import zeros, min, copy
class ConstantTimeStep(object):
"""A simple wrapper class for a constant time step.
**Constructor**:
Parameters
----------
step_size : float
the size of the time step
"""
def __init__(self, step_size):
self.step_size = step_size
def __call__(self, *args, **kwargs):
return self.step_size
def first_step_size(self):
"""Obtain the initial step size"""
return self.step_size
def last_step_size(self):
"""Obtain the most recent step size"""
return self.step_size
def target_error(self):
"""Obtain the most target error, needed here just to avoid a base class"""
return -1.
def step_size_is_constant(self):
"""Whether or not this controller has a constant or variable step size"""
return True
class PIController(object):
"""A PI controller on the embedded temporal error estimate
**Constructor**:
Parameters
----------
kp : float
the modal gain of the proportional control mode (default: 0.06666666667)
ki : float
the modal gain of the integral control mode (default: 0.1333333333)
target_error : float
the target error for the controller (default: 1.e-10)
max_step : float
the maximum allowable time step (default: 1.e-3)
max_ramp : float
the maximum allowable rate of increase of the time step (default: 1.1)
first_step : float
the initial step size (default: 1.e-6)
"""
def __init__(self, kp=0.06666666667, ki=0.1333333333,
target_error=1.e-4, max_step=1.e4,
max_ramp=1.1, first_step=1.e-3):
self._kp = kp
self._ki = ki
self._target_error = target_error
self._max_step = max_step
self._max_ramp = max_ramp
self._first_step = first_step
self._number_of_old_values = 2
self._err_history = zeros(self._number_of_old_values)
self._step_history = zeros(self._number_of_old_values)
def __call__(self, step_count, step, step_output, *args, **kwargs):
error = step_output.temporal_error
if error < 1.e-16:
return min([step * self._max_ramp, self._max_step])
if step_count < self._number_of_old_values - 1:
self._err_history[step_count] = error
self._step_history[step_count] = step
else:
self._err_history[:-1] = self._err_history[1:]
self._step_history[:-1] = self._step_history[1:]
self._err_history[-1] = error
self._step_history[-1] = step
if step_count == 0:
mod = min([self._max_ramp, (self._target_error / error) ** self._ki])
else:
mod = min([self._max_ramp,
(self._target_error / error) ** self._ki * (self._err_history[-1] / error) ** self._kp])
return min([step * mod, self._max_step])
def first_step_size(self):
"""Obtain the initial step size"""
return self._first_step
def last_step_size(self):
"""Obtain the most recent step size"""
return self._step_history[-1]
def target_error(self):
"""Obtain the most target error, needed here just to avoid a base class"""
return self._target_error
def step_size_is_constant(self):
"""Whether or not this controller has a constant or variable step size"""
return False
class CascadeController(object):
"""A two-level cascade control system on the embedded temporal error estimate and the ratio of two estimates.
The stepper method must support multiple embedded error estimates.
**Constructor**:
Parameters
----------
kp : float
the modal gain of the proportional control mode for the error controller (default: 0.06666666667)
ki : float
the modal gain of the integral control mode for the error controller (default: 0.1333333333)
ratio_kp : float
the modal gain of the proportional control mode for the ratio controller (default: 0.1)
ratio_ki : float
the modal gain of the integral control mode for the ratio controller (default: 0.3)
target_ratio : float
the value of the target ratio for the controller (default: 1.e-2)
initial_target_error : float
the initial value of the target error for the controller (default: 1.e-10)
max_step : float
the maximum allowable time step (default: 1.e-3)
max_ramp : float
the maximum allowable rate of increase of the time step (default: 1.1)
first_step : float
the initial step size (default: 1.e-6)
"""
def __init__(self, kp=0.06666666667, ki=0.1333333333,
ratio_kp=0.1, ratio_ki=0.3,
initial_target_error=1.e-10, target_ratio=1.e-2,
max_step=1.e-3, max_ramp=1.1, first_step=1.e-6):
self._kp = kp
self._ki = ki
self._ratio_kp = ratio_kp
self._ratio_ki = ratio_ki
self._target_ratio = target_ratio
self._initial_target_error = initial_target_error
self._target_error = copy(initial_target_error)
self._max_step = max_step
self._max_ramp = max_ramp
self._first_step = first_step
self._number_of_old_values = 2
self._ratio_history = zeros(self._number_of_old_values)
self._err_history = zeros(self._number_of_old_values)
self._step_history = zeros(self._number_of_old_values)
def __call__(self, step_count, step, step_output, *args, **kwargs):
error = step_output.temporal_error
ratio = error / (1.e-12 + step_output.extra_errors[0])
if error < 1.e-16:
return min([step * self._max_ramp, self._max_step])
if step_count < self._number_of_old_values - 1:
self._err_history[step_count] = error
self._ratio_history[step_count] = ratio
self._step_history[step_count] = step
else:
self._err_history[:-1] = self._err_history[1:]
self._ratio_history[:-1] = self._ratio_history[1:]
self._step_history[:-1] = self._step_history[1:]
self._err_history[-1] = error
self._ratio_history[-1] = ratio
self._step_history[-1] = step
if step_count == 0:
mod = min([self._max_ramp, (self._target_error / error) ** self._ki])
err_mod = (self._target_ratio / ratio) ** self._ratio_ki
else:
mod = min([self._max_ramp,
(self._target_error / error) ** self._ki * (self._err_history[-1] / error) ** self._kp])
err_mod = min([self._max_ramp, (self._target_ratio / ratio) ** self._ratio_ki * (
self._ratio_history[-1] / ratio) ** self._ratio_kp])
self._target_error *= err_mod
return min([step * mod, self._max_step])
def first_step_size(self):
"""Obtain the initial step size"""
return self._first_step
def last_step_size(self):
"""Obtain the most recent step size"""
return self._step_history[-1]
def target_error(self):
"""Obtain the most target error, needed here just to avoid a base class"""
return self._target_error
def step_size_is_constant(self):
"""Whether or not this controller has a constant or variable step size"""
return False
class RatioController(object):
"""A PI controller on the ratio of two embedded temporal error estimates
The stepper method must support multiple embedded error estimates.
**Constructor**:
Parameters
----------
kp : float
the modal gain of the proportional control mode (default: 0.1)
ki : float
the modal gain of the integral control mode (default: 0.3)
target_ratio : float
the target error ratio for the controller (default: 1.e-2)
max_step : float
the maximum allowable time step (default: 1.e-3)
max_ramp : float
the maximum allowable rate of increase of the time step (default: 1.1)
first_step : float
the initial step size (default: 1.e-6)
"""
def __init__(self, kp=0.1, ki=0.3,
target_ratio=1.e-2, max_step=1.e-3,
max_ramp=1.1, first_step=1.e-6):
self._kp = kp
self._ki = ki
self._target_ratio = target_ratio
self._max_step = max_step
self._max_ramp = max_ramp
self._first_step = first_step
self._number_of_old_values = 2
self._ratio_history = zeros(self._number_of_old_values)
self._step_history = zeros(self._number_of_old_values)
def __call__(self, step_count, step, step_output, *args, **kwargs):
error = step_output.temporal_error
ratio = error / (1.e-12 + step_output.extra_errors[0])
if error < 1.e-16:
return min([step * self._max_ramp, self._max_step])
if step_count < self._number_of_old_values - 1:
self._ratio_history[step_count] = ratio
self._step_history[step_count] = step
else:
self._ratio_history[:-1] = self._ratio_history[1:]
self._step_history[:-1] = self._step_history[1:]
self._ratio_history[-1] = ratio
self._step_history[-1] = step
if step_count == 0:
mod = min([self._max_ramp, (self._target_ratio / ratio) ** self._ki])
else:
mod = min([self._max_ramp,
(self._target_ratio / ratio) ** self._ki * (self._ratio_history[-1] / ratio) ** self._kp])
return min([step * mod, self._max_step])
def first_step_size(self):
"""Obtain the initial step size"""
return self._first_step
def last_step_size(self):
"""Obtain the most recent step size"""
return self._step_history[-1]
def target_error(self):
"""Obtain the most target error, needed here just to avoid a base class"""
return 1.e305
def step_size_is_constant(self):
"""Whether or not this controller has a constant or variable step size"""
return False
| 38.169611 | 128 | 0.626365 | 1,479 | 10,802 | 4.302231 | 0.11021 | 0.042747 | 0.042433 | 0.030646 | 0.832626 | 0.829797 | 0.812353 | 0.783435 | 0.752318 | 0.752318 | 0 | 0.029197 | 0.277078 | 10,802 | 282 | 129 | 38.304965 | 0.785632 | 0.354194 | 0 | 0.794521 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.164384 | false | 0 | 0.006849 | 0.006849 | 0.356164 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
de5de7002324906d0d22d395d4fb5c794a5c9c06 | 783 | py | Python | metadata-ingestion/tests/unit/test_config_clean.py | pramodbiligiri/datahub | 892adbcf330a9c7c687a293dd3edeca9fa0e2fd8 | [
"Apache-2.0"
] | 3,586 | 2020-01-27T11:09:57.000Z | 2022-03-15T16:13:30.000Z | metadata-ingestion/tests/unit/test_config_clean.py | iamduyu/datahub | 4c33124e8f5582749877e30ac2b0c0c1bfa06f42 | [
"Apache-2.0"
] | 1,678 | 2020-01-27T20:51:01.000Z | 2022-03-15T15:22:02.000Z | metadata-ingestion/tests/unit/test_config_clean.py | iamduyu/datahub | 4c33124e8f5582749877e30ac2b0c0c1bfa06f42 | [
"Apache-2.0"
] | 924 | 2020-01-28T20:10:50.000Z | 2022-03-15T10:01:23.000Z | from datahub.utilities import config_clean
def test_url_without_slash_suffix():
assert (
config_clean.remove_trailing_slashes("http://example.com")
== "http://example.com"
)
def test_url_with_suffix():
assert (
config_clean.remove_trailing_slashes("http://example.com/")
== "http://example.com"
)
def test_url_with_multiple_slashes():
assert (
config_clean.remove_trailing_slashes("http://example.com/a/b/c")
== "http://example.com/a/b/c"
)
assert (
config_clean.remove_trailing_slashes("http://example.com/a/b/c/")
== "http://example.com/a/b/c"
)
assert (
config_clean.remove_trailing_slashes("http://example.com/a/b/c///")
== "http://example.com/a/b/c"
)
| 25.258065 | 75 | 0.624521 | 101 | 783 | 4.574257 | 0.237624 | 0.238095 | 0.30303 | 0.194805 | 0.839827 | 0.839827 | 0.839827 | 0.839827 | 0.839827 | 0.839827 | 0 | 0 | 0.212005 | 783 | 30 | 76 | 26.1 | 0.748784 | 0 | 0 | 0.416667 | 0 | 0 | 0.282248 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 1 | 0.125 | true | 0 | 0.041667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
de61a43aba4bc5b50d446da1c1ecb058e7e3e800 | 30,507 | py | Python | authnzerver/tests/test_server.py | waqasbhatti/authnzerver | d40fa38601f4f11e966fc52e11ad6fe1116bb145 | [
"MIT"
] | 3 | 2019-06-02T12:57:08.000Z | 2020-04-01T14:00:12.000Z | authnzerver/tests/test_server.py | waqasbhatti/authnzerver | d40fa38601f4f11e966fc52e11ad6fe1116bb145 | [
"MIT"
] | 7 | 2020-03-17T21:55:41.000Z | 2020-07-07T22:58:48.000Z | authnzerver/tests/test_server.py | waqasbhatti/authnzerver | d40fa38601f4f11e966fc52e11ad6fe1116bb145 | [
"MIT"
] | 2 | 2020-03-04T06:56:27.000Z | 2020-03-24T08:39:11.000Z | '''test_server.py - Waqas Bhatti (waqas.afzal.bhatti@gmail.com) - Mar 2020
License: MIT. See the LICENSE file for details.
This tests the actual running server.
'''
import secrets
import subprocess
import requests
import os.path
import time
from datetime import datetime, timedelta
from pytest import mark
from authnzerver.autosetup import autogen_secrets_authdb
from authnzerver.messaging import encrypt_message, decrypt_message
@mark.skipif(
os.environ.get("GITHUB_WORKFLOW", None) is not None,
reason="github doesn't allow server tests probably"
)
def test_server_with_env(monkeypatch, tmpdir):
'''
This tests if the server starts fine with all config in the environment.
'''
# the basedir will be the pytest provided temporary directory
basedir = str(tmpdir)
# we'll make the auth DB and secrets file first
authdb_path, creds, secrets_file, salt_file, env_file = (
autogen_secrets_authdb(
basedir,
interactive=False
)
)
# read in the secrets file for the secret
with open(secrets_file,'r') as infd:
secret = infd.read().strip('\n')
# read in the salts file for the salt
with open(salt_file,'r') as infd:
salt = infd.read().strip('\n')
# read the creds file so we can try logging in
with open(creds,'r') as infd:
useremail, password = infd.read().strip('\n').split()
# get a temp directory
tmpdir = os.path.join('/tmp', 'authnzrv-%s' % secrets.token_urlsafe(8))
server_listen = '127.0.0.1'
server_port = '18158'
# set up the environment
monkeypatch.setenv("AUTHNZERVER_AUTHDB", authdb_path)
monkeypatch.setenv("AUTHNZERVER_BASEDIR", basedir)
monkeypatch.setenv("AUTHNZERVER_CACHEDIR", tmpdir)
monkeypatch.setenv("AUTHNZERVER_DEBUGMODE", "0")
monkeypatch.setenv("AUTHNZERVER_LISTEN", server_listen)
monkeypatch.setenv("AUTHNZERVER_PORT", server_port)
monkeypatch.setenv("AUTHNZERVER_SECRET", secret)
monkeypatch.setenv("AUTHNZERVER_PIISALT", salt)
monkeypatch.setenv("AUTHNZERVER_SESSIONEXPIRY", "60")
monkeypatch.setenv("AUTHNZERVER_WORKERS", "1")
monkeypatch.setenv("AUTHNZERVER_EMAILSERVER", "smtp.test.org")
monkeypatch.setenv("AUTHNZERVER_EMAILPORT", "25")
monkeypatch.setenv("AUTHNZERVER_EMAILUSER", "testuser")
monkeypatch.setenv("AUTHNZERVER_EMAILPASS", "testpass")
monkeypatch.setenv(
"AUTHNZERVER_RATELIMITS",
"ipaddr:300;user:360;session:120;apikey:720;burst:150;"
"user-new:50;user-login:50"
)
# launch the server subprocess
p = subprocess.Popen("authnzrv", shell=True)
# wait 2.5 seconds for the server to start
time.sleep(2.5)
try:
#
# 1. hit the server with a request for a new session
#
# create a new anonymous session token
session_payload = {
'user_id':2,
'user_agent':'Mozzarella Killerwhale',
'expires':datetime.utcnow()+timedelta(hours=1),
'ip_address': '1.1.1.1',
'extra_info_json':{'pref_datasets_always_private':True}
}
request_dict = {
'request':'session-new',
'body':session_payload,
'reqid':101,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=1.0
)
resp.raise_for_status()
# decrypt the response
response_dict = decrypt_message(resp.text, secret)
assert response_dict['reqid'] == request_dict['reqid']
assert response_dict['success'] is True
assert isinstance(response_dict['response'], dict)
assert response_dict['response']['session_token'] is not None
#
# 2. login as the superuser
#
request_dict = {
'request':'user-login',
'body':{
'session_token':response_dict['response']['session_token'],
'email':useremail,
'password':password
},
'reqid':102,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=1.0
)
resp.raise_for_status()
# decrypt the response
response_dict = decrypt_message(resp.text, secret)
assert response_dict['reqid'] == request_dict['reqid']
assert response_dict['success'] is True
assert isinstance(response_dict['response'], dict)
assert response_dict['response']['user_id'] == 1
#
# kill the server at the end
#
finally:
p.terminate()
try:
p.communicate(timeout=3.0)
p.kill()
except Exception:
pass
# make sure to kill authnzrv on some Linux machines. use lsof and the
# port number to find the remaining authnzrv processes and kill them
# subprocess.call(
# "lsof | grep 18158 | awk '{ print $2 }' | sort | uniq | xargs kill -2",
# shell=True
# )
@mark.skipif(
os.environ.get("GITHUB_WORKFLOW", None) is not None,
reason="github doesn't allow server tests probably"
)
def test_server_invalid_logins(monkeypatch, tmpdir):
'''This tests if the server responds appropriately to invalid logins.
The timing difference between successive failed logins should increase
roughly exponentially.
'''
# the basedir will be the pytest provided temporary directory
basedir = str(tmpdir)
# we'll make the auth DB and secrets file first
authdb_path, creds, secrets_file, salt_file, env_file = (
autogen_secrets_authdb(
basedir,
interactive=False
)
)
# read in the secrets file for the secret
with open(secrets_file,'r') as infd:
secret = infd.read().strip('\n')
# read in the salts file for the salt
with open(salt_file,'r') as infd:
salt = infd.read().strip('\n')
# read the creds file so we can try logging in
with open(creds,'r') as infd:
useremail, password = infd.read().strip('\n').split()
# get a temp directory
tmpdir = os.path.join('/tmp', 'authnzrv-%s' % secrets.token_urlsafe(8))
server_listen = '127.0.0.1'
server_port = '18158'
# set up the environment
monkeypatch.setenv("AUTHNZERVER_AUTHDB", authdb_path)
monkeypatch.setenv("AUTHNZERVER_BASEDIR", basedir)
monkeypatch.setenv("AUTHNZERVER_CACHEDIR", tmpdir)
monkeypatch.setenv("AUTHNZERVER_DEBUGMODE", "0")
monkeypatch.setenv("AUTHNZERVER_LISTEN", server_listen)
monkeypatch.setenv("AUTHNZERVER_PORT", server_port)
monkeypatch.setenv("AUTHNZERVER_SECRET", secret)
monkeypatch.setenv("AUTHNZERVER_PIISALT", salt)
monkeypatch.setenv("AUTHNZERVER_SESSIONEXPIRY", "60")
monkeypatch.setenv("AUTHNZERVER_WORKERS", "1")
monkeypatch.setenv("AUTHNZERVER_EMAILSERVER", "smtp.test.org")
monkeypatch.setenv("AUTHNZERVER_EMAILPORT", "25")
monkeypatch.setenv("AUTHNZERVER_EMAILUSER", "testuser")
monkeypatch.setenv("AUTHNZERVER_EMAILPASS", "testpass")
monkeypatch.setenv(
"AUTHNZERVER_RATELIMITS",
"ipaddr:300;user:360;session:120;apikey:720;burst:150;"
"user-new:50;user-login:50"
)
# launch the server subprocess
p = subprocess.Popen("authnzrv", shell=True)
# wait 2.5 seconds for the server to start
time.sleep(2.5)
timing = []
try:
#
# attempt to login as the superuser several times with the wrong
# password
#
for i in range(5):
# create a new anonymous session token
session_payload = {
'user_id':2,
'user_agent':'Mozzarella Killerwhale',
'expires':datetime.utcnow()+timedelta(hours=1),
'ip_address': '1.1.1.1',
'extra_info_json':{'pref_datasets_always_private':True}
}
request_dict = {
'request':'session-new',
'body':session_payload,
'reqid':i,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=1.0
)
resp.raise_for_status()
# decrypt the response
session_dict = decrypt_message(resp.text, secret)
assert session_dict['reqid'] == request_dict['reqid']
assert session_dict['success'] is True
assert isinstance(session_dict['response'], dict)
assert session_dict['response']['session_token'] is not None
request_dict = {
'request':'user-login',
'body':{
'session_token':session_dict['response']['session_token'],
'email':useremail,
'password':'%s-%i' % (password,i)
},
'reqid':10*i + 10,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
start_login_time = time.monotonic()
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=60.0
)
resp.raise_for_status()
timing.append(time.monotonic() - start_login_time)
# decrypt the response
response_dict = decrypt_message(resp.text, secret)
assert response_dict['reqid'] == request_dict['reqid']
assert response_dict['success'] is False
assert isinstance(response_dict['response'], dict)
assert response_dict['response']['user_id'] is None
#
# check if the timings follow the expected trend
#
diffs = [timing[x+1]-timing[x] for x in range(4)]
diffs_increasing = all(diffs[x+1] > diffs[x] for x in range(3))
assert diffs_increasing is True
# now login wih the correct password and see if the login time goes back
# to normal
session_payload = {
'user_id':2,
'user_agent':'Mozzarella Killerwhale',
'expires':datetime.utcnow()+timedelta(hours=1),
'ip_address': '1.1.1.1',
'extra_info_json':{'pref_datasets_always_private':True}
}
request_dict = {
'request':'session-new',
'body':session_payload,
'reqid':1004,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=1.0
)
resp.raise_for_status()
# decrypt the response
session_dict = decrypt_message(resp.text, secret)
assert session_dict['reqid'] == request_dict['reqid']
assert session_dict['success'] is True
assert isinstance(session_dict['response'], dict)
assert session_dict['response']['session_token'] is not None
request_dict = {
'request':'user-login',
'body':{
'session_token':session_dict['response']['session_token'],
'email':useremail,
'password':password
},
'reqid':1005,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
start_login_time = time.monotonic()
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=60.0
)
resp.raise_for_status()
timing.append(time.monotonic() - start_login_time)
# decrypt the response
response_dict = decrypt_message(resp.text, secret)
assert response_dict['reqid'] == request_dict['reqid']
assert response_dict['success'] is True
assert isinstance(response_dict['response'], dict)
assert response_dict['response']['user_id'] == 1
# the latest time should be less than the 1st time (when throttling was
# activated) and also less than the immediately previous time
assert ((timing[-1] < timing[0]) and (timing[-1] < timing[-2]))
finally:
#
# kill the server at the end
#
p.terminate()
try:
p.communicate(timeout=3.0)
p.kill()
except Exception:
pass
# make sure to kill authnzrv on some Linux machines. use lsof and the
# port number to find the remaining authnzrv processes and kill them
# subprocess.call(
# "lsof | grep 18158 | awk '{ print $2 }' | sort | uniq | xargs kill -2",
# shell=True
# )
@mark.skipif(
os.environ.get("GITHUB_WORKFLOW", None) is not None,
reason="github doesn't allow server tests probably"
)
def test_server_invalid_logins_with_lock(monkeypatch, tmpdir):
'''This tests if the server responds appropriately to invalid logins.
The timing difference between successive failed logins should increase
roughly exponentially. In addition, the user should be locked out of their
account for the configured amount of time and unlocked thereafter.
'''
# the basedir will be the pytest provided temporary directory
basedir = str(tmpdir)
# we'll make the auth DB and secrets file first
authdb_path, creds, secrets_file, salt_file, env_file = (
autogen_secrets_authdb(
basedir,
interactive=False
)
)
# read in the secrets file for the secret
with open(secrets_file,'r') as infd:
secret = infd.read().strip('\n')
# read in the salts file for the salt
with open(salt_file,'r') as infd:
salt = infd.read().strip('\n')
# read the creds file so we can try logging in
with open(creds,'r') as infd:
useremail, password = infd.read().strip('\n').split()
# get a temp directory
tmpdir = os.path.join('/tmp', 'authnzrv-%s' % secrets.token_urlsafe(8))
server_listen = '127.0.0.1'
server_port = '18158'
# set up the environment
monkeypatch.setenv("AUTHNZERVER_AUTHDB", authdb_path)
monkeypatch.setenv("AUTHNZERVER_BASEDIR", basedir)
monkeypatch.setenv("AUTHNZERVER_CACHEDIR", tmpdir)
monkeypatch.setenv("AUTHNZERVER_DEBUGMODE", "0")
monkeypatch.setenv("AUTHNZERVER_LISTEN", server_listen)
monkeypatch.setenv("AUTHNZERVER_PORT", server_port)
monkeypatch.setenv("AUTHNZERVER_SECRET", secret)
monkeypatch.setenv("AUTHNZERVER_PIISALT", salt)
monkeypatch.setenv("AUTHNZERVER_SESSIONEXPIRY", "60")
monkeypatch.setenv("AUTHNZERVER_WORKERS", "1")
monkeypatch.setenv("AUTHNZERVER_EMAILSERVER", "smtp.test.org")
monkeypatch.setenv("AUTHNZERVER_EMAILPORT", "25")
monkeypatch.setenv("AUTHNZERVER_EMAILUSER", "testuser")
monkeypatch.setenv("AUTHNZERVER_EMAILPASS", "testpass")
monkeypatch.setenv("AUTHNZERVER_USERLOCKTRIES", "2")
monkeypatch.setenv("AUTHNZERVER_USERLOCKTIME", "20")
monkeypatch.setenv(
"AUTHNZERVER_RATELIMITS",
"ipaddr:300;user:360;session:120;apikey:720;burst:150;"
"user-new:50;user-login:50"
)
# launch the server subprocess
p = subprocess.Popen("authnzrv", shell=True)
# wait 2.5 seconds for the server to start
time.sleep(2.5)
timing = []
try:
#
# attempt to login as the superuser several times with the wrong
# password
#
for i in range(4):
# create a new anonymous session token
session_payload = {
'user_id':2,
'user_agent':'Mozzarella Killerwhale',
'expires':datetime.utcnow()+timedelta(hours=1),
'ip_address': '1.1.1.1',
'extra_info_json':{'pref_datasets_always_private':True}
}
request_dict = {
'request':'session-new',
'body':session_payload,
'reqid':i,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=1.0
)
resp.raise_for_status()
# decrypt the response
session_dict = decrypt_message(resp.text, secret)
assert session_dict['reqid'] == request_dict['reqid']
assert session_dict['success'] is True
assert isinstance(session_dict['response'], dict)
assert session_dict['response']['session_token'] is not None
request_dict = {
'request':'user-login',
'body':{
'session_token':session_dict['response']['session_token'],
'email':useremail,
'password':'%s-%i' % (password,i)
},
'reqid':10*i + 10,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
start_login_time = time.monotonic()
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=60.0
)
resp.raise_for_status()
timing.append(time.monotonic() - start_login_time)
# decrypt the response
response_dict = decrypt_message(resp.text, secret)
assert response_dict['reqid'] == request_dict['reqid']
assert response_dict['success'] is False
assert isinstance(response_dict['response'], dict)
assert response_dict['response']['user_id'] is None
# for the last attempt, we should get back a "locked" account
# message
if i >= 2:
assert (
"Your user account has been locked "
"after repeated login failures. "
"Try again in an hour or "
"contact the server admins."
) in response_dict['messages']
# wait 30 seconds for the lock time to expire
time.sleep(30)
# now login wih the correct password and see if we can login now
session_payload = {
'user_id':2,
'user_agent':'Mozzarella Killerwhale',
'expires':datetime.utcnow()+timedelta(hours=1),
'ip_address': '1.1.1.1',
'extra_info_json':{'pref_datasets_always_private':True}
}
request_dict = {
'request':'session-new',
'body':session_payload,
'reqid':1004,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=1.0
)
resp.raise_for_status()
# decrypt the response
session_dict = decrypt_message(resp.text, secret)
assert session_dict['reqid'] == request_dict['reqid']
assert session_dict['success'] is True
assert isinstance(session_dict['response'], dict)
assert session_dict['response']['session_token'] is not None
request_dict = {
'request':'user-login',
'body':{
'session_token':session_dict['response']['session_token'],
'email':useremail,
'password':password
},
'reqid':1005,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
start_login_time = time.monotonic()
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=60.0
)
resp.raise_for_status()
timing.append(time.monotonic() - start_login_time)
# decrypt the response
response_dict = decrypt_message(resp.text, secret)
assert response_dict['reqid'] == request_dict['reqid']
assert response_dict['success'] is True
assert isinstance(response_dict['response'], dict)
assert response_dict['response']['user_id'] == 1
assert response_dict['response']['user_role'] == 'superuser'
finally:
#
# kill the server at the end
#
p.terminate()
try:
p.communicate(timeout=3.0)
p.kill()
except Exception:
pass
# make sure to kill authnzrv on some Linux machines. use lsof and the
# port number to find the remaining authnzrv processes and kill them
# subprocess.call(
# "lsof | grep 18158 | awk '{ print $2 }' | sort | uniq | xargs kill -2",
# shell=True
# )
@mark.skipif(
os.environ.get("GITHUB_WORKFLOW", None) is not None,
reason="github doesn't allow server tests probably"
)
def test_server_invalid_passchecks_with_lock(monkeypatch, tmpdir):
'''This tests if the server responds appropriately to
invalid password checks.
The timing difference between successive failed password checks should
increase roughly exponentially. In addition, the user should be locked out
of their account for the configured amount of time and unlocked thereafter.
'''
# the basedir will be the pytest provided temporary directory
basedir = str(tmpdir)
# we'll make the auth DB and secrets file first
authdb_path, creds, secrets_file, salt_file, env_file = (
autogen_secrets_authdb(
basedir,
interactive=False
)
)
# read in the secrets file for the secret
with open(secrets_file,'r') as infd:
secret = infd.read().strip('\n')
# read in the salts file for the salt
with open(salt_file,'r') as infd:
salt = infd.read().strip('\n')
# read the creds file so we can try logging in
with open(creds,'r') as infd:
useremail, password = infd.read().strip('\n').split()
# get a temp directory
tmpdir = os.path.join('/tmp', 'authnzrv-%s' % secrets.token_urlsafe(8))
server_listen = '127.0.0.1'
server_port = '18158'
# set up the environment
monkeypatch.setenv("AUTHNZERVER_AUTHDB", authdb_path)
monkeypatch.setenv("AUTHNZERVER_BASEDIR", basedir)
monkeypatch.setenv("AUTHNZERVER_CACHEDIR", tmpdir)
monkeypatch.setenv("AUTHNZERVER_DEBUGMODE", "0")
monkeypatch.setenv("AUTHNZERVER_LISTEN", server_listen)
monkeypatch.setenv("AUTHNZERVER_PORT", server_port)
monkeypatch.setenv("AUTHNZERVER_SECRET", secret)
monkeypatch.setenv("AUTHNZERVER_PIISALT", salt)
monkeypatch.setenv("AUTHNZERVER_SESSIONEXPIRY", "60")
monkeypatch.setenv("AUTHNZERVER_WORKERS", "1")
monkeypatch.setenv("AUTHNZERVER_EMAILSERVER", "smtp.test.org")
monkeypatch.setenv("AUTHNZERVER_EMAILPORT", "25")
monkeypatch.setenv("AUTHNZERVER_EMAILUSER", "testuser")
monkeypatch.setenv("AUTHNZERVER_EMAILPASS", "testpass")
monkeypatch.setenv("AUTHNZERVER_USERLOCKTRIES", "2")
monkeypatch.setenv("AUTHNZERVER_USERLOCKTIME", "20")
monkeypatch.setenv(
"AUTHNZERVER_RATELIMITS",
"ipaddr:300;user:360;session:120;apikey:720;burst:150;"
"user-new:50;user-login:50"
)
# launch the server subprocess
p = subprocess.Popen("authnzrv", shell=True)
# wait 2.5 seconds for the server to start
time.sleep(2.5)
timing = []
try:
#
# attempt to login as the superuser several times with the wrong
# password
#
for i in range(4):
# create a new anonymous session token
session_payload = {
'user_id':2,
'user_agent':'Mozzarella Killerwhale',
'expires':datetime.utcnow()+timedelta(hours=1),
'ip_address': '1.1.1.1',
'extra_info_json':{'pref_datasets_always_private':True}
}
request_dict = {
'request':'session-new',
'body':session_payload,
'reqid':i,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=1.0
)
resp.raise_for_status()
# decrypt the response
session_dict = decrypt_message(resp.text, secret)
assert session_dict['reqid'] == request_dict['reqid']
assert session_dict['success'] is True
assert isinstance(session_dict['response'], dict)
assert session_dict['response']['session_token'] is not None
request_dict = {
'request':'user-passcheck-nosession',
'body':{
'email':useremail,
'password':'%s-%i' % (password,i)
},
'reqid':10*i + 10,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
start_login_time = time.monotonic()
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=60.0
)
resp.raise_for_status()
timing.append(time.monotonic() - start_login_time)
# decrypt the response
response_dict = decrypt_message(resp.text, secret)
assert response_dict['reqid'] == request_dict['reqid']
assert response_dict['success'] is False
assert isinstance(response_dict['response'], dict)
assert response_dict['response']['user_id'] is None
# for the last attempt, we should get back a "locked" account
# message
if i >= 2:
assert (
"Your user account has been locked "
"after repeated login failures. "
"Try again in an hour or "
"contact the server admins."
) in response_dict['messages']
# wait 30 seconds for the lock time to expire
time.sleep(30)
# now login wih the correct password and see if we can login now
session_payload = {
'user_id':2,
'user_agent':'Mozzarella Killerwhale',
'expires':datetime.utcnow()+timedelta(hours=1),
'ip_address': '1.1.1.1',
'extra_info_json':{'pref_datasets_always_private':True}
}
request_dict = {
'request':'session-new',
'body':session_payload,
'reqid':1004,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=1.0
)
resp.raise_for_status()
# decrypt the response
session_dict = decrypt_message(resp.text, secret)
assert session_dict['reqid'] == request_dict['reqid']
assert session_dict['success'] is True
assert isinstance(session_dict['response'], dict)
assert session_dict['response']['session_token'] is not None
request_dict = {
'request':'user-passcheck-nosession',
'body':{
'email':useremail,
'password':password
},
'reqid':1005,
'client_ipaddr': '1.2.3.4'
}
encrypted_request = encrypt_message(request_dict, secret)
start_login_time = time.monotonic()
# send the request to the authnzerver
resp = requests.post(
'http://%s:%s' % (server_listen, server_port),
data=encrypted_request,
timeout=60.0
)
resp.raise_for_status()
timing.append(time.monotonic() - start_login_time)
# decrypt the response
response_dict = decrypt_message(resp.text, secret)
assert response_dict['reqid'] == request_dict['reqid']
assert response_dict['success'] is True
assert isinstance(response_dict['response'], dict)
assert response_dict['response']['user_id'] == 1
assert response_dict['response']['user_role'] == 'superuser'
finally:
#
# kill the server at the end
#
p.terminate()
try:
p.communicate(timeout=3.0)
p.kill()
except Exception:
pass
# make sure to kill authnzrv on some Linux machines. use lsof and the
# port number to find the remaining authnzrv processes and kill them
# subprocess.call(
# "lsof | grep 18158 | awk '{ print $2 }' | sort | uniq | xargs kill -2",
# shell=True
# )
| 33.016234 | 85 | 0.591504 | 3,402 | 30,507 | 5.153145 | 0.092887 | 0.062061 | 0.102219 | 0.01118 | 0.954481 | 0.952655 | 0.949062 | 0.945753 | 0.9433 | 0.937824 | 0 | 0.019342 | 0.301767 | 30,507 | 923 | 86 | 33.052004 | 0.803671 | 0.172977 | 0 | 0.862069 | 0 | 0 | 0.197472 | 0.050798 | 0 | 0 | 0 | 0 | 0.106897 | 1 | 0.006897 | false | 0.037931 | 0.015517 | 0 | 0.022414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
de7850ca776dfcd293ff8f951bb81ff4d98e10e9 | 49 | py | Python | Cisco/2.1.2.12.py | martageraldo/aula-python-cisco | 6f4061c4200bb611f90a4231e1646ae8a0066e99 | [
"MIT"
] | null | null | null | Cisco/2.1.2.12.py | martageraldo/aula-python-cisco | 6f4061c4200bb611f90a4231e1646ae8a0066e99 | [
"MIT"
] | null | null | null | Cisco/2.1.2.12.py | martageraldo/aula-python-cisco | 6f4061c4200bb611f90a4231e1646ae8a0066e99 | [
"MIT"
] | null | null | null | print(2+2)
print(4-4)
print(8*2)
print(10/2)
| 9.8 | 12 | 0.591837 | 12 | 49 | 2.416667 | 0.416667 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.219512 | 0.163265 | 49 | 4 | 13 | 12.25 | 0.487805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
deab496993cc870cfd7dba13050bf29640868a58 | 256 | py | Python | artemis/fileman/experiment_record.py | peteroconnor-bc/artemis | ad2871fae7d986bf10580eec27aee5b7315adad5 | [
"BSD-2-Clause-FreeBSD"
] | 235 | 2016-08-26T14:18:51.000Z | 2022-03-13T10:54:39.000Z | artemis/fileman/experiment_record.py | peteroconnor-bc/artemis | ad2871fae7d986bf10580eec27aee5b7315adad5 | [
"BSD-2-Clause-FreeBSD"
] | 112 | 2016-04-30T11:48:38.000Z | 2021-01-12T20:17:32.000Z | artemis/fileman/experiment_record.py | peteroconnor-bc/artemis | ad2871fae7d986bf10580eec27aee5b7315adad5 | [
"BSD-2-Clause-FreeBSD"
] | 31 | 2016-11-05T19:09:19.000Z | 2021-09-13T07:35:40.000Z |
from artemis.experiments.experiment_record import *
import logging
logging.getLogger(__name__).warning('The module artemis.fileman.experiment_record is deprecated and will eventually be removed. Import artemis.experiments.experiment_record instead.')
| 32 | 184 | 0.839844 | 31 | 256 | 6.709677 | 0.677419 | 0.230769 | 0.269231 | 0.326923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 256 | 7 | 185 | 36.571429 | 0.896552 | 0 | 0 | 0 | 0 | 0.333333 | 0.570866 | 0.275591 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
dedaade4cbca70764fda5e310e7558e5ef1fdaa9 | 126 | py | Python | backend/app/app/api/utils/db.py | GadyTal/Dor2Door | 85fdac54f2088d965edf269ff49064831bf0f1fb | [
"MIT"
] | 1 | 2020-11-02T07:56:08.000Z | 2020-11-02T07:56:08.000Z | backend/app/app/api/utils/db.py | GadyTal/Dor2Door | 85fdac54f2088d965edf269ff49064831bf0f1fb | [
"MIT"
] | null | null | null | backend/app/app/api/utils/db.py | GadyTal/Dor2Door | 85fdac54f2088d965edf269ff49064831bf0f1fb | [
"MIT"
] | 1 | 2021-08-23T22:49:47.000Z | 2021-08-23T22:49:47.000Z | from starlette.requests import Request
# TODO: Check this version
def get_db(request: Request):
return request.state.db
| 18 | 38 | 0.769841 | 18 | 126 | 5.333333 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15873 | 126 | 6 | 39 | 21 | 0.90566 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
deded1e54e33c26b508ecbe7fbece132cf27861e | 3,458 | py | Python | ocrb/op3/exp_variants/variants.py | ecker-lab/object-centric-representation-benchmark | 2bcfc491d49ff9badc099af20b67c43eea2666d8 | [
"MIT"
] | 25 | 2020-08-07T10:35:11.000Z | 2021-11-02T19:10:28.000Z | ocrb/op3/exp_variants/variants.py | ecker-lab/object-centric-representation-benchmark | 2bcfc491d49ff9badc099af20b67c43eea2666d8 | [
"MIT"
] | 1 | 2021-03-01T10:45:01.000Z | 2021-03-01T11:12:17.000Z | ocrb/op3/exp_variants/variants.py | ecker-lab/object-centric-representation-benchmark | 2bcfc491d49ff9badc099af20b67c43eea2666d8 | [
"MIT"
] | 2 | 2020-08-07T10:35:17.000Z | 2020-12-20T07:51:19.000Z | object_room_variant = dict(
op3_args=dict(
refinement_model_type="size_dependent_conv", # size_dependent_conv, size_dependent_conv_no_share
decoder_model_type="reg", # reg, reg_no_share
dynamics_model_type="reg_ac32", # reg_ac32, reg_ac32_no_share
sto_repsize=64,
det_repsize=64,
extra_args=dict(
beta=1e-2,
deterministic_sampling=False
),
K=8
),
schedule_args=dict( # Arguments for TrainingScheduler
seed_steps=4,
T=5, # Max number of steps into the future we want to go or max length of a schedule
schedule_type='rprp', # single_step_physics, curriculum, static_iodine, rprp, next_step, random_alternating
loss_type='iodine_dynamics',
r_steps=2,
),
training_args=dict( # Arguments for OP3Trainer
batch_size=16, # Change to appropriate constant based off dataset size
lr=3e-4,
),
num_epochs=300,
save_period=5,
dataparallel=True,
path='ocrb/data/datasets',
ckpt_dir='ocrb/op3/ckpts',
num_workers=1,
n_steps=10,
debug=False,
split=True
)
mot_sprite_variant = dict(
op3_args=dict(
refinement_model_type="size_dependent_conv", # size_dependent_conv, size_dependent_conv_no_share
decoder_model_type="reg", # reg, reg_no_share
dynamics_model_type="reg_ac32", # reg_ac32, reg_ac32_no_share
sto_repsize=64,
det_repsize=64,
extra_args=dict(
beta=1e-2,
deterministic_sampling=False
),
K=5
),
schedule_args=dict( # Arguments for TrainingScheduler
seed_steps=4,
T=5, # Max number of steps into the future we want to go or max length of a schedule
schedule_type='rprp', # single_step_physics, curriculum, static_iodine, rprp, next_step, random_alternating
loss_type='iodine_dynamics',
r_steps=2,
),
training_args=dict( # Arguments for OP3Trainer
batch_size=16, # Change to appropriate constant based off dataset size
lr=1e-4
),
num_epochs=300,
save_period=5,
dataparallel=True,
path='ocrb/data/datasets',
ckpt_dir='ocrb/op3/ckpts',
num_workers=1,
n_steps=10,
debug=False,
split=True
)
multidsprites_videos_variant = dict(
op3_args=dict(
refinement_model_type="size_dependent_conv", # size_dependent_conv, size_dependent_conv_no_share
decoder_model_type="reg", # reg, reg_no_share
dynamics_model_type="reg_ac32", # reg_ac32, reg_ac32_no_share
sto_repsize=64,
det_repsize=64,
extra_args=dict(
beta=1e-2,
deterministic_sampling=False
),
K=6
),
schedule_args=dict( # Arguments for TrainingScheduler
seed_steps=4,
T=5, # Max number of steps into the future we want to go or max length of a schedule
schedule_type='rprp', # single_step_physics, curriculum, static_iodine, rprp, next_step, random_alternating
loss_type='iodine_dynamics',
r_steps=2,
),
training_args=dict( # Arguments for OP3Trainer
batch_size=16, # Change to appropriate constant based off dataset size
lr=3e-4,
),
num_epochs=300,
save_period=5,
dataparallel=True,
path='ocrb/data/datasets',
ckpt_dir='ocrb/op3/ckpts',
num_workers=1,
n_steps=10,
debug=False,
split=True
) | 33.25 | 116 | 0.653846 | 459 | 3,458 | 4.625272 | 0.222222 | 0.045219 | 0.072068 | 0.05935 | 0.979746 | 0.979746 | 0.979746 | 0.979746 | 0.979746 | 0.979746 | 0 | 0.035156 | 0.259688 | 3,458 | 104 | 117 | 33.25 | 0.794141 | 0.319838 | 0 | 0.901961 | 0 | 0 | 0.104381 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
def107bfa8142532d7551f0fa0482a750848d383 | 227 | py | Python | src/movement/no.py | Quanta-Robotics/Robot-Blueberry | 7b7e77e09ac5e9ec5afd947e0db1ecc8773e56da | [
"MIT"
] | 25 | 2021-06-08T07:09:30.000Z | 2021-12-30T06:28:35.000Z | src/movement/no.py | ICT-CoU/Robot-Blueberry | d19fd1be037df9d67de64df57a87006d74cd6c43 | [
"MIT"
] | 2 | 2021-05-23T12:54:51.000Z | 2021-06-07T17:47:56.000Z | src/movement/no.py | ICT-CoU/Robot-Blueberry | d19fd1be037df9d67de64df57a87006d74cd6c43 | [
"MIT"
] | 14 | 2021-06-08T13:02:28.000Z | 2021-12-30T20:07:18.000Z | from expression import *
changeDegreeGpio([0],[0],5,0.03)
time.sleep(0.5)
changeDegreeGpio([0],[90],5,0.03)
time.sleep(0.5)
changeDegreeGpio([0],[180],5,0.03)
time.sleep(0.5)
changeDegreeGpio([0],[90],5,0.03)
time.sleep(0.5)
| 18.916667 | 34 | 0.687225 | 43 | 227 | 3.627907 | 0.255814 | 0.064103 | 0.102564 | 0.205128 | 0.737179 | 0.737179 | 0.737179 | 0.737179 | 0.737179 | 0.628205 | 0 | 0.169014 | 0.061674 | 227 | 11 | 35 | 20.636364 | 0.56338 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
720a749ae109791d4b6c2a3eae8e257fa7d6d91f | 115 | py | Python | ctf_uk_flag.py | sebabeva/ctf-despegar-2017 | 738776d3bab952a6ec4a3443fd5b3319005aaea3 | [
"MIT"
] | null | null | null | ctf_uk_flag.py | sebabeva/ctf-despegar-2017 | 738776d3bab952a6ec4a3443fd5b3319005aaea3 | [
"MIT"
] | null | null | null | ctf_uk_flag.py | sebabeva/ctf-despegar-2017 | 738776d3bab952a6ec4a3443fd5b3319005aaea3 | [
"MIT"
] | null | null | null | #!/usr/bin/python
print '%x\n' % (0x30167b0eb4eef511ec82272b4b47a2d71471 ^ 0x1319057cb23c1dcbf616876372617fff8b48) | 38.333333 | 96 | 0.826087 | 8 | 115 | 11.875 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.472222 | 0.06087 | 115 | 3 | 96 | 38.333333 | 0.407407 | 0.13913 | 0 | 0 | 0 | 0 | 0.040404 | 0 | 0 | 0 | 0.767677 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
a0fc2921ca92c4b88cca3d26e890d14254978367 | 23,276 | py | Python | libica/openapi/libtes/api/tasks_api.py | umccr-illumina/libica | 916d27eea499f29bee590268b84208effb0cc576 | [
"MIT"
] | null | null | null | libica/openapi/libtes/api/tasks_api.py | umccr-illumina/libica | 916d27eea499f29bee590268b84208effb0cc576 | [
"MIT"
] | 4 | 2021-11-15T10:47:51.000Z | 2022-02-22T04:43:20.000Z | libica/openapi/libtes/api/tasks_api.py | umccr-illumina/libica | 916d27eea499f29bee590268b84208effb0cc576 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
Task Execution Service
No description provided (generated by Openapi Generator https://github.com/openapitools/openapi-generator) # noqa: E501
The version of the OpenAPI document: v1
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from libica.openapi.libtes.api_client import ApiClient
from libica.openapi.libtes.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class TasksApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_task(self, **kwargs): # noqa: E501
"""Create a Task # noqa: E501
Creates a task. Returns the ID associated with the new task. Also returns the task version ID associated with the new task, if provided. Substitutions can be defined in the following format: \"{{string}}\", and specified at launch time. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_task(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param CreateTaskRequest body:
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Task
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.create_task_with_http_info(**kwargs) # noqa: E501
def create_task_with_http_info(self, **kwargs): # noqa: E501
"""Create a Task # noqa: E501
Creates a task. Returns the ID associated with the new task. Also returns the task version ID associated with the new task, if provided. Substitutions can be defined in the following format: \"{{string}}\", and specified at launch time. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_task_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param CreateTaskRequest body:
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(Task, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'body'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method create_task" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json-patch+json', 'application/json', 'text/json', 'application/*+json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/v1/tasks', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Task', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def get_task(self, task_id, **kwargs): # noqa: E501
"""Get the details of a Task # noqa: E501
Gets the details of a Task for a given task ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_task(task_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str task_id: (required)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: TaskSummary
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.get_task_with_http_info(task_id, **kwargs) # noqa: E501
def get_task_with_http_info(self, task_id, **kwargs): # noqa: E501
"""Get the details of a Task # noqa: E501
Gets the details of a Task for a given task ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_task_with_http_info(task_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str task_id: (required)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(TaskSummary, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'task_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_task" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'task_id' is set
if self.api_client.client_side_validation and ('task_id' not in local_var_params or # noqa: E501
local_var_params['task_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `task_id` when calling `get_task`") # noqa: E501
collection_formats = {}
path_params = {}
if 'task_id' in local_var_params:
path_params['taskId'] = local_var_params['task_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/v1/tasks/{taskId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskSummary', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def list_tasks(self, **kwargs): # noqa: E501
"""Get a list of tasks # noqa: E501
Gets a list of tasks accessible by the current tenant ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_tasks(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str names: Name: Optional parameter to filter the returned list. Case-Sensitive
:param str acls: Name: Optional parameter to filter the returned list. Case-Sensitive
:param int page_size: Optional parameter to define the page size returned. Valid inputs range from 1-1000.
:param str sort: Sort: Optional parameter to set the sort of the returned list. Valid fields include: name, timeCreated, timeModified. The sort can be specified as asc or desc. (Default: asc.)
:param str page_token: pageToken: Optional parameter for navigation after initial listing. Valid values include firstPageToken, nextPageToken, and previousPageToken (provided in the list response)
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: TaskSummaryPagedItems
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.list_tasks_with_http_info(**kwargs) # noqa: E501
def list_tasks_with_http_info(self, **kwargs): # noqa: E501
"""Get a list of tasks # noqa: E501
Gets a list of tasks accessible by the current tenant ID. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.list_tasks_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str names: Name: Optional parameter to filter the returned list. Case-Sensitive
:param str acls: Name: Optional parameter to filter the returned list. Case-Sensitive
:param int page_size: Optional parameter to define the page size returned. Valid inputs range from 1-1000.
:param str sort: Sort: Optional parameter to set the sort of the returned list. Valid fields include: name, timeCreated, timeModified. The sort can be specified as asc or desc. (Default: asc.)
:param str page_token: pageToken: Optional parameter for navigation after initial listing. Valid values include firstPageToken, nextPageToken, and previousPageToken (provided in the list response)
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(TaskSummaryPagedItems, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'names',
'acls',
'page_size',
'sort',
'page_token'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method list_tasks" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'names' in local_var_params and local_var_params['names'] is not None: # noqa: E501
query_params.append(('names', local_var_params['names'])) # noqa: E501
if 'acls' in local_var_params and local_var_params['acls'] is not None: # noqa: E501
query_params.append(('acls', local_var_params['acls'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('pageSize', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
if 'page_token' in local_var_params and local_var_params['page_token'] is not None: # noqa: E501
query_params.append(('pageToken', local_var_params['page_token'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/v1/tasks', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TaskSummaryPagedItems', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
def update_task(self, task_id, **kwargs): # noqa: E501
"""Update an existing task. # noqa: E501
Updates the task with a given ID. The task's name, description can be updated. The task's name must remain unique. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_task(task_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str task_id: (required)
:param UpdateTaskRequest body: Details of the task to be updated.
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Task
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
return self.update_task_with_http_info(task_id, **kwargs) # noqa: E501
def update_task_with_http_info(self, task_id, **kwargs): # noqa: E501
"""Update an existing task. # noqa: E501
Updates the task with a given ID. The task's name, description can be updated. The task's name must remain unique. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_task_with_http_info(task_id, async_req=True)
>>> result = thread.get()
:param async_req bool: execute request asynchronously
:param str task_id: (required)
:param UpdateTaskRequest body: Details of the task to be updated.
:param _return_http_data_only: response data without head status code
and headers
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: tuple(Task, status_code(int), headers(HTTPHeaderDict))
If the method is called asynchronously,
returns the request thread.
"""
local_var_params = locals()
all_params = [
'task_id',
'body'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method update_task" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'task_id' is set
if self.api_client.client_side_validation and ('task_id' not in local_var_params or # noqa: E501
local_var_params['task_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `task_id` when calling `update_task`") # noqa: E501
collection_formats = {}
path_params = {}
if 'task_id' in local_var_params:
path_params['taskId'] = local_var_params['task_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json-patch+json', 'application/json', 'text/json', 'application/*+json']) # noqa: E501
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/v1/tasks/{taskId}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Task', # noqa: E501
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats)
| 44.847784 | 258 | 0.597482 | 2,677 | 23,276 | 4.97684 | 0.091147 | 0.040231 | 0.061998 | 0.027021 | 0.926218 | 0.918637 | 0.918637 | 0.911206 | 0.891016 | 0.885461 | 0 | 0.014892 | 0.327805 | 23,276 | 518 | 259 | 44.934363 | 0.836636 | 0.490548 | 0 | 0.694215 | 1 | 0 | 0.159569 | 0.032587 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03719 | false | 0 | 0.020661 | 0 | 0.095041 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9d008f722b3929c682f280943d078ad5808b5794 | 369 | py | Python | core/algos/__init__.py | firehose-dataset/congrad | 20792f43aa89beae75454e30b82b2e1280ed3106 | [
"MIT"
] | 9 | 2020-07-21T14:37:22.000Z | 2021-07-14T12:44:13.000Z | core/algos/__init__.py | firehose-dataset/congrad | 20792f43aa89beae75454e30b82b2e1280ed3106 | [
"MIT"
] | 2 | 2020-09-22T18:05:03.000Z | 2020-11-19T09:42:21.000Z | core/algos/__init__.py | firehose-dataset/congrad | 20792f43aa89beae75454e30b82b2e1280ed3106 | [
"MIT"
] | 2 | 2020-07-21T16:39:12.000Z | 2020-07-30T02:20:47.000Z | from core.algos.agem import AGEM
from core.algos.online import OnlineOnly
from core.algos.replay import ReplayOnly
from core.algos.mr import MixedReplay
from core.algos.congrad_agem import ConGraD_AGEM
from core.algos.congrad_online import ConGraD_OnlineOnly
from core.algos.congrad_replay import ConGraD_ReplayOnly
from core.algos.congrad_mr import ConGraD_MixedReplay | 46.125 | 56 | 0.872629 | 56 | 369 | 5.607143 | 0.214286 | 0.203822 | 0.33121 | 0.254777 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084011 | 369 | 8 | 57 | 46.125 | 0.928994 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9d2355a68c919788cae3ae9ed090f9fbe86907d3 | 4,671 | py | Python | day2.py | JustinSGray/AoC2017 | 8b0d18ce77e4ec0b9ab859bcfb457b386b617c37 | [
"Apache-2.0"
] | null | null | null | day2.py | JustinSGray/AoC2017 | 8b0d18ce77e4ec0b9ab859bcfb457b386b617c37 | [
"Apache-2.0"
] | null | null | null | day2.py | JustinSGray/AoC2017 | 8b0d18ce77e4ec0b9ab859bcfb457b386b617c37 | [
"Apache-2.0"
] | null | null | null | def _prep_input(input):
return [row.split() for row in input.split("\n")]
def checksum(input):
data = _prep_input(input)
checks = []
for row in data:
i_row = [int(s) for s in row]
big = max(i_row)
little = min(i_row)
checks.append(big - little)
return checks, sum(checks)
def mod_checksum(input):
data = _prep_input(input)
checks = []
for row in data:
l_row = len(row)
i_row = [int(s) for s in row]
for i in range(l_row):
found = False
for j in range(i+1, l_row):
# print(i,j, row[i], row[j])
if (i_row[i] % i_row[j] == 0):
checks.append(i_row[i]//i_row[j])
found = True
break
elif (i_row[j] % i_row[i] == 0):
checks.append(i_row[j]//i_row[i])
found = True
break
if found:
break
return checks, sum(checks)
if __name__ == "__main__":
###########
# Part 1
###########
test_input = """5 1 9 5
7 5 3
2 4 6 8"""
e_vals = [8,4,6]
checks, sum_checks = checksum(test_input)
print('part 1 test', e_vals, checks)
print(' ', 18, sum_checks)
real_input = """5806 6444 1281 38 267 1835 223 4912 5995 230 4395 2986 6048 4719 216 1201
74 127 226 84 174 280 94 159 198 305 124 106 205 99 177 294
1332 52 54 655 56 170 843 707 1273 1163 89 23 43 1300 1383 1229
5653 236 1944 3807 5356 246 222 1999 4872 206 5265 5397 5220 5538 286 917
3512 3132 2826 3664 2814 549 3408 3384 142 120 160 114 1395 2074 1816 2357
100 2000 112 103 2122 113 92 522 1650 929 1281 2286 2259 1068 1089 651
646 490 297 60 424 234 48 491 245 523 229 189 174 627 441 598
2321 555 2413 2378 157 27 194 2512 117 140 2287 277 2635 1374 1496 1698
101 1177 104 89 542 2033 1724 1197 474 1041 1803 770 87 1869 1183 553
1393 92 105 1395 1000 85 391 1360 1529 1367 1063 688 642 102 999 638
4627 223 188 5529 2406 4980 2384 2024 4610 279 249 2331 4660 4350 3264 242
769 779 502 75 1105 53 55 931 1056 1195 65 292 1234 1164 678 1032
2554 75 4406 484 2285 226 5666 245 4972 3739 5185 1543 230 236 3621 5387
826 4028 4274 163 5303 4610 145 5779 157 4994 5053 186 5060 3082 2186 4882
588 345 67 286 743 54 802 776 29 44 107 63 303 372 41 810
128 2088 3422 111 3312 740 3024 1946 920 131 112 477 3386 2392 1108 2741"""
print('part 1 real answer', checksum(real_input))
###########
# Part 1
###########
test_input = """5 9 2 8
9 4 7 3
3 8 6 5"""
e_vals = [4, 3, 2]
checks, sum_checks = mod_checksum(test_input)
print('part 2 test', e_vals, checks)
print(' ', 9, sum_checks)
real_input = """5806 6444 1281 38 267 1835 223 4912 5995 230 4395 2986 6048 4719 216 1201
74 127 226 84 174 280 94 159 198 305 124 106 205 99 177 294
1332 52 54 655 56 170 843 707 1273 1163 89 23 43 1300 1383 1229
5653 236 1944 3807 5356 246 222 1999 4872 206 5265 5397 5220 5538 286 917
3512 3132 2826 3664 2814 549 3408 3384 142 120 160 114 1395 2074 1816 2357
100 2000 112 103 2122 113 92 522 1650 929 1281 2286 2259 1068 1089 651
646 490 297 60 424 234 48 491 245 523 229 189 174 627 441 598
2321 555 2413 2378 157 27 194 2512 117 140 2287 277 2635 1374 1496 1698
101 1177 104 89 542 2033 1724 1197 474 1041 1803 770 87 1869 1183 553
1393 92 105 1395 1000 85 391 1360 1529 1367 1063 688 642 102 999 638
4627 223 188 5529 2406 4980 2384 2024 4610 279 249 2331 4660 4350 3264 242
769 779 502 75 1105 53 55 931 1056 1195 65 292 1234 1164 678 1032
2554 75 4406 484 2285 226 5666 245 4972 3739 5185 1543 230 236 3621 5387
826 4028 4274 163 5303 4610 145 5779 157 4994 5053 186 5060 3082 2186 4882
588 345 67 286 743 54 802 776 29 44 107 63 303 372 41 810
128 2088 3422 111 3312 740 3024 1946 920 131 112 477 3386 2392 1108 2741"""
print('part 2 real answer', mod_checksum(real_input))
| 42.463636 | 124 | 0.542496 | 754 | 4,671 | 3.297082 | 0.387268 | 0.020917 | 0.010056 | 0.016895 | 0.838294 | 0.778761 | 0.762671 | 0.762671 | 0.748994 | 0.748994 | 0 | 0.627149 | 0.402269 | 4,671 | 109 | 125 | 42.853211 | 0.263252 | 0.008563 | 0 | 0.5875 | 0 | 0 | 0.669577 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0375 | false | 0 | 0 | 0.0125 | 0.075 | 0.075 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
19c92291756d9cfc39e8bc77efedcb69a48dd73b | 239,799 | py | Python | jamf/api/mobile_device_prestages_api.py | jensenbox/python-jamf | 85213085b1064a00375a7aa7df5e33c19f5178eb | [
"RSA-MD"
] | 1 | 2021-04-20T15:28:57.000Z | 2021-04-20T15:28:57.000Z | jamf/api/mobile_device_prestages_api.py | jensenbox/python-jamf | 85213085b1064a00375a7aa7df5e33c19f5178eb | [
"RSA-MD"
] | null | null | null | jamf/api/mobile_device_prestages_api.py | jensenbox/python-jamf | 85213085b1064a00375a7aa7df5e33c19f5178eb | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
Jamf Pro API
## Overview This is a sample Jamf Pro server which allows for usage without any authentication. The Jamf Pro environment which supports the Try it Out functionality does not run the current beta version of Jamf Pro, thus any newly added endpoints will result in an error and should be used soley for documentation purposes. # noqa: E501
The version of the OpenAPI document: 10.25.0
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from jamf.api_client import ApiClient
from jamf.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class MobileDevicePrestagesApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def v1_mobile_device_prestages_get(self, **kwargs): # noqa: E501
"""Search for sorted and paged Mobile Device Prestages # noqa: E501
Search for sorted and paged mobile device prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_get(async_req=True)
>>> result = thread.get()
:param page:
:type page: int
:param size:
:type size: int
:param pagesize:
:type pagesize: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property:asc/desc. Multiple sort criteria are supported and must be separated with a comma. Example: sort=date:desc,name:asc
:type sort: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: MobileDevicePrestageSearchResults
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_get_with_http_info(**kwargs) # noqa: E501
def v1_mobile_device_prestages_get_with_http_info(self, **kwargs): # noqa: E501
"""Search for sorted and paged Mobile Device Prestages # noqa: E501
Search for sorted and paged mobile device prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_get_with_http_info(async_req=True)
>>> result = thread.get()
:param page:
:type page: int
:param size:
:type size: int
:param pagesize:
:type pagesize: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property:asc/desc. Multiple sort criteria are supported and must be separated with a comma. Example: sort=date:desc,name:asc
:type sort: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(MobileDevicePrestageSearchResults, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'page',
'size',
'pagesize',
'page_size',
'sort'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'size' in local_var_params and local_var_params['size'] is not None: # noqa: E501
query_params.append(('size', local_var_params['size'])) # noqa: E501
if 'pagesize' in local_var_params and local_var_params['pagesize'] is not None: # noqa: E501
query_params.append(('pagesize', local_var_params['pagesize'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('page-size', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "MobileDevicePrestageSearchResults",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_attachments_delete(self, id, file_attachment_delete, **kwargs): # noqa: E501
"""Remove an attachment for a Mobile Device Prestage # noqa: E501
Remove an attachment for a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_attachments_delete(id, file_attachment_delete, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param file_attachment_delete: (required)
:type file_attachment_delete: FileAttachmentDelete
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_attachments_delete_with_http_info(id, file_attachment_delete, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_attachments_delete_with_http_info(self, id, file_attachment_delete, **kwargs): # noqa: E501
"""Remove an attachment for a Mobile Device Prestage # noqa: E501
Remove an attachment for a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_attachments_delete_with_http_info(id, file_attachment_delete, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param file_attachment_delete: (required)
:type file_attachment_delete: FileAttachmentDelete
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'id',
'file_attachment_delete'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_attachments_delete" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_attachments_delete`") # noqa: E501
# verify the required parameter 'file_attachment_delete' is set
if self.api_client.client_side_validation and ('file_attachment_delete' not in local_var_params or # noqa: E501
local_var_params['file_attachment_delete'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `file_attachment_delete` when calling `v1_mobile_device_prestages_id_attachments_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'file_attachment_delete' in local_var_params:
body_params = local_var_params['file_attachment_delete']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}/attachments', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_attachments_get(self, id, **kwargs): # noqa: E501
"""Get attachments for a Mobile Device Prestage # noqa: E501
Get attachments for a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_attachments_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[FileAttachment]
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_attachments_get_with_http_info(id, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_attachments_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get attachments for a Mobile Device Prestage # noqa: E501
Get attachments for a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_attachments_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[FileAttachment], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_attachments_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_attachments_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[FileAttachment]",
404: "ApiError",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}/attachments', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_attachments_post(self, id, file, **kwargs): # noqa: E501
"""Add an attachment to a Mobile Device Prestage # noqa: E501
Add an attachment to a Mobile Device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_attachments_post(id, file, async_req=True)
>>> result = thread.get()
:param id: Identifier of the Mobile Device Prestage the attachment should be assigned to (required)
:type id: int
:param file: The file to upload (required)
:type file: file
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageFileAttachment
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_attachments_post_with_http_info(id, file, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_attachments_post_with_http_info(self, id, file, **kwargs): # noqa: E501
"""Add an attachment to a Mobile Device Prestage # noqa: E501
Add an attachment to a Mobile Device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_attachments_post_with_http_info(id, file, async_req=True)
>>> result = thread.get()
:param id: Identifier of the Mobile Device Prestage the attachment should be assigned to (required)
:type id: int
:param file: The file to upload (required)
:type file: file
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageFileAttachment, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'file'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_attachments_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_attachments_post`") # noqa: E501
# verify the required parameter 'file' is set
if self.api_client.client_side_validation and ('file' not in local_var_params or # noqa: E501
local_var_params['file'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `file` when calling `v1_mobile_device_prestages_id_attachments_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'file' in local_var_params:
local_var_files['file'] = local_var_params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "PrestageFileAttachment",
404: "ApiError",
413: "ApiError",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}/attachments', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_delete(self, id, **kwargs): # noqa: E501
"""Delete a Mobile Device Prestage with the supplied id # noqa: E501
Deletes a Mobile Device Prestage with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_delete(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_delete_with_http_info(id, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_delete_with_http_info(self, id, **kwargs): # noqa: E501
"""Delete a Mobile Device Prestage with the supplied id # noqa: E501
Deletes a Mobile Device Prestage with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_delete_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_delete" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_get(self, id, **kwargs): # noqa: E501
"""Retrieve a Mobile Device Prestage with the supplied id # noqa: E501
Retrieves a Mobile Device Prestage with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: GetMobileDevicePrestage
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_get_with_http_info(id, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve a Mobile Device Prestage with the supplied id # noqa: E501
Retrieves a Mobile Device Prestage with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(GetMobileDevicePrestage, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "GetMobileDevicePrestage",
404: "ApiError",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_history_get(self, id, **kwargs): # noqa: E501
"""Get sorted and paged Mobile Device Prestage history objects # noqa: E501
Gets sorted and paged mobile device prestage history objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_history_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param page:
:type page: int
:param size:
:type size: int
:param pagesize:
:type pagesize: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property,asc/desc. Default sort order is descending. Multiple sort criteria are supported and must be entered on separate lines in Swagger UI. In the URI the 'sort' query param is duplicated for each sort criterion, e.g., ...&sort=name%2Casc&sort=date%2Cdesc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: HistorySearchResults
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_history_get_with_http_info(id, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_history_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get sorted and paged Mobile Device Prestage history objects # noqa: E501
Gets sorted and paged mobile device prestage history objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_history_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param page:
:type page: int
:param size:
:type size: int
:param pagesize:
:type pagesize: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property,asc/desc. Default sort order is descending. Multiple sort criteria are supported and must be entered on separate lines in Swagger UI. In the URI the 'sort' query param is duplicated for each sort criterion, e.g., ...&sort=name%2Casc&sort=date%2Cdesc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(HistorySearchResults, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'page',
'size',
'pagesize',
'page_size',
'sort'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_history_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_history_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'size' in local_var_params and local_var_params['size'] is not None: # noqa: E501
query_params.append(('size', local_var_params['size'])) # noqa: E501
if 'pagesize' in local_var_params and local_var_params['pagesize'] is not None: # noqa: E501
query_params.append(('pagesize', local_var_params['pagesize'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('page-size', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
collection_formats['sort'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "HistorySearchResults",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}/history', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_history_post(self, id, object_history_note, **kwargs): # noqa: E501
"""Add Mobile Device Prestage history object notes # noqa: E501
Adds mobile device prestage history object notes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_history_post(id, object_history_note, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param object_history_note: History notes to create (required)
:type object_history_note: ObjectHistoryNote
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: ObjectHistory
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_history_post_with_http_info(id, object_history_note, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_history_post_with_http_info(self, id, object_history_note, **kwargs): # noqa: E501
"""Add Mobile Device Prestage history object notes # noqa: E501
Adds mobile device prestage history object notes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_history_post_with_http_info(id, object_history_note, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param object_history_note: History notes to create (required)
:type object_history_note: ObjectHistoryNote
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(ObjectHistory, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'object_history_note'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_history_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_history_post`") # noqa: E501
# verify the required parameter 'object_history_note' is set
if self.api_client.client_side_validation and ('object_history_note' not in local_var_params or # noqa: E501
local_var_params['object_history_note'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `object_history_note` when calling `v1_mobile_device_prestages_id_history_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'object_history_note' in local_var_params:
body_params = local_var_params['object_history_note']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "ObjectHistory",
503: "ApiError",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}/history', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_put(self, id, put_mobile_device_prestage, **kwargs): # noqa: E501
"""Update a Mobile Device Prestage # noqa: E501
Updates a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_put(id, put_mobile_device_prestage, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param put_mobile_device_prestage: Mobile Device Prestage to update (required)
:type put_mobile_device_prestage: PutMobileDevicePrestage
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: GetMobileDevicePrestage
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_put_with_http_info(id, put_mobile_device_prestage, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_put_with_http_info(self, id, put_mobile_device_prestage, **kwargs): # noqa: E501
"""Update a Mobile Device Prestage # noqa: E501
Updates a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_put_with_http_info(id, put_mobile_device_prestage, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param put_mobile_device_prestage: Mobile Device Prestage to update (required)
:type put_mobile_device_prestage: PutMobileDevicePrestage
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(GetMobileDevicePrestage, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'put_mobile_device_prestage'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_put" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_put`") # noqa: E501
# verify the required parameter 'put_mobile_device_prestage' is set
if self.api_client.client_side_validation and ('put_mobile_device_prestage' not in local_var_params or # noqa: E501
local_var_params['put_mobile_device_prestage'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `put_mobile_device_prestage` when calling `v1_mobile_device_prestages_id_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'put_mobile_device_prestage' in local_var_params:
body_params = local_var_params['put_mobile_device_prestage']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "GetMobileDevicePrestage",
404: "ApiError",
409: "ApiError",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_scope_delete(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Remove Device Scope for a specific Mobile Device Prestage # noqa: E501
Remove device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_scope_delete(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param prestage_scope_update: Serial Numbers to remove from scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScopeResponse
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_scope_delete_with_http_info(id, prestage_scope_update, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_scope_delete_with_http_info(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Remove Device Scope for a specific Mobile Device Prestage # noqa: E501
Remove device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_scope_delete_with_http_info(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param prestage_scope_update: Serial Numbers to remove from scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScopeResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'prestage_scope_update'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_scope_delete" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_scope_delete`") # noqa: E501
# verify the required parameter 'prestage_scope_update' is set
if self.api_client.client_side_validation and ('prestage_scope_update' not in local_var_params or # noqa: E501
local_var_params['prestage_scope_update'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `prestage_scope_update` when calling `v1_mobile_device_prestages_id_scope_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'prestage_scope_update' in local_var_params:
body_params = local_var_params['prestage_scope_update']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScopeResponse",
400: "ApiError",
404: "ApiError",
409: "ApiError",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}/scope', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_scope_get(self, id, **kwargs): # noqa: E501
"""Get Device Scope for a specific Mobile Device Prestage # noqa: E501
Get device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_scope_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScopeResponse
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_scope_get_with_http_info(id, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_scope_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get Device Scope for a specific Mobile Device Prestage # noqa: E501
Get device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_scope_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScopeResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_scope_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_scope_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScopeResponse",
404: "ApiError",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}/scope', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_scope_post(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Add Device Scope for a specific Mobile Device Prestage # noqa: E501
Add device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_scope_post(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param prestage_scope_update: Serial Numbers to scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScopeResponse
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_scope_post_with_http_info(id, prestage_scope_update, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_scope_post_with_http_info(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Add Device Scope for a specific Mobile Device Prestage # noqa: E501
Add device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_scope_post_with_http_info(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param prestage_scope_update: Serial Numbers to scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScopeResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'prestage_scope_update'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_scope_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_scope_post`") # noqa: E501
# verify the required parameter 'prestage_scope_update' is set
if self.api_client.client_side_validation and ('prestage_scope_update' not in local_var_params or # noqa: E501
local_var_params['prestage_scope_update'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `prestage_scope_update` when calling `v1_mobile_device_prestages_id_scope_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'prestage_scope_update' in local_var_params:
body_params = local_var_params['prestage_scope_update']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScopeResponse",
400: "ApiError",
404: "ApiError",
409: "ApiError",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}/scope', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_id_scope_put(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Replace Device Scope for a specific Mobile Device Prestage # noqa: E501
Replace device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_scope_put(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param prestage_scope_update: Serial Numbers to scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScopeResponse
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_id_scope_put_with_http_info(id, prestage_scope_update, **kwargs) # noqa: E501
def v1_mobile_device_prestages_id_scope_put_with_http_info(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Replace Device Scope for a specific Mobile Device Prestage # noqa: E501
Replace device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_id_scope_put_with_http_info(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param prestage_scope_update: Serial Numbers to scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScopeResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'prestage_scope_update'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_id_scope_put" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_id_scope_put`") # noqa: E501
# verify the required parameter 'prestage_scope_update' is set
if self.api_client.client_side_validation and ('prestage_scope_update' not in local_var_params or # noqa: E501
local_var_params['prestage_scope_update'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `prestage_scope_update` when calling `v1_mobile_device_prestages_id_scope_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'prestage_scope_update' in local_var_params:
body_params = local_var_params['prestage_scope_update']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScopeResponse",
400: "ApiError",
404: "ApiError",
409: "ApiError",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/{id}/scope', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_post(self, mobile_device_prestage, **kwargs): # noqa: E501
"""Create a Mobile Device Prestage # noqa: E501
Create a mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_post(mobile_device_prestage, async_req=True)
>>> result = thread.get()
:param mobile_device_prestage: Mobile Device Prestage to create. ids defined in this body will be ignored (required)
:type mobile_device_prestage: MobileDevicePrestage
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: GetMobileDevicePrestage
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_post_with_http_info(mobile_device_prestage, **kwargs) # noqa: E501
def v1_mobile_device_prestages_post_with_http_info(self, mobile_device_prestage, **kwargs): # noqa: E501
"""Create a Mobile Device Prestage # noqa: E501
Create a mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_post_with_http_info(mobile_device_prestage, async_req=True)
>>> result = thread.get()
:param mobile_device_prestage: Mobile Device Prestage to create. ids defined in this body will be ignored (required)
:type mobile_device_prestage: MobileDevicePrestage
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(GetMobileDevicePrestage, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'mobile_device_prestage'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'mobile_device_prestage' is set
if self.api_client.client_side_validation and ('mobile_device_prestage' not in local_var_params or # noqa: E501
local_var_params['mobile_device_prestage'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `mobile_device_prestage` when calling `v1_mobile_device_prestages_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'mobile_device_prestage' in local_var_params:
body_params = local_var_params['mobile_device_prestage']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "GetMobileDevicePrestage",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_scope_get(self, **kwargs): # noqa: E501
"""Get all Device Scope for all Mobile Device Prestages # noqa: E501
Get all device scope for all mobile device prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_scope_get(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScope
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_scope_get_with_http_info(**kwargs) # noqa: E501
def v1_mobile_device_prestages_scope_get_with_http_info(self, **kwargs): # noqa: E501
"""Get all Device Scope for all Mobile Device Prestages # noqa: E501
Get all device scope for all mobile device prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_scope_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScope, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_scope_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScope",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/scope', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_sync_get(self, **kwargs): # noqa: E501
"""Get all Prestage sync States for all prestages # noqa: E501
Get all prestage sync states for all prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_sync_get(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[PrestageSyncStatus]
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_sync_get_with_http_info(**kwargs) # noqa: E501
def v1_mobile_device_prestages_sync_get_with_http_info(self, **kwargs): # noqa: E501
"""Get all Prestage sync States for all prestages # noqa: E501
Get all prestage sync states for all prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_sync_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[PrestageSyncStatus], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_sync_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[PrestageSyncStatus]",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/sync', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_sync_id_get(self, id, **kwargs): # noqa: E501
"""Get all prestage sync states for a single prestage # noqa: E501
Get all prestage sync states for a single prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_sync_id_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[PrestageSyncStatus]
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_sync_id_get_with_http_info(id, **kwargs) # noqa: E501
def v1_mobile_device_prestages_sync_id_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get all prestage sync states for a single prestage # noqa: E501
Get all prestage sync states for a single prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_sync_id_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[PrestageSyncStatus], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_sync_id_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_sync_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[PrestageSyncStatus]",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/sync/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v1_mobile_device_prestages_sync_id_latest_get(self, id, **kwargs): # noqa: E501
"""Get the latest Sync State for a single Prestage # noqa: E501
Get the latest sync state for a single prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_sync_id_latest_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageSyncStatus
"""
kwargs['_return_http_data_only'] = True
return self.v1_mobile_device_prestages_sync_id_latest_get_with_http_info(id, **kwargs) # noqa: E501
def v1_mobile_device_prestages_sync_id_latest_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get the latest Sync State for a single Prestage # noqa: E501
Get the latest sync state for a single prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v1_mobile_device_prestages_sync_id_latest_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: int
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageSyncStatus, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_mobile_device_prestages_sync_id_latest_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v1_mobile_device_prestages_sync_id_latest_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageSyncStatus",
}
return self.api_client.call_api(
'/v1/mobile-device-prestages/sync/{id}/latest', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_get(self, **kwargs): # noqa: E501
"""Get sorted and paged Mobile Device Prestages # noqa: E501
Gets sorted and paged mobile device prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_get(async_req=True)
>>> result = thread.get()
:param page:
:type page: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property:asc/desc. Multiple sort criteria are supported and must be separated with a comma. Example: sort=date:desc,name:asc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: MobileDevicePrestageSearchResultsV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_get_with_http_info(**kwargs) # noqa: E501
def v2_mobile_device_prestages_get_with_http_info(self, **kwargs): # noqa: E501
"""Get sorted and paged Mobile Device Prestages # noqa: E501
Gets sorted and paged mobile device prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_get_with_http_info(async_req=True)
>>> result = thread.get()
:param page:
:type page: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property:asc/desc. Multiple sort criteria are supported and must be separated with a comma. Example: sort=date:desc,name:asc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(MobileDevicePrestageSearchResultsV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'page',
'page_size',
'sort'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('page-size', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
collection_formats['sort'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "MobileDevicePrestageSearchResultsV2",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_attachments_delete_multiple_post(self, id, ids, **kwargs): # noqa: E501
"""Remove an attachment for a Mobile Device Prestage # noqa: E501
Remove an attachment for a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_attachments_delete_multiple_post(id, ids, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param ids: (required)
:type ids: Ids
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_attachments_delete_multiple_post_with_http_info(id, ids, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_attachments_delete_multiple_post_with_http_info(self, id, ids, **kwargs): # noqa: E501
"""Remove an attachment for a Mobile Device Prestage # noqa: E501
Remove an attachment for a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_attachments_delete_multiple_post_with_http_info(id, ids, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param ids: (required)
:type ids: Ids
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'id',
'ids'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_attachments_delete_multiple_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_attachments_delete_multiple_post`") # noqa: E501
# verify the required parameter 'ids' is set
if self.api_client.client_side_validation and ('ids' not in local_var_params or # noqa: E501
local_var_params['ids'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `ids` when calling `v2_mobile_device_prestages_id_attachments_delete_multiple_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'ids' in local_var_params:
body_params = local_var_params['ids']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/attachments/delete-multiple', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_attachments_get(self, id, **kwargs): # noqa: E501
"""Get attachments for a Mobile Device Prestage # noqa: E501
Get attachments for a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_attachments_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[FileAttachmentV2]
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_attachments_get_with_http_info(id, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_attachments_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get attachments for a Mobile Device Prestage # noqa: E501
Get attachments for a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_attachments_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[FileAttachmentV2], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_attachments_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_attachments_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[FileAttachmentV2]",
404: "ApiError",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/attachments', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_attachments_post(self, id, file, **kwargs): # noqa: E501
"""Add an attachment to a Mobile Device Prestage # noqa: E501
Add an attachment to a Mobile Device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_attachments_post(id, file, async_req=True)
>>> result = thread.get()
:param id: Identifier of the Mobile Device Prestage the attachment should be assigned to (required)
:type id: str
:param file: The file to upload (required)
:type file: file
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageFileAttachmentV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_attachments_post_with_http_info(id, file, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_attachments_post_with_http_info(self, id, file, **kwargs): # noqa: E501
"""Add an attachment to a Mobile Device Prestage # noqa: E501
Add an attachment to a Mobile Device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_attachments_post_with_http_info(id, file, async_req=True)
>>> result = thread.get()
:param id: Identifier of the Mobile Device Prestage the attachment should be assigned to (required)
:type id: str
:param file: The file to upload (required)
:type file: file
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageFileAttachmentV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'file'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_attachments_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_attachments_post`") # noqa: E501
# verify the required parameter 'file' is set
if self.api_client.client_side_validation and ('file' not in local_var_params or # noqa: E501
local_var_params['file'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `file` when calling `v2_mobile_device_prestages_id_attachments_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
if 'file' in local_var_params:
local_var_files['file'] = local_var_params['file'] # noqa: E501
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['multipart/form-data']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "PrestageFileAttachmentV2",
404: "ApiError",
413: "ApiError",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/attachments', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_delete(self, id, **kwargs): # noqa: E501
"""Delete a Mobile Device Prestage with the supplied id # noqa: E501
Deletes a Mobile Device Prestage with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_delete(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_delete_with_http_info(id, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_delete_with_http_info(self, id, **kwargs): # noqa: E501
"""Delete a Mobile Device Prestage with the supplied id # noqa: E501
Deletes a Mobile Device Prestage with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_delete_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: None
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_delete" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_get(self, id, **kwargs): # noqa: E501
"""Retrieve a Mobile Device Prestage with the supplied id # noqa: E501
Retrieves a Mobile Device Prestage with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: GetMobileDevicePrestageV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_get_with_http_info(id, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieve a Mobile Device Prestage with the supplied id # noqa: E501
Retrieves a Mobile Device Prestage with the supplied id # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(GetMobileDevicePrestageV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "GetMobileDevicePrestageV2",
404: "ApiError",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_history_get(self, id, **kwargs): # noqa: E501
"""Get sorted and paged Mobile Device Prestage history objects # noqa: E501
Gets sorted and paged mobile device prestage history objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_history_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param page:
:type page: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property,asc/desc. Default sort order is descending. Multiple sort criteria are supported and must be entered on separate lines in Swagger UI. In the URI the 'sort' query param is duplicated for each sort criterion, e.g., ...&sort=name%2Casc&sort=date%2Cdesc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: HistorySearchResults
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_history_get_with_http_info(id, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_history_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get sorted and paged Mobile Device Prestage history objects # noqa: E501
Gets sorted and paged mobile device prestage history objects # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_history_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param page:
:type page: int
:param page_size:
:type page_size: int
:param sort: Sorting criteria in the format: property,asc/desc. Default sort order is descending. Multiple sort criteria are supported and must be entered on separate lines in Swagger UI. In the URI the 'sort' query param is duplicated for each sort criterion, e.g., ...&sort=name%2Casc&sort=date%2Cdesc
:type sort: list[str]
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(HistorySearchResults, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'page',
'page_size',
'sort'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_history_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_history_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
if 'page' in local_var_params and local_var_params['page'] is not None: # noqa: E501
query_params.append(('page', local_var_params['page'])) # noqa: E501
if 'page_size' in local_var_params and local_var_params['page_size'] is not None: # noqa: E501
query_params.append(('page-size', local_var_params['page_size'])) # noqa: E501
if 'sort' in local_var_params and local_var_params['sort'] is not None: # noqa: E501
query_params.append(('sort', local_var_params['sort'])) # noqa: E501
collection_formats['sort'] = 'multi' # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "HistorySearchResults",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/history', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_history_post(self, id, object_history_note, **kwargs): # noqa: E501
"""Add Mobile Device Prestage history object notes # noqa: E501
Adds mobile device prestage history object notes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_history_post(id, object_history_note, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param object_history_note: History notes to create (required)
:type object_history_note: ObjectHistoryNote
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: HrefResponse
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_history_post_with_http_info(id, object_history_note, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_history_post_with_http_info(self, id, object_history_note, **kwargs): # noqa: E501
"""Add Mobile Device Prestage history object notes # noqa: E501
Adds mobile device prestage history object notes # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_history_post_with_http_info(id, object_history_note, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param object_history_note: History notes to create (required)
:type object_history_note: ObjectHistoryNote
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(HrefResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'object_history_note'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_history_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_history_post`") # noqa: E501
# verify the required parameter 'object_history_note' is set
if self.api_client.client_side_validation and ('object_history_note' not in local_var_params or # noqa: E501
local_var_params['object_history_note'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `object_history_note` when calling `v2_mobile_device_prestages_id_history_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'object_history_note' in local_var_params:
body_params = local_var_params['object_history_note']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "HrefResponse",
503: "ApiError",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/history', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_put(self, id, put_mobile_device_prestage_v2, **kwargs): # noqa: E501
"""Update a Mobile Device Prestage # noqa: E501
Updates a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_put(id, put_mobile_device_prestage_v2, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param put_mobile_device_prestage_v2: Mobile Device Prestage to update (required)
:type put_mobile_device_prestage_v2: PutMobileDevicePrestageV2
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: GetMobileDevicePrestageV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_put_with_http_info(id, put_mobile_device_prestage_v2, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_put_with_http_info(self, id, put_mobile_device_prestage_v2, **kwargs): # noqa: E501
"""Update a Mobile Device Prestage # noqa: E501
Updates a Mobile Device Prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_put_with_http_info(id, put_mobile_device_prestage_v2, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param put_mobile_device_prestage_v2: Mobile Device Prestage to update (required)
:type put_mobile_device_prestage_v2: PutMobileDevicePrestageV2
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(GetMobileDevicePrestageV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'put_mobile_device_prestage_v2'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_put" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_put`") # noqa: E501
# verify the required parameter 'put_mobile_device_prestage_v2' is set
if self.api_client.client_side_validation and ('put_mobile_device_prestage_v2' not in local_var_params or # noqa: E501
local_var_params['put_mobile_device_prestage_v2'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `put_mobile_device_prestage_v2` when calling `v2_mobile_device_prestages_id_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'put_mobile_device_prestage_v2' in local_var_params:
body_params = local_var_params['put_mobile_device_prestage_v2']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "GetMobileDevicePrestageV2",
404: "ApiError",
409: "ApiError",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_scope_delete_multiple_post(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Remove Device Scope for a specific Mobile Device Prestage # noqa: E501
Remove device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_scope_delete_multiple_post(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param prestage_scope_update: Serial Numbers to remove from scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScopeResponseV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_scope_delete_multiple_post_with_http_info(id, prestage_scope_update, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_scope_delete_multiple_post_with_http_info(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Remove Device Scope for a specific Mobile Device Prestage # noqa: E501
Remove device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_scope_delete_multiple_post_with_http_info(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param prestage_scope_update: Serial Numbers to remove from scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScopeResponseV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'prestage_scope_update'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_scope_delete_multiple_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_scope_delete_multiple_post`") # noqa: E501
# verify the required parameter 'prestage_scope_update' is set
if self.api_client.client_side_validation and ('prestage_scope_update' not in local_var_params or # noqa: E501
local_var_params['prestage_scope_update'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `prestage_scope_update` when calling `v2_mobile_device_prestages_id_scope_delete_multiple_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'prestage_scope_update' in local_var_params:
body_params = local_var_params['prestage_scope_update']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScopeResponseV2",
400: "ApiError",
404: "ApiError",
409: "ApiError",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/scope/delete-multiple', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_scope_get(self, id, **kwargs): # noqa: E501
"""Get Device Scope for a specific Mobile Device Prestage # noqa: E501
Get device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_scope_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScopeResponseV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_scope_get_with_http_info(id, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_scope_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get Device Scope for a specific Mobile Device Prestage # noqa: E501
Get device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_scope_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScopeResponseV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_scope_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_scope_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScopeResponseV2",
404: "ApiError",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/scope', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_scope_post(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Add Device Scope for a specific Mobile Device Prestage # noqa: E501
Add device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_scope_post(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param prestage_scope_update: Serial Numbers to scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScopeResponseV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_scope_post_with_http_info(id, prestage_scope_update, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_scope_post_with_http_info(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Add Device Scope for a specific Mobile Device Prestage # noqa: E501
Add device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_scope_post_with_http_info(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param prestage_scope_update: Serial Numbers to scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScopeResponseV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'prestage_scope_update'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_scope_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_scope_post`") # noqa: E501
# verify the required parameter 'prestage_scope_update' is set
if self.api_client.client_side_validation and ('prestage_scope_update' not in local_var_params or # noqa: E501
local_var_params['prestage_scope_update'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `prestage_scope_update` when calling `v2_mobile_device_prestages_id_scope_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'prestage_scope_update' in local_var_params:
body_params = local_var_params['prestage_scope_update']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScopeResponseV2",
400: "ApiError",
404: "ApiError",
409: "ApiError",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/scope', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_scope_put(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Replace Device Scope for a specific Mobile Device Prestage # noqa: E501
Replace device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_scope_put(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param prestage_scope_update: Serial Numbers to scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScopeResponseV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_scope_put_with_http_info(id, prestage_scope_update, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_scope_put_with_http_info(self, id, prestage_scope_update, **kwargs): # noqa: E501
"""Replace Device Scope for a specific Mobile Device Prestage # noqa: E501
Replace device scope for a specific mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_scope_put_with_http_info(id, prestage_scope_update, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param prestage_scope_update: Serial Numbers to scope (required)
:type prestage_scope_update: PrestageScopeUpdate
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScopeResponseV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id',
'prestage_scope_update'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_scope_put" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_scope_put`") # noqa: E501
# verify the required parameter 'prestage_scope_update' is set
if self.api_client.client_side_validation and ('prestage_scope_update' not in local_var_params or # noqa: E501
local_var_params['prestage_scope_update'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `prestage_scope_update` when calling `v2_mobile_device_prestages_id_scope_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'prestage_scope_update' in local_var_params:
body_params = local_var_params['prestage_scope_update']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScopeResponseV2",
400: "ApiError",
404: "ApiError",
409: "ApiError",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/scope', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_syncs_get(self, id, **kwargs): # noqa: E501
"""Get all prestage sync states for a single prestage # noqa: E501
Get all prestage sync states for a single prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_syncs_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[PrestageSyncStatusV2]
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_syncs_get_with_http_info(id, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_syncs_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get all prestage sync states for a single prestage # noqa: E501
Get all prestage sync states for a single prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_syncs_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[PrestageSyncStatusV2], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_syncs_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_syncs_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[PrestageSyncStatusV2]",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/syncs', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_id_syncs_latest_get(self, id, **kwargs): # noqa: E501
"""Get the latest Sync State for a single Prestage # noqa: E501
Get the latest sync state for a single prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_syncs_latest_get(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageSyncStatusV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_id_syncs_latest_get_with_http_info(id, **kwargs) # noqa: E501
def v2_mobile_device_prestages_id_syncs_latest_get_with_http_info(self, id, **kwargs): # noqa: E501
"""Get the latest Sync State for a single Prestage # noqa: E501
Get the latest sync state for a single prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_id_syncs_latest_get_with_http_info(id, async_req=True)
>>> result = thread.get()
:param id: Mobile Device Prestage identifier (required)
:type id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageSyncStatusV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_id_syncs_latest_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in local_var_params or # noqa: E501
local_var_params['id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `id` when calling `v2_mobile_device_prestages_id_syncs_latest_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in local_var_params:
path_params['id'] = local_var_params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageSyncStatusV2",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/{id}/syncs/latest', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_post(self, mobile_device_prestage_v2, **kwargs): # noqa: E501
"""Create a Mobile Device Prestage # noqa: E501
Create a mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_post(mobile_device_prestage_v2, async_req=True)
>>> result = thread.get()
:param mobile_device_prestage_v2: Mobile Device Prestage to create. ids defined in this body will be ignored (required)
:type mobile_device_prestage_v2: MobileDevicePrestageV2
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: HrefResponse
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_post_with_http_info(mobile_device_prestage_v2, **kwargs) # noqa: E501
def v2_mobile_device_prestages_post_with_http_info(self, mobile_device_prestage_v2, **kwargs): # noqa: E501
"""Create a Mobile Device Prestage # noqa: E501
Create a mobile device prestage # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_post_with_http_info(mobile_device_prestage_v2, async_req=True)
>>> result = thread.get()
:param mobile_device_prestage_v2: Mobile Device Prestage to create. ids defined in this body will be ignored (required)
:type mobile_device_prestage_v2: MobileDevicePrestageV2
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(HrefResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'mobile_device_prestage_v2'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_post" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'mobile_device_prestage_v2' is set
if self.api_client.client_side_validation and ('mobile_device_prestage_v2' not in local_var_params or # noqa: E501
local_var_params['mobile_device_prestage_v2'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `mobile_device_prestage_v2` when calling `v2_mobile_device_prestages_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'mobile_device_prestage_v2' in local_var_params:
body_params = local_var_params['mobile_device_prestage_v2']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
201: "HrefResponse",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_scope_get(self, **kwargs): # noqa: E501
"""Get all Device Scope for all Mobile Device Prestages # noqa: E501
Get all device scope for all mobile device prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_scope_get(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PrestageScopeV2
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_scope_get_with_http_info(**kwargs) # noqa: E501
def v2_mobile_device_prestages_scope_get_with_http_info(self, **kwargs): # noqa: E501
"""Get all Device Scope for all Mobile Device Prestages # noqa: E501
Get all device scope for all mobile device prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_scope_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PrestageScopeV2, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_scope_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "PrestageScopeV2",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/scope', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def v2_mobile_device_prestages_syncs_get(self, **kwargs): # noqa: E501
"""Get all Prestage sync States for all prestages # noqa: E501
Get all prestage sync states for all prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_syncs_get(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: list[PrestageSyncStatusV2]
"""
kwargs['_return_http_data_only'] = True
return self.v2_mobile_device_prestages_syncs_get_with_http_info(**kwargs) # noqa: E501
def v2_mobile_device_prestages_syncs_get_with_http_info(self, **kwargs): # noqa: E501
"""Get all Prestage sync States for all prestages # noqa: E501
Get all prestage sync states for all prestages # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.v2_mobile_device_prestages_syncs_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(list[PrestageSyncStatusV2], status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method v2_mobile_device_prestages_syncs_get" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
response_types_map = {
200: "list[PrestageSyncStatusV2]",
}
return self.api_client.call_api(
'/v2/mobile-device-prestages/syncs', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 46.337971 | 342 | 0.605157 | 26,997 | 239,799 | 5.0959 | 0.011816 | 0.03617 | 0.055766 | 0.037115 | 0.989751 | 0.989751 | 0.98922 | 0.988966 | 0.988552 | 0.987585 | 0 | 0.015008 | 0.327287 | 239,799 | 5,174 | 343 | 46.346927 | 0.837815 | 0.481949 | 0 | 0.785214 | 0 | 0 | 0.187559 | 0.085188 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031934 | false | 0 | 0.002187 | 0 | 0.066054 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a0419bf30db77a00aee71030782816b1b3f01371 | 6,769 | py | Python | axonius_api_client/tests/tests_api/tests_assets/test_users.py | geransmith/axonius_api_client | 09fd564d62f0ddf7aa44db14a509eaafaf0c930f | [
"MIT"
] | null | null | null | axonius_api_client/tests/tests_api/tests_assets/test_users.py | geransmith/axonius_api_client | 09fd564d62f0ddf7aa44db14a509eaafaf0c930f | [
"MIT"
] | null | null | null | axonius_api_client/tests/tests_api/tests_assets/test_users.py | geransmith/axonius_api_client | 09fd564d62f0ddf7aa44db14a509eaafaf0c930f | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""Test suite for assets."""
import pytest
from .base_assets import (
AssetsPrivate,
AssetsPublic,
ModelMixinsBase,
check_assets,
get_field_values,
load_test_data,
)
class TestUsers(AssetsPrivate, AssetsPublic, ModelMixinsBase):
"""Pass."""
@pytest.fixture(scope="class")
def apiobj(self, api_users):
"""Pass."""
return load_test_data(apiobj=api_users)
def test_get_by_username(self, apiobj):
"""Pass."""
field = apiobj.FIELD_USERNAME
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
value = values[0]
rows = apiobj.get_by_username(
value=value, field=field, fields_map=apiobj.TEST_DATA["fields_map"],
)
check_assets(rows)
assert len(rows) == 1
rows_values = get_field_values(rows=rows, field=field)
assert value in rows_values
def test_get_by_username_equals_not(self, apiobj):
"""Pass."""
field = apiobj.FIELD_USERNAME
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
value = values[0]
rows = apiobj.get_by_username(
value=value,
field=field,
fields_map=apiobj.TEST_DATA["fields_map"],
not_flag=True,
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
assert value not in rows_values
def test_get_by_usernames(self, apiobj):
"""Pass."""
field = apiobj.FIELD_USERNAME
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
values = values[0:2]
rows = apiobj.get_by_usernames(
values=values, field=field, fields_map=apiobj.TEST_DATA["fields_map"],
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
for value in values:
assert value in rows_values
def test_get_by_usernames_not(self, apiobj):
"""Pass."""
field = apiobj.FIELD_USERNAME
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
values = values[0:2]
rows = apiobj.get_by_usernames(
values=values,
field=field,
fields_map=apiobj.TEST_DATA["fields_map"],
not_flag=True,
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
for value in values:
assert value not in rows_values
def test_get_by_username_regex(self, apiobj):
"""Pass."""
field = apiobj.FIELD_USERNAME
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
value = values[0]
regex_value = value[0:5]
rows = apiobj.get_by_username_regex(
value=regex_value, fields_map=apiobj.TEST_DATA["fields_map"], field=field,
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
assert value in rows_values
def test_get_by_username_regex_not(self, apiobj):
"""Pass."""
field = apiobj.FIELD_USERNAME
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
value = values[0]
regex_value = value[0:5]
rows = apiobj.get_by_username_regex(
value=regex_value,
field=field,
not_flag=True,
fields_map=apiobj.TEST_DATA["fields_map"],
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
assert value not in rows_values
def test_get_by_mail(self, apiobj):
"""Pass."""
field = apiobj.FIELD_MAIL
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
value = values[0]
rows = apiobj.get_by_mail(
value=value, field=field, fields_map=apiobj.TEST_DATA["fields_map"],
)
check_assets(rows)
assert len(rows) >= 1
rows_values = get_field_values(rows=rows, field=field)
assert value in rows_values
def test_get_by_mail_equals_not(self, apiobj):
"""Pass."""
field = apiobj.FIELD_MAIL
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
value = values[0]
rows = apiobj.get_by_mail(
value=value,
field=field,
fields_map=apiobj.TEST_DATA["fields_map"],
not_flag=True,
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
assert value not in rows_values
def test_get_by_mails(self, apiobj):
"""Pass."""
field = apiobj.FIELD_MAIL
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
values = values[0:2]
rows = apiobj.get_by_mails(
values=values, field=field, fields_map=apiobj.TEST_DATA["fields_map"],
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
for value in values:
assert value in rows_values
def test_get_by_mails_not(self, apiobj):
"""Pass."""
field = apiobj.FIELD_MAIL
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
values = values[0:2]
rows = apiobj.get_by_mails(
values=values,
field=field,
fields_map=apiobj.TEST_DATA["fields_map"],
not_flag=True,
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
for value in values:
assert value not in rows_values
def test_get_by_mail_regex(self, apiobj):
"""Pass."""
field = apiobj.FIELD_MAIL
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
value = values[0]
regex_value = value[0:5]
rows = apiobj.get_by_mail_regex(
value=regex_value, field=field, fields_map=apiobj.TEST_DATA["fields_map"],
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
assert value in rows_values
def test_get_by_mail_regex_not(self, apiobj):
"""Pass."""
field = apiobj.FIELD_MAIL
values = get_field_values(rows=apiobj.TEST_DATA["assets"], field=field)
value = values[0]
regex_value = value[0:5]
rows = apiobj.get_by_mail_regex(
value=regex_value,
field=field,
fields_map=apiobj.TEST_DATA["fields_map"],
not_flag=True,
)
check_assets(rows)
rows_values = get_field_values(rows=rows, field=field)
assert value not in rows_values
| 29.430435 | 86 | 0.613532 | 847 | 6,769 | 4.615112 | 0.061393 | 0.092095 | 0.089537 | 0.122794 | 0.924277 | 0.919161 | 0.916603 | 0.908416 | 0.898184 | 0.891021 | 0 | 0.005561 | 0.28276 | 6,769 | 229 | 87 | 29.558952 | 0.799588 | 0.019057 | 0 | 0.745342 | 0 | 0 | 0.030012 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 1 | 0.080745 | false | 0 | 0.012422 | 0 | 0.10559 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a0629a51938546797eb101243071ffcbb5171b49 | 68,046 | py | Python | workflow/test/stash/stash_facade_test.py | mibexsoftware/alfred-stash-workflow | 5cdba4d14c8998b937c1aa6af8e3417251fac540 | [
"MIT"
] | 13 | 2016-03-31T16:19:59.000Z | 2019-09-26T20:47:57.000Z | workflow/test/stash/stash_facade_test.py | mibexsoftware/alfred-stash-workflow | 5cdba4d14c8998b937c1aa6af8e3417251fac540 | [
"MIT"
] | 6 | 2015-09-18T15:24:43.000Z | 2019-10-23T16:51:39.000Z | workflow/test/stash/stash_facade_test.py | mibexsoftware/alfred-stash-workflow | 5cdba4d14c8998b937c1aa6af8e3417251fac540 | [
"MIT"
] | 3 | 2015-09-16T18:05:32.000Z | 2020-01-04T19:41:21.000Z | # -*- coding: utf-8 -*-
from distutils.version import StrictVersion
from unittest import TestCase
import httpretty
from src.stash.project import Project
from src.stash.pull_request import PullRequest
from src.stash.pull_request_suggestion import PullRequestSuggestion
from src.stash.repository import Repository
from src.stash.stash_facade import StashFacade
class TestStashFacade(TestCase):
def setUp(self):
self.stash_facade = StashFacade(stash_host='http://localhost:7990/stash')
@httpretty.activate
def test_get_all_projects(self):
# GIVEN
self._mock_projects_rest_call()
# WHEN
projects = self.stash_facade.all_projects()
# THEN
self.assertEquals(
[Project(key='PRJ',
name='My Cool Project',
description='The description for my cool project.',
link='http://link/to/project')],
projects
)
@httpretty.activate
def test_get_all_repositories(self):
# GIVEN
self._mock_repos_rest_call()
# WHEN
repositories = self.stash_facade.all_repositories()
# THEN
self.assertEquals([Repository(name='My repo 1', slug='my-repo1', link='http://link/to/repository1',
project_key='PRJ', project_name='My Cool Project', public=False, fork=False,
clone_url='https://<baseURL>/scm/PRJ/my-repo1.git'),
Repository(name='My repo 2', slug='my-repo2', link='http://link/to/repository2',
project_key='PRJ', project_name='My Cool Project', public=True, fork=True,
clone_url='https://<baseURL>/scm/PRJ/my-repo2.git')],
repositories)
@httpretty.activate
def test_my_pull_requests_to_review(self):
# GIVEN
self._mock_pull_requests_rest_call()
# WHEN
pull_requests = self.stash_facade.my_pull_requests_to_review(StrictVersion('4.10.0'))
# THEN
self.assertEquals(
[PullRequest(pull_request_id=1,
from_branch='dev',
to_branch='master',
title='Dev',
link='http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/pull-requests/1',
repo_name='rep_1',
project_key='PROJECT_1')],
pull_requests
)
@httpretty.activate
def test_my_created_pull_requests(self):
# GIVEN
self._mock_pull_requests_rest_call()
# WHEN
pull_requests = self.stash_facade.my_created_pull_requests(StrictVersion('4.10.0'))
# THEN
self.assertEquals(
[PullRequest(pull_request_id=1,
from_branch='dev',
to_branch='master',
title='Dev',
link='http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/pull-requests/1',
repo_name='rep_1',
project_key='PROJECT_1')],
pull_requests
)
@httpretty.activate
def test_open_pull_requests(self):
# GIVEN
self._mock_repos_rest_call()
self._mock_open_pull_requests_rest_call()
# WHEN
pull_requests = self.stash_facade.open_pull_requests()
# THEN
self.assertEquals(
[PullRequest(pull_request_id=1,
from_branch='dev',
to_branch='master',
title='Dev',
link='http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/pull-requests/1',
repo_name='rep_1',
project_key='PROJECT_1'),
PullRequest(pull_request_id=1,
from_branch='dev',
to_branch='master',
title='Dev',
link='http://localhost:7990/stash/projects/PROJECT_1/repos/rep_2/pull-requests/1',
repo_name='rep_2',
project_key='PROJECT_1')],
pull_requests
)
@httpretty.activate
def test_pull_request_suggestions(self):
# GIVEN
self._mock_pull_request_suggestions_rest_call()
# WHEN
suggestions = self.stash_facade.my_pull_request_suggestions(StrictVersion('4.10.0'))
# THEN
self.assertEquals(
PullRequestSuggestion(change_time=1478187004000,
repo_slug='alfred-stash-workflow',
repo_name='Alfred Stash Workflow',
project_name='Mira',
project_key='MIRA',
project_url='https://localhost:7990/bitbucket/projects/MIRA',
ref_id='refs/heads/master',
display_id='master'),
suggestions[0]
)
def _mock_pull_request_suggestions_rest_call(self):
httpretty.register_uri(httpretty.GET, "http://localhost:7990/stash/rest/api/1.0/dashboard/pull-request-suggestions",
body='''{
"size": 1,
"limit": 3,
"isLastPage": true,
"values": [
{
"changeTime": 1478187004000,
"refChange": {
"ref": {
"id": "refs/heads/master",
"displayId": "master",
"type": "BRANCH"
},
"refId": "refs/heads/master",
"fromHash": "f9431d41a78f766138220f73a3b50286766df770",
"toHash": "41bf84305faa3a5882f1eeb19a3298098f574bf9",
"type": "UPDATE"
},
"repository": {
"slug": "alfred-stash-workflow",
"id": 352,
"name": "Alfred Stash Workflow",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": false,
"project": {
"key": "MIRA",
"id": 41,
"name": "Mira",
"description": "Our phantastic Atlassian plug-ins",
"public": false,
"type": "NORMAL",
"links": {
"self": [
{
"href": "https://localhost:7990/bitbucket/projects/MIRA"
}
]
}
},
"public": false,
"links": {
"clone": [
{
"href": "https://mrueegg@localhost:7990/bitbucket/scm/mira/alfred-stash-workflow.git",
"name": "http"
},
{
"href": "ssh://git@localhost:8442/mira/alfred-stash-workflow.git",
"name": "ssh"
}
],
"self": [
{
"href": "https://localhost:7990/bitbucket/projects/MIRA/repos/alfred-stash-workflow/browse"
}
]
}
},
"fromRef": {
"id": "refs/heads/master",
"displayId": "master",
"type": "BRANCH"
},
"toRef": {
"id": "refs/heads/master",
"displayId": "master",
"type": "BRANCH"
}
}
],
"start": 0
}''',
content_type="application/json")
def _mock_pull_requests_rest_call(self):
httpretty.register_uri(httpretty.GET, "http://localhost:7990/stash/rest/api/1.0/inbox/pull-requests",
body='''{
"size": 1,
"limit": 25,
"isLastPage": true,
"values": [
{
"id": 1,
"version": 0,
"title": "Dev",
"description": "* a couple of changes",
"state": "OPEN",
"open": true,
"closed": false,
"createdDate": 1436283154855,
"updatedDate": 1436283154855,
"fromRef": {
"id": "refs/heads/dev",
"displayId": "dev",
"latestChangeset": "bf97bf79c6d2b14757d6a929a576a65be296cc20",
"repository": {
"slug": "rep_1",
"id": 11,
"name": "rep_1",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PROJECT_1",
"id": 1,
"name": "Project 1",
"description": "Default configuration project #1",
"public": false,
"type": "NORMAL",
"link": {
"url": "/projects/PROJECT_1",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1"
}
]
}
},
"public": false,
"link": {
"url": "/projects/PROJECT_1/repos/rep_1/browse",
"rel": "self"
},
"cloneUrl": "http://admin@localhost:7990/stash/scm/project_1/rep_1.git",
"links": {
"clone": [
{
"href": "ssh://git@localhost:7999/project_1/rep_1.git",
"name": "ssh"
},
{
"href": "http://admin@localhost:7990/stash/scm/project_1/rep_1.git",
"name": "http"
}
],
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/browse"
}
]
}
}
},
"toRef": {
"id": "refs/heads/master",
"displayId": "master",
"latestChangeset": "0c38f167ab09ceb7d9ec1bb3d41ff3993a34d803",
"repository": {
"slug": "rep_1",
"id": 11,
"name": "rep_1",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PROJECT_1",
"id": 1,
"name": "Project 1",
"description": "Default configuration project #1",
"public": false,
"type": "NORMAL",
"link": {
"url": "/projects/PROJECT_1",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1"
}
]
}
},
"public": false,
"link": {
"url": "/projects/PROJECT_1/repos/rep_1/browse",
"rel": "self"
},
"cloneUrl": "http://admin@localhost:7990/stash/scm/project_1/rep_1.git",
"links": {
"clone": [
{
"href": "ssh://git@localhost:7999/project_1/rep_1.git",
"name": "ssh"
},
{
"href": "http://admin@localhost:7990/stash/scm/project_1/rep_1.git",
"name": "http"
}
],
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/browse"
}
]
}
}
},
"locked": false,
"author": {
"user": {
"name": "admin",
"emailAddress": "admin@example.com",
"id": 1,
"displayName": "Administrator",
"active": true,
"slug": "admin",
"type": "NORMAL",
"link": {
"url": "/users/admin",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/users/admin"
}
]
}
},
"role": "AUTHOR",
"approved": false
},
"reviewers": [],
"participants": [],
"attributes": {
"resolvedTaskCount": [
"0"
],
"commentCount": [
"5"
],
"openTaskCount": [
"0"
]
},
"link": {
"url": "/projects/PROJECT_1/repos/rep_1/pull-requests/1",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/pull-requests/1"
}
]
}
}
],
"start": 0
}''',
content_type="application/json")
def _mock_open_pull_requests_rest_call(self):
httpretty.register_uri(httpretty.GET,
"http://localhost:7990/stash/rest/api/1.0/projects/PRJ/repos/my-repo1/pull-requests?limit=100",
body='''{
"size": 1,
"limit": 25,
"isLastPage": true,
"values": [
{
"id": 1,
"version": 0,
"title": "Dev",
"description": "* a couple of changes",
"state": "OPEN",
"open": true,
"closed": false,
"createdDate": 1436283154855,
"updatedDate": 1436283154855,
"fromRef": {
"id": "refs/heads/dev",
"displayId": "dev",
"latestChangeset": "bf97bf79c6d2b14757d6a929a576a65be296cc20",
"repository": {
"slug": "rep_1",
"id": 11,
"name": "rep_1",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PROJECT_1",
"id": 1,
"name": "Project 1",
"description": "Default configuration project #1",
"public": false,
"type": "NORMAL",
"link": {
"url": "/projects/PROJECT_1",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1"
}
]
}
},
"public": false,
"link": {
"url": "/projects/PROJECT_1/repos/rep_1/browse",
"rel": "self"
},
"cloneUrl": "http://admin@localhost:7990/stash/scm/project_1/rep_1.git",
"links": {
"clone": [
{
"href": "ssh://git@localhost:7999/project_1/rep_1.git",
"name": "ssh"
},
{
"href": "http://admin@localhost:7990/stash/scm/project_1/rep_1.git",
"name": "http"
}
],
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/browse"
}
]
}
}
},
"toRef": {
"id": "refs/heads/master",
"displayId": "master",
"latestChangeset": "0c38f167ab09ceb7d9ec1bb3d41ff3993a34d803",
"repository": {
"slug": "rep_1",
"id": 11,
"name": "rep_1",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PROJECT_1",
"id": 1,
"name": "Project 1",
"description": "Default configuration project #1",
"public": false,
"type": "NORMAL",
"link": {
"url": "/projects/PROJECT_1",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1"
}
]
}
},
"public": false,
"link": {
"url": "/projects/PROJECT_1/repos/rep_1/browse",
"rel": "self"
},
"cloneUrl": "http://admin@localhost:7990/stash/scm/project_1/rep_1.git",
"links": {
"clone": [
{
"href": "ssh://git@localhost:7999/project_1/rep_1.git",
"name": "ssh"
},
{
"href": "http://admin@localhost:7990/stash/scm/project_1/rep_1.git",
"name": "http"
}
],
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/browse"
}
]
}
}
},
"locked": false,
"author": {
"user": {
"name": "admin",
"emailAddress": "admin@example.com",
"id": 1,
"displayName": "Administrator",
"active": true,
"slug": "admin",
"type": "NORMAL",
"link": {
"url": "/users/admin",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/users/admin"
}
]
}
},
"role": "AUTHOR",
"approved": false
},
"reviewers": [],
"participants": [],
"attributes": {
"resolvedTaskCount": [
"0"
],
"commentCount": [
"5"
],
"openTaskCount": [
"0"
]
},
"link": {
"url": "/projects/PROJECT_1/repos/rep_1/pull-requests/1",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/pull-requests/1"
}
]
}
}
],
"start": 0
}''',
content_type="application/json")
httpretty.register_uri(httpretty.GET,
"http://localhost:7990/stash/rest/api/1.0/projects/PRJ/repos/my-repo2/pull-requests?limit=100",
body='''{
"size": 1,
"limit": 25,
"isLastPage": true,
"values": [
{
"id": 1,
"version": 0,
"title": "Dev",
"description": "* a couple of changes",
"state": "OPEN",
"open": true,
"closed": false,
"createdDate": 1436283154855,
"updatedDate": 1436283154855,
"fromRef": {
"id": "refs/heads/dev",
"displayId": "dev",
"latestChangeset": "bf97bf79c6d2b14757d6a929a576a65be296cc20",
"repository": {
"slug": "rep_2",
"id": 11,
"name": "rep_2",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PROJECT_1",
"id": 1,
"name": "Project 1",
"description": "Default configuration project #1",
"public": false,
"type": "NORMAL",
"link": {
"url": "/projects/PROJECT_1",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1"
}
]
}
},
"public": false,
"link": {
"url": "/projects/PROJECT_1/repos/rep_2/browse",
"rel": "self"
},
"cloneUrl": "http://admin@localhost:7990/stash/scm/project_1/rep_2.git",
"links": {
"clone": [
{
"href": "ssh://git@localhost:7999/project_1/rep_2.git",
"name": "ssh"
},
{
"href": "http://admin@localhost:7990/stash/scm/project_1/rep_2.git",
"name": "http"
}
],
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/browse"
}
]
}
}
},
"toRef": {
"id": "refs/heads/master",
"displayId": "master",
"latestChangeset": "0c38f167ab09ceb7d9ec1bb3d41ff3993a34d803",
"repository": {
"slug": "rep_1",
"id": 11,
"name": "rep_1",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PROJECT_1",
"id": 1,
"name": "Project 1",
"description": "Default configuration project #1",
"public": false,
"type": "NORMAL",
"link": {
"url": "/projects/PROJECT_1",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1"
}
]
}
},
"public": false,
"link": {
"url": "/projects/PROJECT_1/repos/rep_2/browse",
"rel": "self"
},
"cloneUrl": "http://admin@localhost:7990/stash/scm/project_1/rep_2.git",
"links": {
"clone": [
{
"href": "ssh://git@localhost:7999/project_1/rep_2.git",
"name": "ssh"
},
{
"href": "http://admin@localhost:7990/stash/scm/project_1/rep_1.git",
"name": "http"
}
],
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1/repos/rep_1/browse"
}
]
}
}
},
"locked": false,
"author": {
"user": {
"name": "admin",
"emailAddress": "admin@example.com",
"id": 1,
"displayName": "Administrator",
"active": true,
"slug": "admin",
"type": "NORMAL",
"link": {
"url": "/users/admin",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/users/admin"
}
]
}
},
"role": "AUTHOR",
"approved": false
},
"reviewers": [],
"participants": [],
"attributes": {
"resolvedTaskCount": [
"0"
],
"commentCount": [
"5"
],
"openTaskCount": [
"0"
]
},
"link": {
"url": "/projects/PROJECT_1/repos/rep_2/pull-requests/1",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://localhost:7990/stash/projects/PROJECT_1/repos/rep_2/pull-requests/1"
}
]
}
}
],
"start": 0
}''',
content_type="application/json")
def _mock_projects_rest_call(self):
httpretty.register_uri(httpretty.GET, "http://localhost:7990/stash/rest/api/1.0/projects",
body='''{
"size": 1,
"limit": 25,
"isLastPage": true,
"values": [
{
"key": "PRJ",
"id": 1,
"name": "My Cool Project",
"description": "The description for my cool project.",
"public": true,
"type": "NORMAL",
"link": {
"url": "http://link/to/project",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://link/to/project"
}
]
}
}
],
"start": 0
}''',
content_type="application/json")
def _mock_repos_rest_call(self):
httpretty.register_uri(httpretty.GET, "http://localhost:7990/stash/rest/api/1.0/repos",
responses=[
httpretty.Response(body='''{
"size": 1,
"limit": 1,
"isLastPage": false,
"values": [
{
"slug": "my-repo1",
"id": 1,
"name": "My repo 1",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PRJ",
"id": 1,
"name": "My Cool Project",
"description": "The description for my cool project.",
"public": true,
"type": "NORMAL",
"link": {
"url": "http://link/to/project",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://link/to/project"
}
]
}
},
"public": false,
"cloneUrl": "https://<baseURL>/scm/PRJ/my-repo1.git",
"link": {
"url": "http://link/to/repository",
"rel": "self"
},
"links": {
"clone": [
{
"href": "https://<baseURL>/scm/PRJ/my-repo1.git",
"name": "http"
},
{
"href": "ssh://git@<baseURL>/PRJ/my-repo.git",
"name": "ssh"
}
],
"self": [
{
"href": "http://link/to/repository1",
"rel": "self"
}
]
}
}
],
"start": 0,
"nextPageStart": 1
}'''),
httpretty.Response(body='''{
"size": 1,
"limit": 1,
"isLastPage": true,
"values": [
{
"slug": "my-repo2",
"id": 1,
"name": "My repo 2",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"origin": {
"slug": "my-repo",
"id": 1,
"name": "My repo",
"scmId": "git",
"state": "AVAILABLE",
"statusMessage": "Available",
"forkable": true,
"project": {
"key": "PRJ",
"id": 1,
"name": "My Cool Project",
"description": "The description for my cool project.",
"public": true,
"type": "NORMAL",
"link": {
"url": "http://link/to/project",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://link/to/project"
}
]
}
}
},
"project": {
"key": "PRJ",
"id": 1,
"name": "My Cool Project",
"description": "The description for my cool project.",
"public": true,
"type": "NORMAL",
"link": {
"url": "http://link/to/project",
"rel": "self"
},
"links": {
"self": [
{
"href": "http://link/to/project"
}
]
}
},
"public": true,
"cloneUrl": "https://<baseURL>/scm/PRJ/my-repo2.git",
"link": {
"url": "http://link/to/repository",
"rel": "self"
},
"links": {
"clone": [
{
"href": "https://<baseURL>/scm/PRJ/my-repo2.git",
"name": "http"
},
{
"href": "ssh://git@<baseURL>/PRJ/my-repo.git",
"name": "ssh"
}
],
"self": [
{
"href": "http://link/to/repository2",
"rel": "self"
}
]
}
}
],
"start": 1
}''')
],
content_type="application/json")
| 72.159067 | 159 | 0.169562 | 2,229 | 68,046 | 5.050695 | 0.083894 | 0.052585 | 0.065553 | 0.056671 | 0.849352 | 0.829277 | 0.802985 | 0.77758 | 0.740451 | 0.727572 | 0 | 0.04951 | 0.767584 | 68,046 | 942 | 160 | 72.235669 | 0.662346 | 0.001719 | 0 | 0.636964 | 0 | 0.035204 | 0.91923 | 0.019629 | 0 | 0 | 0 | 0 | 0.006601 | 1 | 0.013201 | false | 0 | 0.008801 | 0 | 0.023102 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
260c6cec7aef768b4767a91da6c81d572acae7f8 | 245 | py | Python | Ex0016/ex0016.py | Rodrigo-Antonio-Silva/ExerciciosPythonCursoemVideo | 3b2d68094dd5d60f0e45a75590eb2be9be030640 | [
"MIT"
] | null | null | null | Ex0016/ex0016.py | Rodrigo-Antonio-Silva/ExerciciosPythonCursoemVideo | 3b2d68094dd5d60f0e45a75590eb2be9be030640 | [
"MIT"
] | null | null | null | Ex0016/ex0016.py | Rodrigo-Antonio-Silva/ExerciciosPythonCursoemVideo | 3b2d68094dd5d60f0e45a75590eb2be9be030640 | [
"MIT"
] | null | null | null | """from math import trunc
r = float(input('Insira um número:'))
print('A porção inteira de {} é: {}'.format(r, trunc(r)))
print ('Fim')"""
n = float(input('Insira um número: '))
print('A porção inteira de {} é {}. '.format(n, int(n)))
| 27.222222 | 58 | 0.591837 | 38 | 245 | 3.815789 | 0.526316 | 0.082759 | 0.22069 | 0.248276 | 0.717241 | 0.717241 | 0.717241 | 0.717241 | 0.717241 | 0.717241 | 0 | 0 | 0.179592 | 245 | 8 | 59 | 30.625 | 0.721393 | 0.538776 | 0 | 0 | 0 | 0 | 0.479592 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
26166e7899f0ad87afd5dfcfac12063ddbf61837 | 140 | py | Python | tests/chainerx_tests/unit_tests/test_core_module.py | zaltoprofen/chainer | 3b03f9afc80fd67f65d5e0395ef199e9506b6ee1 | [
"MIT"
] | 3,705 | 2017-06-01T07:36:12.000Z | 2022-03-30T10:46:15.000Z | tests/chainerx_tests/unit_tests/test_core_module.py | nolfwin/chainer | 8d776fcc1e848cb9d3800a6aab356eb91ae9d088 | [
"MIT"
] | 5,998 | 2017-06-01T06:40:17.000Z | 2022-03-08T01:42:44.000Z | tests/chainerx_tests/unit_tests/test_core_module.py | nolfwin/chainer | 8d776fcc1e848cb9d3800a6aab356eb91ae9d088 | [
"MIT"
] | 1,150 | 2017-06-02T03:39:46.000Z | 2022-03-29T02:29:32.000Z | import chainerx
def test_core():
assert chainerx.__name__ == 'chainerx'
def test_is_available():
assert chainerx.is_available()
| 14 | 42 | 0.728571 | 17 | 140 | 5.529412 | 0.529412 | 0.234043 | 0.319149 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 140 | 9 | 43 | 15.555556 | 0.810345 | 0 | 0 | 0 | 0 | 0 | 0.057143 | 0 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.4 | true | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
262e83be56cfa4c706565120e2c795a4e3e33dc3 | 141 | py | Python | app/dummy.py | tyty999/py3startapp | 199766f7b7a96fd16cb04a58eb381946337b568f | [
"MIT"
] | 13 | 2018-10-10T03:31:13.000Z | 2022-03-27T22:44:37.000Z | app/dummy.py | tyty999/py3startapp | 199766f7b7a96fd16cb04a58eb381946337b568f | [
"MIT"
] | 6 | 2019-11-16T17:11:27.000Z | 2021-05-30T12:33:11.000Z | app/dummy.py | tyty999/py3startapp | 199766f7b7a96fd16cb04a58eb381946337b568f | [
"MIT"
] | 2 | 2020-10-02T07:01:12.000Z | 2021-08-06T08:21:31.000Z | """Dummy module.
Dummy module to exercise unit test code. Replace this with actual application
logic.
"""
def dummy():
return 'dummy'
| 14.1 | 77 | 0.70922 | 19 | 141 | 5.263158 | 0.789474 | 0.22 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191489 | 141 | 9 | 78 | 15.666667 | 0.877193 | 0.702128 | 0 | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
26763e021ca6a1e35de4963bebcc8ee0296a2212 | 144 | py | Python | wsgiproxy/__init__.py | vlcinsky/WSGIProxy2 | dfabea6f9e8d3a026bbbe8f7c30b39dc6d3d26f6 | [
"MIT"
] | 6 | 2015-06-13T21:54:30.000Z | 2020-08-14T17:54:25.000Z | wsgiproxy/__init__.py | vlcinsky/WSGIProxy2 | dfabea6f9e8d3a026bbbe8f7c30b39dc6d3d26f6 | [
"MIT"
] | 14 | 2015-07-10T18:33:57.000Z | 2021-08-19T17:18:26.000Z | wsgiproxy/__init__.py | vlcinsky/WSGIProxy2 | dfabea6f9e8d3a026bbbe8f7c30b39dc6d3d26f6 | [
"MIT"
] | 15 | 2016-02-14T03:03:52.000Z | 2021-08-19T06:44:27.000Z | # -*- coding: utf-8 -*-
from .proxies import Proxy # NOQA
from .proxies import HostProxy # NOQA
from .proxies import TransparentProxy # NOQA
| 28.8 | 45 | 0.715278 | 18 | 144 | 5.722222 | 0.555556 | 0.320388 | 0.495146 | 0.407767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008475 | 0.180556 | 144 | 4 | 46 | 36 | 0.864407 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
cd11b3d492ccad25718074f25cdb8d75b861f9dd | 15,404 | py | Python | sdk/python/pulumi_azure/mssql/job_agent.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2018-06-18T00:19:44.000Z | 2022-02-20T05:32:57.000Z | sdk/python/pulumi_azure/mssql/job_agent.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 663 | 2018-06-18T21:08:46.000Z | 2022-03-31T20:10:11.000Z | sdk/python/pulumi_azure/mssql/job_agent.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 41 | 2018-07-19T22:37:38.000Z | 2022-03-14T10:56:26.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['JobAgentArgs', 'JobAgent']
@pulumi.input_type
class JobAgentArgs:
def __init__(__self__, *,
database_id: pulumi.Input[str],
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a JobAgent resource.
:param pulumi.Input[str] database_id: The ID of the database to store metadata for the Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[str] location: The Azure Region where the Elastic Job Agent should exist. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[str] name: The name which should be used for this Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags which should be assigned to the Database.
"""
pulumi.set(__self__, "database_id", database_id)
if location is not None:
pulumi.set(__self__, "location", location)
if name is not None:
pulumi.set(__self__, "name", name)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="databaseId")
def database_id(self) -> pulumi.Input[str]:
"""
The ID of the database to store metadata for the Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
"""
return pulumi.get(self, "database_id")
@database_id.setter
def database_id(self, value: pulumi.Input[str]):
pulumi.set(self, "database_id", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
The Azure Region where the Elastic Job Agent should exist. Changing this forces a new Elastic Job Agent to be created.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name which should be used for this Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A mapping of tags which should be assigned to the Database.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _JobAgentState:
def __init__(__self__, *,
database_id: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
Input properties used for looking up and filtering JobAgent resources.
:param pulumi.Input[str] database_id: The ID of the database to store metadata for the Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[str] location: The Azure Region where the Elastic Job Agent should exist. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[str] name: The name which should be used for this Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags which should be assigned to the Database.
"""
if database_id is not None:
pulumi.set(__self__, "database_id", database_id)
if location is not None:
pulumi.set(__self__, "location", location)
if name is not None:
pulumi.set(__self__, "name", name)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="databaseId")
def database_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the database to store metadata for the Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
"""
return pulumi.get(self, "database_id")
@database_id.setter
def database_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "database_id", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
The Azure Region where the Elastic Job Agent should exist. Changing this forces a new Elastic Job Agent to be created.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name which should be used for this Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A mapping of tags which should be assigned to the Database.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
class JobAgent(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
database_id: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
"""
Manages an Elastic Job Agent.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="northeurope")
example_server = azure.mssql.Server("exampleServer",
resource_group_name=example_resource_group.name,
location=example_resource_group.location,
version="12.0",
administrator_login="4dm1n157r470r",
administrator_login_password="4-v3ry-53cr37-p455w0rd")
example_database = azure.mssql.Database("exampleDatabase",
server_id=example_server.id,
collation="SQL_Latin1_General_CP1_CI_AS",
sku_name="S1")
example_job_agent = azure.mssql.JobAgent("exampleJobAgent",
location=example_resource_group.location,
database_id=example_database.id)
```
## Import
Elastic Job Agents can be imported using the `id`, e.g.
```sh
$ pulumi import azure:mssql/jobAgent:JobAgent example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mygroup1/providers/Microsoft.Sql/servers/myserver1/jobAgents/myjobagent1
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] database_id: The ID of the database to store metadata for the Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[str] location: The Azure Region where the Elastic Job Agent should exist. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[str] name: The name which should be used for this Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags which should be assigned to the Database.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: JobAgentArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages an Elastic Job Agent.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="northeurope")
example_server = azure.mssql.Server("exampleServer",
resource_group_name=example_resource_group.name,
location=example_resource_group.location,
version="12.0",
administrator_login="4dm1n157r470r",
administrator_login_password="4-v3ry-53cr37-p455w0rd")
example_database = azure.mssql.Database("exampleDatabase",
server_id=example_server.id,
collation="SQL_Latin1_General_CP1_CI_AS",
sku_name="S1")
example_job_agent = azure.mssql.JobAgent("exampleJobAgent",
location=example_resource_group.location,
database_id=example_database.id)
```
## Import
Elastic Job Agents can be imported using the `id`, e.g.
```sh
$ pulumi import azure:mssql/jobAgent:JobAgent example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mygroup1/providers/Microsoft.Sql/servers/myserver1/jobAgents/myjobagent1
```
:param str resource_name: The name of the resource.
:param JobAgentArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(JobAgentArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
database_id: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = JobAgentArgs.__new__(JobAgentArgs)
if database_id is None and not opts.urn:
raise TypeError("Missing required property 'database_id'")
__props__.__dict__["database_id"] = database_id
__props__.__dict__["location"] = location
__props__.__dict__["name"] = name
__props__.__dict__["tags"] = tags
super(JobAgent, __self__).__init__(
'azure:mssql/jobAgent:JobAgent',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
database_id: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None) -> 'JobAgent':
"""
Get an existing JobAgent resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] database_id: The ID of the database to store metadata for the Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[str] location: The Azure Region where the Elastic Job Agent should exist. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[str] name: The name which should be used for this Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags which should be assigned to the Database.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _JobAgentState.__new__(_JobAgentState)
__props__.__dict__["database_id"] = database_id
__props__.__dict__["location"] = location
__props__.__dict__["name"] = name
__props__.__dict__["tags"] = tags
return JobAgent(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="databaseId")
def database_id(self) -> pulumi.Output[str]:
"""
The ID of the database to store metadata for the Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
"""
return pulumi.get(self, "database_id")
@property
@pulumi.getter
def location(self) -> pulumi.Output[str]:
"""
The Azure Region where the Elastic Job Agent should exist. Changing this forces a new Elastic Job Agent to be created.
"""
return pulumi.get(self, "location")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name which should be used for this Elastic Job Agent. Changing this forces a new Elastic Job Agent to be created.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
A mapping of tags which should be assigned to the Database.
"""
return pulumi.get(self, "tags")
| 44.011429 | 202 | 0.647429 | 1,902 | 15,404 | 5.070452 | 0.101472 | 0.078702 | 0.078391 | 0.054749 | 0.832849 | 0.812007 | 0.800187 | 0.791995 | 0.786707 | 0.785877 | 0 | 0.010402 | 0.257336 | 15,404 | 349 | 203 | 44.137536 | 0.832605 | 0.434303 | 0 | 0.701149 | 1 | 0 | 0.068003 | 0.003749 | 0 | 0 | 0 | 0 | 0 | 1 | 0.155172 | false | 0.005747 | 0.028736 | 0 | 0.275862 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
26e4ab674b7f6b44d484d06c661267ed7ce69d56 | 28,423 | py | Python | imaginaire/model_utils/gancraft/camctl.py | hw07216/imaginaire | 87c774114622e39488a5ea8a7728b1a20896afb9 | [
"RSA-MD"
] | 3,308 | 2020-07-15T17:50:13.000Z | 2022-03-31T14:53:31.000Z | imaginaire/model_utils/gancraft/camctl.py | hw07216/imaginaire | 87c774114622e39488a5ea8a7728b1a20896afb9 | [
"RSA-MD"
] | 132 | 2020-09-20T17:36:28.000Z | 2022-03-28T12:40:03.000Z | src/imaginaire/model_utils/gancraft/camctl.py | livingbio/imaginaire-fsvid2vid | d82c87aced50afd44fd162491ba5b59056b74034 | [
"RSA-MD"
] | 370 | 2020-09-29T00:34:08.000Z | 2022-03-30T04:12:48.000Z | # Copyright (C) 2021 NVIDIA CORPORATION & AFFILIATES. All rights reserved.
#
# This work is made available under the Nvidia Source Code License-NC.
# To view a copy of this license, check out LICENSE.md
import numpy as np
import torch
class EvalCameraController:
def __init__(self, voxel, maxstep=128, pattern=0, cam_ang=73, smooth_decay_multiplier=1.0):
self.voxel = voxel
self.maxstep = maxstep
self.camera_poses = [] # ori, dir, up, f
circle = torch.linspace(0, 2*np.pi, steps=maxstep)
size = min(voxel.voxel_t.size(1), voxel.voxel_t.size(2)) / 2
# Shrink the circle a bit.
shift = size * 0.2
size = size * 0.8
if pattern == 0:
height_history = []
# Calculate smooth height.
for i in range(maxstep):
farpoint = torch.tensor([
70,
torch.sin(circle[i])*size + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size + voxel.voxel_t.size(2)/2 + shift])
height_history.append(self._get_height(farpoint[1], farpoint[2], farpoint[0]))
# Filtfilt
height_history = self.filtfilt(height_history, decay=0.2*smooth_decay_multiplier)
for i in range(maxstep):
farpoint = torch.tensor([
70,
torch.sin(circle[i])*size + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = height_history[i]
nearpoint = torch.tensor([
60,
torch.sin(circle[i]+0.5*np.pi)*size*0.5 + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]+0.5*np.pi)*size*0.5 + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(cam_ang/2)) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
elif pattern == 1:
zoom = torch.linspace(1.0, 0.25, steps=maxstep)
height_history = []
for i in range(maxstep):
farpoint = torch.tensor([
90,
torch.sin(circle[i])*size + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size + voxel.voxel_t.size(2)/2 + shift])
height_history.append(self._get_height(farpoint[1], farpoint[2], farpoint[0]))
height_history = self.filtfilt(height_history, decay=0.2*smooth_decay_multiplier)
for i in range(maxstep):
farpoint = torch.tensor([
90,
torch.sin(circle[i])*size + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = height_history[i]
nearpoint = torch.tensor([
60,
torch.sin(circle[i]-0.3*np.pi)*size*0.3 + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]-0.3*np.pi)*size*0.3 + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(cam_ang/2)*zoom[i]) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
elif pattern == 2:
move = torch.linspace(1.0, 0.2, steps=maxstep)
height_history = []
for i in range(maxstep):
farpoint = torch.tensor([
90,
torch.sin(circle[i])*size*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size*move[i] + voxel.voxel_t.size(2)/2 + shift])
height_history.append(self._get_height(farpoint[1], farpoint[2], farpoint[0]))
height_history = self.filtfilt(height_history, decay=0.2*smooth_decay_multiplier)
for i in range(maxstep):
farpoint = torch.tensor([
90,
torch.sin(circle[i])*size*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size*move[i] + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = height_history[i]
nearpoint = torch.tensor([
60,
torch.sin(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(cam_ang/2)) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
elif pattern == 3:
move = torch.linspace(0.75, 0.2, steps=maxstep)
height_history = []
for i in range(maxstep):
farpoint = torch.tensor([
70,
torch.sin(-circle[i])*size*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(-circle[i])*size*move[i] + voxel.voxel_t.size(2)/2 + shift])
height_history.append(self._get_height(farpoint[1], farpoint[2], farpoint[0]))
height_history = self.filtfilt(height_history, decay=0.2*smooth_decay_multiplier)
for i in range(maxstep):
farpoint = torch.tensor([
70,
torch.sin(-circle[i])*size*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(-circle[i])*size*move[i] + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = height_history[i]
nearpoint = torch.tensor([
60,
torch.sin(-circle[i]-0.4*np.pi)*size*0.9*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(-circle[i]-0.4*np.pi)*size*0.9*move[i] + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(cam_ang/2)) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
elif pattern == 4:
move = torch.linspace(1.0, 0.5, steps=maxstep)
height_history = []
for i in range(maxstep):
farpoint = torch.tensor([
90,
torch.sin(circle[i])*size*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size*move[i] + voxel.voxel_t.size(2)/2 + shift])
height_history.append(self._get_height(farpoint[1], farpoint[2], farpoint[0]))
height_history = self.filtfilt(height_history, decay=0.2*smooth_decay_multiplier)
for i in range(maxstep):
farpoint = torch.tensor([
90,
torch.sin(circle[i])*size*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size*move[i] + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = height_history[i]
nearpoint = torch.tensor([
60,
torch.sin(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(cam_ang/2)) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
# look outward
elif pattern == 5:
move = torch.linspace(1.0, 0.5, steps=maxstep)
height_history = []
for i in range(maxstep):
nearpoint = torch.tensor([
60,
torch.sin(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(2)/2 + shift])
height_history.append(self._get_height(nearpoint[1], nearpoint[2], nearpoint[0]))
height_history = self.filtfilt(height_history, decay=0.2*smooth_decay_multiplier)
for i in range(maxstep):
nearpoint = torch.tensor([
60,
torch.sin(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(2)/2 + shift])
nearpoint[0] = height_history[i]
farpoint = torch.tensor([
60,
torch.sin(circle[i])*size*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size*move[i] + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(nearpoint)
cam_dir = self.voxel.world2local(farpoint - nearpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(cam_ang/2)) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
# Rise
elif pattern == 6:
shift = 0
lift = torch.linspace(0.0, 200.0, steps=maxstep)
zoom = torch.linspace(0.8, 1.6, steps=maxstep)
for i in range(maxstep):
farpoint = torch.tensor([
80+lift[i],
torch.sin(circle[i]/4)*size*0.2 + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]/4)*size*0.2 + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = self._get_height(farpoint[1], farpoint[2], farpoint[0])
nearpoint = torch.tensor([
65,
torch.sin(circle[i]/4+0.5*np.pi)*size*0.1 + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]/4+0.5*np.pi)*size*0.1 + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(73/2)*zoom[i]) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
# 45deg
elif pattern == 7:
rad = torch.tensor([np.deg2rad(45).astype(np.float32)])
size = 1536
for i in range(maxstep):
farpoint = torch.tensor([
61+size,
torch.sin(rad)*size + voxel.voxel_t.size(1)/2,
torch.cos(rad)*size + voxel.voxel_t.size(2)/2])
nearpoint = torch.tensor([
61,
voxel.voxel_t.size(1)/2,
voxel.voxel_t.size(2)/2])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(19.5/2)) # about 50mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
def _get_height(self, loc0, loc1, minheight):
loc0 = int(loc0)
loc1 = int(loc1)
height = minheight
for dx in range(-3, 4):
for dy in range(-3, 4):
if (loc0+dx) < 0 or (loc0+dx) >= self.voxel.heightmap.shape[0] or (loc1+dy) < 0 or \
(loc1+dy) >= self.voxel.heightmap.shape[1]:
height = max(height, minheight)
else:
height = max(height, self.voxel.heightmap[loc0+dx, loc1+dy] + 2)
return height
def filtfilt(self, height_history, decay=0.2):
# Filtfilt
height_history2 = []
maxstep = len(height_history)
prev_height = height_history[0]
for i in range(maxstep):
prev_height = prev_height - decay
if prev_height < height_history[i]:
prev_height = height_history[i]
height_history2.append(prev_height)
prev_height = height_history[-1]
for i in range(maxstep-1, -1, -1):
prev_height = prev_height - decay
if prev_height < height_history[i]:
prev_height = height_history[i]
height_history2[i] = max(prev_height, height_history2[i])
return height_history2
def __len__(self):
return len(self.camera_poses)
def __getitem__(self, idx):
return self.camera_poses[idx]
class TourCameraController:
def __init__(self, voxel, maxstep=128):
self.voxel = voxel
self.maxstep = maxstep
self.camera_poses = [] # ori, dir, up, f
circle = torch.linspace(0, 2*np.pi, steps=maxstep//4)
size = min(voxel.voxel_t.size(1), voxel.voxel_t.size(2)) / 2
# Shrink the circle a bit
shift = size * 0.2
size = size * 0.8
for i in range(maxstep//4):
farpoint = torch.tensor([
70,
torch.sin(circle[i])*size + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = self._get_height(farpoint[1], farpoint[2], farpoint[0])
nearpoint = torch.tensor([
60,
torch.sin(circle[i]+0.5*np.pi)*size*0.5 + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]+0.5*np.pi)*size*0.5 + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(73/2)) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
zoom = torch.linspace(1.0, 0.25, steps=maxstep//4)
for i in range(maxstep//4):
farpoint = torch.tensor([
90,
torch.sin(circle[i])*size + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = self._get_height(farpoint[1], farpoint[2], farpoint[0])
nearpoint = torch.tensor([
60,
torch.sin(circle[i]-0.3*np.pi)*size*0.3 + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]-0.3*np.pi)*size*0.3 + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(73/2)*zoom[i]) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
move = torch.linspace(1.0, 0.2, steps=maxstep//4)
for i in range(maxstep//4):
farpoint = torch.tensor([
90,
torch.sin(circle[i])*size*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size*move[i] + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = self._get_height(farpoint[1], farpoint[2], farpoint[0])
nearpoint = torch.tensor([
60,
torch.sin(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]+0.5*np.pi)*size*0.3*move[i] + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(73/2)) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
lift = torch.linspace(0.0, 200.0, steps=maxstep//4)
zoom = torch.linspace(0.6, 1.2, steps=maxstep//4)
for i in range(maxstep//4):
farpoint = torch.tensor([
80+lift[i],
torch.sin(circle[i])*size*0.2 + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i])*size*0.2 + voxel.voxel_t.size(2)/2 + shift])
farpoint[0] = self._get_height(farpoint[1], farpoint[2], farpoint[0])
nearpoint = torch.tensor([
60,
torch.sin(circle[i]+0.5*np.pi)*size*0.1 + voxel.voxel_t.size(1)/2 + shift,
torch.cos(circle[i]+0.5*np.pi)*size*0.1 + voxel.voxel_t.size(2)/2 + shift])
cam_ori = self.voxel.world2local(farpoint)
cam_dir = self.voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = self.voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(73/2)*zoom[i]) # about 24mm fov
self.camera_poses.append((cam_ori, cam_dir, cam_up, cam_f))
def _get_height(self, loc0, loc1, minheight):
loc0 = int(loc0)
loc1 = int(loc1)
height = minheight
for dx in range(-3, 4):
for dy in range(-3, 4):
if (loc0+dx) < 0 or (loc0+dx) >= self.voxel.heightmap.shape[0] or (loc1+dy) < 0 or \
(loc1+dy) >= self.voxel.heightmap.shape[1]:
height = max(height, minheight)
else:
height = max(height, self.voxel.heightmap[loc0+dx, loc1+dy] + 2)
return height
def __len__(self):
return len(self.camera_poses)
def __getitem__(self, idx):
return self.camera_poses[idx]
def rand_camera_pose_birdseye(voxel, border=128):
r"""Generating random camera pose in the upper hemisphere, in the format of origin-direction-up
Assuming [Y X Z] coordinate. Y is negative gravity direction.
The camera pose is converted into the voxel coordinate system so that it can be used directly for rendering
1. Uniformly sample a point on the upper hemisphere of a unit sphere, as cam_ori.
2. Set cam_dir to be from cam_ori to the origin
3. cam_up is always pointing towards sky
4. move cam_ori to random place according to voxel size
"""
cam_dir = torch.randn(3, dtype=torch.float32)
cam_dir = cam_dir / torch.sqrt(torch.sum(cam_dir*cam_dir))
cam_dir[0] = -torch.abs(cam_dir[0])
cam_up = torch.tensor([1, 0, 0], dtype=torch.float32)
# generate camera lookat target
r = np.random.rand(2)
r[0] *= voxel.voxel_t.size(1)-border-border
r[1] *= voxel.voxel_t.size(2)-border-border
r = r + border
y = voxel.heightmap[int(r[0]+0.5), int(r[1]+0.5)] + (np.random.rand(1)-0.5) * 5
cam_target = torch.tensor([y, r[0], r[1]], dtype=torch.float32)
cam_ori = cam_target - cam_dir * (np.random.rand(1).item() * 100)
cam_ori[0] = max(voxel.heightmap[int(cam_ori[1]+0.5), int(cam_ori[2]+0.5)]+2, cam_ori[0])
# Translate to voxel coordinate
cam_ori = voxel.world2local(cam_ori)
cam_dir = voxel.world2local(cam_dir, is_vec=True)
cam_up = voxel.world2local(cam_up, is_vec=True)
return cam_ori, cam_dir, cam_up
def get_neighbor_height(heightmap, loc0, loc1, minheight, neighbor_size=7):
loc0 = int(loc0)
loc1 = int(loc1)
height = 0
for dx in range(-neighbor_size//2, neighbor_size//2+1):
for dy in range(-neighbor_size//2, neighbor_size//2+1):
if (loc0+dx) < 0 or (loc0+dx) >= heightmap.shape[0] or (loc1+dy) < 0 or (loc1+dy) >= heightmap.shape[1]:
height = max(height, minheight)
else:
height = max(minheight, heightmap[loc0+dx, loc1+dy] + 2)
return height
def rand_camera_pose_firstperson(voxel, border=128):
r"""Generating random camera pose in the upper hemisphere, in the format of origin-direction-up
"""
r = np.random.rand(5)
r[0] *= voxel.voxel_t.size(1)-border-border
r[1] *= voxel.voxel_t.size(2)-border-border
r[0] = r[0] + border
r[1] = r[1] + border
y = get_neighbor_height(voxel.heightmap, r[0], r[1], 0) + np.random.rand(1) * 15
cam_ori = torch.tensor([y, r[0], r[1]], dtype=torch.float32)
rand_ang_h = r[2] * 2 * np.pi
cam_target = torch.tensor([0, cam_ori[1]+np.sin(rand_ang_h)*border*r[4], cam_ori[2] +
np.cos(rand_ang_h)*border*r[4]], dtype=torch.float32)
cam_target[0] = get_neighbor_height(voxel.heightmap, cam_target[1],
cam_target[2], 0, neighbor_size=1) - 2 + r[3] * 10
cam_dir = cam_target - cam_ori
cam_up = torch.tensor([1, 0, 0], dtype=torch.float32)
cam_ori = voxel.world2local(cam_ori)
cam_dir = voxel.world2local(cam_dir, is_vec=True)
cam_up = voxel.world2local(cam_up, is_vec=True)
return cam_ori, cam_dir, cam_up
def rand_camera_pose_thridperson(voxel, border=96):
r = torch.rand(2)
r[0] *= voxel.voxel_t.size(1)
r[1] *= voxel.voxel_t.size(2)
rand_height = 60 + torch.rand(1) * 40
rand_height = get_neighbor_height(voxel.heightmap, r[0], r[1], rand_height, neighbor_size=5)
farpoint = torch.tensor([rand_height, r[0], r[1]], dtype=torch.float32)
r = torch.rand(2)
r[0] *= voxel.voxel_t.size(1) - border - border
r[1] *= voxel.voxel_t.size(2) - border - border
r[0] = r[0] + border
r[1] = r[1] + border
rand_height = get_neighbor_height(voxel.heightmap, r[0], r[1], 65, neighbor_size=1) - 5
nearpoint = torch.tensor([rand_height, r[0], r[1]], dtype=torch.float32)
cam_ori = voxel.world2local(farpoint)
cam_dir = voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = voxel.world2local(torch.tensor([1, 0, 0], dtype=torch.float32), is_vec=True)
return cam_ori, cam_dir, cam_up
def rand_camera_pose_thridperson2(voxel, border=48):
r = torch.rand(2)
r[0] *= voxel.voxel_t.size(1) - border - border
r[1] *= voxel.voxel_t.size(2) - border - border
r[0] = r[0] + border
r[1] = r[1] + border
rand_height = 60 + torch.rand(1) * 40
rand_height = get_neighbor_height(voxel.heightmap, r[0], r[1], rand_height, neighbor_size=5)
farpoint = torch.tensor([rand_height, r[0], r[1]], dtype=torch.float32)
r = torch.rand(2)
r[0] *= voxel.voxel_t.size(1) - border - border
r[1] *= voxel.voxel_t.size(2) - border - border
r[0] = r[0] + border
r[1] = r[1] + border
rand_height = get_neighbor_height(voxel.heightmap, r[0], r[1], 65, neighbor_size=1) - 5
nearpoint = torch.tensor([rand_height, r[0], r[1]], dtype=torch.float32)
# Random Up vector (tilt a little bit)
# up = torch.randn(3) * 0.05 # cutoff +-0.1, Tan(10deg) = 0.176
up = torch.randn(3) * 0.02
up[0] = 1.0
up = up / up.norm(p=2)
cam_ori = voxel.world2local(farpoint)
cam_dir = voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = voxel.world2local(up, is_vec=True)
return cam_ori, cam_dir, cam_up
def rand_camera_pose_thridperson3(voxel, border=64):
r"""Attempting to solve the camera too close to wall problem and the lack of aerial poses."""
r = torch.rand(2)
r[0] *= voxel.voxel_t.size(1) - border - border
r[1] *= voxel.voxel_t.size(2) - border - border
r[0] = r[0] + border
r[1] = r[1] + border
rand_height = 60 + torch.rand(1) * 40
if torch.rand(1) > 0.8:
rand_height = 60 + torch.rand(1) * 60
rand_height = get_neighbor_height(voxel.heightmap, r[0], r[1], rand_height, neighbor_size=7)
farpoint = torch.tensor([rand_height, r[0], r[1]], dtype=torch.float32)
r = torch.rand(2)
r[0] *= voxel.voxel_t.size(1) - border - border
r[1] *= voxel.voxel_t.size(2) - border - border
r[0] = r[0] + border
r[1] = r[1] + border
rand_height = get_neighbor_height(voxel.heightmap, r[0], r[1], 65, neighbor_size=3) - 5
nearpoint = torch.tensor([rand_height, r[0], r[1]], dtype=torch.float32)
# Random Up vector (tilt a little bit)
# up = torch.randn(3) * 0.05 # cutoff +-0.1, Tan(10deg) = 0.176
up = torch.randn(3) * 0.02
up[0] = 1.0
up = up / up.norm(p=2)
# print(up)
cam_ori = voxel.world2local(farpoint)
cam_dir = voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = voxel.world2local(up, is_vec=True)
return cam_ori, cam_dir, cam_up
def rand_camera_pose_tour(voxel):
size = min(voxel.voxel_t.size(1), voxel.voxel_t.size(2)) / 2
center = [voxel.voxel_t.size(1)/2, voxel.voxel_t.size(2)/2]
rnd = torch.rand(8)
rnd_deg = torch.rand(1) * 2 * np.pi
far_radius = rnd[0]*0.8+0.2
far_height = rnd[1]*30 + 60
farpoint = torch.tensor([
far_height,
torch.sin(rnd_deg)*size*far_radius + center[0],
torch.cos(rnd_deg)*size*far_radius + center[1]])
farpoint[0] = get_neighbor_height(voxel.heightmap, farpoint[1], farpoint[2], farpoint[0], neighbor_size=7)
near_radius = far_radius * rnd[2]
near_shift_rad = np.pi*(rnd[3]-0.5)
near_height = 60 + rnd[4] * 10
nearpoint = torch.tensor([
near_height,
torch.sin(rnd_deg+near_shift_rad)*size*near_radius + center[0],
torch.cos(rnd_deg+near_shift_rad)*size*near_radius + center[1]])
# Random Up vector (tilt a little bit)
# up = torch.randn(3) * 0.05 # cutoff +-0.1, Tan(10deg) = 0.176
up = torch.randn(3) * 0.02
up[0] = 1.0
up = up / up.norm(p=2)
cam_ori = voxel.world2local(farpoint)
cam_dir = voxel.world2local(nearpoint - farpoint, is_vec=True)
cam_up = voxel.world2local(up, is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(73/2)*(rnd[5]*0.75+0.25)) # about 24mm fov
return cam_ori, cam_dir, cam_up, cam_f
# Look from center to outward
def rand_camera_pose_insideout(voxel):
size = min(voxel.voxel_t.size(1), voxel.voxel_t.size(2)) / 2
center = [voxel.voxel_t.size(1)/2, voxel.voxel_t.size(2)/2]
rnd = torch.rand(8)
rnd_deg = torch.rand(1) * 2 * np.pi
far_radius = rnd[0]*0.8+0.2
far_height = rnd[1]*10 + 60
farpoint = torch.tensor([
far_height,
torch.sin(rnd_deg)*size*far_radius + center[0],
torch.cos(rnd_deg)*size*far_radius + center[1]])
near_radius = far_radius * rnd[2]
near_shift_rad = np.pi*(rnd[3]-0.5)
near_height = 60 + rnd[4] * 30
nearpoint = torch.tensor([
near_height,
torch.sin(rnd_deg+near_shift_rad)*size*near_radius + center[0],
torch.cos(rnd_deg+near_shift_rad)*size*near_radius + center[1]])
nearpoint[0] = get_neighbor_height(voxel.heightmap, nearpoint[1], nearpoint[2], nearpoint[0], neighbor_size=7)
# Random Up vector (tilt a little bit)
# up = torch.randn(3) * 0.05 # cutoff +-0.1, Tan(10deg) = 0.176
up = torch.randn(3) * 0.02
up[0] = 1.0
up = up / up.norm(p=2)
cam_ori = voxel.world2local(nearpoint)
cam_dir = voxel.world2local(farpoint-nearpoint, is_vec=True)
cam_up = voxel.world2local(up, is_vec=True)
cam_f = 0.5/np.tan(np.deg2rad(73/2)*(rnd[5]*0.75+0.25)) # about 24mm fov
return cam_ori, cam_dir, cam_up, cam_f
| 44.341654 | 116 | 0.577138 | 4,241 | 28,423 | 3.732139 | 0.055647 | 0.056861 | 0.061157 | 0.083397 | 0.875474 | 0.869282 | 0.850708 | 0.843252 | 0.839778 | 0.815011 | 0 | 0.056339 | 0.278718 | 28,423 | 640 | 117 | 44.410938 | 0.715721 | 0.060409 | 0 | 0.776181 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034908 | false | 0 | 0.004107 | 0.008214 | 0.073922 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3e357e83e2f151efde9f4e9dc28165490eb87b5a | 377 | py | Python | pytorch_image_classification/datasets/__init__.py | doulemint/pytorch_image_classification | 3553295218b30775272027b8234bb8a2276af30f | [
"MIT"
] | 1 | 2021-08-25T03:07:48.000Z | 2021-08-25T03:07:48.000Z | pytorch_image_classification/datasets/__init__.py | doulemint/pytorch_image_classification | 3553295218b30775272027b8234bb8a2276af30f | [
"MIT"
] | null | null | null | pytorch_image_classification/datasets/__init__.py | doulemint/pytorch_image_classification | 3553295218b30775272027b8234bb8a2276af30f | [
"MIT"
] | null | null | null | from .datasets import create_dataset, MyDataset
from .dataloader import create_dataloader
from .dataloader import prepare_dataloader,worker_init_fn
from .datasets import get_files
from .datasets import create_dataset, MyDataset,pesudoMyDataset
from .dataloader import create_dataloader
from .dataloader import prepare_dataloader, worker_init_fn
from .datasets import get_files
| 41.888889 | 63 | 0.872679 | 49 | 377 | 6.469388 | 0.285714 | 0.15142 | 0.227129 | 0.15142 | 0.952681 | 0.952681 | 0.700315 | 0.700315 | 0.700315 | 0.700315 | 0 | 0 | 0.092838 | 377 | 8 | 64 | 47.125 | 0.926901 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 12 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.