hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
54689a8f70204ed3ce3be420eefa51d103851d6f | 21,932 | py | Python | pynab/openapi/api/payee_locations_api.py | idwagner/pynab | 26e0f2d2adc86374ff5caf5a614778fafa1c19d6 | [
"Apache-2.0"
] | null | null | null | pynab/openapi/api/payee_locations_api.py | idwagner/pynab | 26e0f2d2adc86374ff5caf5a614778fafa1c19d6 | [
"Apache-2.0"
] | null | null | null | pynab/openapi/api/payee_locations_api.py | idwagner/pynab | 26e0f2d2adc86374ff5caf5a614778fafa1c19d6 | [
"Apache-2.0"
] | 1 | 2020-11-05T22:28:13.000Z | 2020-11-05T22:28:13.000Z | # coding: utf-8
"""
YNAB API Endpoints
Our API uses a REST based design, leverages the JSON data format, and relies upon HTTPS for transport. We respond with meaningful HTTP response codes and if an error occurs, we include error details in the response body. API Documentation is at https://api.youneedabudget.com # noqa: E501
The version of the OpenAPI document: 1.0.0
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from pynab.openapi.api_client import ApiClient
from pynab.openapi.exceptions import ( # noqa: F401
ApiTypeError,
ApiValueError
)
class PayeeLocationsApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def get_payee_location_by_id(self, budget_id, payee_location_id, **kwargs): # noqa: E501
"""Single payee location # noqa: E501
Returns a single payee location # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_payee_location_by_id(budget_id, payee_location_id, async_req=True)
>>> result = thread.get()
:param budget_id: The id of the budget. \"last-used\" can be used to specify the last used budget and \"default\" can be used if default budget selection is enabled (see: https://api.youneedabudget.com/#oauth-default-budget). (required)
:type budget_id: str
:param payee_location_id: id of payee location (required)
:type payee_location_id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PayeeLocationResponse
"""
kwargs['_return_http_data_only'] = True
return self.get_payee_location_by_id_with_http_info(budget_id, payee_location_id, **kwargs) # noqa: E501
def get_payee_location_by_id_with_http_info(self, budget_id, payee_location_id, **kwargs): # noqa: E501
"""Single payee location # noqa: E501
Returns a single payee location # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_payee_location_by_id_with_http_info(budget_id, payee_location_id, async_req=True)
>>> result = thread.get()
:param budget_id: The id of the budget. \"last-used\" can be used to specify the last used budget and \"default\" can be used if default budget selection is enabled (see: https://api.youneedabudget.com/#oauth-default-budget). (required)
:type budget_id: str
:param payee_location_id: id of payee location (required)
:type payee_location_id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PayeeLocationResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'budget_id',
'payee_location_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_payee_location_by_id" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'budget_id' is set
if self.api_client.client_side_validation and ('budget_id' not in local_var_params or # noqa: E501
local_var_params['budget_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `budget_id` when calling `get_payee_location_by_id`") # noqa: E501
# verify the required parameter 'payee_location_id' is set
if self.api_client.client_side_validation and ('payee_location_id' not in local_var_params or # noqa: E501
local_var_params['payee_location_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `payee_location_id` when calling `get_payee_location_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'budget_id' in local_var_params:
path_params['budget_id'] = local_var_params['budget_id'] # noqa: E501
if 'payee_location_id' in local_var_params:
path_params['payee_location_id'] = local_var_params['payee_location_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['bearer'] # noqa: E501
response_types_map = {
200: "PayeeLocationResponse",
404: "ErrorResponse",
}
return self.api_client.call_api(
'/budgets/{budget_id}/payee_locations/{payee_location_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def get_payee_locations(self, budget_id, **kwargs): # noqa: E501
"""List payee locations # noqa: E501
Returns all payee locations # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_payee_locations(budget_id, async_req=True)
>>> result = thread.get()
:param budget_id: The id of the budget. \"last-used\" can be used to specify the last used budget and \"default\" can be used if default budget selection is enabled (see: https://api.youneedabudget.com/#oauth-default-budget). (required)
:type budget_id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PayeeLocationsResponse
"""
kwargs['_return_http_data_only'] = True
return self.get_payee_locations_with_http_info(budget_id, **kwargs) # noqa: E501
def get_payee_locations_with_http_info(self, budget_id, **kwargs): # noqa: E501
"""List payee locations # noqa: E501
Returns all payee locations # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_payee_locations_with_http_info(budget_id, async_req=True)
>>> result = thread.get()
:param budget_id: The id of the budget. \"last-used\" can be used to specify the last used budget and \"default\" can be used if default budget selection is enabled (see: https://api.youneedabudget.com/#oauth-default-budget). (required)
:type budget_id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PayeeLocationsResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'budget_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_payee_locations" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'budget_id' is set
if self.api_client.client_side_validation and ('budget_id' not in local_var_params or # noqa: E501
local_var_params['budget_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `budget_id` when calling `get_payee_locations`") # noqa: E501
collection_formats = {}
path_params = {}
if 'budget_id' in local_var_params:
path_params['budget_id'] = local_var_params['budget_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['bearer'] # noqa: E501
response_types_map = {
200: "PayeeLocationsResponse",
404: "ErrorResponse",
}
return self.api_client.call_api(
'/budgets/{budget_id}/payee_locations', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
def get_payee_locations_by_payee(self, budget_id, payee_id, **kwargs): # noqa: E501
"""List locations for a payee # noqa: E501
Returns all payee locations for a specified payee # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_payee_locations_by_payee(budget_id, payee_id, async_req=True)
>>> result = thread.get()
:param budget_id: The id of the budget. \"last-used\" can be used to specify the last used budget and \"default\" can be used if default budget selection is enabled (see: https://api.youneedabudget.com/#oauth-default-budget). (required)
:type budget_id: str
:param payee_id: id of payee (required)
:type payee_id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: PayeeLocationsResponse
"""
kwargs['_return_http_data_only'] = True
return self.get_payee_locations_by_payee_with_http_info(budget_id, payee_id, **kwargs) # noqa: E501
def get_payee_locations_by_payee_with_http_info(self, budget_id, payee_id, **kwargs): # noqa: E501
"""List locations for a payee # noqa: E501
Returns all payee locations for a specified payee # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_payee_locations_by_payee_with_http_info(budget_id, payee_id, async_req=True)
>>> result = thread.get()
:param budget_id: The id of the budget. \"last-used\" can be used to specify the last used budget and \"default\" can be used if default budget selection is enabled (see: https://api.youneedabudget.com/#oauth-default-budget). (required)
:type budget_id: str
:param payee_id: id of payee (required)
:type payee_id: str
:param async_req: Whether to execute the request asynchronously.
:type async_req: bool, optional
:param _return_http_data_only: response data without head status code
and headers
:type _return_http_data_only: bool, optional
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:type _preload_content: bool, optional
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
:param _request_auth: set to override the auth_settings for an a single
request; this effectively ignores the authentication
in the spec for a single request.
:type _request_auth: dict, optional
:return: Returns the result object.
If the method is called asynchronously,
returns the request thread.
:rtype: tuple(PayeeLocationsResponse, status_code(int), headers(HTTPHeaderDict))
"""
local_var_params = locals()
all_params = [
'budget_id',
'payee_id'
]
all_params.extend(
[
'async_req',
'_return_http_data_only',
'_preload_content',
'_request_timeout',
'_request_auth'
]
)
for key, val in six.iteritems(local_var_params['kwargs']):
if key not in all_params:
raise ApiTypeError(
"Got an unexpected keyword argument '%s'"
" to method get_payee_locations_by_payee" % key
)
local_var_params[key] = val
del local_var_params['kwargs']
# verify the required parameter 'budget_id' is set
if self.api_client.client_side_validation and ('budget_id' not in local_var_params or # noqa: E501
local_var_params['budget_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `budget_id` when calling `get_payee_locations_by_payee`") # noqa: E501
# verify the required parameter 'payee_id' is set
if self.api_client.client_side_validation and ('payee_id' not in local_var_params or # noqa: E501
local_var_params['payee_id'] is None): # noqa: E501
raise ApiValueError("Missing the required parameter `payee_id` when calling `get_payee_locations_by_payee`") # noqa: E501
collection_formats = {}
path_params = {}
if 'budget_id' in local_var_params:
path_params['budget_id'] = local_var_params['budget_id'] # noqa: E501
if 'payee_id' in local_var_params:
path_params['payee_id'] = local_var_params['payee_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['bearer'] # noqa: E501
response_types_map = {
200: "PayeeLocationsResponse",
404: "ErrorResponse",
}
return self.api_client.call_api(
'/budgets/{budget_id}/payees/{payee_id}/payee_locations', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_types_map=response_types_map,
auth_settings=auth_settings,
async_req=local_var_params.get('async_req'),
_return_http_data_only=local_var_params.get('_return_http_data_only'), # noqa: E501
_preload_content=local_var_params.get('_preload_content', True),
_request_timeout=local_var_params.get('_request_timeout'),
collection_formats=collection_formats,
_request_auth=local_var_params.get('_request_auth'))
| 46.863248 | 294 | 0.611846 | 2,559 | 21,932 | 4.983587 | 0.086362 | 0.033874 | 0.051596 | 0.025406 | 0.933819 | 0.929899 | 0.922606 | 0.91751 | 0.899396 | 0.899396 | 0 | 0.012613 | 0.316752 | 21,932 | 467 | 295 | 46.963597 | 0.838438 | 0.497766 | 0 | 0.692683 | 0 | 0 | 0.19367 | 0.06205 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034146 | false | 0 | 0.02439 | 0 | 0.092683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
548390e9bec27ab9cd5d973da63a866ca6137155 | 4,463 | py | Python | bench/rand.py | pskopnik/htc-cache-simulator | ee502db3f1c2b99ffe05ee609a18069b583798da | [
"MIT"
] | 1 | 2020-12-15T16:09:31.000Z | 2020-12-15T16:09:31.000Z | bench/rand.py | pskopnik/htc-cache-system-simulator | ee502db3f1c2b99ffe05ee609a18069b583798da | [
"MIT"
] | null | null | null | bench/rand.py | pskopnik/htc-cache-system-simulator | ee502db3f1c2b99ffe05ee609a18069b583798da | [
"MIT"
] | null | null | null | import itertools
import timeit
import random
from .utils import StringSource
from typing import Callable, Deque, List, Set, Tuple
"""Benchmark for data structures + operations for the RAND cache eviction
algorithm.
The operations performed for each replacement are:
1. Choose a random file from the list (algorithm state) and remove it (also
evict from storage).
2. Append the file to be placed to the list (algorithm state).
"""
base_length = 10000
def time_list_choice() -> Tuple[int, float]:
col: List[str] = []
s = StringSource()
col.extend(s.take(base_length))
def do() -> None:
el = random.choice(col)
col.remove(el)
col.append(next(s))
timer = timeit.Timer(do)
return timer.autorange()
def time_list_ind_del() -> Tuple[int, float]:
col: List[str] = []
s = StringSource()
col.extend(s.take(base_length))
def do() -> None:
ind = random.randrange(len(col))
del col[ind]
col.append(next(s))
timer = timeit.Timer(do)
return timer.autorange()
def time_list_ind_del_check() -> Tuple[int, float]:
col: List[str] = []
s = StringSource()
col.extend(s.take(base_length))
def do() -> None:
ind = random.randrange(len(col))
del col[ind]
el = next(s)
if el not in col:
col.append(el)
timer = timeit.Timer(do)
return timer.autorange()
def time_list_ind_pop() -> Tuple[int, float]:
col: List[str] = []
s = StringSource()
col = []
col.extend(s.take(base_length))
def do() -> None:
ind = random.randrange(len(col))
col[ind] = col[-1]
col.pop()
col.append(next(s))
timer = timeit.Timer(do)
return timer.autorange()
def time_list_ind_pop_check() -> Tuple[int, float]:
col: List[str] = []
s = StringSource()
col.extend(s.take(base_length))
def do() -> None:
ind = random.randrange(len(col))
col[ind] = col[-1]
col.pop()
el = next(s)
if el not in col:
col.append(el)
timer = timeit.Timer(do)
return timer.autorange()
def time_list_set_del() -> Tuple[int, float]:
col: List[str] = []
col_set: Set[str] = set()
s = StringSource()
for el in s.take(base_length):
col.append(el)
col_set.add(el)
def do() -> None:
ind = random.randrange(len(col))
el = col[ind]
del col[ind]
col_set.remove(el)
el = next(s)
if el not in col_set:
col.append(el)
col_set.add(el)
timer = timeit.Timer(do)
return timer.autorange()
def time_list_set_pop() -> Tuple[int, float]:
col: List[str] = []
col_set: Set[str] = set()
s = StringSource()
for el in s.take(base_length):
col.append(el)
col_set.add(el)
def do() -> None:
ind = random.randrange(len(col))
el = col[ind]
col[ind] = col[-1]
col.pop()
col_set.remove(el)
el = next(s)
if el not in col_set:
col.append(el)
col_set.add(el)
timer = timeit.Timer(do)
return timer.autorange()
def time_deque() -> Tuple[int, float]:
col: Deque[str] = Deque()
s = StringSource()
col.extend(s.take(base_length))
def do() -> None:
el = random.choice(col)
col.remove(el)
col.append(next(s))
timer = timeit.Timer(do)
return timer.autorange()
def time_deque_range() -> Tuple[int, float]:
col: Deque[str] = Deque()
s = StringSource()
col.extend(s.take(base_length))
def do() -> None:
ind = random.randrange(len(col))
el = col[ind]
del col[ind]
col.append(next(s))
timer = timeit.Timer(do)
return timer.autorange()
def time_deque_ind_check() -> Tuple[int, float]:
col: Deque[str] = Deque()
s = StringSource()
col.extend(s.take(base_length))
def do() -> None:
ind = random.randrange(len(col))
el = col[ind]
del col[ind]
el = next(s)
if el not in col:
col.append(el)
timer = timeit.Timer(do)
return timer.autorange()
def time_set_iter() -> Tuple[int, float]:
col: Set[str] = set()
s = StringSource()
col.update(s.take(base_length))
def do() -> None:
ind = random.randrange(len(col))
el_to_remove = list(itertools.islice(col, ind, ind + 1))[0]
col.remove(el_to_remove)
col.add(next(s))
timer = timeit.Timer(do)
return timer.autorange()
BenchmarkFunction = Callable[[], Tuple[int, float]]
benchmark_functions: List[BenchmarkFunction] = [
time_list_choice,
time_list_ind_del,
time_list_ind_del_check,
time_list_ind_pop,
time_list_ind_pop_check,
time_list_set_del,
time_list_set_pop,
time_deque,
time_deque_range,
time_deque_ind_check,
time_set_iter,
]
def main() -> None:
for f in benchmark_functions:
rep, dur = f()
print(f.__name__, (dur / rep, rep, dur))
if __name__ == '__main__':
main()
| 18.290984 | 76 | 0.666144 | 704 | 4,463 | 4.080966 | 0.122159 | 0.038984 | 0.054299 | 0.06126 | 0.738949 | 0.716324 | 0.716324 | 0.708667 | 0.708667 | 0.680125 | 0 | 0.003273 | 0.178579 | 4,463 | 243 | 77 | 18.366255 | 0.780415 | 0 | 0 | 0.739394 | 0 | 0 | 0.001926 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.139394 | false | 0 | 0.030303 | 0 | 0.236364 | 0.006061 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
49d2ed6066cc56a87632114b0bc00af34fbe2877 | 43 | py | Python | tools/deepke/name_entity_re/standard/__init__.py | dfface/DoctorKG | 6bd6ebec8244a9ce0a2c8c278a704f02b9afaaf8 | [
"MIT"
] | 1 | 2022-03-26T16:08:08.000Z | 2022-03-26T16:08:08.000Z | tools/deepke/name_entity_re/standard/__init__.py | dfface/DoctorKG | 6bd6ebec8244a9ce0a2c8c278a704f02b9afaaf8 | [
"MIT"
] | null | null | null | tools/deepke/name_entity_re/standard/__init__.py | dfface/DoctorKG | 6bd6ebec8244a9ce0a2c8c278a704f02b9afaaf8 | [
"MIT"
] | null | null | null | from .models import *
from .tools import * | 21.5 | 22 | 0.72093 | 6 | 43 | 5.166667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186047 | 43 | 2 | 23 | 21.5 | 0.885714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b718f24e7816394140c4c96868d0e31c144b52dc | 145 | py | Python | my_functions.py | Hejinzefinance/Learn_git | 83903bd1da8493d6855c959dddec6793ff29d89e | [
"MIT"
] | null | null | null | my_functions.py | Hejinzefinance/Learn_git | 83903bd1da8493d6855c959dddec6793ff29d89e | [
"MIT"
] | null | null | null | my_functions.py | Hejinzefinance/Learn_git | 83903bd1da8493d6855c959dddec6793ff29d89e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Mon Mar 15 21:21:17 2021
@author: Hejizne
"""
def add(x,y):
return x+y
def minus(x,y):
return x-y | 14.5 | 35 | 0.57931 | 28 | 145 | 3 | 0.678571 | 0.095238 | 0.190476 | 0.214286 | 0.238095 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115044 | 0.22069 | 145 | 10 | 36 | 14.5 | 0.628319 | 0.524138 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
b7371c852221804191c75e6e40e06d13acff645e | 41 | py | Python | umysqldb/constants.py | byaka/umysqldb_portable | 0485cd9c623d169280de0c0d595858b04b03d080 | [
"BSD-3-Clause"
] | null | null | null | umysqldb/constants.py | byaka/umysqldb_portable | 0485cd9c623d169280de0c0d595858b04b03d080 | [
"BSD-3-Clause"
] | 1 | 2017-02-22T16:31:24.000Z | 2017-02-22T16:31:24.000Z | umysqldb/constants.py | byaka/umysqldb_portable | 0485cd9c623d169280de0c0d595858b04b03d080 | [
"BSD-3-Clause"
] | null | null | null | from pymysql_portable.constants import *
| 20.5 | 40 | 0.853659 | 5 | 41 | 6.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 41 | 1 | 41 | 41 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3f91d023ed9da6d3e6745fa9158ac328f7b2fb1b | 155 | py | Python | gehomesdk/exception/ge_needs_reauthentication_error.py | willhayslett/gehome | 7e407a1d31cede1453656eaef948332e808484ea | [
"MIT"
] | 17 | 2021-05-18T01:58:06.000Z | 2022-03-22T20:49:32.000Z | gehomesdk/exception/ge_needs_reauthentication_error.py | willhayslett/gehome | 7e407a1d31cede1453656eaef948332e808484ea | [
"MIT"
] | 29 | 2021-05-17T21:43:16.000Z | 2022-02-28T22:50:48.000Z | gehomesdk/exception/ge_needs_reauthentication_error.py | willhayslett/gehome | 7e407a1d31cede1453656eaef948332e808484ea | [
"MIT"
] | 9 | 2021-05-17T04:40:58.000Z | 2022-02-02T17:26:13.000Z | from .ge_exception import GeException
class GeNeedsReauthenticationError(GeException):
"""Error raised when the reauthentication is needed"""
pass | 31 | 58 | 0.793548 | 16 | 155 | 7.625 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141935 | 155 | 5 | 59 | 31 | 0.917293 | 0.309677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
3f957aae7d3424f49398ea82640d72d107428ded | 33 | py | Python | root/main.py | smysnk/led-controller | 9f74d644ba4b5841806483ed245f075f61a4f719 | [
"MIT"
] | null | null | null | root/main.py | smysnk/led-controller | 9f74d644ba4b5841806483ed245f075f61a4f719 | [
"MIT"
] | null | null | null | root/main.py | smysnk/led-controller | 9f74d644ba4b5841806483ed245f075f61a4f719 | [
"MIT"
] | null | null | null | import src.main
src.main.start()
| 11 | 16 | 0.757576 | 6 | 33 | 4.166667 | 0.666667 | 0.56 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 33 | 2 | 17 | 16.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
b7477a1d5b995b4b193cd54073cdd5bf7bd04da1 | 91,724 | py | Python | datasets/col_loc_scale_dataset.py | tim-henry/experiment-framework | ff54b315e5c584269f473c5eb8959fb40926a59c | [
"MIT"
] | 1 | 2021-01-25T11:40:19.000Z | 2021-01-25T11:40:19.000Z | datasets/col_loc_scale_dataset.py | tim-henry/experiment-framework | ff54b315e5c584269f473c5eb8959fb40926a59c | [
"MIT"
] | null | null | null | datasets/col_loc_scale_dataset.py | tim-henry/experiment-framework | ff54b315e5c584269f473c5eb8959fb40926a59c | [
"MIT"
] | 2 | 2020-04-12T02:44:01.000Z | 2021-12-27T02:13:38.000Z | import numpy as np
from PIL import Image
import random
import torch
import matplotlib.pyplot as plt
from torchvision import datasets, transforms
class ColLocScaleMNIST(datasets.MNIST):
# Color classes
color_name_map = ["red", "green", "blue", "yellow", "magenta", "cyan", "purple", "lime", "orange", "white"]
color_map = [np.array([1, 0.1, 0.1]), np.array([0.1, 1, 0.1]), np.array([0.1, 0.1, 1]),
np.array([1, 1, 0.1]), np.array([1, 0.1, 1]), np.array([0.1, 1, 1]),
np.array([0.57, 0.12, 0.71]), np.array([0.72, 0.96, 0.24]), np.array([0.96, 0.51, 0.19]),
np.array([1, 1, 1])]
# Gaussian noise arguments
mu = 0
sigma = 50
# pct_to_keep: percentage of possible combinations to keep between 0 and 1, rounded down to nearest multiple of 1/9
def __init__(self, root, train=True, transform=None, target_transform=None, download=False, pct_to_keep=1, color_indices=np.arange(9)):
super().__init__(root, train, transform, target_transform, download)
pct = pct_to_keep
self.max_left_dist = int(pct / 2)
self.max_right_dist = int(pct / 2) if pct % 2 == 0 else int(pct / 2) + 1
self.held_out = [(0, 4), (1, 5), (2, 6), (3, 7), (4, 8), (5, 0), (6, 1), (7, 2), (8, 3)]
self.control = [(0, 0), (1, 1), (2, 2), (3, 3), (4, 4), (5, 5), (6, 6), (7, 7), (8, 8)]
self.combination_space_shape = (9, 9)
self.class_names = ("shape", "color", "location", "scale")
self.name = "col_loc_scale_mnist"
np.random.seed(17)
self.color_indices = np.arange(9)
self.loc_indices = np.arange(9)
self.scale_indices = np.arange(9)
np.random.shuffle(self.color_indices)
np.random.shuffle(self.loc_indices)
np.random.shuffle(self.scale_indices)
self.pct_to_combs = [[(0, 0, 0, 0),(1, 1, 0, 0),(2, 2, 0, 0),(3, 3, 0, 0),(4, 4, 0, 0),(5, 5, 0, 0),(6, 6, 0, 0),(7, 7, 0, 0),(8, 8, 0, 0),(1, 0, 1, 0),(2, 1, 1, 0),(3, 2, 1, 0),(4, 3, 1, 0),(5, 4, 1, 0),(6, 5, 1, 0),(7, 6, 1, 0),(8, 7, 1, 0),(0, 8, 1, 0),(2, 0, 2, 0),(3, 1, 2, 0),(4, 2, 2, 0),(5, 3, 2, 0),(6, 4, 2, 0),(7, 5, 2, 0),(8, 6, 2, 0),(0, 7, 2, 0),(1, 8, 2, 0),(3, 0, 3, 0),(4, 1, 3, 0),(5, 2, 3, 0),(6, 3, 3, 0),(7, 4, 3, 0),(8, 5, 3, 0),(0, 6, 3, 0),(1, 7, 3, 0),(2, 8, 3, 0),(4, 0, 4, 0),(5, 1, 4, 0),(6, 2, 4, 0),(7, 3, 4, 0),(8, 4, 4, 0),(0, 5, 4, 0),(1, 6, 4, 0),(2, 7, 4, 0),(3, 8, 4, 0),(5, 0, 5, 0),(6, 1, 5, 0),(7, 2, 5, 0),(8, 3, 5, 0),(0, 4, 5, 0),(1, 5, 5, 0),(2, 6, 5, 0),(3, 7, 5, 0),(4, 8, 5, 0),(6, 0, 6, 0),(7, 1, 6, 0),(8, 2, 6, 0),(0, 3, 6, 0),(1, 4, 6, 0),(2, 5, 6, 0),(3, 6, 6, 0),(4, 7, 6, 0),(5, 8, 6, 0),(7, 0, 7, 0),(8, 1, 7, 0),(0, 2, 7, 0),(1, 3, 7, 0),(2, 4, 7, 0),(3, 5, 7, 0),(4, 6, 7, 0),(5, 7, 7, 0),(6, 8, 7, 0),(8, 0, 8, 0),(0, 1, 8, 0),(1, 2, 8, 0),(2, 3, 8, 0),(3, 4, 8, 0),(4, 5, 8, 0),(5, 6, 8, 0),(6, 7, 8, 0),(7, 8, 8, 0),(1, 0, 0, 1),(2, 1, 0, 1),(3, 2, 0, 1),(4, 3, 0, 1),(5, 4, 0, 1),(6, 5, 0, 1),(7, 6, 0, 1),(8, 7, 0, 1),(0, 8, 0, 1),(2, 0, 1, 1),(3, 1, 1, 1),(4, 2, 1, 1),(5, 3, 1, 1),(6, 4, 1, 1),(7, 5, 1, 1),(8, 6, 1, 1),(0, 7, 1, 1),(1, 8, 1, 1),(3, 0, 2, 1),(4, 1, 2, 1),(5, 2, 2, 1),(6, 3, 2, 1),(7, 4, 2, 1),(8, 5, 2, 1),(0, 6, 2, 1),(1, 7, 2, 1),(2, 8, 2, 1),(4, 0, 3, 1),(5, 1, 3, 1),(6, 2, 3, 1),(7, 3, 3, 1),(8, 4, 3, 1),(0, 5, 3, 1),(1, 6, 3, 1),(2, 7, 3, 1),(3, 8, 3, 1),(5, 0, 4, 1),(6, 1, 4, 1),(7, 2, 4, 1),(8, 3, 4, 1),(0, 4, 4, 1),(1, 5, 4, 1),(2, 6, 4, 1),(3, 7, 4, 1),(4, 8, 4, 1),(6, 0, 5, 1),(7, 1, 5, 1),(8, 2, 5, 1),(0, 3, 5, 1),(1, 4, 5, 1),(2, 5, 5, 1),(3, 6, 5, 1),(4, 7, 5, 1),(5, 8, 5, 1),(7, 0, 6, 1),(8, 1, 6, 1),(0, 2, 6, 1),(1, 3, 6, 1),(2, 4, 6, 1),(3, 5, 6, 1),(4, 6, 6, 1),(5, 7, 6, 1),(6, 8, 6, 1),(8, 0, 7, 1),(0, 1, 7, 1),(1, 2, 7, 1),(2, 3, 7, 1),(3, 4, 7, 1),(4, 5, 7, 1),(5, 6, 7, 1),(6, 7, 7, 1),(7, 8, 7, 1),(0, 0, 8, 1),(1, 1, 8, 1),(2, 2, 8, 1),(3, 3, 8, 1),(4, 4, 8, 1),(5, 5, 8, 1),(6, 6, 8, 1),(7, 7, 8, 1),(8, 8, 8, 1),(2, 0, 0, 2),(3, 1, 0, 2),(4, 2, 0, 2),(5, 3, 0, 2),(6, 4, 0, 2),(7, 5, 0, 2),(8, 6, 0, 2),(0, 7, 0, 2),(1, 8, 0, 2),(3, 0, 1, 2),(4, 1, 1, 2),(5, 2, 1, 2),(6, 3, 1, 2),(7, 4, 1, 2),(8, 5, 1, 2),(0, 6, 1, 2),(1, 7, 1, 2),(2, 8, 1, 2),(4, 0, 2, 2),(5, 1, 2, 2),(6, 2, 2, 2),(7, 3, 2, 2),(8, 4, 2, 2),(0, 5, 2, 2),(1, 6, 2, 2),(2, 7, 2, 2),(3, 8, 2, 2),(5, 0, 3, 2),(6, 1, 3, 2),(7, 2, 3, 2),(8, 3, 3, 2),(0, 4, 3, 2),(1, 5, 3, 2),(2, 6, 3, 2),(3, 7, 3, 2),(4, 8, 3, 2),(6, 0, 4, 2),(7, 1, 4, 2),(8, 2, 4, 2),(0, 3, 4, 2),(1, 4, 4, 2),(2, 5, 4, 2),(3, 6, 4, 2),(4, 7, 4, 2),(5, 8, 4, 2),(7, 0, 5, 2),(8, 1, 5, 2),(0, 2, 5, 2),(1, 3, 5, 2),(2, 4, 5, 2),(3, 5, 5, 2),(4, 6, 5, 2),(5, 7, 5, 2),(6, 8, 5, 2),(8, 0, 6, 2),(0, 1, 6, 2),(1, 2, 6, 2),(2, 3, 6, 2),(3, 4, 6, 2),(4, 5, 6, 2),(5, 6, 6, 2),(6, 7, 6, 2),(7, 8, 6, 2),(0, 0, 7, 2),(1, 1, 7, 2),(2, 2, 7, 2),(3, 3, 7, 2),(4, 4, 7, 2),(5, 5, 7, 2),(6, 6, 7, 2),(7, 7, 7, 2),(8, 8, 7, 2),(1, 0, 8, 2),(2, 1, 8, 2),(3, 2, 8, 2),(4, 3, 8, 2),(5, 4, 8, 2),(6, 5, 8, 2),(7, 6, 8, 2),(8, 7, 8, 2),(0, 8, 8, 2),(3, 0, 0, 3),(4, 1, 0, 3),(5, 2, 0, 3),(6, 3, 0, 3),(7, 4, 0, 3),(8, 5, 0, 3),(0, 6, 0, 3),(1, 7, 0, 3),(2, 8, 0, 3),(4, 0, 1, 3),(5, 1, 1, 3),(6, 2, 1, 3),(7, 3, 1, 3),(8, 4, 1, 3),(0, 5, 1, 3),(1, 6, 1, 3),(2, 7, 1, 3),(3, 8, 1, 3),(5, 0, 2, 3),(6, 1, 2, 3),(7, 2, 2, 3),(8, 3, 2, 3),(0, 4, 2, 3),(1, 5, 2, 3),(2, 6, 2, 3),(3, 7, 2, 3),(4, 8, 2, 3),(6, 0, 3, 3),(7, 1, 3, 3),(8, 2, 3, 3),(0, 3, 3, 3),(1, 4, 3, 3),(2, 5, 3, 3),(3, 6, 3, 3),(4, 7, 3, 3),(5, 8, 3, 3),(7, 0, 4, 3),(8, 1, 4, 3),(0, 2, 4, 3),(1, 3, 4, 3),(2, 4, 4, 3),(3, 5, 4, 3),(4, 6, 4, 3),(5, 7, 4, 3),(6, 8, 4, 3),(8, 0, 5, 3),(0, 1, 5, 3),(1, 2, 5, 3),(2, 3, 5, 3),(3, 4, 5, 3),(4, 5, 5, 3),(5, 6, 5, 3),(6, 7, 5, 3),(7, 8, 5, 3),(0, 0, 6, 3),(1, 1, 6, 3),(2, 2, 6, 3),(3, 3, 6, 3),(4, 4, 6, 3),(5, 5, 6, 3),(6, 6, 6, 3),(7, 7, 6, 3),(8, 8, 6, 3),(1, 0, 7, 3),(2, 1, 7, 3),(3, 2, 7, 3),(4, 3, 7, 3),(5, 4, 7, 3),(6, 5, 7, 3),(7, 6, 7, 3),(8, 7, 7, 3),(0, 8, 7, 3),(2, 0, 8, 3),(3, 1, 8, 3),(4, 2, 8, 3),(5, 3, 8, 3),(6, 4, 8, 3),(7, 5, 8, 3),(8, 6, 8, 3),(0, 7, 8, 3),(1, 8, 8, 3),(4, 0, 0, 4),(5, 1, 0, 4),(6, 2, 0, 4),(7, 3, 0, 4),(8, 4, 0, 4),(0, 5, 0, 4),(1, 6, 0, 4),(2, 7, 0, 4),(3, 8, 0, 4),(5, 0, 1, 4),(6, 1, 1, 4),(7, 2, 1, 4),(8, 3, 1, 4),(0, 4, 1, 4),(1, 5, 1, 4),(2, 6, 1, 4),(3, 7, 1, 4),(4, 8, 1, 4),(6, 0, 2, 4),(7, 1, 2, 4),(8, 2, 2, 4),(0, 3, 2, 4),(1, 4, 2, 4),(2, 5, 2, 4),(3, 6, 2, 4),(4, 7, 2, 4),(5, 8, 2, 4),(7, 0, 3, 4),(8, 1, 3, 4),(0, 2, 3, 4),(1, 3, 3, 4),(2, 4, 3, 4),(3, 5, 3, 4),(4, 6, 3, 4),(5, 7, 3, 4),(6, 8, 3, 4),(8, 0, 4, 4),(0, 1, 4, 4),(1, 2, 4, 4),(2, 3, 4, 4),(3, 4, 4, 4),(4, 5, 4, 4),(5, 6, 4, 4),(6, 7, 4, 4),(7, 8, 4, 4),(0, 0, 5, 4),(1, 1, 5, 4),(2, 2, 5, 4),(3, 3, 5, 4),(4, 4, 5, 4),(5, 5, 5, 4),(6, 6, 5, 4),(7, 7, 5, 4),(8, 8, 5, 4),(1, 0, 6, 4),(2, 1, 6, 4),(3, 2, 6, 4),(4, 3, 6, 4),(5, 4, 6, 4),(6, 5, 6, 4),(7, 6, 6, 4),(8, 7, 6, 4),(0, 8, 6, 4),(2, 0, 7, 4),(3, 1, 7, 4),(4, 2, 7, 4),(5, 3, 7, 4),(6, 4, 7, 4),(7, 5, 7, 4),(8, 6, 7, 4),(0, 7, 7, 4),(1, 8, 7, 4),(3, 0, 8, 4),(4, 1, 8, 4),(5, 2, 8, 4),(6, 3, 8, 4),(7, 4, 8, 4),(8, 5, 8, 4),(0, 6, 8, 4),(1, 7, 8, 4),(2, 8, 8, 4),(5, 0, 0, 5),(6, 1, 0, 5),(7, 2, 0, 5),(8, 3, 0, 5),(0, 4, 0, 5),(1, 5, 0, 5),(2, 6, 0, 5),(3, 7, 0, 5),(4, 8, 0, 5),(6, 0, 1, 5),(7, 1, 1, 5),(8, 2, 1, 5),(0, 3, 1, 5),(1, 4, 1, 5),(2, 5, 1, 5),(3, 6, 1, 5),(4, 7, 1, 5),(5, 8, 1, 5),(7, 0, 2, 5),(8, 1, 2, 5),(0, 2, 2, 5),(1, 3, 2, 5),(2, 4, 2, 5),(3, 5, 2, 5),(4, 6, 2, 5),(5, 7, 2, 5),(6, 8, 2, 5),(8, 0, 3, 5),(0, 1, 3, 5),(1, 2, 3, 5),(2, 3, 3, 5),(3, 4, 3, 5),(4, 5, 3, 5),(5, 6, 3, 5),(6, 7, 3, 5),(7, 8, 3, 5),(0, 0, 4, 5),(1, 1, 4, 5),(2, 2, 4, 5),(3, 3, 4, 5),(4, 4, 4, 5),(5, 5, 4, 5),(6, 6, 4, 5),(7, 7, 4, 5),(8, 8, 4, 5),(1, 0, 5, 5),(2, 1, 5, 5),(3, 2, 5, 5),(4, 3, 5, 5),(5, 4, 5, 5),(6, 5, 5, 5),(7, 6, 5, 5),(8, 7, 5, 5),(0, 8, 5, 5),(2, 0, 6, 5),(3, 1, 6, 5),(4, 2, 6, 5),(5, 3, 6, 5),(6, 4, 6, 5),(7, 5, 6, 5),(8, 6, 6, 5),(0, 7, 6, 5),(1, 8, 6, 5),(3, 0, 7, 5),(4, 1, 7, 5),(5, 2, 7, 5),(6, 3, 7, 5),(7, 4, 7, 5),(8, 5, 7, 5),(0, 6, 7, 5),(1, 7, 7, 5),(2, 8, 7, 5),(4, 0, 8, 5),(5, 1, 8, 5),(6, 2, 8, 5),(7, 3, 8, 5),(8, 4, 8, 5),(0, 5, 8, 5),(1, 6, 8, 5),(2, 7, 8, 5),(3, 8, 8, 5),(6, 0, 0, 6),(7, 1, 0, 6),(8, 2, 0, 6),(0, 3, 0, 6),(1, 4, 0, 6),(2, 5, 0, 6),(3, 6, 0, 6),(4, 7, 0, 6),(5, 8, 0, 6),(7, 0, 1, 6),(8, 1, 1, 6),(0, 2, 1, 6),(1, 3, 1, 6),(2, 4, 1, 6),(3, 5, 1, 6),(4, 6, 1, 6),(5, 7, 1, 6),(6, 8, 1, 6),(8, 0, 2, 6),(0, 1, 2, 6),(1, 2, 2, 6),(2, 3, 2, 6),(3, 4, 2, 6),(4, 5, 2, 6),(5, 6, 2, 6),(6, 7, 2, 6),(7, 8, 2, 6),(0, 0, 3, 6),(1, 1, 3, 6),(2, 2, 3, 6),(3, 3, 3, 6),(4, 4, 3, 6),(5, 5, 3, 6),(6, 6, 3, 6),(7, 7, 3, 6),(8, 8, 3, 6),(1, 0, 4, 6),(2, 1, 4, 6),(3, 2, 4, 6),(4, 3, 4, 6),(5, 4, 4, 6),(6, 5, 4, 6),(7, 6, 4, 6),(8, 7, 4, 6),(0, 8, 4, 6),(2, 0, 5, 6),(3, 1, 5, 6),(4, 2, 5, 6),(5, 3, 5, 6),(6, 4, 5, 6),(7, 5, 5, 6),(8, 6, 5, 6),(0, 7, 5, 6),(1, 8, 5, 6),(3, 0, 6, 6),(4, 1, 6, 6),(5, 2, 6, 6),(6, 3, 6, 6),(7, 4, 6, 6),(8, 5, 6, 6),(0, 6, 6, 6),(1, 7, 6, 6),(2, 8, 6, 6),(4, 0, 7, 6),(5, 1, 7, 6),(6, 2, 7, 6),(7, 3, 7, 6),(8, 4, 7, 6),(0, 5, 7, 6),(1, 6, 7, 6),(2, 7, 7, 6),(3, 8, 7, 6),(5, 0, 8, 6),(6, 1, 8, 6),(7, 2, 8, 6),(8, 3, 8, 6),(0, 4, 8, 6),(1, 5, 8, 6),(2, 6, 8, 6),(3, 7, 8, 6),(4, 8, 8, 6),(7, 0, 0, 7),(8, 1, 0, 7),(0, 2, 0, 7),(1, 3, 0, 7),(2, 4, 0, 7),(3, 5, 0, 7),(4, 6, 0, 7),(5, 7, 0, 7),(6, 8, 0, 7),(8, 0, 1, 7),(0, 1, 1, 7),(1, 2, 1, 7),(2, 3, 1, 7),(3, 4, 1, 7),(4, 5, 1, 7),(5, 6, 1, 7),(6, 7, 1, 7),(7, 8, 1, 7),(0, 0, 2, 7),(1, 1, 2, 7),(2, 2, 2, 7),(3, 3, 2, 7),(4, 4, 2, 7),(5, 5, 2, 7),(6, 6, 2, 7),(7, 7, 2, 7),(8, 8, 2, 7),(1, 0, 3, 7),(2, 1, 3, 7),(3, 2, 3, 7),(4, 3, 3, 7),(5, 4, 3, 7),(6, 5, 3, 7),(7, 6, 3, 7),(8, 7, 3, 7),(0, 8, 3, 7),(2, 0, 4, 7),(3, 1, 4, 7),(4, 2, 4, 7),(5, 3, 4, 7),(6, 4, 4, 7),(7, 5, 4, 7),(8, 6, 4, 7),(0, 7, 4, 7),(1, 8, 4, 7),(3, 0, 5, 7),(4, 1, 5, 7),(5, 2, 5, 7),(6, 3, 5, 7),(7, 4, 5, 7),(8, 5, 5, 7),(0, 6, 5, 7),(1, 7, 5, 7),(2, 8, 5, 7),(4, 0, 6, 7),(5, 1, 6, 7),(6, 2, 6, 7),(7, 3, 6, 7),(8, 4, 6, 7),(0, 5, 6, 7),(1, 6, 6, 7),(2, 7, 6, 7),(3, 8, 6, 7),(5, 0, 7, 7),(6, 1, 7, 7),(7, 2, 7, 7),(8, 3, 7, 7),(0, 4, 7, 7),(1, 5, 7, 7),(2, 6, 7, 7),(3, 7, 7, 7),(4, 8, 7, 7),(6, 0, 8, 7),(7, 1, 8, 7),(8, 2, 8, 7),(0, 3, 8, 7),(1, 4, 8, 7),(2, 5, 8, 7),(3, 6, 8, 7),(4, 7, 8, 7),(5, 8, 8, 7),(8, 0, 0, 8),(0, 1, 0, 8),(1, 2, 0, 8),(2, 3, 0, 8),(3, 4, 0, 8),(4, 5, 0, 8),(5, 6, 0, 8),(6, 7, 0, 8),(7, 8, 0, 8),(0, 0, 1, 8),(1, 1, 1, 8),(2, 2, 1, 8),(3, 3, 1, 8),(4, 4, 1, 8),(5, 5, 1, 8),(6, 6, 1, 8),(7, 7, 1, 8),(8, 8, 1, 8),(1, 0, 2, 8),(2, 1, 2, 8),(3, 2, 2, 8),(4, 3, 2, 8),(5, 4, 2, 8),(6, 5, 2, 8),(7, 6, 2, 8),(8, 7, 2, 8),(0, 8, 2, 8),(2, 0, 3, 8),(3, 1, 3, 8),(4, 2, 3, 8),(5, 3, 3, 8),(6, 4, 3, 8),(7, 5, 3, 8),(8, 6, 3, 8),(0, 7, 3, 8),(1, 8, 3, 8),(3, 0, 4, 8),(4, 1, 4, 8),(5, 2, 4, 8),(6, 3, 4, 8),(7, 4, 4, 8),(8, 5, 4, 8),(0, 6, 4, 8),(1, 7, 4, 8),(2, 8, 4, 8),(4, 0, 5, 8),(5, 1, 5, 8),(6, 2, 5, 8),(7, 3, 5, 8),(8, 4, 5, 8),(0, 5, 5, 8),(1, 6, 5, 8),(2, 7, 5, 8),(3, 8, 5, 8),(5, 0, 6, 8),(6, 1, 6, 8),(7, 2, 6, 8),(8, 3, 6, 8),(0, 4, 6, 8),(1, 5, 6, 8),(2, 6, 6, 8),(3, 7, 6, 8),(4, 8, 6, 8),(6, 0, 7, 8),(7, 1, 7, 8),(8, 2, 7, 8),(0, 3, 7, 8),(1, 4, 7, 8),(2, 5, 7, 8),(3, 6, 7, 8),(4, 7, 7, 8),(5, 8, 7, 8),(7, 0, 8, 8),(8, 1, 8, 8),(0, 2, 8, 8),(1, 3, 8, 8),(2, 4, 8, 8),(3, 5, 8, 8),(4, 6, 8, 8),(5, 7, 8, 8),(6, 8, 8, 8)],
[(1, 0, 0, 0),(2, 1, 0, 0),(3, 2, 0, 0),(4, 3, 0, 0),(5, 4, 0, 0),(6, 5, 0, 0),(7, 6, 0, 0),(8, 7, 0, 0),(0, 8, 0, 0),(2, 0, 1, 0),(3, 1, 1, 0),(4, 2, 1, 0),(5, 3, 1, 0),(6, 4, 1, 0),(7, 5, 1, 0),(8, 6, 1, 0),(0, 7, 1, 0),(1, 8, 1, 0),(3, 0, 2, 0),(4, 1, 2, 0),(5, 2, 2, 0),(6, 3, 2, 0),(7, 4, 2, 0),(8, 5, 2, 0),(0, 6, 2, 0),(1, 7, 2, 0),(2, 8, 2, 0),(4, 0, 3, 0),(5, 1, 3, 0),(6, 2, 3, 0),(7, 3, 3, 0),(8, 4, 3, 0),(0, 5, 3, 0),(1, 6, 3, 0),(2, 7, 3, 0),(3, 8, 3, 0),(5, 0, 4, 0),(6, 1, 4, 0),(7, 2, 4, 0),(8, 3, 4, 0),(0, 4, 4, 0),(1, 5, 4, 0),(2, 6, 4, 0),(3, 7, 4, 0),(4, 8, 4, 0),(6, 0, 5, 0),(7, 1, 5, 0),(8, 2, 5, 0),(0, 3, 5, 0),(1, 4, 5, 0),(2, 5, 5, 0),(3, 6, 5, 0),(4, 7, 5, 0),(5, 8, 5, 0),(7, 0, 6, 0),(8, 1, 6, 0),(0, 2, 6, 0),(1, 3, 6, 0),(2, 4, 6, 0),(3, 5, 6, 0),(4, 6, 6, 0),(5, 7, 6, 0),(6, 8, 6, 0),(8, 0, 7, 0),(0, 1, 7, 0),(1, 2, 7, 0),(2, 3, 7, 0),(3, 4, 7, 0),(4, 5, 7, 0),(5, 6, 7, 0),(6, 7, 7, 0),(7, 8, 7, 0),(0, 0, 8, 0),(1, 1, 8, 0),(2, 2, 8, 0),(3, 3, 8, 0),(4, 4, 8, 0),(5, 5, 8, 0),(6, 6, 8, 0),(7, 7, 8, 0),(8, 8, 8, 0),(2, 0, 0, 1),(3, 1, 0, 1),(4, 2, 0, 1),(5, 3, 0, 1),(6, 4, 0, 1),(7, 5, 0, 1),(8, 6, 0, 1),(0, 7, 0, 1),(1, 8, 0, 1),(3, 0, 1, 1),(4, 1, 1, 1),(5, 2, 1, 1),(6, 3, 1, 1),(7, 4, 1, 1),(8, 5, 1, 1),(0, 6, 1, 1),(1, 7, 1, 1),(2, 8, 1, 1),(4, 0, 2, 1),(5, 1, 2, 1),(6, 2, 2, 1),(7, 3, 2, 1),(8, 4, 2, 1),(0, 5, 2, 1),(1, 6, 2, 1),(2, 7, 2, 1),(3, 8, 2, 1),(5, 0, 3, 1),(6, 1, 3, 1),(7, 2, 3, 1),(8, 3, 3, 1),(0, 4, 3, 1),(1, 5, 3, 1),(2, 6, 3, 1),(3, 7, 3, 1),(4, 8, 3, 1),(6, 0, 4, 1),(7, 1, 4, 1),(8, 2, 4, 1),(0, 3, 4, 1),(1, 4, 4, 1),(2, 5, 4, 1),(3, 6, 4, 1),(4, 7, 4, 1),(5, 8, 4, 1),(7, 0, 5, 1),(8, 1, 5, 1),(0, 2, 5, 1),(1, 3, 5, 1),(2, 4, 5, 1),(3, 5, 5, 1),(4, 6, 5, 1),(5, 7, 5, 1),(6, 8, 5, 1),(8, 0, 6, 1),(0, 1, 6, 1),(1, 2, 6, 1),(2, 3, 6, 1),(3, 4, 6, 1),(4, 5, 6, 1),(5, 6, 6, 1),(6, 7, 6, 1),(7, 8, 6, 1),(0, 0, 7, 1),(1, 1, 7, 1),(2, 2, 7, 1),(3, 3, 7, 1),(4, 4, 7, 1),(5, 5, 7, 1),(6, 6, 7, 1),(7, 7, 7, 1),(8, 8, 7, 1),(1, 0, 8, 1),(2, 1, 8, 1),(3, 2, 8, 1),(4, 3, 8, 1),(5, 4, 8, 1),(6, 5, 8, 1),(7, 6, 8, 1),(8, 7, 8, 1),(0, 8, 8, 1),(3, 0, 0, 2),(4, 1, 0, 2),(5, 2, 0, 2),(6, 3, 0, 2),(7, 4, 0, 2),(8, 5, 0, 2),(0, 6, 0, 2),(1, 7, 0, 2),(2, 8, 0, 2),(4, 0, 1, 2),(5, 1, 1, 2),(6, 2, 1, 2),(7, 3, 1, 2),(8, 4, 1, 2),(0, 5, 1, 2),(1, 6, 1, 2),(2, 7, 1, 2),(3, 8, 1, 2),(5, 0, 2, 2),(6, 1, 2, 2),(7, 2, 2, 2),(8, 3, 2, 2),(0, 4, 2, 2),(1, 5, 2, 2),(2, 6, 2, 2),(3, 7, 2, 2),(4, 8, 2, 2),(6, 0, 3, 2),(7, 1, 3, 2),(8, 2, 3, 2),(0, 3, 3, 2),(1, 4, 3, 2),(2, 5, 3, 2),(3, 6, 3, 2),(4, 7, 3, 2),(5, 8, 3, 2),(7, 0, 4, 2),(8, 1, 4, 2),(0, 2, 4, 2),(1, 3, 4, 2),(2, 4, 4, 2),(3, 5, 4, 2),(4, 6, 4, 2),(5, 7, 4, 2),(6, 8, 4, 2),(8, 0, 5, 2),(0, 1, 5, 2),(1, 2, 5, 2),(2, 3, 5, 2),(3, 4, 5, 2),(4, 5, 5, 2),(5, 6, 5, 2),(6, 7, 5, 2),(7, 8, 5, 2),(0, 0, 6, 2),(1, 1, 6, 2),(2, 2, 6, 2),(3, 3, 6, 2),(4, 4, 6, 2),(5, 5, 6, 2),(6, 6, 6, 2),(7, 7, 6, 2),(8, 8, 6, 2),(1, 0, 7, 2),(2, 1, 7, 2),(3, 2, 7, 2),(4, 3, 7, 2),(5, 4, 7, 2),(6, 5, 7, 2),(7, 6, 7, 2),(8, 7, 7, 2),(0, 8, 7, 2),(2, 0, 8, 2),(3, 1, 8, 2),(4, 2, 8, 2),(5, 3, 8, 2),(6, 4, 8, 2),(7, 5, 8, 2),(8, 6, 8, 2),(0, 7, 8, 2),(1, 8, 8, 2),(4, 0, 0, 3),(5, 1, 0, 3),(6, 2, 0, 3),(7, 3, 0, 3),(8, 4, 0, 3),(0, 5, 0, 3),(1, 6, 0, 3),(2, 7, 0, 3),(3, 8, 0, 3),(5, 0, 1, 3),(6, 1, 1, 3),(7, 2, 1, 3),(8, 3, 1, 3),(0, 4, 1, 3),(1, 5, 1, 3),(2, 6, 1, 3),(3, 7, 1, 3),(4, 8, 1, 3),(6, 0, 2, 3),(7, 1, 2, 3),(8, 2, 2, 3),(0, 3, 2, 3),(1, 4, 2, 3),(2, 5, 2, 3),(3, 6, 2, 3),(4, 7, 2, 3),(5, 8, 2, 3),(7, 0, 3, 3),(8, 1, 3, 3),(0, 2, 3, 3),(1, 3, 3, 3),(2, 4, 3, 3),(3, 5, 3, 3),(4, 6, 3, 3),(5, 7, 3, 3),(6, 8, 3, 3),(8, 0, 4, 3),(0, 1, 4, 3),(1, 2, 4, 3),(2, 3, 4, 3),(3, 4, 4, 3),(4, 5, 4, 3),(5, 6, 4, 3),(6, 7, 4, 3),(7, 8, 4, 3),(0, 0, 5, 3),(1, 1, 5, 3),(2, 2, 5, 3),(3, 3, 5, 3),(4, 4, 5, 3),(5, 5, 5, 3),(6, 6, 5, 3),(7, 7, 5, 3),(8, 8, 5, 3),(1, 0, 6, 3),(2, 1, 6, 3),(3, 2, 6, 3),(4, 3, 6, 3),(5, 4, 6, 3),(6, 5, 6, 3),(7, 6, 6, 3),(8, 7, 6, 3),(0, 8, 6, 3),(2, 0, 7, 3),(3, 1, 7, 3),(4, 2, 7, 3),(5, 3, 7, 3),(6, 4, 7, 3),(7, 5, 7, 3),(8, 6, 7, 3),(0, 7, 7, 3),(1, 8, 7, 3),(3, 0, 8, 3),(4, 1, 8, 3),(5, 2, 8, 3),(6, 3, 8, 3),(7, 4, 8, 3),(8, 5, 8, 3),(0, 6, 8, 3),(1, 7, 8, 3),(2, 8, 8, 3),(5, 0, 0, 4),(6, 1, 0, 4),(7, 2, 0, 4),(8, 3, 0, 4),(0, 4, 0, 4),(1, 5, 0, 4),(2, 6, 0, 4),(3, 7, 0, 4),(4, 8, 0, 4),(6, 0, 1, 4),(7, 1, 1, 4),(8, 2, 1, 4),(0, 3, 1, 4),(1, 4, 1, 4),(2, 5, 1, 4),(3, 6, 1, 4),(4, 7, 1, 4),(5, 8, 1, 4),(7, 0, 2, 4),(8, 1, 2, 4),(0, 2, 2, 4),(1, 3, 2, 4),(2, 4, 2, 4),(3, 5, 2, 4),(4, 6, 2, 4),(5, 7, 2, 4),(6, 8, 2, 4),(8, 0, 3, 4),(0, 1, 3, 4),(1, 2, 3, 4),(2, 3, 3, 4),(3, 4, 3, 4),(4, 5, 3, 4),(5, 6, 3, 4),(6, 7, 3, 4),(7, 8, 3, 4),(0, 0, 4, 4),(1, 1, 4, 4),(2, 2, 4, 4),(3, 3, 4, 4),(4, 4, 4, 4),(5, 5, 4, 4),(6, 6, 4, 4),(7, 7, 4, 4),(8, 8, 4, 4),(1, 0, 5, 4),(2, 1, 5, 4),(3, 2, 5, 4),(4, 3, 5, 4),(5, 4, 5, 4),(6, 5, 5, 4),(7, 6, 5, 4),(8, 7, 5, 4),(0, 8, 5, 4),(2, 0, 6, 4),(3, 1, 6, 4),(4, 2, 6, 4),(5, 3, 6, 4),(6, 4, 6, 4),(7, 5, 6, 4),(8, 6, 6, 4),(0, 7, 6, 4),(1, 8, 6, 4),(3, 0, 7, 4),(4, 1, 7, 4),(5, 2, 7, 4),(6, 3, 7, 4),(7, 4, 7, 4),(8, 5, 7, 4),(0, 6, 7, 4),(1, 7, 7, 4),(2, 8, 7, 4),(4, 0, 8, 4),(5, 1, 8, 4),(6, 2, 8, 4),(7, 3, 8, 4),(8, 4, 8, 4),(0, 5, 8, 4),(1, 6, 8, 4),(2, 7, 8, 4),(3, 8, 8, 4),(6, 0, 0, 5),(7, 1, 0, 5),(8, 2, 0, 5),(0, 3, 0, 5),(1, 4, 0, 5),(2, 5, 0, 5),(3, 6, 0, 5),(4, 7, 0, 5),(5, 8, 0, 5),(7, 0, 1, 5),(8, 1, 1, 5),(0, 2, 1, 5),(1, 3, 1, 5),(2, 4, 1, 5),(3, 5, 1, 5),(4, 6, 1, 5),(5, 7, 1, 5),(6, 8, 1, 5),(8, 0, 2, 5),(0, 1, 2, 5),(1, 2, 2, 5),(2, 3, 2, 5),(3, 4, 2, 5),(4, 5, 2, 5),(5, 6, 2, 5),(6, 7, 2, 5),(7, 8, 2, 5),(0, 0, 3, 5),(1, 1, 3, 5),(2, 2, 3, 5),(3, 3, 3, 5),(4, 4, 3, 5),(5, 5, 3, 5),(6, 6, 3, 5),(7, 7, 3, 5),(8, 8, 3, 5),(1, 0, 4, 5),(2, 1, 4, 5),(3, 2, 4, 5),(4, 3, 4, 5),(5, 4, 4, 5),(6, 5, 4, 5),(7, 6, 4, 5),(8, 7, 4, 5),(0, 8, 4, 5),(2, 0, 5, 5),(3, 1, 5, 5),(4, 2, 5, 5),(5, 3, 5, 5),(6, 4, 5, 5),(7, 5, 5, 5),(8, 6, 5, 5),(0, 7, 5, 5),(1, 8, 5, 5),(3, 0, 6, 5),(4, 1, 6, 5),(5, 2, 6, 5),(6, 3, 6, 5),(7, 4, 6, 5),(8, 5, 6, 5),(0, 6, 6, 5),(1, 7, 6, 5),(2, 8, 6, 5),(4, 0, 7, 5),(5, 1, 7, 5),(6, 2, 7, 5),(7, 3, 7, 5),(8, 4, 7, 5),(0, 5, 7, 5),(1, 6, 7, 5),(2, 7, 7, 5),(3, 8, 7, 5),(5, 0, 8, 5),(6, 1, 8, 5),(7, 2, 8, 5),(8, 3, 8, 5),(0, 4, 8, 5),(1, 5, 8, 5),(2, 6, 8, 5),(3, 7, 8, 5),(4, 8, 8, 5),(7, 0, 0, 6),(8, 1, 0, 6),(0, 2, 0, 6),(1, 3, 0, 6),(2, 4, 0, 6),(3, 5, 0, 6),(4, 6, 0, 6),(5, 7, 0, 6),(6, 8, 0, 6),(8, 0, 1, 6),(0, 1, 1, 6),(1, 2, 1, 6),(2, 3, 1, 6),(3, 4, 1, 6),(4, 5, 1, 6),(5, 6, 1, 6),(6, 7, 1, 6),(7, 8, 1, 6),(0, 0, 2, 6),(1, 1, 2, 6),(2, 2, 2, 6),(3, 3, 2, 6),(4, 4, 2, 6),(5, 5, 2, 6),(6, 6, 2, 6),(7, 7, 2, 6),(8, 8, 2, 6),(1, 0, 3, 6),(2, 1, 3, 6),(3, 2, 3, 6),(4, 3, 3, 6),(5, 4, 3, 6),(6, 5, 3, 6),(7, 6, 3, 6),(8, 7, 3, 6),(0, 8, 3, 6),(2, 0, 4, 6),(3, 1, 4, 6),(4, 2, 4, 6),(5, 3, 4, 6),(6, 4, 4, 6),(7, 5, 4, 6),(8, 6, 4, 6),(0, 7, 4, 6),(1, 8, 4, 6),(3, 0, 5, 6),(4, 1, 5, 6),(5, 2, 5, 6),(6, 3, 5, 6),(7, 4, 5, 6),(8, 5, 5, 6),(0, 6, 5, 6),(1, 7, 5, 6),(2, 8, 5, 6),(4, 0, 6, 6),(5, 1, 6, 6),(6, 2, 6, 6),(7, 3, 6, 6),(8, 4, 6, 6),(0, 5, 6, 6),(1, 6, 6, 6),(2, 7, 6, 6),(3, 8, 6, 6),(5, 0, 7, 6),(6, 1, 7, 6),(7, 2, 7, 6),(8, 3, 7, 6),(0, 4, 7, 6),(1, 5, 7, 6),(2, 6, 7, 6),(3, 7, 7, 6),(4, 8, 7, 6),(6, 0, 8, 6),(7, 1, 8, 6),(8, 2, 8, 6),(0, 3, 8, 6),(1, 4, 8, 6),(2, 5, 8, 6),(3, 6, 8, 6),(4, 7, 8, 6),(5, 8, 8, 6),(8, 0, 0, 7),(0, 1, 0, 7),(1, 2, 0, 7),(2, 3, 0, 7),(3, 4, 0, 7),(4, 5, 0, 7),(5, 6, 0, 7),(6, 7, 0, 7),(7, 8, 0, 7),(0, 0, 1, 7),(1, 1, 1, 7),(2, 2, 1, 7),(3, 3, 1, 7),(4, 4, 1, 7),(5, 5, 1, 7),(6, 6, 1, 7),(7, 7, 1, 7),(8, 8, 1, 7),(1, 0, 2, 7),(2, 1, 2, 7),(3, 2, 2, 7),(4, 3, 2, 7),(5, 4, 2, 7),(6, 5, 2, 7),(7, 6, 2, 7),(8, 7, 2, 7),(0, 8, 2, 7),(2, 0, 3, 7),(3, 1, 3, 7),(4, 2, 3, 7),(5, 3, 3, 7),(6, 4, 3, 7),(7, 5, 3, 7),(8, 6, 3, 7),(0, 7, 3, 7),(1, 8, 3, 7),(3, 0, 4, 7),(4, 1, 4, 7),(5, 2, 4, 7),(6, 3, 4, 7),(7, 4, 4, 7),(8, 5, 4, 7),(0, 6, 4, 7),(1, 7, 4, 7),(2, 8, 4, 7),(4, 0, 5, 7),(5, 1, 5, 7),(6, 2, 5, 7),(7, 3, 5, 7),(8, 4, 5, 7),(0, 5, 5, 7),(1, 6, 5, 7),(2, 7, 5, 7),(3, 8, 5, 7),(5, 0, 6, 7),(6, 1, 6, 7),(7, 2, 6, 7),(8, 3, 6, 7),(0, 4, 6, 7),(1, 5, 6, 7),(2, 6, 6, 7),(3, 7, 6, 7),(4, 8, 6, 7),(6, 0, 7, 7),(7, 1, 7, 7),(8, 2, 7, 7),(0, 3, 7, 7),(1, 4, 7, 7),(2, 5, 7, 7),(3, 6, 7, 7),(4, 7, 7, 7),(5, 8, 7, 7),(7, 0, 8, 7),(8, 1, 8, 7),(0, 2, 8, 7),(1, 3, 8, 7),(2, 4, 8, 7),(3, 5, 8, 7),(4, 6, 8, 7),(5, 7, 8, 7),(6, 8, 8, 7),(0, 0, 0, 8),(1, 1, 0, 8),(2, 2, 0, 8),(3, 3, 0, 8),(4, 4, 0, 8),(5, 5, 0, 8),(6, 6, 0, 8),(7, 7, 0, 8),(8, 8, 0, 8),(1, 0, 1, 8),(2, 1, 1, 8),(3, 2, 1, 8),(4, 3, 1, 8),(5, 4, 1, 8),(6, 5, 1, 8),(7, 6, 1, 8),(8, 7, 1, 8),(0, 8, 1, 8),(2, 0, 2, 8),(3, 1, 2, 8),(4, 2, 2, 8),(5, 3, 2, 8),(6, 4, 2, 8),(7, 5, 2, 8),(8, 6, 2, 8),(0, 7, 2, 8),(1, 8, 2, 8),(3, 0, 3, 8),(4, 1, 3, 8),(5, 2, 3, 8),(6, 3, 3, 8),(7, 4, 3, 8),(8, 5, 3, 8),(0, 6, 3, 8),(1, 7, 3, 8),(2, 8, 3, 8),(4, 0, 4, 8),(5, 1, 4, 8),(6, 2, 4, 8),(7, 3, 4, 8),(8, 4, 4, 8),(0, 5, 4, 8),(1, 6, 4, 8),(2, 7, 4, 8),(3, 8, 4, 8),(5, 0, 5, 8),(6, 1, 5, 8),(7, 2, 5, 8),(8, 3, 5, 8),(0, 4, 5, 8),(1, 5, 5, 8),(2, 6, 5, 8),(3, 7, 5, 8),(4, 8, 5, 8),(6, 0, 6, 8),(7, 1, 6, 8),(8, 2, 6, 8),(0, 3, 6, 8),(1, 4, 6, 8),(2, 5, 6, 8),(3, 6, 6, 8),(4, 7, 6, 8),(5, 8, 6, 8),(7, 0, 7, 8),(8, 1, 7, 8),(0, 2, 7, 8),(1, 3, 7, 8),(2, 4, 7, 8),(3, 5, 7, 8),(4, 6, 7, 8),(5, 7, 7, 8),(6, 8, 7, 8),(8, 0, 8, 8),(0, 1, 8, 8),(1, 2, 8, 8),(2, 3, 8, 8),(3, 4, 8, 8),(4, 5, 8, 8),(5, 6, 8, 8),(6, 7, 8, 8),(7, 8, 8, 8)],
[(2, 0, 0, 0),(3, 1, 0, 0),(4, 2, 0, 0),(5, 3, 0, 0),(6, 4, 0, 0),(7, 5, 0, 0),(8, 6, 0, 0),(0, 7, 0, 0),(1, 8, 0, 0),(3, 0, 1, 0),(4, 1, 1, 0),(5, 2, 1, 0),(6, 3, 1, 0),(7, 4, 1, 0),(8, 5, 1, 0),(0, 6, 1, 0),(1, 7, 1, 0),(2, 8, 1, 0),(4, 0, 2, 0),(5, 1, 2, 0),(6, 2, 2, 0),(7, 3, 2, 0),(8, 4, 2, 0),(0, 5, 2, 0),(1, 6, 2, 0),(2, 7, 2, 0),(3, 8, 2, 0),(5, 0, 3, 0),(6, 1, 3, 0),(7, 2, 3, 0),(8, 3, 3, 0),(0, 4, 3, 0),(1, 5, 3, 0),(2, 6, 3, 0),(3, 7, 3, 0),(4, 8, 3, 0),(6, 0, 4, 0),(7, 1, 4, 0),(8, 2, 4, 0),(0, 3, 4, 0),(1, 4, 4, 0),(2, 5, 4, 0),(3, 6, 4, 0),(4, 7, 4, 0),(5, 8, 4, 0),(7, 0, 5, 0),(8, 1, 5, 0),(0, 2, 5, 0),(1, 3, 5, 0),(2, 4, 5, 0),(3, 5, 5, 0),(4, 6, 5, 0),(5, 7, 5, 0),(6, 8, 5, 0),(8, 0, 6, 0),(0, 1, 6, 0),(1, 2, 6, 0),(2, 3, 6, 0),(3, 4, 6, 0),(4, 5, 6, 0),(5, 6, 6, 0),(6, 7, 6, 0),(7, 8, 6, 0),(0, 0, 7, 0),(1, 1, 7, 0),(2, 2, 7, 0),(3, 3, 7, 0),(4, 4, 7, 0),(5, 5, 7, 0),(6, 6, 7, 0),(7, 7, 7, 0),(8, 8, 7, 0),(1, 0, 8, 0),(2, 1, 8, 0),(3, 2, 8, 0),(4, 3, 8, 0),(5, 4, 8, 0),(6, 5, 8, 0),(7, 6, 8, 0),(8, 7, 8, 0),(0, 8, 8, 0),(3, 0, 0, 1),(4, 1, 0, 1),(5, 2, 0, 1),(6, 3, 0, 1),(7, 4, 0, 1),(8, 5, 0, 1),(0, 6, 0, 1),(1, 7, 0, 1),(2, 8, 0, 1),(4, 0, 1, 1),(5, 1, 1, 1),(6, 2, 1, 1),(7, 3, 1, 1),(8, 4, 1, 1),(0, 5, 1, 1),(1, 6, 1, 1),(2, 7, 1, 1),(3, 8, 1, 1),(5, 0, 2, 1),(6, 1, 2, 1),(7, 2, 2, 1),(8, 3, 2, 1),(0, 4, 2, 1),(1, 5, 2, 1),(2, 6, 2, 1),(3, 7, 2, 1),(4, 8, 2, 1),(6, 0, 3, 1),(7, 1, 3, 1),(8, 2, 3, 1),(0, 3, 3, 1),(1, 4, 3, 1),(2, 5, 3, 1),(3, 6, 3, 1),(4, 7, 3, 1),(5, 8, 3, 1),(7, 0, 4, 1),(8, 1, 4, 1),(0, 2, 4, 1),(1, 3, 4, 1),(2, 4, 4, 1),(3, 5, 4, 1),(4, 6, 4, 1),(5, 7, 4, 1),(6, 8, 4, 1),(8, 0, 5, 1),(0, 1, 5, 1),(1, 2, 5, 1),(2, 3, 5, 1),(3, 4, 5, 1),(4, 5, 5, 1),(5, 6, 5, 1),(6, 7, 5, 1),(7, 8, 5, 1),(0, 0, 6, 1),(1, 1, 6, 1),(2, 2, 6, 1),(3, 3, 6, 1),(4, 4, 6, 1),(5, 5, 6, 1),(6, 6, 6, 1),(7, 7, 6, 1),(8, 8, 6, 1),(1, 0, 7, 1),(2, 1, 7, 1),(3, 2, 7, 1),(4, 3, 7, 1),(5, 4, 7, 1),(6, 5, 7, 1),(7, 6, 7, 1),(8, 7, 7, 1),(0, 8, 7, 1),(2, 0, 8, 1),(3, 1, 8, 1),(4, 2, 8, 1),(5, 3, 8, 1),(6, 4, 8, 1),(7, 5, 8, 1),(8, 6, 8, 1),(0, 7, 8, 1),(1, 8, 8, 1),(4, 0, 0, 2),(5, 1, 0, 2),(6, 2, 0, 2),(7, 3, 0, 2),(8, 4, 0, 2),(0, 5, 0, 2),(1, 6, 0, 2),(2, 7, 0, 2),(3, 8, 0, 2),(5, 0, 1, 2),(6, 1, 1, 2),(7, 2, 1, 2),(8, 3, 1, 2),(0, 4, 1, 2),(1, 5, 1, 2),(2, 6, 1, 2),(3, 7, 1, 2),(4, 8, 1, 2),(6, 0, 2, 2),(7, 1, 2, 2),(8, 2, 2, 2),(0, 3, 2, 2),(1, 4, 2, 2),(2, 5, 2, 2),(3, 6, 2, 2),(4, 7, 2, 2),(5, 8, 2, 2),(7, 0, 3, 2),(8, 1, 3, 2),(0, 2, 3, 2),(1, 3, 3, 2),(2, 4, 3, 2),(3, 5, 3, 2),(4, 6, 3, 2),(5, 7, 3, 2),(6, 8, 3, 2),(8, 0, 4, 2),(0, 1, 4, 2),(1, 2, 4, 2),(2, 3, 4, 2),(3, 4, 4, 2),(4, 5, 4, 2),(5, 6, 4, 2),(6, 7, 4, 2),(7, 8, 4, 2),(0, 0, 5, 2),(1, 1, 5, 2),(2, 2, 5, 2),(3, 3, 5, 2),(4, 4, 5, 2),(5, 5, 5, 2),(6, 6, 5, 2),(7, 7, 5, 2),(8, 8, 5, 2),(1, 0, 6, 2),(2, 1, 6, 2),(3, 2, 6, 2),(4, 3, 6, 2),(5, 4, 6, 2),(6, 5, 6, 2),(7, 6, 6, 2),(8, 7, 6, 2),(0, 8, 6, 2),(2, 0, 7, 2),(3, 1, 7, 2),(4, 2, 7, 2),(5, 3, 7, 2),(6, 4, 7, 2),(7, 5, 7, 2),(8, 6, 7, 2),(0, 7, 7, 2),(1, 8, 7, 2),(3, 0, 8, 2),(4, 1, 8, 2),(5, 2, 8, 2),(6, 3, 8, 2),(7, 4, 8, 2),(8, 5, 8, 2),(0, 6, 8, 2),(1, 7, 8, 2),(2, 8, 8, 2),(5, 0, 0, 3),(6, 1, 0, 3),(7, 2, 0, 3),(8, 3, 0, 3),(0, 4, 0, 3),(1, 5, 0, 3),(2, 6, 0, 3),(3, 7, 0, 3),(4, 8, 0, 3),(6, 0, 1, 3),(7, 1, 1, 3),(8, 2, 1, 3),(0, 3, 1, 3),(1, 4, 1, 3),(2, 5, 1, 3),(3, 6, 1, 3),(4, 7, 1, 3),(5, 8, 1, 3),(7, 0, 2, 3),(8, 1, 2, 3),(0, 2, 2, 3),(1, 3, 2, 3),(2, 4, 2, 3),(3, 5, 2, 3),(4, 6, 2, 3),(5, 7, 2, 3),(6, 8, 2, 3),(8, 0, 3, 3),(0, 1, 3, 3),(1, 2, 3, 3),(2, 3, 3, 3),(3, 4, 3, 3),(4, 5, 3, 3),(5, 6, 3, 3),(6, 7, 3, 3),(7, 8, 3, 3),(0, 0, 4, 3),(1, 1, 4, 3),(2, 2, 4, 3),(3, 3, 4, 3),(4, 4, 4, 3),(5, 5, 4, 3),(6, 6, 4, 3),(7, 7, 4, 3),(8, 8, 4, 3),(1, 0, 5, 3),(2, 1, 5, 3),(3, 2, 5, 3),(4, 3, 5, 3),(5, 4, 5, 3),(6, 5, 5, 3),(7, 6, 5, 3),(8, 7, 5, 3),(0, 8, 5, 3),(2, 0, 6, 3),(3, 1, 6, 3),(4, 2, 6, 3),(5, 3, 6, 3),(6, 4, 6, 3),(7, 5, 6, 3),(8, 6, 6, 3),(0, 7, 6, 3),(1, 8, 6, 3),(3, 0, 7, 3),(4, 1, 7, 3),(5, 2, 7, 3),(6, 3, 7, 3),(7, 4, 7, 3),(8, 5, 7, 3),(0, 6, 7, 3),(1, 7, 7, 3),(2, 8, 7, 3),(4, 0, 8, 3),(5, 1, 8, 3),(6, 2, 8, 3),(7, 3, 8, 3),(8, 4, 8, 3),(0, 5, 8, 3),(1, 6, 8, 3),(2, 7, 8, 3),(3, 8, 8, 3),(6, 0, 0, 4),(7, 1, 0, 4),(8, 2, 0, 4),(0, 3, 0, 4),(1, 4, 0, 4),(2, 5, 0, 4),(3, 6, 0, 4),(4, 7, 0, 4),(5, 8, 0, 4),(7, 0, 1, 4),(8, 1, 1, 4),(0, 2, 1, 4),(1, 3, 1, 4),(2, 4, 1, 4),(3, 5, 1, 4),(4, 6, 1, 4),(5, 7, 1, 4),(6, 8, 1, 4),(8, 0, 2, 4),(0, 1, 2, 4),(1, 2, 2, 4),(2, 3, 2, 4),(3, 4, 2, 4),(4, 5, 2, 4),(5, 6, 2, 4),(6, 7, 2, 4),(7, 8, 2, 4),(0, 0, 3, 4),(1, 1, 3, 4),(2, 2, 3, 4),(3, 3, 3, 4),(4, 4, 3, 4),(5, 5, 3, 4),(6, 6, 3, 4),(7, 7, 3, 4),(8, 8, 3, 4),(1, 0, 4, 4),(2, 1, 4, 4),(3, 2, 4, 4),(4, 3, 4, 4),(5, 4, 4, 4),(6, 5, 4, 4),(7, 6, 4, 4),(8, 7, 4, 4),(0, 8, 4, 4),(2, 0, 5, 4),(3, 1, 5, 4),(4, 2, 5, 4),(5, 3, 5, 4),(6, 4, 5, 4),(7, 5, 5, 4),(8, 6, 5, 4),(0, 7, 5, 4),(1, 8, 5, 4),(3, 0, 6, 4),(4, 1, 6, 4),(5, 2, 6, 4),(6, 3, 6, 4),(7, 4, 6, 4),(8, 5, 6, 4),(0, 6, 6, 4),(1, 7, 6, 4),(2, 8, 6, 4),(4, 0, 7, 4),(5, 1, 7, 4),(6, 2, 7, 4),(7, 3, 7, 4),(8, 4, 7, 4),(0, 5, 7, 4),(1, 6, 7, 4),(2, 7, 7, 4),(3, 8, 7, 4),(5, 0, 8, 4),(6, 1, 8, 4),(7, 2, 8, 4),(8, 3, 8, 4),(0, 4, 8, 4),(1, 5, 8, 4),(2, 6, 8, 4),(3, 7, 8, 4),(4, 8, 8, 4),(7, 0, 0, 5),(8, 1, 0, 5),(0, 2, 0, 5),(1, 3, 0, 5),(2, 4, 0, 5),(3, 5, 0, 5),(4, 6, 0, 5),(5, 7, 0, 5),(6, 8, 0, 5),(8, 0, 1, 5),(0, 1, 1, 5),(1, 2, 1, 5),(2, 3, 1, 5),(3, 4, 1, 5),(4, 5, 1, 5),(5, 6, 1, 5),(6, 7, 1, 5),(7, 8, 1, 5),(0, 0, 2, 5),(1, 1, 2, 5),(2, 2, 2, 5),(3, 3, 2, 5),(4, 4, 2, 5),(5, 5, 2, 5),(6, 6, 2, 5),(7, 7, 2, 5),(8, 8, 2, 5),(1, 0, 3, 5),(2, 1, 3, 5),(3, 2, 3, 5),(4, 3, 3, 5),(5, 4, 3, 5),(6, 5, 3, 5),(7, 6, 3, 5),(8, 7, 3, 5),(0, 8, 3, 5),(2, 0, 4, 5),(3, 1, 4, 5),(4, 2, 4, 5),(5, 3, 4, 5),(6, 4, 4, 5),(7, 5, 4, 5),(8, 6, 4, 5),(0, 7, 4, 5),(1, 8, 4, 5),(3, 0, 5, 5),(4, 1, 5, 5),(5, 2, 5, 5),(6, 3, 5, 5),(7, 4, 5, 5),(8, 5, 5, 5),(0, 6, 5, 5),(1, 7, 5, 5),(2, 8, 5, 5),(4, 0, 6, 5),(5, 1, 6, 5),(6, 2, 6, 5),(7, 3, 6, 5),(8, 4, 6, 5),(0, 5, 6, 5),(1, 6, 6, 5),(2, 7, 6, 5),(3, 8, 6, 5),(5, 0, 7, 5),(6, 1, 7, 5),(7, 2, 7, 5),(8, 3, 7, 5),(0, 4, 7, 5),(1, 5, 7, 5),(2, 6, 7, 5),(3, 7, 7, 5),(4, 8, 7, 5),(6, 0, 8, 5),(7, 1, 8, 5),(8, 2, 8, 5),(0, 3, 8, 5),(1, 4, 8, 5),(2, 5, 8, 5),(3, 6, 8, 5),(4, 7, 8, 5),(5, 8, 8, 5),(8, 0, 0, 6),(0, 1, 0, 6),(1, 2, 0, 6),(2, 3, 0, 6),(3, 4, 0, 6),(4, 5, 0, 6),(5, 6, 0, 6),(6, 7, 0, 6),(7, 8, 0, 6),(0, 0, 1, 6),(1, 1, 1, 6),(2, 2, 1, 6),(3, 3, 1, 6),(4, 4, 1, 6),(5, 5, 1, 6),(6, 6, 1, 6),(7, 7, 1, 6),(8, 8, 1, 6),(1, 0, 2, 6),(2, 1, 2, 6),(3, 2, 2, 6),(4, 3, 2, 6),(5, 4, 2, 6),(6, 5, 2, 6),(7, 6, 2, 6),(8, 7, 2, 6),(0, 8, 2, 6),(2, 0, 3, 6),(3, 1, 3, 6),(4, 2, 3, 6),(5, 3, 3, 6),(6, 4, 3, 6),(7, 5, 3, 6),(8, 6, 3, 6),(0, 7, 3, 6),(1, 8, 3, 6),(3, 0, 4, 6),(4, 1, 4, 6),(5, 2, 4, 6),(6, 3, 4, 6),(7, 4, 4, 6),(8, 5, 4, 6),(0, 6, 4, 6),(1, 7, 4, 6),(2, 8, 4, 6),(4, 0, 5, 6),(5, 1, 5, 6),(6, 2, 5, 6),(7, 3, 5, 6),(8, 4, 5, 6),(0, 5, 5, 6),(1, 6, 5, 6),(2, 7, 5, 6),(3, 8, 5, 6),(5, 0, 6, 6),(6, 1, 6, 6),(7, 2, 6, 6),(8, 3, 6, 6),(0, 4, 6, 6),(1, 5, 6, 6),(2, 6, 6, 6),(3, 7, 6, 6),(4, 8, 6, 6),(6, 0, 7, 6),(7, 1, 7, 6),(8, 2, 7, 6),(0, 3, 7, 6),(1, 4, 7, 6),(2, 5, 7, 6),(3, 6, 7, 6),(4, 7, 7, 6),(5, 8, 7, 6),(7, 0, 8, 6),(8, 1, 8, 6),(0, 2, 8, 6),(1, 3, 8, 6),(2, 4, 8, 6),(3, 5, 8, 6),(4, 6, 8, 6),(5, 7, 8, 6),(6, 8, 8, 6),(0, 0, 0, 7),(1, 1, 0, 7),(2, 2, 0, 7),(3, 3, 0, 7),(4, 4, 0, 7),(5, 5, 0, 7),(6, 6, 0, 7),(7, 7, 0, 7),(8, 8, 0, 7),(1, 0, 1, 7),(2, 1, 1, 7),(3, 2, 1, 7),(4, 3, 1, 7),(5, 4, 1, 7),(6, 5, 1, 7),(7, 6, 1, 7),(8, 7, 1, 7),(0, 8, 1, 7),(2, 0, 2, 7),(3, 1, 2, 7),(4, 2, 2, 7),(5, 3, 2, 7),(6, 4, 2, 7),(7, 5, 2, 7),(8, 6, 2, 7),(0, 7, 2, 7),(1, 8, 2, 7),(3, 0, 3, 7),(4, 1, 3, 7),(5, 2, 3, 7),(6, 3, 3, 7),(7, 4, 3, 7),(8, 5, 3, 7),(0, 6, 3, 7),(1, 7, 3, 7),(2, 8, 3, 7),(4, 0, 4, 7),(5, 1, 4, 7),(6, 2, 4, 7),(7, 3, 4, 7),(8, 4, 4, 7),(0, 5, 4, 7),(1, 6, 4, 7),(2, 7, 4, 7),(3, 8, 4, 7),(5, 0, 5, 7),(6, 1, 5, 7),(7, 2, 5, 7),(8, 3, 5, 7),(0, 4, 5, 7),(1, 5, 5, 7),(2, 6, 5, 7),(3, 7, 5, 7),(4, 8, 5, 7),(6, 0, 6, 7),(7, 1, 6, 7),(8, 2, 6, 7),(0, 3, 6, 7),(1, 4, 6, 7),(2, 5, 6, 7),(3, 6, 6, 7),(4, 7, 6, 7),(5, 8, 6, 7),(7, 0, 7, 7),(8, 1, 7, 7),(0, 2, 7, 7),(1, 3, 7, 7),(2, 4, 7, 7),(3, 5, 7, 7),(4, 6, 7, 7),(5, 7, 7, 7),(6, 8, 7, 7),(8, 0, 8, 7),(0, 1, 8, 7),(1, 2, 8, 7),(2, 3, 8, 7),(3, 4, 8, 7),(4, 5, 8, 7),(5, 6, 8, 7),(6, 7, 8, 7),(7, 8, 8, 7),(1, 0, 0, 8),(2, 1, 0, 8),(3, 2, 0, 8),(4, 3, 0, 8),(5, 4, 0, 8),(6, 5, 0, 8),(7, 6, 0, 8),(8, 7, 0, 8),(0, 8, 0, 8),(2, 0, 1, 8),(3, 1, 1, 8),(4, 2, 1, 8),(5, 3, 1, 8),(6, 4, 1, 8),(7, 5, 1, 8),(8, 6, 1, 8),(0, 7, 1, 8),(1, 8, 1, 8),(3, 0, 2, 8),(4, 1, 2, 8),(5, 2, 2, 8),(6, 3, 2, 8),(7, 4, 2, 8),(8, 5, 2, 8),(0, 6, 2, 8),(1, 7, 2, 8),(2, 8, 2, 8),(4, 0, 3, 8),(5, 1, 3, 8),(6, 2, 3, 8),(7, 3, 3, 8),(8, 4, 3, 8),(0, 5, 3, 8),(1, 6, 3, 8),(2, 7, 3, 8),(3, 8, 3, 8),(5, 0, 4, 8),(6, 1, 4, 8),(7, 2, 4, 8),(8, 3, 4, 8),(0, 4, 4, 8),(1, 5, 4, 8),(2, 6, 4, 8),(3, 7, 4, 8),(4, 8, 4, 8),(6, 0, 5, 8),(7, 1, 5, 8),(8, 2, 5, 8),(0, 3, 5, 8),(1, 4, 5, 8),(2, 5, 5, 8),(3, 6, 5, 8),(4, 7, 5, 8),(5, 8, 5, 8),(7, 0, 6, 8),(8, 1, 6, 8),(0, 2, 6, 8),(1, 3, 6, 8),(2, 4, 6, 8),(3, 5, 6, 8),(4, 6, 6, 8),(5, 7, 6, 8),(6, 8, 6, 8),(8, 0, 7, 8),(0, 1, 7, 8),(1, 2, 7, 8),(2, 3, 7, 8),(3, 4, 7, 8),(4, 5, 7, 8),(5, 6, 7, 8),(6, 7, 7, 8),(7, 8, 7, 8),(0, 0, 8, 8),(1, 1, 8, 8),(2, 2, 8, 8),(3, 3, 8, 8),(4, 4, 8, 8),(5, 5, 8, 8),(6, 6, 8, 8),(7, 7, 8, 8),(8, 8, 8, 8)],
[(3, 0, 0, 0),(4, 1, 0, 0),(5, 2, 0, 0),(6, 3, 0, 0),(7, 4, 0, 0),(8, 5, 0, 0),(0, 6, 0, 0),(1, 7, 0, 0),(2, 8, 0, 0),(4, 0, 1, 0),(5, 1, 1, 0),(6, 2, 1, 0),(7, 3, 1, 0),(8, 4, 1, 0),(0, 5, 1, 0),(1, 6, 1, 0),(2, 7, 1, 0),(3, 8, 1, 0),(5, 0, 2, 0),(6, 1, 2, 0),(7, 2, 2, 0),(8, 3, 2, 0),(0, 4, 2, 0),(1, 5, 2, 0),(2, 6, 2, 0),(3, 7, 2, 0),(4, 8, 2, 0),(6, 0, 3, 0),(7, 1, 3, 0),(8, 2, 3, 0),(0, 3, 3, 0),(1, 4, 3, 0),(2, 5, 3, 0),(3, 6, 3, 0),(4, 7, 3, 0),(5, 8, 3, 0),(7, 0, 4, 0),(8, 1, 4, 0),(0, 2, 4, 0),(1, 3, 4, 0),(2, 4, 4, 0),(3, 5, 4, 0),(4, 6, 4, 0),(5, 7, 4, 0),(6, 8, 4, 0),(8, 0, 5, 0),(0, 1, 5, 0),(1, 2, 5, 0),(2, 3, 5, 0),(3, 4, 5, 0),(4, 5, 5, 0),(5, 6, 5, 0),(6, 7, 5, 0),(7, 8, 5, 0),(0, 0, 6, 0),(1, 1, 6, 0),(2, 2, 6, 0),(3, 3, 6, 0),(4, 4, 6, 0),(5, 5, 6, 0),(6, 6, 6, 0),(7, 7, 6, 0),(8, 8, 6, 0),(1, 0, 7, 0),(2, 1, 7, 0),(3, 2, 7, 0),(4, 3, 7, 0),(5, 4, 7, 0),(6, 5, 7, 0),(7, 6, 7, 0),(8, 7, 7, 0),(0, 8, 7, 0),(2, 0, 8, 0),(3, 1, 8, 0),(4, 2, 8, 0),(5, 3, 8, 0),(6, 4, 8, 0),(7, 5, 8, 0),(8, 6, 8, 0),(0, 7, 8, 0),(1, 8, 8, 0),(4, 0, 0, 1),(5, 1, 0, 1),(6, 2, 0, 1),(7, 3, 0, 1),(8, 4, 0, 1),(0, 5, 0, 1),(1, 6, 0, 1),(2, 7, 0, 1),(3, 8, 0, 1),(5, 0, 1, 1),(6, 1, 1, 1),(7, 2, 1, 1),(8, 3, 1, 1),(0, 4, 1, 1),(1, 5, 1, 1),(2, 6, 1, 1),(3, 7, 1, 1),(4, 8, 1, 1),(6, 0, 2, 1),(7, 1, 2, 1),(8, 2, 2, 1),(0, 3, 2, 1),(1, 4, 2, 1),(2, 5, 2, 1),(3, 6, 2, 1),(4, 7, 2, 1),(5, 8, 2, 1),(7, 0, 3, 1),(8, 1, 3, 1),(0, 2, 3, 1),(1, 3, 3, 1),(2, 4, 3, 1),(3, 5, 3, 1),(4, 6, 3, 1),(5, 7, 3, 1),(6, 8, 3, 1),(8, 0, 4, 1),(0, 1, 4, 1),(1, 2, 4, 1),(2, 3, 4, 1),(3, 4, 4, 1),(4, 5, 4, 1),(5, 6, 4, 1),(6, 7, 4, 1),(7, 8, 4, 1),(0, 0, 5, 1),(1, 1, 5, 1),(2, 2, 5, 1),(3, 3, 5, 1),(4, 4, 5, 1),(5, 5, 5, 1),(6, 6, 5, 1),(7, 7, 5, 1),(8, 8, 5, 1),(1, 0, 6, 1),(2, 1, 6, 1),(3, 2, 6, 1),(4, 3, 6, 1),(5, 4, 6, 1),(6, 5, 6, 1),(7, 6, 6, 1),(8, 7, 6, 1),(0, 8, 6, 1),(2, 0, 7, 1),(3, 1, 7, 1),(4, 2, 7, 1),(5, 3, 7, 1),(6, 4, 7, 1),(7, 5, 7, 1),(8, 6, 7, 1),(0, 7, 7, 1),(1, 8, 7, 1),(3, 0, 8, 1),(4, 1, 8, 1),(5, 2, 8, 1),(6, 3, 8, 1),(7, 4, 8, 1),(8, 5, 8, 1),(0, 6, 8, 1),(1, 7, 8, 1),(2, 8, 8, 1),(5, 0, 0, 2),(6, 1, 0, 2),(7, 2, 0, 2),(8, 3, 0, 2),(0, 4, 0, 2),(1, 5, 0, 2),(2, 6, 0, 2),(3, 7, 0, 2),(4, 8, 0, 2),(6, 0, 1, 2),(7, 1, 1, 2),(8, 2, 1, 2),(0, 3, 1, 2),(1, 4, 1, 2),(2, 5, 1, 2),(3, 6, 1, 2),(4, 7, 1, 2),(5, 8, 1, 2),(7, 0, 2, 2),(8, 1, 2, 2),(0, 2, 2, 2),(1, 3, 2, 2),(2, 4, 2, 2),(3, 5, 2, 2),(4, 6, 2, 2),(5, 7, 2, 2),(6, 8, 2, 2),(8, 0, 3, 2),(0, 1, 3, 2),(1, 2, 3, 2),(2, 3, 3, 2),(3, 4, 3, 2),(4, 5, 3, 2),(5, 6, 3, 2),(6, 7, 3, 2),(7, 8, 3, 2),(0, 0, 4, 2),(1, 1, 4, 2),(2, 2, 4, 2),(3, 3, 4, 2),(4, 4, 4, 2),(5, 5, 4, 2),(6, 6, 4, 2),(7, 7, 4, 2),(8, 8, 4, 2),(1, 0, 5, 2),(2, 1, 5, 2),(3, 2, 5, 2),(4, 3, 5, 2),(5, 4, 5, 2),(6, 5, 5, 2),(7, 6, 5, 2),(8, 7, 5, 2),(0, 8, 5, 2),(2, 0, 6, 2),(3, 1, 6, 2),(4, 2, 6, 2),(5, 3, 6, 2),(6, 4, 6, 2),(7, 5, 6, 2),(8, 6, 6, 2),(0, 7, 6, 2),(1, 8, 6, 2),(3, 0, 7, 2),(4, 1, 7, 2),(5, 2, 7, 2),(6, 3, 7, 2),(7, 4, 7, 2),(8, 5, 7, 2),(0, 6, 7, 2),(1, 7, 7, 2),(2, 8, 7, 2),(4, 0, 8, 2),(5, 1, 8, 2),(6, 2, 8, 2),(7, 3, 8, 2),(8, 4, 8, 2),(0, 5, 8, 2),(1, 6, 8, 2),(2, 7, 8, 2),(3, 8, 8, 2),(6, 0, 0, 3),(7, 1, 0, 3),(8, 2, 0, 3),(0, 3, 0, 3),(1, 4, 0, 3),(2, 5, 0, 3),(3, 6, 0, 3),(4, 7, 0, 3),(5, 8, 0, 3),(7, 0, 1, 3),(8, 1, 1, 3),(0, 2, 1, 3),(1, 3, 1, 3),(2, 4, 1, 3),(3, 5, 1, 3),(4, 6, 1, 3),(5, 7, 1, 3),(6, 8, 1, 3),(8, 0, 2, 3),(0, 1, 2, 3),(1, 2, 2, 3),(2, 3, 2, 3),(3, 4, 2, 3),(4, 5, 2, 3),(5, 6, 2, 3),(6, 7, 2, 3),(7, 8, 2, 3),(0, 0, 3, 3),(1, 1, 3, 3),(2, 2, 3, 3),(3, 3, 3, 3),(4, 4, 3, 3),(5, 5, 3, 3),(6, 6, 3, 3),(7, 7, 3, 3),(8, 8, 3, 3),(1, 0, 4, 3),(2, 1, 4, 3),(3, 2, 4, 3),(4, 3, 4, 3),(5, 4, 4, 3),(6, 5, 4, 3),(7, 6, 4, 3),(8, 7, 4, 3),(0, 8, 4, 3),(2, 0, 5, 3),(3, 1, 5, 3),(4, 2, 5, 3),(5, 3, 5, 3),(6, 4, 5, 3),(7, 5, 5, 3),(8, 6, 5, 3),(0, 7, 5, 3),(1, 8, 5, 3),(3, 0, 6, 3),(4, 1, 6, 3),(5, 2, 6, 3),(6, 3, 6, 3),(7, 4, 6, 3),(8, 5, 6, 3),(0, 6, 6, 3),(1, 7, 6, 3),(2, 8, 6, 3),(4, 0, 7, 3),(5, 1, 7, 3),(6, 2, 7, 3),(7, 3, 7, 3),(8, 4, 7, 3),(0, 5, 7, 3),(1, 6, 7, 3),(2, 7, 7, 3),(3, 8, 7, 3),(5, 0, 8, 3),(6, 1, 8, 3),(7, 2, 8, 3),(8, 3, 8, 3),(0, 4, 8, 3),(1, 5, 8, 3),(2, 6, 8, 3),(3, 7, 8, 3),(4, 8, 8, 3),(7, 0, 0, 4),(8, 1, 0, 4),(0, 2, 0, 4),(1, 3, 0, 4),(2, 4, 0, 4),(3, 5, 0, 4),(4, 6, 0, 4),(5, 7, 0, 4),(6, 8, 0, 4),(8, 0, 1, 4),(0, 1, 1, 4),(1, 2, 1, 4),(2, 3, 1, 4),(3, 4, 1, 4),(4, 5, 1, 4),(5, 6, 1, 4),(6, 7, 1, 4),(7, 8, 1, 4),(0, 0, 2, 4),(1, 1, 2, 4),(2, 2, 2, 4),(3, 3, 2, 4),(4, 4, 2, 4),(5, 5, 2, 4),(6, 6, 2, 4),(7, 7, 2, 4),(8, 8, 2, 4),(1, 0, 3, 4),(2, 1, 3, 4),(3, 2, 3, 4),(4, 3, 3, 4),(5, 4, 3, 4),(6, 5, 3, 4),(7, 6, 3, 4),(8, 7, 3, 4),(0, 8, 3, 4),(2, 0, 4, 4),(3, 1, 4, 4),(4, 2, 4, 4),(5, 3, 4, 4),(6, 4, 4, 4),(7, 5, 4, 4),(8, 6, 4, 4),(0, 7, 4, 4),(1, 8, 4, 4),(3, 0, 5, 4),(4, 1, 5, 4),(5, 2, 5, 4),(6, 3, 5, 4),(7, 4, 5, 4),(8, 5, 5, 4),(0, 6, 5, 4),(1, 7, 5, 4),(2, 8, 5, 4),(4, 0, 6, 4),(5, 1, 6, 4),(6, 2, 6, 4),(7, 3, 6, 4),(8, 4, 6, 4),(0, 5, 6, 4),(1, 6, 6, 4),(2, 7, 6, 4),(3, 8, 6, 4),(5, 0, 7, 4),(6, 1, 7, 4),(7, 2, 7, 4),(8, 3, 7, 4),(0, 4, 7, 4),(1, 5, 7, 4),(2, 6, 7, 4),(3, 7, 7, 4),(4, 8, 7, 4),(6, 0, 8, 4),(7, 1, 8, 4),(8, 2, 8, 4),(0, 3, 8, 4),(1, 4, 8, 4),(2, 5, 8, 4),(3, 6, 8, 4),(4, 7, 8, 4),(5, 8, 8, 4),(8, 0, 0, 5),(0, 1, 0, 5),(1, 2, 0, 5),(2, 3, 0, 5),(3, 4, 0, 5),(4, 5, 0, 5),(5, 6, 0, 5),(6, 7, 0, 5),(7, 8, 0, 5),(0, 0, 1, 5),(1, 1, 1, 5),(2, 2, 1, 5),(3, 3, 1, 5),(4, 4, 1, 5),(5, 5, 1, 5),(6, 6, 1, 5),(7, 7, 1, 5),(8, 8, 1, 5),(1, 0, 2, 5),(2, 1, 2, 5),(3, 2, 2, 5),(4, 3, 2, 5),(5, 4, 2, 5),(6, 5, 2, 5),(7, 6, 2, 5),(8, 7, 2, 5),(0, 8, 2, 5),(2, 0, 3, 5),(3, 1, 3, 5),(4, 2, 3, 5),(5, 3, 3, 5),(6, 4, 3, 5),(7, 5, 3, 5),(8, 6, 3, 5),(0, 7, 3, 5),(1, 8, 3, 5),(3, 0, 4, 5),(4, 1, 4, 5),(5, 2, 4, 5),(6, 3, 4, 5),(7, 4, 4, 5),(8, 5, 4, 5),(0, 6, 4, 5),(1, 7, 4, 5),(2, 8, 4, 5),(4, 0, 5, 5),(5, 1, 5, 5),(6, 2, 5, 5),(7, 3, 5, 5),(8, 4, 5, 5),(0, 5, 5, 5),(1, 6, 5, 5),(2, 7, 5, 5),(3, 8, 5, 5),(5, 0, 6, 5),(6, 1, 6, 5),(7, 2, 6, 5),(8, 3, 6, 5),(0, 4, 6, 5),(1, 5, 6, 5),(2, 6, 6, 5),(3, 7, 6, 5),(4, 8, 6, 5),(6, 0, 7, 5),(7, 1, 7, 5),(8, 2, 7, 5),(0, 3, 7, 5),(1, 4, 7, 5),(2, 5, 7, 5),(3, 6, 7, 5),(4, 7, 7, 5),(5, 8, 7, 5),(7, 0, 8, 5),(8, 1, 8, 5),(0, 2, 8, 5),(1, 3, 8, 5),(2, 4, 8, 5),(3, 5, 8, 5),(4, 6, 8, 5),(5, 7, 8, 5),(6, 8, 8, 5),(0, 0, 0, 6),(1, 1, 0, 6),(2, 2, 0, 6),(3, 3, 0, 6),(4, 4, 0, 6),(5, 5, 0, 6),(6, 6, 0, 6),(7, 7, 0, 6),(8, 8, 0, 6),(1, 0, 1, 6),(2, 1, 1, 6),(3, 2, 1, 6),(4, 3, 1, 6),(5, 4, 1, 6),(6, 5, 1, 6),(7, 6, 1, 6),(8, 7, 1, 6),(0, 8, 1, 6),(2, 0, 2, 6),(3, 1, 2, 6),(4, 2, 2, 6),(5, 3, 2, 6),(6, 4, 2, 6),(7, 5, 2, 6),(8, 6, 2, 6),(0, 7, 2, 6),(1, 8, 2, 6),(3, 0, 3, 6),(4, 1, 3, 6),(5, 2, 3, 6),(6, 3, 3, 6),(7, 4, 3, 6),(8, 5, 3, 6),(0, 6, 3, 6),(1, 7, 3, 6),(2, 8, 3, 6),(4, 0, 4, 6),(5, 1, 4, 6),(6, 2, 4, 6),(7, 3, 4, 6),(8, 4, 4, 6),(0, 5, 4, 6),(1, 6, 4, 6),(2, 7, 4, 6),(3, 8, 4, 6),(5, 0, 5, 6),(6, 1, 5, 6),(7, 2, 5, 6),(8, 3, 5, 6),(0, 4, 5, 6),(1, 5, 5, 6),(2, 6, 5, 6),(3, 7, 5, 6),(4, 8, 5, 6),(6, 0, 6, 6),(7, 1, 6, 6),(8, 2, 6, 6),(0, 3, 6, 6),(1, 4, 6, 6),(2, 5, 6, 6),(3, 6, 6, 6),(4, 7, 6, 6),(5, 8, 6, 6),(7, 0, 7, 6),(8, 1, 7, 6),(0, 2, 7, 6),(1, 3, 7, 6),(2, 4, 7, 6),(3, 5, 7, 6),(4, 6, 7, 6),(5, 7, 7, 6),(6, 8, 7, 6),(8, 0, 8, 6),(0, 1, 8, 6),(1, 2, 8, 6),(2, 3, 8, 6),(3, 4, 8, 6),(4, 5, 8, 6),(5, 6, 8, 6),(6, 7, 8, 6),(7, 8, 8, 6),(1, 0, 0, 7),(2, 1, 0, 7),(3, 2, 0, 7),(4, 3, 0, 7),(5, 4, 0, 7),(6, 5, 0, 7),(7, 6, 0, 7),(8, 7, 0, 7),(0, 8, 0, 7),(2, 0, 1, 7),(3, 1, 1, 7),(4, 2, 1, 7),(5, 3, 1, 7),(6, 4, 1, 7),(7, 5, 1, 7),(8, 6, 1, 7),(0, 7, 1, 7),(1, 8, 1, 7),(3, 0, 2, 7),(4, 1, 2, 7),(5, 2, 2, 7),(6, 3, 2, 7),(7, 4, 2, 7),(8, 5, 2, 7),(0, 6, 2, 7),(1, 7, 2, 7),(2, 8, 2, 7),(4, 0, 3, 7),(5, 1, 3, 7),(6, 2, 3, 7),(7, 3, 3, 7),(8, 4, 3, 7),(0, 5, 3, 7),(1, 6, 3, 7),(2, 7, 3, 7),(3, 8, 3, 7),(5, 0, 4, 7),(6, 1, 4, 7),(7, 2, 4, 7),(8, 3, 4, 7),(0, 4, 4, 7),(1, 5, 4, 7),(2, 6, 4, 7),(3, 7, 4, 7),(4, 8, 4, 7),(6, 0, 5, 7),(7, 1, 5, 7),(8, 2, 5, 7),(0, 3, 5, 7),(1, 4, 5, 7),(2, 5, 5, 7),(3, 6, 5, 7),(4, 7, 5, 7),(5, 8, 5, 7),(7, 0, 6, 7),(8, 1, 6, 7),(0, 2, 6, 7),(1, 3, 6, 7),(2, 4, 6, 7),(3, 5, 6, 7),(4, 6, 6, 7),(5, 7, 6, 7),(6, 8, 6, 7),(8, 0, 7, 7),(0, 1, 7, 7),(1, 2, 7, 7),(2, 3, 7, 7),(3, 4, 7, 7),(4, 5, 7, 7),(5, 6, 7, 7),(6, 7, 7, 7),(7, 8, 7, 7),(0, 0, 8, 7),(1, 1, 8, 7),(2, 2, 8, 7),(3, 3, 8, 7),(4, 4, 8, 7),(5, 5, 8, 7),(6, 6, 8, 7),(7, 7, 8, 7),(8, 8, 8, 7),(2, 0, 0, 8),(3, 1, 0, 8),(4, 2, 0, 8),(5, 3, 0, 8),(6, 4, 0, 8),(7, 5, 0, 8),(8, 6, 0, 8),(0, 7, 0, 8),(1, 8, 0, 8),(3, 0, 1, 8),(4, 1, 1, 8),(5, 2, 1, 8),(6, 3, 1, 8),(7, 4, 1, 8),(8, 5, 1, 8),(0, 6, 1, 8),(1, 7, 1, 8),(2, 8, 1, 8),(4, 0, 2, 8),(5, 1, 2, 8),(6, 2, 2, 8),(7, 3, 2, 8),(8, 4, 2, 8),(0, 5, 2, 8),(1, 6, 2, 8),(2, 7, 2, 8),(3, 8, 2, 8),(5, 0, 3, 8),(6, 1, 3, 8),(7, 2, 3, 8),(8, 3, 3, 8),(0, 4, 3, 8),(1, 5, 3, 8),(2, 6, 3, 8),(3, 7, 3, 8),(4, 8, 3, 8),(6, 0, 4, 8),(7, 1, 4, 8),(8, 2, 4, 8),(0, 3, 4, 8),(1, 4, 4, 8),(2, 5, 4, 8),(3, 6, 4, 8),(4, 7, 4, 8),(5, 8, 4, 8),(7, 0, 5, 8),(8, 1, 5, 8),(0, 2, 5, 8),(1, 3, 5, 8),(2, 4, 5, 8),(3, 5, 5, 8),(4, 6, 5, 8),(5, 7, 5, 8),(6, 8, 5, 8),(8, 0, 6, 8),(0, 1, 6, 8),(1, 2, 6, 8),(2, 3, 6, 8),(3, 4, 6, 8),(4, 5, 6, 8),(5, 6, 6, 8),(6, 7, 6, 8),(7, 8, 6, 8),(0, 0, 7, 8),(1, 1, 7, 8),(2, 2, 7, 8),(3, 3, 7, 8),(4, 4, 7, 8),(5, 5, 7, 8),(6, 6, 7, 8),(7, 7, 7, 8),(8, 8, 7, 8),(1, 0, 8, 8),(2, 1, 8, 8),(3, 2, 8, 8),(4, 3, 8, 8),(5, 4, 8, 8),(6, 5, 8, 8),(7, 6, 8, 8),(8, 7, 8, 8),(0, 8, 8, 8)],
[(4, 0, 0, 0),(5, 1, 0, 0),(6, 2, 0, 0),(7, 3, 0, 0),(8, 4, 0, 0),(0, 5, 0, 0),(1, 6, 0, 0),(2, 7, 0, 0),(3, 8, 0, 0),(5, 0, 1, 0),(6, 1, 1, 0),(7, 2, 1, 0),(8, 3, 1, 0),(0, 4, 1, 0),(1, 5, 1, 0),(2, 6, 1, 0),(3, 7, 1, 0),(4, 8, 1, 0),(6, 0, 2, 0),(7, 1, 2, 0),(8, 2, 2, 0),(0, 3, 2, 0),(1, 4, 2, 0),(2, 5, 2, 0),(3, 6, 2, 0),(4, 7, 2, 0),(5, 8, 2, 0),(7, 0, 3, 0),(8, 1, 3, 0),(0, 2, 3, 0),(1, 3, 3, 0),(2, 4, 3, 0),(3, 5, 3, 0),(4, 6, 3, 0),(5, 7, 3, 0),(6, 8, 3, 0),(8, 0, 4, 0),(0, 1, 4, 0),(1, 2, 4, 0),(2, 3, 4, 0),(3, 4, 4, 0),(4, 5, 4, 0),(5, 6, 4, 0),(6, 7, 4, 0),(7, 8, 4, 0),(0, 0, 5, 0),(1, 1, 5, 0),(2, 2, 5, 0),(3, 3, 5, 0),(4, 4, 5, 0),(5, 5, 5, 0),(6, 6, 5, 0),(7, 7, 5, 0),(8, 8, 5, 0),(1, 0, 6, 0),(2, 1, 6, 0),(3, 2, 6, 0),(4, 3, 6, 0),(5, 4, 6, 0),(6, 5, 6, 0),(7, 6, 6, 0),(8, 7, 6, 0),(0, 8, 6, 0),(2, 0, 7, 0),(3, 1, 7, 0),(4, 2, 7, 0),(5, 3, 7, 0),(6, 4, 7, 0),(7, 5, 7, 0),(8, 6, 7, 0),(0, 7, 7, 0),(1, 8, 7, 0),(3, 0, 8, 0),(4, 1, 8, 0),(5, 2, 8, 0),(6, 3, 8, 0),(7, 4, 8, 0),(8, 5, 8, 0),(0, 6, 8, 0),(1, 7, 8, 0),(2, 8, 8, 0),(5, 0, 0, 1),(6, 1, 0, 1),(7, 2, 0, 1),(8, 3, 0, 1),(0, 4, 0, 1),(1, 5, 0, 1),(2, 6, 0, 1),(3, 7, 0, 1),(4, 8, 0, 1),(6, 0, 1, 1),(7, 1, 1, 1),(8, 2, 1, 1),(0, 3, 1, 1),(1, 4, 1, 1),(2, 5, 1, 1),(3, 6, 1, 1),(4, 7, 1, 1),(5, 8, 1, 1),(7, 0, 2, 1),(8, 1, 2, 1),(0, 2, 2, 1),(1, 3, 2, 1),(2, 4, 2, 1),(3, 5, 2, 1),(4, 6, 2, 1),(5, 7, 2, 1),(6, 8, 2, 1),(8, 0, 3, 1),(0, 1, 3, 1),(1, 2, 3, 1),(2, 3, 3, 1),(3, 4, 3, 1),(4, 5, 3, 1),(5, 6, 3, 1),(6, 7, 3, 1),(7, 8, 3, 1),(0, 0, 4, 1),(1, 1, 4, 1),(2, 2, 4, 1),(3, 3, 4, 1),(4, 4, 4, 1),(5, 5, 4, 1),(6, 6, 4, 1),(7, 7, 4, 1),(8, 8, 4, 1),(1, 0, 5, 1),(2, 1, 5, 1),(3, 2, 5, 1),(4, 3, 5, 1),(5, 4, 5, 1),(6, 5, 5, 1),(7, 6, 5, 1),(8, 7, 5, 1),(0, 8, 5, 1),(2, 0, 6, 1),(3, 1, 6, 1),(4, 2, 6, 1),(5, 3, 6, 1),(6, 4, 6, 1),(7, 5, 6, 1),(8, 6, 6, 1),(0, 7, 6, 1),(1, 8, 6, 1),(3, 0, 7, 1),(4, 1, 7, 1),(5, 2, 7, 1),(6, 3, 7, 1),(7, 4, 7, 1),(8, 5, 7, 1),(0, 6, 7, 1),(1, 7, 7, 1),(2, 8, 7, 1),(4, 0, 8, 1),(5, 1, 8, 1),(6, 2, 8, 1),(7, 3, 8, 1),(8, 4, 8, 1),(0, 5, 8, 1),(1, 6, 8, 1),(2, 7, 8, 1),(3, 8, 8, 1),(6, 0, 0, 2),(7, 1, 0, 2),(8, 2, 0, 2),(0, 3, 0, 2),(1, 4, 0, 2),(2, 5, 0, 2),(3, 6, 0, 2),(4, 7, 0, 2),(5, 8, 0, 2),(7, 0, 1, 2),(8, 1, 1, 2),(0, 2, 1, 2),(1, 3, 1, 2),(2, 4, 1, 2),(3, 5, 1, 2),(4, 6, 1, 2),(5, 7, 1, 2),(6, 8, 1, 2),(8, 0, 2, 2),(0, 1, 2, 2),(1, 2, 2, 2),(2, 3, 2, 2),(3, 4, 2, 2),(4, 5, 2, 2),(5, 6, 2, 2),(6, 7, 2, 2),(7, 8, 2, 2),(0, 0, 3, 2),(1, 1, 3, 2),(2, 2, 3, 2),(3, 3, 3, 2),(4, 4, 3, 2),(5, 5, 3, 2),(6, 6, 3, 2),(7, 7, 3, 2),(8, 8, 3, 2),(1, 0, 4, 2),(2, 1, 4, 2),(3, 2, 4, 2),(4, 3, 4, 2),(5, 4, 4, 2),(6, 5, 4, 2),(7, 6, 4, 2),(8, 7, 4, 2),(0, 8, 4, 2),(2, 0, 5, 2),(3, 1, 5, 2),(4, 2, 5, 2),(5, 3, 5, 2),(6, 4, 5, 2),(7, 5, 5, 2),(8, 6, 5, 2),(0, 7, 5, 2),(1, 8, 5, 2),(3, 0, 6, 2),(4, 1, 6, 2),(5, 2, 6, 2),(6, 3, 6, 2),(7, 4, 6, 2),(8, 5, 6, 2),(0, 6, 6, 2),(1, 7, 6, 2),(2, 8, 6, 2),(4, 0, 7, 2),(5, 1, 7, 2),(6, 2, 7, 2),(7, 3, 7, 2),(8, 4, 7, 2),(0, 5, 7, 2),(1, 6, 7, 2),(2, 7, 7, 2),(3, 8, 7, 2),(5, 0, 8, 2),(6, 1, 8, 2),(7, 2, 8, 2),(8, 3, 8, 2),(0, 4, 8, 2),(1, 5, 8, 2),(2, 6, 8, 2),(3, 7, 8, 2),(4, 8, 8, 2),(7, 0, 0, 3),(8, 1, 0, 3),(0, 2, 0, 3),(1, 3, 0, 3),(2, 4, 0, 3),(3, 5, 0, 3),(4, 6, 0, 3),(5, 7, 0, 3),(6, 8, 0, 3),(8, 0, 1, 3),(0, 1, 1, 3),(1, 2, 1, 3),(2, 3, 1, 3),(3, 4, 1, 3),(4, 5, 1, 3),(5, 6, 1, 3),(6, 7, 1, 3),(7, 8, 1, 3),(0, 0, 2, 3),(1, 1, 2, 3),(2, 2, 2, 3),(3, 3, 2, 3),(4, 4, 2, 3),(5, 5, 2, 3),(6, 6, 2, 3),(7, 7, 2, 3),(8, 8, 2, 3),(1, 0, 3, 3),(2, 1, 3, 3),(3, 2, 3, 3),(4, 3, 3, 3),(5, 4, 3, 3),(6, 5, 3, 3),(7, 6, 3, 3),(8, 7, 3, 3),(0, 8, 3, 3),(2, 0, 4, 3),(3, 1, 4, 3),(4, 2, 4, 3),(5, 3, 4, 3),(6, 4, 4, 3),(7, 5, 4, 3),(8, 6, 4, 3),(0, 7, 4, 3),(1, 8, 4, 3),(3, 0, 5, 3),(4, 1, 5, 3),(5, 2, 5, 3),(6, 3, 5, 3),(7, 4, 5, 3),(8, 5, 5, 3),(0, 6, 5, 3),(1, 7, 5, 3),(2, 8, 5, 3),(4, 0, 6, 3),(5, 1, 6, 3),(6, 2, 6, 3),(7, 3, 6, 3),(8, 4, 6, 3),(0, 5, 6, 3),(1, 6, 6, 3),(2, 7, 6, 3),(3, 8, 6, 3),(5, 0, 7, 3),(6, 1, 7, 3),(7, 2, 7, 3),(8, 3, 7, 3),(0, 4, 7, 3),(1, 5, 7, 3),(2, 6, 7, 3),(3, 7, 7, 3),(4, 8, 7, 3),(6, 0, 8, 3),(7, 1, 8, 3),(8, 2, 8, 3),(0, 3, 8, 3),(1, 4, 8, 3),(2, 5, 8, 3),(3, 6, 8, 3),(4, 7, 8, 3),(5, 8, 8, 3),(8, 0, 0, 4),(0, 1, 0, 4),(1, 2, 0, 4),(2, 3, 0, 4),(3, 4, 0, 4),(4, 5, 0, 4),(5, 6, 0, 4),(6, 7, 0, 4),(7, 8, 0, 4),(0, 0, 1, 4),(1, 1, 1, 4),(2, 2, 1, 4),(3, 3, 1, 4),(4, 4, 1, 4),(5, 5, 1, 4),(6, 6, 1, 4),(7, 7, 1, 4),(8, 8, 1, 4),(1, 0, 2, 4),(2, 1, 2, 4),(3, 2, 2, 4),(4, 3, 2, 4),(5, 4, 2, 4),(6, 5, 2, 4),(7, 6, 2, 4),(8, 7, 2, 4),(0, 8, 2, 4),(2, 0, 3, 4),(3, 1, 3, 4),(4, 2, 3, 4),(5, 3, 3, 4),(6, 4, 3, 4),(7, 5, 3, 4),(8, 6, 3, 4),(0, 7, 3, 4),(1, 8, 3, 4),(3, 0, 4, 4),(4, 1, 4, 4),(5, 2, 4, 4),(6, 3, 4, 4),(7, 4, 4, 4),(8, 5, 4, 4),(0, 6, 4, 4),(1, 7, 4, 4),(2, 8, 4, 4),(4, 0, 5, 4),(5, 1, 5, 4),(6, 2, 5, 4),(7, 3, 5, 4),(8, 4, 5, 4),(0, 5, 5, 4),(1, 6, 5, 4),(2, 7, 5, 4),(3, 8, 5, 4),(5, 0, 6, 4),(6, 1, 6, 4),(7, 2, 6, 4),(8, 3, 6, 4),(0, 4, 6, 4),(1, 5, 6, 4),(2, 6, 6, 4),(3, 7, 6, 4),(4, 8, 6, 4),(6, 0, 7, 4),(7, 1, 7, 4),(8, 2, 7, 4),(0, 3, 7, 4),(1, 4, 7, 4),(2, 5, 7, 4),(3, 6, 7, 4),(4, 7, 7, 4),(5, 8, 7, 4),(7, 0, 8, 4),(8, 1, 8, 4),(0, 2, 8, 4),(1, 3, 8, 4),(2, 4, 8, 4),(3, 5, 8, 4),(4, 6, 8, 4),(5, 7, 8, 4),(6, 8, 8, 4),(0, 0, 0, 5),(1, 1, 0, 5),(2, 2, 0, 5),(3, 3, 0, 5),(4, 4, 0, 5),(5, 5, 0, 5),(6, 6, 0, 5),(7, 7, 0, 5),(8, 8, 0, 5),(1, 0, 1, 5),(2, 1, 1, 5),(3, 2, 1, 5),(4, 3, 1, 5),(5, 4, 1, 5),(6, 5, 1, 5),(7, 6, 1, 5),(8, 7, 1, 5),(0, 8, 1, 5),(2, 0, 2, 5),(3, 1, 2, 5),(4, 2, 2, 5),(5, 3, 2, 5),(6, 4, 2, 5),(7, 5, 2, 5),(8, 6, 2, 5),(0, 7, 2, 5),(1, 8, 2, 5),(3, 0, 3, 5),(4, 1, 3, 5),(5, 2, 3, 5),(6, 3, 3, 5),(7, 4, 3, 5),(8, 5, 3, 5),(0, 6, 3, 5),(1, 7, 3, 5),(2, 8, 3, 5),(4, 0, 4, 5),(5, 1, 4, 5),(6, 2, 4, 5),(7, 3, 4, 5),(8, 4, 4, 5),(0, 5, 4, 5),(1, 6, 4, 5),(2, 7, 4, 5),(3, 8, 4, 5),(5, 0, 5, 5),(6, 1, 5, 5),(7, 2, 5, 5),(8, 3, 5, 5),(0, 4, 5, 5),(1, 5, 5, 5),(2, 6, 5, 5),(3, 7, 5, 5),(4, 8, 5, 5),(6, 0, 6, 5),(7, 1, 6, 5),(8, 2, 6, 5),(0, 3, 6, 5),(1, 4, 6, 5),(2, 5, 6, 5),(3, 6, 6, 5),(4, 7, 6, 5),(5, 8, 6, 5),(7, 0, 7, 5),(8, 1, 7, 5),(0, 2, 7, 5),(1, 3, 7, 5),(2, 4, 7, 5),(3, 5, 7, 5),(4, 6, 7, 5),(5, 7, 7, 5),(6, 8, 7, 5),(8, 0, 8, 5),(0, 1, 8, 5),(1, 2, 8, 5),(2, 3, 8, 5),(3, 4, 8, 5),(4, 5, 8, 5),(5, 6, 8, 5),(6, 7, 8, 5),(7, 8, 8, 5),(1, 0, 0, 6),(2, 1, 0, 6),(3, 2, 0, 6),(4, 3, 0, 6),(5, 4, 0, 6),(6, 5, 0, 6),(7, 6, 0, 6),(8, 7, 0, 6),(0, 8, 0, 6),(2, 0, 1, 6),(3, 1, 1, 6),(4, 2, 1, 6),(5, 3, 1, 6),(6, 4, 1, 6),(7, 5, 1, 6),(8, 6, 1, 6),(0, 7, 1, 6),(1, 8, 1, 6),(3, 0, 2, 6),(4, 1, 2, 6),(5, 2, 2, 6),(6, 3, 2, 6),(7, 4, 2, 6),(8, 5, 2, 6),(0, 6, 2, 6),(1, 7, 2, 6),(2, 8, 2, 6),(4, 0, 3, 6),(5, 1, 3, 6),(6, 2, 3, 6),(7, 3, 3, 6),(8, 4, 3, 6),(0, 5, 3, 6),(1, 6, 3, 6),(2, 7, 3, 6),(3, 8, 3, 6),(5, 0, 4, 6),(6, 1, 4, 6),(7, 2, 4, 6),(8, 3, 4, 6),(0, 4, 4, 6),(1, 5, 4, 6),(2, 6, 4, 6),(3, 7, 4, 6),(4, 8, 4, 6),(6, 0, 5, 6),(7, 1, 5, 6),(8, 2, 5, 6),(0, 3, 5, 6),(1, 4, 5, 6),(2, 5, 5, 6),(3, 6, 5, 6),(4, 7, 5, 6),(5, 8, 5, 6),(7, 0, 6, 6),(8, 1, 6, 6),(0, 2, 6, 6),(1, 3, 6, 6),(2, 4, 6, 6),(3, 5, 6, 6),(4, 6, 6, 6),(5, 7, 6, 6),(6, 8, 6, 6),(8, 0, 7, 6),(0, 1, 7, 6),(1, 2, 7, 6),(2, 3, 7, 6),(3, 4, 7, 6),(4, 5, 7, 6),(5, 6, 7, 6),(6, 7, 7, 6),(7, 8, 7, 6),(0, 0, 8, 6),(1, 1, 8, 6),(2, 2, 8, 6),(3, 3, 8, 6),(4, 4, 8, 6),(5, 5, 8, 6),(6, 6, 8, 6),(7, 7, 8, 6),(8, 8, 8, 6),(2, 0, 0, 7),(3, 1, 0, 7),(4, 2, 0, 7),(5, 3, 0, 7),(6, 4, 0, 7),(7, 5, 0, 7),(8, 6, 0, 7),(0, 7, 0, 7),(1, 8, 0, 7),(3, 0, 1, 7),(4, 1, 1, 7),(5, 2, 1, 7),(6, 3, 1, 7),(7, 4, 1, 7),(8, 5, 1, 7),(0, 6, 1, 7),(1, 7, 1, 7),(2, 8, 1, 7),(4, 0, 2, 7),(5, 1, 2, 7),(6, 2, 2, 7),(7, 3, 2, 7),(8, 4, 2, 7),(0, 5, 2, 7),(1, 6, 2, 7),(2, 7, 2, 7),(3, 8, 2, 7),(5, 0, 3, 7),(6, 1, 3, 7),(7, 2, 3, 7),(8, 3, 3, 7),(0, 4, 3, 7),(1, 5, 3, 7),(2, 6, 3, 7),(3, 7, 3, 7),(4, 8, 3, 7),(6, 0, 4, 7),(7, 1, 4, 7),(8, 2, 4, 7),(0, 3, 4, 7),(1, 4, 4, 7),(2, 5, 4, 7),(3, 6, 4, 7),(4, 7, 4, 7),(5, 8, 4, 7),(7, 0, 5, 7),(8, 1, 5, 7),(0, 2, 5, 7),(1, 3, 5, 7),(2, 4, 5, 7),(3, 5, 5, 7),(4, 6, 5, 7),(5, 7, 5, 7),(6, 8, 5, 7),(8, 0, 6, 7),(0, 1, 6, 7),(1, 2, 6, 7),(2, 3, 6, 7),(3, 4, 6, 7),(4, 5, 6, 7),(5, 6, 6, 7),(6, 7, 6, 7),(7, 8, 6, 7),(0, 0, 7, 7),(1, 1, 7, 7),(2, 2, 7, 7),(3, 3, 7, 7),(4, 4, 7, 7),(5, 5, 7, 7),(6, 6, 7, 7),(7, 7, 7, 7),(8, 8, 7, 7),(1, 0, 8, 7),(2, 1, 8, 7),(3, 2, 8, 7),(4, 3, 8, 7),(5, 4, 8, 7),(6, 5, 8, 7),(7, 6, 8, 7),(8, 7, 8, 7),(0, 8, 8, 7),(3, 0, 0, 8),(4, 1, 0, 8),(5, 2, 0, 8),(6, 3, 0, 8),(7, 4, 0, 8),(8, 5, 0, 8),(0, 6, 0, 8),(1, 7, 0, 8),(2, 8, 0, 8),(4, 0, 1, 8),(5, 1, 1, 8),(6, 2, 1, 8),(7, 3, 1, 8),(8, 4, 1, 8),(0, 5, 1, 8),(1, 6, 1, 8),(2, 7, 1, 8),(3, 8, 1, 8),(5, 0, 2, 8),(6, 1, 2, 8),(7, 2, 2, 8),(8, 3, 2, 8),(0, 4, 2, 8),(1, 5, 2, 8),(2, 6, 2, 8),(3, 7, 2, 8),(4, 8, 2, 8),(6, 0, 3, 8),(7, 1, 3, 8),(8, 2, 3, 8),(0, 3, 3, 8),(1, 4, 3, 8),(2, 5, 3, 8),(3, 6, 3, 8),(4, 7, 3, 8),(5, 8, 3, 8),(7, 0, 4, 8),(8, 1, 4, 8),(0, 2, 4, 8),(1, 3, 4, 8),(2, 4, 4, 8),(3, 5, 4, 8),(4, 6, 4, 8),(5, 7, 4, 8),(6, 8, 4, 8),(8, 0, 5, 8),(0, 1, 5, 8),(1, 2, 5, 8),(2, 3, 5, 8),(3, 4, 5, 8),(4, 5, 5, 8),(5, 6, 5, 8),(6, 7, 5, 8),(7, 8, 5, 8),(0, 0, 6, 8),(1, 1, 6, 8),(2, 2, 6, 8),(3, 3, 6, 8),(4, 4, 6, 8),(5, 5, 6, 8),(6, 6, 6, 8),(7, 7, 6, 8),(8, 8, 6, 8),(1, 0, 7, 8),(2, 1, 7, 8),(3, 2, 7, 8),(4, 3, 7, 8),(5, 4, 7, 8),(6, 5, 7, 8),(7, 6, 7, 8),(8, 7, 7, 8),(0, 8, 7, 8),(2, 0, 8, 8),(3, 1, 8, 8),(4, 2, 8, 8),(5, 3, 8, 8),(6, 4, 8, 8),(7, 5, 8, 8),(8, 6, 8, 8),(0, 7, 8, 8),(1, 8, 8, 8)],
[(5, 0, 0, 0),(6, 1, 0, 0),(7, 2, 0, 0),(8, 3, 0, 0),(0, 4, 0, 0),(1, 5, 0, 0),(2, 6, 0, 0),(3, 7, 0, 0),(4, 8, 0, 0),(6, 0, 1, 0),(7, 1, 1, 0),(8, 2, 1, 0),(0, 3, 1, 0),(1, 4, 1, 0),(2, 5, 1, 0),(3, 6, 1, 0),(4, 7, 1, 0),(5, 8, 1, 0),(7, 0, 2, 0),(8, 1, 2, 0),(0, 2, 2, 0),(1, 3, 2, 0),(2, 4, 2, 0),(3, 5, 2, 0),(4, 6, 2, 0),(5, 7, 2, 0),(6, 8, 2, 0),(8, 0, 3, 0),(0, 1, 3, 0),(1, 2, 3, 0),(2, 3, 3, 0),(3, 4, 3, 0),(4, 5, 3, 0),(5, 6, 3, 0),(6, 7, 3, 0),(7, 8, 3, 0),(0, 0, 4, 0),(1, 1, 4, 0),(2, 2, 4, 0),(3, 3, 4, 0),(4, 4, 4, 0),(5, 5, 4, 0),(6, 6, 4, 0),(7, 7, 4, 0),(8, 8, 4, 0),(1, 0, 5, 0),(2, 1, 5, 0),(3, 2, 5, 0),(4, 3, 5, 0),(5, 4, 5, 0),(6, 5, 5, 0),(7, 6, 5, 0),(8, 7, 5, 0),(0, 8, 5, 0),(2, 0, 6, 0),(3, 1, 6, 0),(4, 2, 6, 0),(5, 3, 6, 0),(6, 4, 6, 0),(7, 5, 6, 0),(8, 6, 6, 0),(0, 7, 6, 0),(1, 8, 6, 0),(3, 0, 7, 0),(4, 1, 7, 0),(5, 2, 7, 0),(6, 3, 7, 0),(7, 4, 7, 0),(8, 5, 7, 0),(0, 6, 7, 0),(1, 7, 7, 0),(2, 8, 7, 0),(4, 0, 8, 0),(5, 1, 8, 0),(6, 2, 8, 0),(7, 3, 8, 0),(8, 4, 8, 0),(0, 5, 8, 0),(1, 6, 8, 0),(2, 7, 8, 0),(3, 8, 8, 0),(6, 0, 0, 1),(7, 1, 0, 1),(8, 2, 0, 1),(0, 3, 0, 1),(1, 4, 0, 1),(2, 5, 0, 1),(3, 6, 0, 1),(4, 7, 0, 1),(5, 8, 0, 1),(7, 0, 1, 1),(8, 1, 1, 1),(0, 2, 1, 1),(1, 3, 1, 1),(2, 4, 1, 1),(3, 5, 1, 1),(4, 6, 1, 1),(5, 7, 1, 1),(6, 8, 1, 1),(8, 0, 2, 1),(0, 1, 2, 1),(1, 2, 2, 1),(2, 3, 2, 1),(3, 4, 2, 1),(4, 5, 2, 1),(5, 6, 2, 1),(6, 7, 2, 1),(7, 8, 2, 1),(0, 0, 3, 1),(1, 1, 3, 1),(2, 2, 3, 1),(3, 3, 3, 1),(4, 4, 3, 1),(5, 5, 3, 1),(6, 6, 3, 1),(7, 7, 3, 1),(8, 8, 3, 1),(1, 0, 4, 1),(2, 1, 4, 1),(3, 2, 4, 1),(4, 3, 4, 1),(5, 4, 4, 1),(6, 5, 4, 1),(7, 6, 4, 1),(8, 7, 4, 1),(0, 8, 4, 1),(2, 0, 5, 1),(3, 1, 5, 1),(4, 2, 5, 1),(5, 3, 5, 1),(6, 4, 5, 1),(7, 5, 5, 1),(8, 6, 5, 1),(0, 7, 5, 1),(1, 8, 5, 1),(3, 0, 6, 1),(4, 1, 6, 1),(5, 2, 6, 1),(6, 3, 6, 1),(7, 4, 6, 1),(8, 5, 6, 1),(0, 6, 6, 1),(1, 7, 6, 1),(2, 8, 6, 1),(4, 0, 7, 1),(5, 1, 7, 1),(6, 2, 7, 1),(7, 3, 7, 1),(8, 4, 7, 1),(0, 5, 7, 1),(1, 6, 7, 1),(2, 7, 7, 1),(3, 8, 7, 1),(5, 0, 8, 1),(6, 1, 8, 1),(7, 2, 8, 1),(8, 3, 8, 1),(0, 4, 8, 1),(1, 5, 8, 1),(2, 6, 8, 1),(3, 7, 8, 1),(4, 8, 8, 1),(7, 0, 0, 2),(8, 1, 0, 2),(0, 2, 0, 2),(1, 3, 0, 2),(2, 4, 0, 2),(3, 5, 0, 2),(4, 6, 0, 2),(5, 7, 0, 2),(6, 8, 0, 2),(8, 0, 1, 2),(0, 1, 1, 2),(1, 2, 1, 2),(2, 3, 1, 2),(3, 4, 1, 2),(4, 5, 1, 2),(5, 6, 1, 2),(6, 7, 1, 2),(7, 8, 1, 2),(0, 0, 2, 2),(1, 1, 2, 2),(2, 2, 2, 2),(3, 3, 2, 2),(4, 4, 2, 2),(5, 5, 2, 2),(6, 6, 2, 2),(7, 7, 2, 2),(8, 8, 2, 2),(1, 0, 3, 2),(2, 1, 3, 2),(3, 2, 3, 2),(4, 3, 3, 2),(5, 4, 3, 2),(6, 5, 3, 2),(7, 6, 3, 2),(8, 7, 3, 2),(0, 8, 3, 2),(2, 0, 4, 2),(3, 1, 4, 2),(4, 2, 4, 2),(5, 3, 4, 2),(6, 4, 4, 2),(7, 5, 4, 2),(8, 6, 4, 2),(0, 7, 4, 2),(1, 8, 4, 2),(3, 0, 5, 2),(4, 1, 5, 2),(5, 2, 5, 2),(6, 3, 5, 2),(7, 4, 5, 2),(8, 5, 5, 2),(0, 6, 5, 2),(1, 7, 5, 2),(2, 8, 5, 2),(4, 0, 6, 2),(5, 1, 6, 2),(6, 2, 6, 2),(7, 3, 6, 2),(8, 4, 6, 2),(0, 5, 6, 2),(1, 6, 6, 2),(2, 7, 6, 2),(3, 8, 6, 2),(5, 0, 7, 2),(6, 1, 7, 2),(7, 2, 7, 2),(8, 3, 7, 2),(0, 4, 7, 2),(1, 5, 7, 2),(2, 6, 7, 2),(3, 7, 7, 2),(4, 8, 7, 2),(6, 0, 8, 2),(7, 1, 8, 2),(8, 2, 8, 2),(0, 3, 8, 2),(1, 4, 8, 2),(2, 5, 8, 2),(3, 6, 8, 2),(4, 7, 8, 2),(5, 8, 8, 2),(8, 0, 0, 3),(0, 1, 0, 3),(1, 2, 0, 3),(2, 3, 0, 3),(3, 4, 0, 3),(4, 5, 0, 3),(5, 6, 0, 3),(6, 7, 0, 3),(7, 8, 0, 3),(0, 0, 1, 3),(1, 1, 1, 3),(2, 2, 1, 3),(3, 3, 1, 3),(4, 4, 1, 3),(5, 5, 1, 3),(6, 6, 1, 3),(7, 7, 1, 3),(8, 8, 1, 3),(1, 0, 2, 3),(2, 1, 2, 3),(3, 2, 2, 3),(4, 3, 2, 3),(5, 4, 2, 3),(6, 5, 2, 3),(7, 6, 2, 3),(8, 7, 2, 3),(0, 8, 2, 3),(2, 0, 3, 3),(3, 1, 3, 3),(4, 2, 3, 3),(5, 3, 3, 3),(6, 4, 3, 3),(7, 5, 3, 3),(8, 6, 3, 3),(0, 7, 3, 3),(1, 8, 3, 3),(3, 0, 4, 3),(4, 1, 4, 3),(5, 2, 4, 3),(6, 3, 4, 3),(7, 4, 4, 3),(8, 5, 4, 3),(0, 6, 4, 3),(1, 7, 4, 3),(2, 8, 4, 3),(4, 0, 5, 3),(5, 1, 5, 3),(6, 2, 5, 3),(7, 3, 5, 3),(8, 4, 5, 3),(0, 5, 5, 3),(1, 6, 5, 3),(2, 7, 5, 3),(3, 8, 5, 3),(5, 0, 6, 3),(6, 1, 6, 3),(7, 2, 6, 3),(8, 3, 6, 3),(0, 4, 6, 3),(1, 5, 6, 3),(2, 6, 6, 3),(3, 7, 6, 3),(4, 8, 6, 3),(6, 0, 7, 3),(7, 1, 7, 3),(8, 2, 7, 3),(0, 3, 7, 3),(1, 4, 7, 3),(2, 5, 7, 3),(3, 6, 7, 3),(4, 7, 7, 3),(5, 8, 7, 3),(7, 0, 8, 3),(8, 1, 8, 3),(0, 2, 8, 3),(1, 3, 8, 3),(2, 4, 8, 3),(3, 5, 8, 3),(4, 6, 8, 3),(5, 7, 8, 3),(6, 8, 8, 3),(0, 0, 0, 4),(1, 1, 0, 4),(2, 2, 0, 4),(3, 3, 0, 4),(4, 4, 0, 4),(5, 5, 0, 4),(6, 6, 0, 4),(7, 7, 0, 4),(8, 8, 0, 4),(1, 0, 1, 4),(2, 1, 1, 4),(3, 2, 1, 4),(4, 3, 1, 4),(5, 4, 1, 4),(6, 5, 1, 4),(7, 6, 1, 4),(8, 7, 1, 4),(0, 8, 1, 4),(2, 0, 2, 4),(3, 1, 2, 4),(4, 2, 2, 4),(5, 3, 2, 4),(6, 4, 2, 4),(7, 5, 2, 4),(8, 6, 2, 4),(0, 7, 2, 4),(1, 8, 2, 4),(3, 0, 3, 4),(4, 1, 3, 4),(5, 2, 3, 4),(6, 3, 3, 4),(7, 4, 3, 4),(8, 5, 3, 4),(0, 6, 3, 4),(1, 7, 3, 4),(2, 8, 3, 4),(4, 0, 4, 4),(5, 1, 4, 4),(6, 2, 4, 4),(7, 3, 4, 4),(8, 4, 4, 4),(0, 5, 4, 4),(1, 6, 4, 4),(2, 7, 4, 4),(3, 8, 4, 4),(5, 0, 5, 4),(6, 1, 5, 4),(7, 2, 5, 4),(8, 3, 5, 4),(0, 4, 5, 4),(1, 5, 5, 4),(2, 6, 5, 4),(3, 7, 5, 4),(4, 8, 5, 4),(6, 0, 6, 4),(7, 1, 6, 4),(8, 2, 6, 4),(0, 3, 6, 4),(1, 4, 6, 4),(2, 5, 6, 4),(3, 6, 6, 4),(4, 7, 6, 4),(5, 8, 6, 4),(7, 0, 7, 4),(8, 1, 7, 4),(0, 2, 7, 4),(1, 3, 7, 4),(2, 4, 7, 4),(3, 5, 7, 4),(4, 6, 7, 4),(5, 7, 7, 4),(6, 8, 7, 4),(8, 0, 8, 4),(0, 1, 8, 4),(1, 2, 8, 4),(2, 3, 8, 4),(3, 4, 8, 4),(4, 5, 8, 4),(5, 6, 8, 4),(6, 7, 8, 4),(7, 8, 8, 4),(1, 0, 0, 5),(2, 1, 0, 5),(3, 2, 0, 5),(4, 3, 0, 5),(5, 4, 0, 5),(6, 5, 0, 5),(7, 6, 0, 5),(8, 7, 0, 5),(0, 8, 0, 5),(2, 0, 1, 5),(3, 1, 1, 5),(4, 2, 1, 5),(5, 3, 1, 5),(6, 4, 1, 5),(7, 5, 1, 5),(8, 6, 1, 5),(0, 7, 1, 5),(1, 8, 1, 5),(3, 0, 2, 5),(4, 1, 2, 5),(5, 2, 2, 5),(6, 3, 2, 5),(7, 4, 2, 5),(8, 5, 2, 5),(0, 6, 2, 5),(1, 7, 2, 5),(2, 8, 2, 5),(4, 0, 3, 5),(5, 1, 3, 5),(6, 2, 3, 5),(7, 3, 3, 5),(8, 4, 3, 5),(0, 5, 3, 5),(1, 6, 3, 5),(2, 7, 3, 5),(3, 8, 3, 5),(5, 0, 4, 5),(6, 1, 4, 5),(7, 2, 4, 5),(8, 3, 4, 5),(0, 4, 4, 5),(1, 5, 4, 5),(2, 6, 4, 5),(3, 7, 4, 5),(4, 8, 4, 5),(6, 0, 5, 5),(7, 1, 5, 5),(8, 2, 5, 5),(0, 3, 5, 5),(1, 4, 5, 5),(2, 5, 5, 5),(3, 6, 5, 5),(4, 7, 5, 5),(5, 8, 5, 5),(7, 0, 6, 5),(8, 1, 6, 5),(0, 2, 6, 5),(1, 3, 6, 5),(2, 4, 6, 5),(3, 5, 6, 5),(4, 6, 6, 5),(5, 7, 6, 5),(6, 8, 6, 5),(8, 0, 7, 5),(0, 1, 7, 5),(1, 2, 7, 5),(2, 3, 7, 5),(3, 4, 7, 5),(4, 5, 7, 5),(5, 6, 7, 5),(6, 7, 7, 5),(7, 8, 7, 5),(0, 0, 8, 5),(1, 1, 8, 5),(2, 2, 8, 5),(3, 3, 8, 5),(4, 4, 8, 5),(5, 5, 8, 5),(6, 6, 8, 5),(7, 7, 8, 5),(8, 8, 8, 5),(2, 0, 0, 6),(3, 1, 0, 6),(4, 2, 0, 6),(5, 3, 0, 6),(6, 4, 0, 6),(7, 5, 0, 6),(8, 6, 0, 6),(0, 7, 0, 6),(1, 8, 0, 6),(3, 0, 1, 6),(4, 1, 1, 6),(5, 2, 1, 6),(6, 3, 1, 6),(7, 4, 1, 6),(8, 5, 1, 6),(0, 6, 1, 6),(1, 7, 1, 6),(2, 8, 1, 6),(4, 0, 2, 6),(5, 1, 2, 6),(6, 2, 2, 6),(7, 3, 2, 6),(8, 4, 2, 6),(0, 5, 2, 6),(1, 6, 2, 6),(2, 7, 2, 6),(3, 8, 2, 6),(5, 0, 3, 6),(6, 1, 3, 6),(7, 2, 3, 6),(8, 3, 3, 6),(0, 4, 3, 6),(1, 5, 3, 6),(2, 6, 3, 6),(3, 7, 3, 6),(4, 8, 3, 6),(6, 0, 4, 6),(7, 1, 4, 6),(8, 2, 4, 6),(0, 3, 4, 6),(1, 4, 4, 6),(2, 5, 4, 6),(3, 6, 4, 6),(4, 7, 4, 6),(5, 8, 4, 6),(7, 0, 5, 6),(8, 1, 5, 6),(0, 2, 5, 6),(1, 3, 5, 6),(2, 4, 5, 6),(3, 5, 5, 6),(4, 6, 5, 6),(5, 7, 5, 6),(6, 8, 5, 6),(8, 0, 6, 6),(0, 1, 6, 6),(1, 2, 6, 6),(2, 3, 6, 6),(3, 4, 6, 6),(4, 5, 6, 6),(5, 6, 6, 6),(6, 7, 6, 6),(7, 8, 6, 6),(0, 0, 7, 6),(1, 1, 7, 6),(2, 2, 7, 6),(3, 3, 7, 6),(4, 4, 7, 6),(5, 5, 7, 6),(6, 6, 7, 6),(7, 7, 7, 6),(8, 8, 7, 6),(1, 0, 8, 6),(2, 1, 8, 6),(3, 2, 8, 6),(4, 3, 8, 6),(5, 4, 8, 6),(6, 5, 8, 6),(7, 6, 8, 6),(8, 7, 8, 6),(0, 8, 8, 6),(3, 0, 0, 7),(4, 1, 0, 7),(5, 2, 0, 7),(6, 3, 0, 7),(7, 4, 0, 7),(8, 5, 0, 7),(0, 6, 0, 7),(1, 7, 0, 7),(2, 8, 0, 7),(4, 0, 1, 7),(5, 1, 1, 7),(6, 2, 1, 7),(7, 3, 1, 7),(8, 4, 1, 7),(0, 5, 1, 7),(1, 6, 1, 7),(2, 7, 1, 7),(3, 8, 1, 7),(5, 0, 2, 7),(6, 1, 2, 7),(7, 2, 2, 7),(8, 3, 2, 7),(0, 4, 2, 7),(1, 5, 2, 7),(2, 6, 2, 7),(3, 7, 2, 7),(4, 8, 2, 7),(6, 0, 3, 7),(7, 1, 3, 7),(8, 2, 3, 7),(0, 3, 3, 7),(1, 4, 3, 7),(2, 5, 3, 7),(3, 6, 3, 7),(4, 7, 3, 7),(5, 8, 3, 7),(7, 0, 4, 7),(8, 1, 4, 7),(0, 2, 4, 7),(1, 3, 4, 7),(2, 4, 4, 7),(3, 5, 4, 7),(4, 6, 4, 7),(5, 7, 4, 7),(6, 8, 4, 7),(8, 0, 5, 7),(0, 1, 5, 7),(1, 2, 5, 7),(2, 3, 5, 7),(3, 4, 5, 7),(4, 5, 5, 7),(5, 6, 5, 7),(6, 7, 5, 7),(7, 8, 5, 7),(0, 0, 6, 7),(1, 1, 6, 7),(2, 2, 6, 7),(3, 3, 6, 7),(4, 4, 6, 7),(5, 5, 6, 7),(6, 6, 6, 7),(7, 7, 6, 7),(8, 8, 6, 7),(1, 0, 7, 7),(2, 1, 7, 7),(3, 2, 7, 7),(4, 3, 7, 7),(5, 4, 7, 7),(6, 5, 7, 7),(7, 6, 7, 7),(8, 7, 7, 7),(0, 8, 7, 7),(2, 0, 8, 7),(3, 1, 8, 7),(4, 2, 8, 7),(5, 3, 8, 7),(6, 4, 8, 7),(7, 5, 8, 7),(8, 6, 8, 7),(0, 7, 8, 7),(1, 8, 8, 7),(4, 0, 0, 8),(5, 1, 0, 8),(6, 2, 0, 8),(7, 3, 0, 8),(8, 4, 0, 8),(0, 5, 0, 8),(1, 6, 0, 8),(2, 7, 0, 8),(3, 8, 0, 8),(5, 0, 1, 8),(6, 1, 1, 8),(7, 2, 1, 8),(8, 3, 1, 8),(0, 4, 1, 8),(1, 5, 1, 8),(2, 6, 1, 8),(3, 7, 1, 8),(4, 8, 1, 8),(6, 0, 2, 8),(7, 1, 2, 8),(8, 2, 2, 8),(0, 3, 2, 8),(1, 4, 2, 8),(2, 5, 2, 8),(3, 6, 2, 8),(4, 7, 2, 8),(5, 8, 2, 8),(7, 0, 3, 8),(8, 1, 3, 8),(0, 2, 3, 8),(1, 3, 3, 8),(2, 4, 3, 8),(3, 5, 3, 8),(4, 6, 3, 8),(5, 7, 3, 8),(6, 8, 3, 8),(8, 0, 4, 8),(0, 1, 4, 8),(1, 2, 4, 8),(2, 3, 4, 8),(3, 4, 4, 8),(4, 5, 4, 8),(5, 6, 4, 8),(6, 7, 4, 8),(7, 8, 4, 8),(0, 0, 5, 8),(1, 1, 5, 8),(2, 2, 5, 8),(3, 3, 5, 8),(4, 4, 5, 8),(5, 5, 5, 8),(6, 6, 5, 8),(7, 7, 5, 8),(8, 8, 5, 8),(1, 0, 6, 8),(2, 1, 6, 8),(3, 2, 6, 8),(4, 3, 6, 8),(5, 4, 6, 8),(6, 5, 6, 8),(7, 6, 6, 8),(8, 7, 6, 8),(0, 8, 6, 8),(2, 0, 7, 8),(3, 1, 7, 8),(4, 2, 7, 8),(5, 3, 7, 8),(6, 4, 7, 8),(7, 5, 7, 8),(8, 6, 7, 8),(0, 7, 7, 8),(1, 8, 7, 8),(3, 0, 8, 8),(4, 1, 8, 8),(5, 2, 8, 8),(6, 3, 8, 8),(7, 4, 8, 8),(8, 5, 8, 8),(0, 6, 8, 8),(1, 7, 8, 8),(2, 8, 8, 8)],
[(6, 0, 0, 0),(7, 1, 0, 0),(8, 2, 0, 0),(0, 3, 0, 0),(1, 4, 0, 0),(2, 5, 0, 0),(3, 6, 0, 0),(4, 7, 0, 0),(5, 8, 0, 0),(7, 0, 1, 0),(8, 1, 1, 0),(0, 2, 1, 0),(1, 3, 1, 0),(2, 4, 1, 0),(3, 5, 1, 0),(4, 6, 1, 0),(5, 7, 1, 0),(6, 8, 1, 0),(8, 0, 2, 0),(0, 1, 2, 0),(1, 2, 2, 0),(2, 3, 2, 0),(3, 4, 2, 0),(4, 5, 2, 0),(5, 6, 2, 0),(6, 7, 2, 0),(7, 8, 2, 0),(0, 0, 3, 0),(1, 1, 3, 0),(2, 2, 3, 0),(3, 3, 3, 0),(4, 4, 3, 0),(5, 5, 3, 0),(6, 6, 3, 0),(7, 7, 3, 0),(8, 8, 3, 0),(1, 0, 4, 0),(2, 1, 4, 0),(3, 2, 4, 0),(4, 3, 4, 0),(5, 4, 4, 0),(6, 5, 4, 0),(7, 6, 4, 0),(8, 7, 4, 0),(0, 8, 4, 0),(2, 0, 5, 0),(3, 1, 5, 0),(4, 2, 5, 0),(5, 3, 5, 0),(6, 4, 5, 0),(7, 5, 5, 0),(8, 6, 5, 0),(0, 7, 5, 0),(1, 8, 5, 0),(3, 0, 6, 0),(4, 1, 6, 0),(5, 2, 6, 0),(6, 3, 6, 0),(7, 4, 6, 0),(8, 5, 6, 0),(0, 6, 6, 0),(1, 7, 6, 0),(2, 8, 6, 0),(4, 0, 7, 0),(5, 1, 7, 0),(6, 2, 7, 0),(7, 3, 7, 0),(8, 4, 7, 0),(0, 5, 7, 0),(1, 6, 7, 0),(2, 7, 7, 0),(3, 8, 7, 0),(5, 0, 8, 0),(6, 1, 8, 0),(7, 2, 8, 0),(8, 3, 8, 0),(0, 4, 8, 0),(1, 5, 8, 0),(2, 6, 8, 0),(3, 7, 8, 0),(4, 8, 8, 0),(7, 0, 0, 1),(8, 1, 0, 1),(0, 2, 0, 1),(1, 3, 0, 1),(2, 4, 0, 1),(3, 5, 0, 1),(4, 6, 0, 1),(5, 7, 0, 1),(6, 8, 0, 1),(8, 0, 1, 1),(0, 1, 1, 1),(1, 2, 1, 1),(2, 3, 1, 1),(3, 4, 1, 1),(4, 5, 1, 1),(5, 6, 1, 1),(6, 7, 1, 1),(7, 8, 1, 1),(0, 0, 2, 1),(1, 1, 2, 1),(2, 2, 2, 1),(3, 3, 2, 1),(4, 4, 2, 1),(5, 5, 2, 1),(6, 6, 2, 1),(7, 7, 2, 1),(8, 8, 2, 1),(1, 0, 3, 1),(2, 1, 3, 1),(3, 2, 3, 1),(4, 3, 3, 1),(5, 4, 3, 1),(6, 5, 3, 1),(7, 6, 3, 1),(8, 7, 3, 1),(0, 8, 3, 1),(2, 0, 4, 1),(3, 1, 4, 1),(4, 2, 4, 1),(5, 3, 4, 1),(6, 4, 4, 1),(7, 5, 4, 1),(8, 6, 4, 1),(0, 7, 4, 1),(1, 8, 4, 1),(3, 0, 5, 1),(4, 1, 5, 1),(5, 2, 5, 1),(6, 3, 5, 1),(7, 4, 5, 1),(8, 5, 5, 1),(0, 6, 5, 1),(1, 7, 5, 1),(2, 8, 5, 1),(4, 0, 6, 1),(5, 1, 6, 1),(6, 2, 6, 1),(7, 3, 6, 1),(8, 4, 6, 1),(0, 5, 6, 1),(1, 6, 6, 1),(2, 7, 6, 1),(3, 8, 6, 1),(5, 0, 7, 1),(6, 1, 7, 1),(7, 2, 7, 1),(8, 3, 7, 1),(0, 4, 7, 1),(1, 5, 7, 1),(2, 6, 7, 1),(3, 7, 7, 1),(4, 8, 7, 1),(6, 0, 8, 1),(7, 1, 8, 1),(8, 2, 8, 1),(0, 3, 8, 1),(1, 4, 8, 1),(2, 5, 8, 1),(3, 6, 8, 1),(4, 7, 8, 1),(5, 8, 8, 1),(8, 0, 0, 2),(0, 1, 0, 2),(1, 2, 0, 2),(2, 3, 0, 2),(3, 4, 0, 2),(4, 5, 0, 2),(5, 6, 0, 2),(6, 7, 0, 2),(7, 8, 0, 2),(0, 0, 1, 2),(1, 1, 1, 2),(2, 2, 1, 2),(3, 3, 1, 2),(4, 4, 1, 2),(5, 5, 1, 2),(6, 6, 1, 2),(7, 7, 1, 2),(8, 8, 1, 2),(1, 0, 2, 2),(2, 1, 2, 2),(3, 2, 2, 2),(4, 3, 2, 2),(5, 4, 2, 2),(6, 5, 2, 2),(7, 6, 2, 2),(8, 7, 2, 2),(0, 8, 2, 2),(2, 0, 3, 2),(3, 1, 3, 2),(4, 2, 3, 2),(5, 3, 3, 2),(6, 4, 3, 2),(7, 5, 3, 2),(8, 6, 3, 2),(0, 7, 3, 2),(1, 8, 3, 2),(3, 0, 4, 2),(4, 1, 4, 2),(5, 2, 4, 2),(6, 3, 4, 2),(7, 4, 4, 2),(8, 5, 4, 2),(0, 6, 4, 2),(1, 7, 4, 2),(2, 8, 4, 2),(4, 0, 5, 2),(5, 1, 5, 2),(6, 2, 5, 2),(7, 3, 5, 2),(8, 4, 5, 2),(0, 5, 5, 2),(1, 6, 5, 2),(2, 7, 5, 2),(3, 8, 5, 2),(5, 0, 6, 2),(6, 1, 6, 2),(7, 2, 6, 2),(8, 3, 6, 2),(0, 4, 6, 2),(1, 5, 6, 2),(2, 6, 6, 2),(3, 7, 6, 2),(4, 8, 6, 2),(6, 0, 7, 2),(7, 1, 7, 2),(8, 2, 7, 2),(0, 3, 7, 2),(1, 4, 7, 2),(2, 5, 7, 2),(3, 6, 7, 2),(4, 7, 7, 2),(5, 8, 7, 2),(7, 0, 8, 2),(8, 1, 8, 2),(0, 2, 8, 2),(1, 3, 8, 2),(2, 4, 8, 2),(3, 5, 8, 2),(4, 6, 8, 2),(5, 7, 8, 2),(6, 8, 8, 2),(0, 0, 0, 3),(1, 1, 0, 3),(2, 2, 0, 3),(3, 3, 0, 3),(4, 4, 0, 3),(5, 5, 0, 3),(6, 6, 0, 3),(7, 7, 0, 3),(8, 8, 0, 3),(1, 0, 1, 3),(2, 1, 1, 3),(3, 2, 1, 3),(4, 3, 1, 3),(5, 4, 1, 3),(6, 5, 1, 3),(7, 6, 1, 3),(8, 7, 1, 3),(0, 8, 1, 3),(2, 0, 2, 3),(3, 1, 2, 3),(4, 2, 2, 3),(5, 3, 2, 3),(6, 4, 2, 3),(7, 5, 2, 3),(8, 6, 2, 3),(0, 7, 2, 3),(1, 8, 2, 3),(3, 0, 3, 3),(4, 1, 3, 3),(5, 2, 3, 3),(6, 3, 3, 3),(7, 4, 3, 3),(8, 5, 3, 3),(0, 6, 3, 3),(1, 7, 3, 3),(2, 8, 3, 3),(4, 0, 4, 3),(5, 1, 4, 3),(6, 2, 4, 3),(7, 3, 4, 3),(8, 4, 4, 3),(0, 5, 4, 3),(1, 6, 4, 3),(2, 7, 4, 3),(3, 8, 4, 3),(5, 0, 5, 3),(6, 1, 5, 3),(7, 2, 5, 3),(8, 3, 5, 3),(0, 4, 5, 3),(1, 5, 5, 3),(2, 6, 5, 3),(3, 7, 5, 3),(4, 8, 5, 3),(6, 0, 6, 3),(7, 1, 6, 3),(8, 2, 6, 3),(0, 3, 6, 3),(1, 4, 6, 3),(2, 5, 6, 3),(3, 6, 6, 3),(4, 7, 6, 3),(5, 8, 6, 3),(7, 0, 7, 3),(8, 1, 7, 3),(0, 2, 7, 3),(1, 3, 7, 3),(2, 4, 7, 3),(3, 5, 7, 3),(4, 6, 7, 3),(5, 7, 7, 3),(6, 8, 7, 3),(8, 0, 8, 3),(0, 1, 8, 3),(1, 2, 8, 3),(2, 3, 8, 3),(3, 4, 8, 3),(4, 5, 8, 3),(5, 6, 8, 3),(6, 7, 8, 3),(7, 8, 8, 3),(1, 0, 0, 4),(2, 1, 0, 4),(3, 2, 0, 4),(4, 3, 0, 4),(5, 4, 0, 4),(6, 5, 0, 4),(7, 6, 0, 4),(8, 7, 0, 4),(0, 8, 0, 4),(2, 0, 1, 4),(3, 1, 1, 4),(4, 2, 1, 4),(5, 3, 1, 4),(6, 4, 1, 4),(7, 5, 1, 4),(8, 6, 1, 4),(0, 7, 1, 4),(1, 8, 1, 4),(3, 0, 2, 4),(4, 1, 2, 4),(5, 2, 2, 4),(6, 3, 2, 4),(7, 4, 2, 4),(8, 5, 2, 4),(0, 6, 2, 4),(1, 7, 2, 4),(2, 8, 2, 4),(4, 0, 3, 4),(5, 1, 3, 4),(6, 2, 3, 4),(7, 3, 3, 4),(8, 4, 3, 4),(0, 5, 3, 4),(1, 6, 3, 4),(2, 7, 3, 4),(3, 8, 3, 4),(5, 0, 4, 4),(6, 1, 4, 4),(7, 2, 4, 4),(8, 3, 4, 4),(0, 4, 4, 4),(1, 5, 4, 4),(2, 6, 4, 4),(3, 7, 4, 4),(4, 8, 4, 4),(6, 0, 5, 4),(7, 1, 5, 4),(8, 2, 5, 4),(0, 3, 5, 4),(1, 4, 5, 4),(2, 5, 5, 4),(3, 6, 5, 4),(4, 7, 5, 4),(5, 8, 5, 4),(7, 0, 6, 4),(8, 1, 6, 4),(0, 2, 6, 4),(1, 3, 6, 4),(2, 4, 6, 4),(3, 5, 6, 4),(4, 6, 6, 4),(5, 7, 6, 4),(6, 8, 6, 4),(8, 0, 7, 4),(0, 1, 7, 4),(1, 2, 7, 4),(2, 3, 7, 4),(3, 4, 7, 4),(4, 5, 7, 4),(5, 6, 7, 4),(6, 7, 7, 4),(7, 8, 7, 4),(0, 0, 8, 4),(1, 1, 8, 4),(2, 2, 8, 4),(3, 3, 8, 4),(4, 4, 8, 4),(5, 5, 8, 4),(6, 6, 8, 4),(7, 7, 8, 4),(8, 8, 8, 4),(2, 0, 0, 5),(3, 1, 0, 5),(4, 2, 0, 5),(5, 3, 0, 5),(6, 4, 0, 5),(7, 5, 0, 5),(8, 6, 0, 5),(0, 7, 0, 5),(1, 8, 0, 5),(3, 0, 1, 5),(4, 1, 1, 5),(5, 2, 1, 5),(6, 3, 1, 5),(7, 4, 1, 5),(8, 5, 1, 5),(0, 6, 1, 5),(1, 7, 1, 5),(2, 8, 1, 5),(4, 0, 2, 5),(5, 1, 2, 5),(6, 2, 2, 5),(7, 3, 2, 5),(8, 4, 2, 5),(0, 5, 2, 5),(1, 6, 2, 5),(2, 7, 2, 5),(3, 8, 2, 5),(5, 0, 3, 5),(6, 1, 3, 5),(7, 2, 3, 5),(8, 3, 3, 5),(0, 4, 3, 5),(1, 5, 3, 5),(2, 6, 3, 5),(3, 7, 3, 5),(4, 8, 3, 5),(6, 0, 4, 5),(7, 1, 4, 5),(8, 2, 4, 5),(0, 3, 4, 5),(1, 4, 4, 5),(2, 5, 4, 5),(3, 6, 4, 5),(4, 7, 4, 5),(5, 8, 4, 5),(7, 0, 5, 5),(8, 1, 5, 5),(0, 2, 5, 5),(1, 3, 5, 5),(2, 4, 5, 5),(3, 5, 5, 5),(4, 6, 5, 5),(5, 7, 5, 5),(6, 8, 5, 5),(8, 0, 6, 5),(0, 1, 6, 5),(1, 2, 6, 5),(2, 3, 6, 5),(3, 4, 6, 5),(4, 5, 6, 5),(5, 6, 6, 5),(6, 7, 6, 5),(7, 8, 6, 5),(0, 0, 7, 5),(1, 1, 7, 5),(2, 2, 7, 5),(3, 3, 7, 5),(4, 4, 7, 5),(5, 5, 7, 5),(6, 6, 7, 5),(7, 7, 7, 5),(8, 8, 7, 5),(1, 0, 8, 5),(2, 1, 8, 5),(3, 2, 8, 5),(4, 3, 8, 5),(5, 4, 8, 5),(6, 5, 8, 5),(7, 6, 8, 5),(8, 7, 8, 5),(0, 8, 8, 5),(3, 0, 0, 6),(4, 1, 0, 6),(5, 2, 0, 6),(6, 3, 0, 6),(7, 4, 0, 6),(8, 5, 0, 6),(0, 6, 0, 6),(1, 7, 0, 6),(2, 8, 0, 6),(4, 0, 1, 6),(5, 1, 1, 6),(6, 2, 1, 6),(7, 3, 1, 6),(8, 4, 1, 6),(0, 5, 1, 6),(1, 6, 1, 6),(2, 7, 1, 6),(3, 8, 1, 6),(5, 0, 2, 6),(6, 1, 2, 6),(7, 2, 2, 6),(8, 3, 2, 6),(0, 4, 2, 6),(1, 5, 2, 6),(2, 6, 2, 6),(3, 7, 2, 6),(4, 8, 2, 6),(6, 0, 3, 6),(7, 1, 3, 6),(8, 2, 3, 6),(0, 3, 3, 6),(1, 4, 3, 6),(2, 5, 3, 6),(3, 6, 3, 6),(4, 7, 3, 6),(5, 8, 3, 6),(7, 0, 4, 6),(8, 1, 4, 6),(0, 2, 4, 6),(1, 3, 4, 6),(2, 4, 4, 6),(3, 5, 4, 6),(4, 6, 4, 6),(5, 7, 4, 6),(6, 8, 4, 6),(8, 0, 5, 6),(0, 1, 5, 6),(1, 2, 5, 6),(2, 3, 5, 6),(3, 4, 5, 6),(4, 5, 5, 6),(5, 6, 5, 6),(6, 7, 5, 6),(7, 8, 5, 6),(0, 0, 6, 6),(1, 1, 6, 6),(2, 2, 6, 6),(3, 3, 6, 6),(4, 4, 6, 6),(5, 5, 6, 6),(6, 6, 6, 6),(7, 7, 6, 6),(8, 8, 6, 6),(1, 0, 7, 6),(2, 1, 7, 6),(3, 2, 7, 6),(4, 3, 7, 6),(5, 4, 7, 6),(6, 5, 7, 6),(7, 6, 7, 6),(8, 7, 7, 6),(0, 8, 7, 6),(2, 0, 8, 6),(3, 1, 8, 6),(4, 2, 8, 6),(5, 3, 8, 6),(6, 4, 8, 6),(7, 5, 8, 6),(8, 6, 8, 6),(0, 7, 8, 6),(1, 8, 8, 6),(4, 0, 0, 7),(5, 1, 0, 7),(6, 2, 0, 7),(7, 3, 0, 7),(8, 4, 0, 7),(0, 5, 0, 7),(1, 6, 0, 7),(2, 7, 0, 7),(3, 8, 0, 7),(5, 0, 1, 7),(6, 1, 1, 7),(7, 2, 1, 7),(8, 3, 1, 7),(0, 4, 1, 7),(1, 5, 1, 7),(2, 6, 1, 7),(3, 7, 1, 7),(4, 8, 1, 7),(6, 0, 2, 7),(7, 1, 2, 7),(8, 2, 2, 7),(0, 3, 2, 7),(1, 4, 2, 7),(2, 5, 2, 7),(3, 6, 2, 7),(4, 7, 2, 7),(5, 8, 2, 7),(7, 0, 3, 7),(8, 1, 3, 7),(0, 2, 3, 7),(1, 3, 3, 7),(2, 4, 3, 7),(3, 5, 3, 7),(4, 6, 3, 7),(5, 7, 3, 7),(6, 8, 3, 7),(8, 0, 4, 7),(0, 1, 4, 7),(1, 2, 4, 7),(2, 3, 4, 7),(3, 4, 4, 7),(4, 5, 4, 7),(5, 6, 4, 7),(6, 7, 4, 7),(7, 8, 4, 7),(0, 0, 5, 7),(1, 1, 5, 7),(2, 2, 5, 7),(3, 3, 5, 7),(4, 4, 5, 7),(5, 5, 5, 7),(6, 6, 5, 7),(7, 7, 5, 7),(8, 8, 5, 7),(1, 0, 6, 7),(2, 1, 6, 7),(3, 2, 6, 7),(4, 3, 6, 7),(5, 4, 6, 7),(6, 5, 6, 7),(7, 6, 6, 7),(8, 7, 6, 7),(0, 8, 6, 7),(2, 0, 7, 7),(3, 1, 7, 7),(4, 2, 7, 7),(5, 3, 7, 7),(6, 4, 7, 7),(7, 5, 7, 7),(8, 6, 7, 7),(0, 7, 7, 7),(1, 8, 7, 7),(3, 0, 8, 7),(4, 1, 8, 7),(5, 2, 8, 7),(6, 3, 8, 7),(7, 4, 8, 7),(8, 5, 8, 7),(0, 6, 8, 7),(1, 7, 8, 7),(2, 8, 8, 7),(5, 0, 0, 8),(6, 1, 0, 8),(7, 2, 0, 8),(8, 3, 0, 8),(0, 4, 0, 8),(1, 5, 0, 8),(2, 6, 0, 8),(3, 7, 0, 8),(4, 8, 0, 8),(6, 0, 1, 8),(7, 1, 1, 8),(8, 2, 1, 8),(0, 3, 1, 8),(1, 4, 1, 8),(2, 5, 1, 8),(3, 6, 1, 8),(4, 7, 1, 8),(5, 8, 1, 8),(7, 0, 2, 8),(8, 1, 2, 8),(0, 2, 2, 8),(1, 3, 2, 8),(2, 4, 2, 8),(3, 5, 2, 8),(4, 6, 2, 8),(5, 7, 2, 8),(6, 8, 2, 8),(8, 0, 3, 8),(0, 1, 3, 8),(1, 2, 3, 8),(2, 3, 3, 8),(3, 4, 3, 8),(4, 5, 3, 8),(5, 6, 3, 8),(6, 7, 3, 8),(7, 8, 3, 8),(0, 0, 4, 8),(1, 1, 4, 8),(2, 2, 4, 8),(3, 3, 4, 8),(4, 4, 4, 8),(5, 5, 4, 8),(6, 6, 4, 8),(7, 7, 4, 8),(8, 8, 4, 8),(1, 0, 5, 8),(2, 1, 5, 8),(3, 2, 5, 8),(4, 3, 5, 8),(5, 4, 5, 8),(6, 5, 5, 8),(7, 6, 5, 8),(8, 7, 5, 8),(0, 8, 5, 8),(2, 0, 6, 8),(3, 1, 6, 8),(4, 2, 6, 8),(5, 3, 6, 8),(6, 4, 6, 8),(7, 5, 6, 8),(8, 6, 6, 8),(0, 7, 6, 8),(1, 8, 6, 8),(3, 0, 7, 8),(4, 1, 7, 8),(5, 2, 7, 8),(6, 3, 7, 8),(7, 4, 7, 8),(8, 5, 7, 8),(0, 6, 7, 8),(1, 7, 7, 8),(2, 8, 7, 8),(4, 0, 8, 8),(5, 1, 8, 8),(6, 2, 8, 8),(7, 3, 8, 8),(8, 4, 8, 8),(0, 5, 8, 8),(1, 6, 8, 8),(2, 7, 8, 8),(3, 8, 8, 8)],
[(7, 0, 0, 0),(8, 1, 0, 0),(0, 2, 0, 0),(1, 3, 0, 0),(2, 4, 0, 0),(3, 5, 0, 0),(4, 6, 0, 0),(5, 7, 0, 0),(6, 8, 0, 0),(8, 0, 1, 0),(0, 1, 1, 0),(1, 2, 1, 0),(2, 3, 1, 0),(3, 4, 1, 0),(4, 5, 1, 0),(5, 6, 1, 0),(6, 7, 1, 0),(7, 8, 1, 0),(0, 0, 2, 0),(1, 1, 2, 0),(2, 2, 2, 0),(3, 3, 2, 0),(4, 4, 2, 0),(5, 5, 2, 0),(6, 6, 2, 0),(7, 7, 2, 0),(8, 8, 2, 0),(1, 0, 3, 0),(2, 1, 3, 0),(3, 2, 3, 0),(4, 3, 3, 0),(5, 4, 3, 0),(6, 5, 3, 0),(7, 6, 3, 0),(8, 7, 3, 0),(0, 8, 3, 0),(2, 0, 4, 0),(3, 1, 4, 0),(4, 2, 4, 0),(5, 3, 4, 0),(6, 4, 4, 0),(7, 5, 4, 0),(8, 6, 4, 0),(0, 7, 4, 0),(1, 8, 4, 0),(3, 0, 5, 0),(4, 1, 5, 0),(5, 2, 5, 0),(6, 3, 5, 0),(7, 4, 5, 0),(8, 5, 5, 0),(0, 6, 5, 0),(1, 7, 5, 0),(2, 8, 5, 0),(4, 0, 6, 0),(5, 1, 6, 0),(6, 2, 6, 0),(7, 3, 6, 0),(8, 4, 6, 0),(0, 5, 6, 0),(1, 6, 6, 0),(2, 7, 6, 0),(3, 8, 6, 0),(5, 0, 7, 0),(6, 1, 7, 0),(7, 2, 7, 0),(8, 3, 7, 0),(0, 4, 7, 0),(1, 5, 7, 0),(2, 6, 7, 0),(3, 7, 7, 0),(4, 8, 7, 0),(6, 0, 8, 0),(7, 1, 8, 0),(8, 2, 8, 0),(0, 3, 8, 0),(1, 4, 8, 0),(2, 5, 8, 0),(3, 6, 8, 0),(4, 7, 8, 0),(5, 8, 8, 0),(8, 0, 0, 1),(0, 1, 0, 1),(1, 2, 0, 1),(2, 3, 0, 1),(3, 4, 0, 1),(4, 5, 0, 1),(5, 6, 0, 1),(6, 7, 0, 1),(7, 8, 0, 1),(0, 0, 1, 1),(1, 1, 1, 1),(2, 2, 1, 1),(3, 3, 1, 1),(4, 4, 1, 1),(5, 5, 1, 1),(6, 6, 1, 1),(7, 7, 1, 1),(8, 8, 1, 1),(1, 0, 2, 1),(2, 1, 2, 1),(3, 2, 2, 1),(4, 3, 2, 1),(5, 4, 2, 1),(6, 5, 2, 1),(7, 6, 2, 1),(8, 7, 2, 1),(0, 8, 2, 1),(2, 0, 3, 1),(3, 1, 3, 1),(4, 2, 3, 1),(5, 3, 3, 1),(6, 4, 3, 1),(7, 5, 3, 1),(8, 6, 3, 1),(0, 7, 3, 1),(1, 8, 3, 1),(3, 0, 4, 1),(4, 1, 4, 1),(5, 2, 4, 1),(6, 3, 4, 1),(7, 4, 4, 1),(8, 5, 4, 1),(0, 6, 4, 1),(1, 7, 4, 1),(2, 8, 4, 1),(4, 0, 5, 1),(5, 1, 5, 1),(6, 2, 5, 1),(7, 3, 5, 1),(8, 4, 5, 1),(0, 5, 5, 1),(1, 6, 5, 1),(2, 7, 5, 1),(3, 8, 5, 1),(5, 0, 6, 1),(6, 1, 6, 1),(7, 2, 6, 1),(8, 3, 6, 1),(0, 4, 6, 1),(1, 5, 6, 1),(2, 6, 6, 1),(3, 7, 6, 1),(4, 8, 6, 1),(6, 0, 7, 1),(7, 1, 7, 1),(8, 2, 7, 1),(0, 3, 7, 1),(1, 4, 7, 1),(2, 5, 7, 1),(3, 6, 7, 1),(4, 7, 7, 1),(5, 8, 7, 1),(7, 0, 8, 1),(8, 1, 8, 1),(0, 2, 8, 1),(1, 3, 8, 1),(2, 4, 8, 1),(3, 5, 8, 1),(4, 6, 8, 1),(5, 7, 8, 1),(6, 8, 8, 1),(0, 0, 0, 2),(1, 1, 0, 2),(2, 2, 0, 2),(3, 3, 0, 2),(4, 4, 0, 2),(5, 5, 0, 2),(6, 6, 0, 2),(7, 7, 0, 2),(8, 8, 0, 2),(1, 0, 1, 2),(2, 1, 1, 2),(3, 2, 1, 2),(4, 3, 1, 2),(5, 4, 1, 2),(6, 5, 1, 2),(7, 6, 1, 2),(8, 7, 1, 2),(0, 8, 1, 2),(2, 0, 2, 2),(3, 1, 2, 2),(4, 2, 2, 2),(5, 3, 2, 2),(6, 4, 2, 2),(7, 5, 2, 2),(8, 6, 2, 2),(0, 7, 2, 2),(1, 8, 2, 2),(3, 0, 3, 2),(4, 1, 3, 2),(5, 2, 3, 2),(6, 3, 3, 2),(7, 4, 3, 2),(8, 5, 3, 2),(0, 6, 3, 2),(1, 7, 3, 2),(2, 8, 3, 2),(4, 0, 4, 2),(5, 1, 4, 2),(6, 2, 4, 2),(7, 3, 4, 2),(8, 4, 4, 2),(0, 5, 4, 2),(1, 6, 4, 2),(2, 7, 4, 2),(3, 8, 4, 2),(5, 0, 5, 2),(6, 1, 5, 2),(7, 2, 5, 2),(8, 3, 5, 2),(0, 4, 5, 2),(1, 5, 5, 2),(2, 6, 5, 2),(3, 7, 5, 2),(4, 8, 5, 2),(6, 0, 6, 2),(7, 1, 6, 2),(8, 2, 6, 2),(0, 3, 6, 2),(1, 4, 6, 2),(2, 5, 6, 2),(3, 6, 6, 2),(4, 7, 6, 2),(5, 8, 6, 2),(7, 0, 7, 2),(8, 1, 7, 2),(0, 2, 7, 2),(1, 3, 7, 2),(2, 4, 7, 2),(3, 5, 7, 2),(4, 6, 7, 2),(5, 7, 7, 2),(6, 8, 7, 2),(8, 0, 8, 2),(0, 1, 8, 2),(1, 2, 8, 2),(2, 3, 8, 2),(3, 4, 8, 2),(4, 5, 8, 2),(5, 6, 8, 2),(6, 7, 8, 2),(7, 8, 8, 2),(1, 0, 0, 3),(2, 1, 0, 3),(3, 2, 0, 3),(4, 3, 0, 3),(5, 4, 0, 3),(6, 5, 0, 3),(7, 6, 0, 3),(8, 7, 0, 3),(0, 8, 0, 3),(2, 0, 1, 3),(3, 1, 1, 3),(4, 2, 1, 3),(5, 3, 1, 3),(6, 4, 1, 3),(7, 5, 1, 3),(8, 6, 1, 3),(0, 7, 1, 3),(1, 8, 1, 3),(3, 0, 2, 3),(4, 1, 2, 3),(5, 2, 2, 3),(6, 3, 2, 3),(7, 4, 2, 3),(8, 5, 2, 3),(0, 6, 2, 3),(1, 7, 2, 3),(2, 8, 2, 3),(4, 0, 3, 3),(5, 1, 3, 3),(6, 2, 3, 3),(7, 3, 3, 3),(8, 4, 3, 3),(0, 5, 3, 3),(1, 6, 3, 3),(2, 7, 3, 3),(3, 8, 3, 3),(5, 0, 4, 3),(6, 1, 4, 3),(7, 2, 4, 3),(8, 3, 4, 3),(0, 4, 4, 3),(1, 5, 4, 3),(2, 6, 4, 3),(3, 7, 4, 3),(4, 8, 4, 3),(6, 0, 5, 3),(7, 1, 5, 3),(8, 2, 5, 3),(0, 3, 5, 3),(1, 4, 5, 3),(2, 5, 5, 3),(3, 6, 5, 3),(4, 7, 5, 3),(5, 8, 5, 3),(7, 0, 6, 3),(8, 1, 6, 3),(0, 2, 6, 3),(1, 3, 6, 3),(2, 4, 6, 3),(3, 5, 6, 3),(4, 6, 6, 3),(5, 7, 6, 3),(6, 8, 6, 3),(8, 0, 7, 3),(0, 1, 7, 3),(1, 2, 7, 3),(2, 3, 7, 3),(3, 4, 7, 3),(4, 5, 7, 3),(5, 6, 7, 3),(6, 7, 7, 3),(7, 8, 7, 3),(0, 0, 8, 3),(1, 1, 8, 3),(2, 2, 8, 3),(3, 3, 8, 3),(4, 4, 8, 3),(5, 5, 8, 3),(6, 6, 8, 3),(7, 7, 8, 3),(8, 8, 8, 3),(2, 0, 0, 4),(3, 1, 0, 4),(4, 2, 0, 4),(5, 3, 0, 4),(6, 4, 0, 4),(7, 5, 0, 4),(8, 6, 0, 4),(0, 7, 0, 4),(1, 8, 0, 4),(3, 0, 1, 4),(4, 1, 1, 4),(5, 2, 1, 4),(6, 3, 1, 4),(7, 4, 1, 4),(8, 5, 1, 4),(0, 6, 1, 4),(1, 7, 1, 4),(2, 8, 1, 4),(4, 0, 2, 4),(5, 1, 2, 4),(6, 2, 2, 4),(7, 3, 2, 4),(8, 4, 2, 4),(0, 5, 2, 4),(1, 6, 2, 4),(2, 7, 2, 4),(3, 8, 2, 4),(5, 0, 3, 4),(6, 1, 3, 4),(7, 2, 3, 4),(8, 3, 3, 4),(0, 4, 3, 4),(1, 5, 3, 4),(2, 6, 3, 4),(3, 7, 3, 4),(4, 8, 3, 4),(6, 0, 4, 4),(7, 1, 4, 4),(8, 2, 4, 4),(0, 3, 4, 4),(1, 4, 4, 4),(2, 5, 4, 4),(3, 6, 4, 4),(4, 7, 4, 4),(5, 8, 4, 4),(7, 0, 5, 4),(8, 1, 5, 4),(0, 2, 5, 4),(1, 3, 5, 4),(2, 4, 5, 4),(3, 5, 5, 4),(4, 6, 5, 4),(5, 7, 5, 4),(6, 8, 5, 4),(8, 0, 6, 4),(0, 1, 6, 4),(1, 2, 6, 4),(2, 3, 6, 4),(3, 4, 6, 4),(4, 5, 6, 4),(5, 6, 6, 4),(6, 7, 6, 4),(7, 8, 6, 4),(0, 0, 7, 4),(1, 1, 7, 4),(2, 2, 7, 4),(3, 3, 7, 4),(4, 4, 7, 4),(5, 5, 7, 4),(6, 6, 7, 4),(7, 7, 7, 4),(8, 8, 7, 4),(1, 0, 8, 4),(2, 1, 8, 4),(3, 2, 8, 4),(4, 3, 8, 4),(5, 4, 8, 4),(6, 5, 8, 4),(7, 6, 8, 4),(8, 7, 8, 4),(0, 8, 8, 4),(3, 0, 0, 5),(4, 1, 0, 5),(5, 2, 0, 5),(6, 3, 0, 5),(7, 4, 0, 5),(8, 5, 0, 5),(0, 6, 0, 5),(1, 7, 0, 5),(2, 8, 0, 5),(4, 0, 1, 5),(5, 1, 1, 5),(6, 2, 1, 5),(7, 3, 1, 5),(8, 4, 1, 5),(0, 5, 1, 5),(1, 6, 1, 5),(2, 7, 1, 5),(3, 8, 1, 5),(5, 0, 2, 5),(6, 1, 2, 5),(7, 2, 2, 5),(8, 3, 2, 5),(0, 4, 2, 5),(1, 5, 2, 5),(2, 6, 2, 5),(3, 7, 2, 5),(4, 8, 2, 5),(6, 0, 3, 5),(7, 1, 3, 5),(8, 2, 3, 5),(0, 3, 3, 5),(1, 4, 3, 5),(2, 5, 3, 5),(3, 6, 3, 5),(4, 7, 3, 5),(5, 8, 3, 5),(7, 0, 4, 5),(8, 1, 4, 5),(0, 2, 4, 5),(1, 3, 4, 5),(2, 4, 4, 5),(3, 5, 4, 5),(4, 6, 4, 5),(5, 7, 4, 5),(6, 8, 4, 5),(8, 0, 5, 5),(0, 1, 5, 5),(1, 2, 5, 5),(2, 3, 5, 5),(3, 4, 5, 5),(4, 5, 5, 5),(5, 6, 5, 5),(6, 7, 5, 5),(7, 8, 5, 5),(0, 0, 6, 5),(1, 1, 6, 5),(2, 2, 6, 5),(3, 3, 6, 5),(4, 4, 6, 5),(5, 5, 6, 5),(6, 6, 6, 5),(7, 7, 6, 5),(8, 8, 6, 5),(1, 0, 7, 5),(2, 1, 7, 5),(3, 2, 7, 5),(4, 3, 7, 5),(5, 4, 7, 5),(6, 5, 7, 5),(7, 6, 7, 5),(8, 7, 7, 5),(0, 8, 7, 5),(2, 0, 8, 5),(3, 1, 8, 5),(4, 2, 8, 5),(5, 3, 8, 5),(6, 4, 8, 5),(7, 5, 8, 5),(8, 6, 8, 5),(0, 7, 8, 5),(1, 8, 8, 5),(4, 0, 0, 6),(5, 1, 0, 6),(6, 2, 0, 6),(7, 3, 0, 6),(8, 4, 0, 6),(0, 5, 0, 6),(1, 6, 0, 6),(2, 7, 0, 6),(3, 8, 0, 6),(5, 0, 1, 6),(6, 1, 1, 6),(7, 2, 1, 6),(8, 3, 1, 6),(0, 4, 1, 6),(1, 5, 1, 6),(2, 6, 1, 6),(3, 7, 1, 6),(4, 8, 1, 6),(6, 0, 2, 6),(7, 1, 2, 6),(8, 2, 2, 6),(0, 3, 2, 6),(1, 4, 2, 6),(2, 5, 2, 6),(3, 6, 2, 6),(4, 7, 2, 6),(5, 8, 2, 6),(7, 0, 3, 6),(8, 1, 3, 6),(0, 2, 3, 6),(1, 3, 3, 6),(2, 4, 3, 6),(3, 5, 3, 6),(4, 6, 3, 6),(5, 7, 3, 6),(6, 8, 3, 6),(8, 0, 4, 6),(0, 1, 4, 6),(1, 2, 4, 6),(2, 3, 4, 6),(3, 4, 4, 6),(4, 5, 4, 6),(5, 6, 4, 6),(6, 7, 4, 6),(7, 8, 4, 6),(0, 0, 5, 6),(1, 1, 5, 6),(2, 2, 5, 6),(3, 3, 5, 6),(4, 4, 5, 6),(5, 5, 5, 6),(6, 6, 5, 6),(7, 7, 5, 6),(8, 8, 5, 6),(1, 0, 6, 6),(2, 1, 6, 6),(3, 2, 6, 6),(4, 3, 6, 6),(5, 4, 6, 6),(6, 5, 6, 6),(7, 6, 6, 6),(8, 7, 6, 6),(0, 8, 6, 6),(2, 0, 7, 6),(3, 1, 7, 6),(4, 2, 7, 6),(5, 3, 7, 6),(6, 4, 7, 6),(7, 5, 7, 6),(8, 6, 7, 6),(0, 7, 7, 6),(1, 8, 7, 6),(3, 0, 8, 6),(4, 1, 8, 6),(5, 2, 8, 6),(6, 3, 8, 6),(7, 4, 8, 6),(8, 5, 8, 6),(0, 6, 8, 6),(1, 7, 8, 6),(2, 8, 8, 6),(5, 0, 0, 7),(6, 1, 0, 7),(7, 2, 0, 7),(8, 3, 0, 7),(0, 4, 0, 7),(1, 5, 0, 7),(2, 6, 0, 7),(3, 7, 0, 7),(4, 8, 0, 7),(6, 0, 1, 7),(7, 1, 1, 7),(8, 2, 1, 7),(0, 3, 1, 7),(1, 4, 1, 7),(2, 5, 1, 7),(3, 6, 1, 7),(4, 7, 1, 7),(5, 8, 1, 7),(7, 0, 2, 7),(8, 1, 2, 7),(0, 2, 2, 7),(1, 3, 2, 7),(2, 4, 2, 7),(3, 5, 2, 7),(4, 6, 2, 7),(5, 7, 2, 7),(6, 8, 2, 7),(8, 0, 3, 7),(0, 1, 3, 7),(1, 2, 3, 7),(2, 3, 3, 7),(3, 4, 3, 7),(4, 5, 3, 7),(5, 6, 3, 7),(6, 7, 3, 7),(7, 8, 3, 7),(0, 0, 4, 7),(1, 1, 4, 7),(2, 2, 4, 7),(3, 3, 4, 7),(4, 4, 4, 7),(5, 5, 4, 7),(6, 6, 4, 7),(7, 7, 4, 7),(8, 8, 4, 7),(1, 0, 5, 7),(2, 1, 5, 7),(3, 2, 5, 7),(4, 3, 5, 7),(5, 4, 5, 7),(6, 5, 5, 7),(7, 6, 5, 7),(8, 7, 5, 7),(0, 8, 5, 7),(2, 0, 6, 7),(3, 1, 6, 7),(4, 2, 6, 7),(5, 3, 6, 7),(6, 4, 6, 7),(7, 5, 6, 7),(8, 6, 6, 7),(0, 7, 6, 7),(1, 8, 6, 7),(3, 0, 7, 7),(4, 1, 7, 7),(5, 2, 7, 7),(6, 3, 7, 7),(7, 4, 7, 7),(8, 5, 7, 7),(0, 6, 7, 7),(1, 7, 7, 7),(2, 8, 7, 7),(4, 0, 8, 7),(5, 1, 8, 7),(6, 2, 8, 7),(7, 3, 8, 7),(8, 4, 8, 7),(0, 5, 8, 7),(1, 6, 8, 7),(2, 7, 8, 7),(3, 8, 8, 7),(6, 0, 0, 8),(7, 1, 0, 8),(8, 2, 0, 8),(0, 3, 0, 8),(1, 4, 0, 8),(2, 5, 0, 8),(3, 6, 0, 8),(4, 7, 0, 8),(5, 8, 0, 8),(7, 0, 1, 8),(8, 1, 1, 8),(0, 2, 1, 8),(1, 3, 1, 8),(2, 4, 1, 8),(3, 5, 1, 8),(4, 6, 1, 8),(5, 7, 1, 8),(6, 8, 1, 8),(8, 0, 2, 8),(0, 1, 2, 8),(1, 2, 2, 8),(2, 3, 2, 8),(3, 4, 2, 8),(4, 5, 2, 8),(5, 6, 2, 8),(6, 7, 2, 8),(7, 8, 2, 8),(0, 0, 3, 8),(1, 1, 3, 8),(2, 2, 3, 8),(3, 3, 3, 8),(4, 4, 3, 8),(5, 5, 3, 8),(6, 6, 3, 8),(7, 7, 3, 8),(8, 8, 3, 8),(1, 0, 4, 8),(2, 1, 4, 8),(3, 2, 4, 8),(4, 3, 4, 8),(5, 4, 4, 8),(6, 5, 4, 8),(7, 6, 4, 8),(8, 7, 4, 8),(0, 8, 4, 8),(2, 0, 5, 8),(3, 1, 5, 8),(4, 2, 5, 8),(5, 3, 5, 8),(6, 4, 5, 8),(7, 5, 5, 8),(8, 6, 5, 8),(0, 7, 5, 8),(1, 8, 5, 8),(3, 0, 6, 8),(4, 1, 6, 8),(5, 2, 6, 8),(6, 3, 6, 8),(7, 4, 6, 8),(8, 5, 6, 8),(0, 6, 6, 8),(1, 7, 6, 8),(2, 8, 6, 8),(4, 0, 7, 8),(5, 1, 7, 8),(6, 2, 7, 8),(7, 3, 7, 8),(8, 4, 7, 8),(0, 5, 7, 8),(1, 6, 7, 8),(2, 7, 7, 8),(3, 8, 7, 8),(5, 0, 8, 8),(6, 1, 8, 8),(7, 2, 8, 8),(8, 3, 8, 8),(0, 4, 8, 8),(1, 5, 8, 8),(2, 6, 8, 8),(3, 7, 8, 8),(4, 8, 8, 8)],
[(8, 0, 0, 0),(0, 1, 0, 0),(1, 2, 0, 0),(2, 3, 0, 0),(3, 4, 0, 0),(4, 5, 0, 0),(5, 6, 0, 0),(6, 7, 0, 0),(7, 8, 0, 0),(0, 0, 1, 0),(1, 1, 1, 0),(2, 2, 1, 0),(3, 3, 1, 0),(4, 4, 1, 0),(5, 5, 1, 0),(6, 6, 1, 0),(7, 7, 1, 0),(8, 8, 1, 0),(1, 0, 2, 0),(2, 1, 2, 0),(3, 2, 2, 0),(4, 3, 2, 0),(5, 4, 2, 0),(6, 5, 2, 0),(7, 6, 2, 0),(8, 7, 2, 0),(0, 8, 2, 0),(2, 0, 3, 0),(3, 1, 3, 0),(4, 2, 3, 0),(5, 3, 3, 0),(6, 4, 3, 0),(7, 5, 3, 0),(8, 6, 3, 0),(0, 7, 3, 0),(1, 8, 3, 0),(3, 0, 4, 0),(4, 1, 4, 0),(5, 2, 4, 0),(6, 3, 4, 0),(7, 4, 4, 0),(8, 5, 4, 0),(0, 6, 4, 0),(1, 7, 4, 0),(2, 8, 4, 0),(4, 0, 5, 0),(5, 1, 5, 0),(6, 2, 5, 0),(7, 3, 5, 0),(8, 4, 5, 0),(0, 5, 5, 0),(1, 6, 5, 0),(2, 7, 5, 0),(3, 8, 5, 0),(5, 0, 6, 0),(6, 1, 6, 0),(7, 2, 6, 0),(8, 3, 6, 0),(0, 4, 6, 0),(1, 5, 6, 0),(2, 6, 6, 0),(3, 7, 6, 0),(4, 8, 6, 0),(6, 0, 7, 0),(7, 1, 7, 0),(8, 2, 7, 0),(0, 3, 7, 0),(1, 4, 7, 0),(2, 5, 7, 0),(3, 6, 7, 0),(4, 7, 7, 0),(5, 8, 7, 0),(7, 0, 8, 0),(8, 1, 8, 0),(0, 2, 8, 0),(1, 3, 8, 0),(2, 4, 8, 0),(3, 5, 8, 0),(4, 6, 8, 0),(5, 7, 8, 0),(6, 8, 8, 0),(0, 0, 0, 1),(1, 1, 0, 1),(2, 2, 0, 1),(3, 3, 0, 1),(4, 4, 0, 1),(5, 5, 0, 1),(6, 6, 0, 1),(7, 7, 0, 1),(8, 8, 0, 1),(1, 0, 1, 1),(2, 1, 1, 1),(3, 2, 1, 1),(4, 3, 1, 1),(5, 4, 1, 1),(6, 5, 1, 1),(7, 6, 1, 1),(8, 7, 1, 1),(0, 8, 1, 1),(2, 0, 2, 1),(3, 1, 2, 1),(4, 2, 2, 1),(5, 3, 2, 1),(6, 4, 2, 1),(7, 5, 2, 1),(8, 6, 2, 1),(0, 7, 2, 1),(1, 8, 2, 1),(3, 0, 3, 1),(4, 1, 3, 1),(5, 2, 3, 1),(6, 3, 3, 1),(7, 4, 3, 1),(8, 5, 3, 1),(0, 6, 3, 1),(1, 7, 3, 1),(2, 8, 3, 1),(4, 0, 4, 1),(5, 1, 4, 1),(6, 2, 4, 1),(7, 3, 4, 1),(8, 4, 4, 1),(0, 5, 4, 1),(1, 6, 4, 1),(2, 7, 4, 1),(3, 8, 4, 1),(5, 0, 5, 1),(6, 1, 5, 1),(7, 2, 5, 1),(8, 3, 5, 1),(0, 4, 5, 1),(1, 5, 5, 1),(2, 6, 5, 1),(3, 7, 5, 1),(4, 8, 5, 1),(6, 0, 6, 1),(7, 1, 6, 1),(8, 2, 6, 1),(0, 3, 6, 1),(1, 4, 6, 1),(2, 5, 6, 1),(3, 6, 6, 1),(4, 7, 6, 1),(5, 8, 6, 1),(7, 0, 7, 1),(8, 1, 7, 1),(0, 2, 7, 1),(1, 3, 7, 1),(2, 4, 7, 1),(3, 5, 7, 1),(4, 6, 7, 1),(5, 7, 7, 1),(6, 8, 7, 1),(8, 0, 8, 1),(0, 1, 8, 1),(1, 2, 8, 1),(2, 3, 8, 1),(3, 4, 8, 1),(4, 5, 8, 1),(5, 6, 8, 1),(6, 7, 8, 1),(7, 8, 8, 1),(1, 0, 0, 2),(2, 1, 0, 2),(3, 2, 0, 2),(4, 3, 0, 2),(5, 4, 0, 2),(6, 5, 0, 2),(7, 6, 0, 2),(8, 7, 0, 2),(0, 8, 0, 2),(2, 0, 1, 2),(3, 1, 1, 2),(4, 2, 1, 2),(5, 3, 1, 2),(6, 4, 1, 2),(7, 5, 1, 2),(8, 6, 1, 2),(0, 7, 1, 2),(1, 8, 1, 2),(3, 0, 2, 2),(4, 1, 2, 2),(5, 2, 2, 2),(6, 3, 2, 2),(7, 4, 2, 2),(8, 5, 2, 2),(0, 6, 2, 2),(1, 7, 2, 2),(2, 8, 2, 2),(4, 0, 3, 2),(5, 1, 3, 2),(6, 2, 3, 2),(7, 3, 3, 2),(8, 4, 3, 2),(0, 5, 3, 2),(1, 6, 3, 2),(2, 7, 3, 2),(3, 8, 3, 2),(5, 0, 4, 2),(6, 1, 4, 2),(7, 2, 4, 2),(8, 3, 4, 2),(0, 4, 4, 2),(1, 5, 4, 2),(2, 6, 4, 2),(3, 7, 4, 2),(4, 8, 4, 2),(6, 0, 5, 2),(7, 1, 5, 2),(8, 2, 5, 2),(0, 3, 5, 2),(1, 4, 5, 2),(2, 5, 5, 2),(3, 6, 5, 2),(4, 7, 5, 2),(5, 8, 5, 2),(7, 0, 6, 2),(8, 1, 6, 2),(0, 2, 6, 2),(1, 3, 6, 2),(2, 4, 6, 2),(3, 5, 6, 2),(4, 6, 6, 2),(5, 7, 6, 2),(6, 8, 6, 2),(8, 0, 7, 2),(0, 1, 7, 2),(1, 2, 7, 2),(2, 3, 7, 2),(3, 4, 7, 2),(4, 5, 7, 2),(5, 6, 7, 2),(6, 7, 7, 2),(7, 8, 7, 2),(0, 0, 8, 2),(1, 1, 8, 2),(2, 2, 8, 2),(3, 3, 8, 2),(4, 4, 8, 2),(5, 5, 8, 2),(6, 6, 8, 2),(7, 7, 8, 2),(8, 8, 8, 2),(2, 0, 0, 3),(3, 1, 0, 3),(4, 2, 0, 3),(5, 3, 0, 3),(6, 4, 0, 3),(7, 5, 0, 3),(8, 6, 0, 3),(0, 7, 0, 3),(1, 8, 0, 3),(3, 0, 1, 3),(4, 1, 1, 3),(5, 2, 1, 3),(6, 3, 1, 3),(7, 4, 1, 3),(8, 5, 1, 3),(0, 6, 1, 3),(1, 7, 1, 3),(2, 8, 1, 3),(4, 0, 2, 3),(5, 1, 2, 3),(6, 2, 2, 3),(7, 3, 2, 3),(8, 4, 2, 3),(0, 5, 2, 3),(1, 6, 2, 3),(2, 7, 2, 3),(3, 8, 2, 3),(5, 0, 3, 3),(6, 1, 3, 3),(7, 2, 3, 3),(8, 3, 3, 3),(0, 4, 3, 3),(1, 5, 3, 3),(2, 6, 3, 3),(3, 7, 3, 3),(4, 8, 3, 3),(6, 0, 4, 3),(7, 1, 4, 3),(8, 2, 4, 3),(0, 3, 4, 3),(1, 4, 4, 3),(2, 5, 4, 3),(3, 6, 4, 3),(4, 7, 4, 3),(5, 8, 4, 3),(7, 0, 5, 3),(8, 1, 5, 3),(0, 2, 5, 3),(1, 3, 5, 3),(2, 4, 5, 3),(3, 5, 5, 3),(4, 6, 5, 3),(5, 7, 5, 3),(6, 8, 5, 3),(8, 0, 6, 3),(0, 1, 6, 3),(1, 2, 6, 3),(2, 3, 6, 3),(3, 4, 6, 3),(4, 5, 6, 3),(5, 6, 6, 3),(6, 7, 6, 3),(7, 8, 6, 3),(0, 0, 7, 3),(1, 1, 7, 3),(2, 2, 7, 3),(3, 3, 7, 3),(4, 4, 7, 3),(5, 5, 7, 3),(6, 6, 7, 3),(7, 7, 7, 3),(8, 8, 7, 3),(1, 0, 8, 3),(2, 1, 8, 3),(3, 2, 8, 3),(4, 3, 8, 3),(5, 4, 8, 3),(6, 5, 8, 3),(7, 6, 8, 3),(8, 7, 8, 3),(0, 8, 8, 3),(3, 0, 0, 4),(4, 1, 0, 4),(5, 2, 0, 4),(6, 3, 0, 4),(7, 4, 0, 4),(8, 5, 0, 4),(0, 6, 0, 4),(1, 7, 0, 4),(2, 8, 0, 4),(4, 0, 1, 4),(5, 1, 1, 4),(6, 2, 1, 4),(7, 3, 1, 4),(8, 4, 1, 4),(0, 5, 1, 4),(1, 6, 1, 4),(2, 7, 1, 4),(3, 8, 1, 4),(5, 0, 2, 4),(6, 1, 2, 4),(7, 2, 2, 4),(8, 3, 2, 4),(0, 4, 2, 4),(1, 5, 2, 4),(2, 6, 2, 4),(3, 7, 2, 4),(4, 8, 2, 4),(6, 0, 3, 4),(7, 1, 3, 4),(8, 2, 3, 4),(0, 3, 3, 4),(1, 4, 3, 4),(2, 5, 3, 4),(3, 6, 3, 4),(4, 7, 3, 4),(5, 8, 3, 4),(7, 0, 4, 4),(8, 1, 4, 4),(0, 2, 4, 4),(1, 3, 4, 4),(2, 4, 4, 4),(3, 5, 4, 4),(4, 6, 4, 4),(5, 7, 4, 4),(6, 8, 4, 4),(8, 0, 5, 4),(0, 1, 5, 4),(1, 2, 5, 4),(2, 3, 5, 4),(3, 4, 5, 4),(4, 5, 5, 4),(5, 6, 5, 4),(6, 7, 5, 4),(7, 8, 5, 4),(0, 0, 6, 4),(1, 1, 6, 4),(2, 2, 6, 4),(3, 3, 6, 4),(4, 4, 6, 4),(5, 5, 6, 4),(6, 6, 6, 4),(7, 7, 6, 4),(8, 8, 6, 4),(1, 0, 7, 4),(2, 1, 7, 4),(3, 2, 7, 4),(4, 3, 7, 4),(5, 4, 7, 4),(6, 5, 7, 4),(7, 6, 7, 4),(8, 7, 7, 4),(0, 8, 7, 4),(2, 0, 8, 4),(3, 1, 8, 4),(4, 2, 8, 4),(5, 3, 8, 4),(6, 4, 8, 4),(7, 5, 8, 4),(8, 6, 8, 4),(0, 7, 8, 4),(1, 8, 8, 4),(4, 0, 0, 5),(5, 1, 0, 5),(6, 2, 0, 5),(7, 3, 0, 5),(8, 4, 0, 5),(0, 5, 0, 5),(1, 6, 0, 5),(2, 7, 0, 5),(3, 8, 0, 5),(5, 0, 1, 5),(6, 1, 1, 5),(7, 2, 1, 5),(8, 3, 1, 5),(0, 4, 1, 5),(1, 5, 1, 5),(2, 6, 1, 5),(3, 7, 1, 5),(4, 8, 1, 5),(6, 0, 2, 5),(7, 1, 2, 5),(8, 2, 2, 5),(0, 3, 2, 5),(1, 4, 2, 5),(2, 5, 2, 5),(3, 6, 2, 5),(4, 7, 2, 5),(5, 8, 2, 5),(7, 0, 3, 5),(8, 1, 3, 5),(0, 2, 3, 5),(1, 3, 3, 5),(2, 4, 3, 5),(3, 5, 3, 5),(4, 6, 3, 5),(5, 7, 3, 5),(6, 8, 3, 5),(8, 0, 4, 5),(0, 1, 4, 5),(1, 2, 4, 5),(2, 3, 4, 5),(3, 4, 4, 5),(4, 5, 4, 5),(5, 6, 4, 5),(6, 7, 4, 5),(7, 8, 4, 5),(0, 0, 5, 5),(1, 1, 5, 5),(2, 2, 5, 5),(3, 3, 5, 5),(4, 4, 5, 5),(5, 5, 5, 5),(6, 6, 5, 5),(7, 7, 5, 5),(8, 8, 5, 5),(1, 0, 6, 5),(2, 1, 6, 5),(3, 2, 6, 5),(4, 3, 6, 5),(5, 4, 6, 5),(6, 5, 6, 5),(7, 6, 6, 5),(8, 7, 6, 5),(0, 8, 6, 5),(2, 0, 7, 5),(3, 1, 7, 5),(4, 2, 7, 5),(5, 3, 7, 5),(6, 4, 7, 5),(7, 5, 7, 5),(8, 6, 7, 5),(0, 7, 7, 5),(1, 8, 7, 5),(3, 0, 8, 5),(4, 1, 8, 5),(5, 2, 8, 5),(6, 3, 8, 5),(7, 4, 8, 5),(8, 5, 8, 5),(0, 6, 8, 5),(1, 7, 8, 5),(2, 8, 8, 5),(5, 0, 0, 6),(6, 1, 0, 6),(7, 2, 0, 6),(8, 3, 0, 6),(0, 4, 0, 6),(1, 5, 0, 6),(2, 6, 0, 6),(3, 7, 0, 6),(4, 8, 0, 6),(6, 0, 1, 6),(7, 1, 1, 6),(8, 2, 1, 6),(0, 3, 1, 6),(1, 4, 1, 6),(2, 5, 1, 6),(3, 6, 1, 6),(4, 7, 1, 6),(5, 8, 1, 6),(7, 0, 2, 6),(8, 1, 2, 6),(0, 2, 2, 6),(1, 3, 2, 6),(2, 4, 2, 6),(3, 5, 2, 6),(4, 6, 2, 6),(5, 7, 2, 6),(6, 8, 2, 6),(8, 0, 3, 6),(0, 1, 3, 6),(1, 2, 3, 6),(2, 3, 3, 6),(3, 4, 3, 6),(4, 5, 3, 6),(5, 6, 3, 6),(6, 7, 3, 6),(7, 8, 3, 6),(0, 0, 4, 6),(1, 1, 4, 6),(2, 2, 4, 6),(3, 3, 4, 6),(4, 4, 4, 6),(5, 5, 4, 6),(6, 6, 4, 6),(7, 7, 4, 6),(8, 8, 4, 6),(1, 0, 5, 6),(2, 1, 5, 6),(3, 2, 5, 6),(4, 3, 5, 6),(5, 4, 5, 6),(6, 5, 5, 6),(7, 6, 5, 6),(8, 7, 5, 6),(0, 8, 5, 6),(2, 0, 6, 6),(3, 1, 6, 6),(4, 2, 6, 6),(5, 3, 6, 6),(6, 4, 6, 6),(7, 5, 6, 6),(8, 6, 6, 6),(0, 7, 6, 6),(1, 8, 6, 6),(3, 0, 7, 6),(4, 1, 7, 6),(5, 2, 7, 6),(6, 3, 7, 6),(7, 4, 7, 6),(8, 5, 7, 6),(0, 6, 7, 6),(1, 7, 7, 6),(2, 8, 7, 6),(4, 0, 8, 6),(5, 1, 8, 6),(6, 2, 8, 6),(7, 3, 8, 6),(8, 4, 8, 6),(0, 5, 8, 6),(1, 6, 8, 6),(2, 7, 8, 6),(3, 8, 8, 6),(6, 0, 0, 7),(7, 1, 0, 7),(8, 2, 0, 7),(0, 3, 0, 7),(1, 4, 0, 7),(2, 5, 0, 7),(3, 6, 0, 7),(4, 7, 0, 7),(5, 8, 0, 7),(7, 0, 1, 7),(8, 1, 1, 7),(0, 2, 1, 7),(1, 3, 1, 7),(2, 4, 1, 7),(3, 5, 1, 7),(4, 6, 1, 7),(5, 7, 1, 7),(6, 8, 1, 7),(8, 0, 2, 7),(0, 1, 2, 7),(1, 2, 2, 7),(2, 3, 2, 7),(3, 4, 2, 7),(4, 5, 2, 7),(5, 6, 2, 7),(6, 7, 2, 7),(7, 8, 2, 7),(0, 0, 3, 7),(1, 1, 3, 7),(2, 2, 3, 7),(3, 3, 3, 7),(4, 4, 3, 7),(5, 5, 3, 7),(6, 6, 3, 7),(7, 7, 3, 7),(8, 8, 3, 7),(1, 0, 4, 7),(2, 1, 4, 7),(3, 2, 4, 7),(4, 3, 4, 7),(5, 4, 4, 7),(6, 5, 4, 7),(7, 6, 4, 7),(8, 7, 4, 7),(0, 8, 4, 7),(2, 0, 5, 7),(3, 1, 5, 7),(4, 2, 5, 7),(5, 3, 5, 7),(6, 4, 5, 7),(7, 5, 5, 7),(8, 6, 5, 7),(0, 7, 5, 7),(1, 8, 5, 7),(3, 0, 6, 7),(4, 1, 6, 7),(5, 2, 6, 7),(6, 3, 6, 7),(7, 4, 6, 7),(8, 5, 6, 7),(0, 6, 6, 7),(1, 7, 6, 7),(2, 8, 6, 7),(4, 0, 7, 7),(5, 1, 7, 7),(6, 2, 7, 7),(7, 3, 7, 7),(8, 4, 7, 7),(0, 5, 7, 7),(1, 6, 7, 7),(2, 7, 7, 7),(3, 8, 7, 7),(5, 0, 8, 7),(6, 1, 8, 7),(7, 2, 8, 7),(8, 3, 8, 7),(0, 4, 8, 7),(1, 5, 8, 7),(2, 6, 8, 7),(3, 7, 8, 7),(4, 8, 8, 7),(7, 0, 0, 8),(8, 1, 0, 8),(0, 2, 0, 8),(1, 3, 0, 8),(2, 4, 0, 8),(3, 5, 0, 8),(4, 6, 0, 8),(5, 7, 0, 8),(6, 8, 0, 8),(8, 0, 1, 8),(0, 1, 1, 8),(1, 2, 1, 8),(2, 3, 1, 8),(3, 4, 1, 8),(4, 5, 1, 8),(5, 6, 1, 8),(6, 7, 1, 8),(7, 8, 1, 8),(0, 0, 2, 8),(1, 1, 2, 8),(2, 2, 2, 8),(3, 3, 2, 8),(4, 4, 2, 8),(5, 5, 2, 8),(6, 6, 2, 8),(7, 7, 2, 8),(8, 8, 2, 8),(1, 0, 3, 8),(2, 1, 3, 8),(3, 2, 3, 8),(4, 3, 3, 8),(5, 4, 3, 8),(6, 5, 3, 8),(7, 6, 3, 8),(8, 7, 3, 8),(0, 8, 3, 8),(2, 0, 4, 8),(3, 1, 4, 8),(4, 2, 4, 8),(5, 3, 4, 8),(6, 4, 4, 8),(7, 5, 4, 8),(8, 6, 4, 8),(0, 7, 4, 8),(1, 8, 4, 8),(3, 0, 5, 8),(4, 1, 5, 8),(5, 2, 5, 8),(6, 3, 5, 8),(7, 4, 5, 8),(8, 5, 5, 8),(0, 6, 5, 8),(1, 7, 5, 8),(2, 8, 5, 8),(4, 0, 6, 8),(5, 1, 6, 8),(6, 2, 6, 8),(7, 3, 6, 8),(8, 4, 6, 8),(0, 5, 6, 8),(1, 6, 6, 8),(2, 7, 6, 8),(3, 8, 6, 8),(5, 0, 7, 8),(6, 1, 7, 8),(7, 2, 7, 8),(8, 3, 7, 8),(0, 4, 7, 8),(1, 5, 7, 8),(2, 6, 7, 8),(3, 7, 7, 8),(4, 8, 7, 8),(6, 0, 8, 8),(7, 1, 8, 8),(8, 2, 8, 8),(0, 3, 8, 8),(1, 4, 8, 8),(2, 5, 8, 8),(3, 6, 8, 8),(4, 7, 8, 8),(5, 8, 8, 8)]]
self.valid_combs = {}
print(pct)
for keep_pct in range(pct):
for shape, color, loc, scale in self.pct_to_combs[keep_pct]:
if shape not in self.valid_combs:
self.valid_combs[shape] = {}
if color not in self.valid_combs[shape]:
self.valid_combs[shape][color] = {}
if loc not in self.valid_combs[shape][color]:
self.valid_combs[shape][color][loc] = []
self.valid_combs[shape][color][loc].append(scale)
self.synth_ct = {}
for keep_pct in range(pct):
for shape, color, loc, scale in self.pct_to_combs[keep_pct]:
if shape not in self.synth_ct:
self.synth_ct[shape] = {}
if color not in self.synth_ct[shape]:
self.synth_ct[shape][color] = {}
if loc not in self.synth_ct[shape][color]:
self.synth_ct[shape][color][loc] = {}
self.synth_ct[shape][color][loc][scale] = 0
def __getitem__(self, index):
"""
Args:
index (int): Index
Returns:
tuple: (image, target) where target is index of the target class.
"""
if self.train:
img, target = self.train_data[index], self.train_labels[index]
else:
img, target = self.test_data[index], self.test_labels[index]
assert target.item() < 9
# Sample labels
shape_class = target.item()
color_class = random.choice(list(self.valid_combs[shape_class].keys()))
loc_class = random.choice(list(self.valid_combs[shape_class][color_class].keys()))
scale_class = random.choice(list(self.valid_combs[shape_class][color_class][loc_class]))
while self.synth_ct[shape_class][color_class][loc_class][scale_class] < \
max(self.synth_ct[shape_class][color_class][loc_class].values()):
color_class = random.choice(list(self.valid_combs[shape_class].keys()))
loc_class = random.choice(list(self.valid_combs[shape_class][color_class].keys()))
scale_class = random.choice(list(self.valid_combs[shape_class][color_class][loc_class]))
self.synth_ct[shape_class][color_class][loc_class][scale_class] += 1
# Put grayscale image in RGB space
img_array = np.stack((img[0] * 255,) * 3, axis=-1).astype(np.uint8)
# Determine image size
img_size = 28
box_size = 32
total_size = 32 * 3
size = range(12, 29, 2)[self.scale_indices[scale_class]]
img = Image.fromarray(img_array, 'RGB')
img.thumbnail((size, size))
local_vert_offset = int((img_size - size) / 2)
local_horiz_offset = int((img_size - size) / 2)
# Determine image location
loc_idx = self.loc_indices[loc_class]
vert_loc_class = int(loc_idx / 3)
horiz_loc_class = loc_idx % 3
global_vert_offset = vert_loc_class * box_size
global_horiz_offset = horiz_loc_class * box_size
vert_offset = local_vert_offset + global_vert_offset
horiz_offset = local_horiz_offset + global_horiz_offset
# Render
img_array = np.zeros((total_size, total_size, 3))
img_array[vert_offset:(vert_offset + size), horiz_offset:(horiz_offset + size), :] = np.array(img)
img_array = np.clip(img_array, 0, 254)
img_array = img_array.astype(dtype=np.uint8)
# Color image
img_array = img_array * self.color_map[self.color_indices[color_class]]
# Add Gaussian noise
noise = np.reshape(np.random.normal(self.mu, self.sigma, img_array.size), img_array.shape)
mask = (img_array != 0).astype("uint8")
img_array = img_array + np.multiply(mask, noise)
img_array = np.clip(img_array, 0, 255)
img_array = img_array.astype("uint8")
img = Image.fromarray(img_array)
# Perform additional transformations
if self.transform is not None:
img = self.transform(img)
if self.target_transform is not None:
target = self.target_transform(target)
return transforms.ToTensor()(img), torch.tensor([target.item(), color_class, loc_class, scale_class])
| 591.767742 | 9,508 | 0.326643 | 27,116 | 91,724 | 1.098134 | 0.006417 | 0.022433 | 0.004231 | 0.007657 | 0.927461 | 0.350573 | 0.320382 | 0.026094 | 0.022333 | 0.021023 | 0 | 0.376222 | 0.235031 | 91,724 | 154 | 9,509 | 595.61039 | 0.048157 | 0.004666 | 0 | 0.091743 | 0 | 0 | 0.001151 | 0 | 0 | 0 | 0 | 0 | 0.009174 | 1 | 0.018349 | false | 0 | 0.055046 | 0 | 0.12844 | 0.009174 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b7568ca217d1e1cf743150556abb235d178156c5 | 325 | py | Python | start.py | moenova/Human-Memory-Manager | 02b2d8ffd78a3ddcd693f0d991bd4c5d0ef54c80 | [
"MIT"
] | 6 | 2019-09-20T01:15:05.000Z | 2020-05-20T20:14:39.000Z | start.py | moenova/Memory-Helper | 02b2d8ffd78a3ddcd693f0d991bd4c5d0ef54c80 | [
"MIT"
] | null | null | null | start.py | moenova/Memory-Helper | 02b2d8ffd78a3ddcd693f0d991bd4c5d0ef54c80 | [
"MIT"
] | null | null | null | from src.memory import startTable,startNode
#startTable("genki test","res/","eng","hiragana",lesson=13)
#startTable("genki test","res/","eng","hiragana")
#startNode("test","res/","第一章节")
#startTable("genki 2 volcab","res/","eng","hiragana",lesson=13)
startTable("genki 2 kanji","res/","hiragana","kanji",lesson=13)
| 36.111111 | 64 | 0.68 | 42 | 325 | 5.261905 | 0.404762 | 0.271493 | 0.190045 | 0.199095 | 0.502262 | 0.502262 | 0.334842 | 0 | 0 | 0 | 0 | 0.026667 | 0.076923 | 325 | 8 | 65 | 40.625 | 0.71 | 0.612308 | 0 | 0 | 0 | 0 | 0.263158 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
b75f8069a886743e26f1ac9cfdb82b1ea13e45ff | 38 | py | Python | tests/test_project.py | buuren/project | e115a85cd5bf2c6d797ab10a61462837b1adb35e | [
"MIT"
] | null | null | null | tests/test_project.py | buuren/project | e115a85cd5bf2c6d797ab10a61462837b1adb35e | [
"MIT"
] | 1 | 2021-06-01T21:54:28.000Z | 2021-06-01T21:54:28.000Z | tests/test_project.py | buuren/project | e115a85cd5bf2c6d797ab10a61462837b1adb35e | [
"MIT"
] | null | null | null | def test_project():
assert 0 == 0
| 12.666667 | 19 | 0.605263 | 6 | 38 | 3.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0.263158 | 38 | 2 | 20 | 19 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b782012597a99e32099d7094fab85f68e6d862fe | 33 | py | Python | stko/utilities/__init__.py | SFin94/stko | 7a913c7f0c4b616ddc52fef7eeb44c539176c351 | [
"MIT"
] | 21 | 2018-04-12T16:25:24.000Z | 2022-02-14T23:05:43.000Z | stko/utilities/__init__.py | SFin94/stko | 7a913c7f0c4b616ddc52fef7eeb44c539176c351 | [
"MIT"
] | 60 | 2020-05-22T13:38:54.000Z | 2022-03-25T09:34:22.000Z | stko/utilities/__init__.py | SFin94/stko | 7a913c7f0c4b616ddc52fef7eeb44c539176c351 | [
"MIT"
] | 5 | 2018-08-07T13:00:16.000Z | 2021-11-01T00:55:10.000Z | from .utilities import * # noqa
| 16.5 | 32 | 0.69697 | 4 | 33 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212121 | 33 | 1 | 33 | 33 | 0.884615 | 0.121212 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b7b778adb80cc53c8973743429065b7cf0cd1ca9 | 175 | py | Python | filey/__init__.py | kendfss/filey | 890126f197cee060b3df2bf8660f9401dab629dc | [
"MIT"
] | null | null | null | filey/__init__.py | kendfss/filey | 890126f197cee060b3df2bf8660f9401dab629dc | [
"MIT"
] | null | null | null | filey/__init__.py | kendfss/filey | 890126f197cee060b3df2bf8660f9401dab629dc | [
"MIT"
] | null | null | null | from .handles import *
from .shell import *
from .walking import *
from .persistence import *
from .shortcuts import *
if __name__ == "__main__":
pass | 12.5 | 27 | 0.634286 | 19 | 175 | 5.421053 | 0.578947 | 0.38835 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.28 | 175 | 14 | 28 | 12.5 | 0.81746 | 0 | 0 | 0 | 0 | 0 | 0.04908 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.142857 | 0.714286 | 0 | 0.714286 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
4d4417123d0ca0c9a30b103f83b57c04ec4a12fa | 2,613 | py | Python | tests/test_reprs.py | amangoel185/hist | 040872b978ecfd98e9836ad0a7e5c27d8dd44d09 | [
"BSD-3-Clause"
] | 84 | 2020-02-12T02:02:58.000Z | 2022-03-23T10:50:03.000Z | tests/test_reprs.py | amangoel185/hist | 040872b978ecfd98e9836ad0a7e5c27d8dd44d09 | [
"BSD-3-Clause"
] | 213 | 2020-03-09T02:38:25.000Z | 2022-03-16T19:22:31.000Z | tests/test_reprs.py | amangoel185/hist | 040872b978ecfd98e9836ad0a7e5c27d8dd44d09 | [
"BSD-3-Clause"
] | 15 | 2020-03-14T12:05:18.000Z | 2021-11-12T14:25:07.000Z | from __future__ import annotations
from hist import Hist, Stack, axis
def test_1D_empty_repr(named_hist):
h = named_hist.new.Reg(10, -1, 1, name="x", label="y").Double()
html = h._repr_html_()
assert html
assert "name='x'" in repr(h)
assert "label='y'" in repr(h)
def test_1D_var_empty_repr(named_hist):
h = named_hist.new.Var(range(10), name="x", label="y").Double()
html = h._repr_html_()
assert html
assert "name='x'" in repr(h)
assert "label='y'" in repr(h)
def test_1D_int_empty_repr(named_hist):
h = named_hist.new.Int(-9, 9, name="x", label="y").Double()
html = h._repr_html_()
assert html
assert "name='x'" in repr(h)
assert "label='y'" in repr(h)
def test_1D_intcat_empty_repr(named_hist):
h = named_hist.new.IntCat([1, 3, 5], name="x", label="y").Double()
html = h._repr_html_()
assert html
assert "name='x'" in repr(h)
assert "label='y'" in repr(h)
def test_1D_strcat_empty_repr(named_hist):
h = named_hist.new.StrCat(["1", "3", "5"], name="x", label="y").Double()
html = h._repr_html_()
assert html
assert "name='x'" in repr(h)
assert "label='y'" in repr(h)
def test_2D_empty_repr(named_hist):
h = (
named_hist.new.Reg(10, -1, 1, name="x", label="y")
.Int(0, 15, name="p", label="q")
.Double()
)
html = h._repr_html_()
assert html
assert "name='x'" in repr(h)
assert "name='p'" in repr(h)
assert "label='y'" in repr(h)
assert "label='q'" in repr(h)
def test_1D_circ_empty_repr(named_hist):
h = named_hist.new.Reg(10, -1, 1, circular=True, name="R", label="r").Double()
html = h._repr_html_()
assert html
assert "name='R'" in repr(h)
assert "label='r'" in repr(h)
def test_ND_empty_repr(named_hist):
h = (
named_hist.new.Reg(10, -1, 1, name="x", label="y")
.Reg(12, -3, 3, name="p", label="q")
.Reg(15, -2, 4, name="a", label="b")
.Double()
)
html = h._repr_html_()
assert html
assert "name='x'" in repr(h)
assert "name='p'" in repr(h)
assert "name='a'" in repr(h)
assert "label='y'" in repr(h)
assert "label='q'" in repr(h)
assert "label='b'" in repr(h)
def test_stack_repr(named_hist):
a1 = axis.Regular(
50, -5, 5, name="A", label="a [unit]", underflow=False, overflow=False
)
a2 = axis.Regular(
50, -5, 5, name="A", label="a [unit]", underflow=False, overflow=False
)
assert "name='A'" in repr(Stack(Hist(a1), Hist(a2)))
assert "label='a [unit]'" in repr(Stack(Hist(a1), Hist(a2)))
| 25.368932 | 82 | 0.589744 | 435 | 2,613 | 3.367816 | 0.131034 | 0.098294 | 0.105119 | 0.124232 | 0.840956 | 0.791126 | 0.784983 | 0.753584 | 0.668942 | 0.642321 | 0 | 0.028543 | 0.22235 | 2,613 | 102 | 83 | 25.617647 | 0.692421 | 0 | 0 | 0.567568 | 0 | 0 | 0.097206 | 0 | 0 | 0 | 0 | 0 | 0.432432 | 1 | 0.121622 | false | 0 | 0.027027 | 0 | 0.148649 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4d7f0f7c79183b1551ac547e2d97917b1d1b3d26 | 124 | py | Python | blog/admin.py | shivansh1698/Blog_yourself | de797a4c01484fb006fddb3e213b75fad517e74b | [
"BSD-3-Clause"
] | null | null | null | blog/admin.py | shivansh1698/Blog_yourself | de797a4c01484fb006fddb3e213b75fad517e74b | [
"BSD-3-Clause"
] | 5 | 2020-02-11T21:31:42.000Z | 2021-06-10T22:45:00.000Z | blog/admin.py | shivansh1698/Blog_yourself | de797a4c01484fb006fddb3e213b75fad517e74b | [
"BSD-3-Clause"
] | null | null | null | from django.contrib import admin
from . import models
admin.site.register(models.Post)
admin.site.register(models.Comment)
| 20.666667 | 35 | 0.814516 | 18 | 124 | 5.611111 | 0.555556 | 0.178218 | 0.336634 | 0.455446 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08871 | 124 | 5 | 36 | 24.8 | 0.893805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
4db79b5311b693731e27206c61e61819471b1c8b | 89 | py | Python | app/moment/__init__.py | MultyXu/Islandr | 5f965638d0cda68e76f63b0625b2411029e90c2d | [
"MIT"
] | 2 | 2019-06-05T01:36:53.000Z | 2019-06-05T01:37:12.000Z | app/moment/__init__.py | dingding0606/Islandr | dd2277a8a7ceb002af3b28045133797fa9ffed9d | [
"MIT"
] | null | null | null | app/moment/__init__.py | dingding0606/Islandr | dd2277a8a7ceb002af3b28045133797fa9ffed9d | [
"MIT"
] | 1 | 2019-08-28T20:49:22.000Z | 2019-08-28T20:49:22.000Z | from flask import Blueprint
moment = Blueprint('moment', __name__)
from . import views
| 14.833333 | 38 | 0.764045 | 11 | 89 | 5.818182 | 0.636364 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157303 | 89 | 5 | 39 | 17.8 | 0.853333 | 0 | 0 | 0 | 0 | 0 | 0.067416 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
4de9c106d4d0f208b0c30b9db9b1737ca7387795 | 22 | py | Python | __init__.py | keenJMS/JMS_OIM | 6f4bc1c114d007c108d6989c05fb0847aee78a1c | [
"Apache-2.0"
] | null | null | null | __init__.py | keenJMS/JMS_OIM | 6f4bc1c114d007c108d6989c05fb0847aee78a1c | [
"Apache-2.0"
] | null | null | null | __init__.py | keenJMS/JMS_OIM | 6f4bc1c114d007c108d6989c05fb0847aee78a1c | [
"Apache-2.0"
] | null | null | null | from modeling import * | 22 | 22 | 0.818182 | 3 | 22 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 22 | 1 | 22 | 22 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1283c9db0b2ab37c874f836d2104cd5f7207a579 | 8,779 | py | Python | tests/components/intellifire/test_config_flow.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 3 | 2021-11-22T22:37:43.000Z | 2022-03-17T00:55:28.000Z | tests/components/intellifire/test_config_flow.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 14 | 2022-01-13T04:27:21.000Z | 2022-03-06T20:30:43.000Z | tests/components/intellifire/test_config_flow.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 3 | 2022-01-02T18:49:54.000Z | 2022-01-25T02:03:54.000Z | """Test the IntelliFire config flow."""
from unittest.mock import AsyncMock, MagicMock, patch
from homeassistant import config_entries
from homeassistant.components import dhcp
from homeassistant.components.intellifire.config_flow import MANUAL_ENTRY_STRING
from homeassistant.components.intellifire.const import DOMAIN
from homeassistant.const import CONF_HOST
from homeassistant.core import HomeAssistant
from homeassistant.data_entry_flow import RESULT_TYPE_CREATE_ENTRY, RESULT_TYPE_FORM
from tests.common import MockConfigEntry
async def test_no_discovery(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_intellifire_config_flow: MagicMock,
) -> None:
"""Test we should get the manual discovery form - because no discovered fireplaces."""
with patch(
"homeassistant.components.intellifire.config_flow.AsyncUDPFireplaceFinder.search_fireplace",
return_value=[],
):
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == RESULT_TYPE_FORM
assert result["errors"] == {}
assert result["step_id"] == "manual_device_entry"
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"],
{
CONF_HOST: "1.1.1.1",
},
)
await hass.async_block_till_done()
assert result2["type"] == RESULT_TYPE_CREATE_ENTRY
assert result2["title"] == "Fireplace 12345"
assert result2["data"] == {CONF_HOST: "1.1.1.1"}
assert len(mock_setup_entry.mock_calls) == 1
async def test_single_discovery(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_intellifire_config_flow: MagicMock,
) -> None:
"""Test single fireplace UDP discovery."""
with patch(
"homeassistant.components.intellifire.config_flow.AsyncUDPFireplaceFinder.search_fireplace",
return_value=["192.168.1.69"],
):
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"], {CONF_HOST: "192.168.1.69"}
)
await hass.async_block_till_done()
print("Result:", result)
assert result2["type"] == RESULT_TYPE_CREATE_ENTRY
assert result2["title"] == "Fireplace 12345"
assert result2["data"] == {CONF_HOST: "192.168.1.69"}
assert len(mock_setup_entry.mock_calls) == 1
async def test_manual_entry(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_intellifire_config_flow: MagicMock,
) -> None:
"""Test for multiple firepalce discovery - involing a pick_device step."""
with patch(
"homeassistant.components.intellifire.config_flow.AsyncUDPFireplaceFinder.search_fireplace",
return_value=["192.168.1.69", "192.168.1.33", "192.168.169"],
):
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["step_id"] == "pick_device"
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input={CONF_HOST: MANUAL_ENTRY_STRING}
)
await hass.async_block_till_done()
assert result2["step_id"] == "manual_device_entry"
async def test_multi_discovery(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_intellifire_config_flow: MagicMock,
) -> None:
"""Test for multiple fireplace discovery - involving a pick_device step."""
with patch(
"homeassistant.components.intellifire.config_flow.AsyncUDPFireplaceFinder.search_fireplace",
return_value=["192.168.1.69", "192.168.1.33", "192.168.169"],
):
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["step_id"] == "pick_device"
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input={CONF_HOST: "192.168.1.33"}
)
await hass.async_block_till_done()
assert result["step_id"] == "pick_device"
assert result2["type"] == RESULT_TYPE_CREATE_ENTRY
async def test_multi_discovery_cannot_connect(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_intellifire_config_flow: MagicMock,
) -> None:
"""Test for multiple fireplace discovery - involving a pick_device step."""
with patch(
"homeassistant.components.intellifire.config_flow.AsyncUDPFireplaceFinder.search_fireplace",
return_value=["192.168.1.69", "192.168.1.33", "192.168.169"],
):
mock_intellifire_config_flow.poll.side_effect = ConnectionError
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == RESULT_TYPE_FORM
assert result["step_id"] == "pick_device"
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input={CONF_HOST: "192.168.1.33"}
)
await hass.async_block_till_done()
assert result2["type"] == RESULT_TYPE_FORM
assert result2["errors"] == {"base": "cannot_connect"}
async def test_form_cannot_connect_manual_entry(
hass: HomeAssistant,
mock_intellifire_config_flow: MagicMock,
mock_fireplace_finder_single: AsyncMock,
) -> None:
"""Test we handle cannot connect error."""
mock_intellifire_config_flow.poll.side_effect = ConnectionError
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
assert result["type"] == RESULT_TYPE_FORM
assert result["step_id"] == "manual_device_entry"
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"],
{
CONF_HOST: "1.1.1.1",
},
)
assert result2["type"] == RESULT_TYPE_FORM
assert result2["errors"] == {"base": "cannot_connect"}
async def test_picker_already_discovered(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_intellifire_config_flow: MagicMock,
) -> None:
"""Test single fireplace UDP discovery."""
entry = MockConfigEntry(
domain=DOMAIN,
data={
"host": "192.168.1.3",
},
title="Fireplace",
unique_id=44444,
)
entry.add_to_hass(hass)
with patch(
"homeassistant.components.intellifire.config_flow.AsyncUDPFireplaceFinder.search_fireplace",
return_value=["192.168.1.3"],
):
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)
await hass.async_block_till_done()
result2 = await hass.config_entries.flow.async_configure(
result["flow_id"],
{
CONF_HOST: "192.168.1.4",
},
)
assert result2["type"] == RESULT_TYPE_CREATE_ENTRY
assert result2["title"] == "Fireplace 12345"
assert result2["data"] == {CONF_HOST: "192.168.1.4"}
assert len(mock_setup_entry.mock_calls) == 2
async def test_dhcp_discovery_intellifire_device(
hass: HomeAssistant,
mock_setup_entry: AsyncMock,
mock_intellifire_config_flow: MagicMock,
) -> None:
"""Test successful DHCP Discovery."""
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": config_entries.SOURCE_DHCP},
data=dhcp.DhcpServiceInfo(
ip="1.1.1.1",
macaddress="AA:BB:CC:DD:EE:FF",
hostname="zentrios-Test",
),
)
assert result["type"] == RESULT_TYPE_FORM
assert result["step_id"] == "dhcp_confirm"
result2 = await hass.config_entries.flow.async_configure(result["flow_id"])
assert result2["type"] == RESULT_TYPE_FORM
assert result2["step_id"] == "dhcp_confirm"
result3 = await hass.config_entries.flow.async_configure(
result2["flow_id"], user_input={}
)
assert result3["title"] == "Fireplace 12345"
assert result3["data"] == {"host": "1.1.1.1"}
async def test_dhcp_discovery_non_intellifire_device(
hass: HomeAssistant,
mock_intellifire_config_flow: MagicMock,
mock_setup_entry: AsyncMock,
) -> None:
"""Test failed DHCP Discovery."""
mock_intellifire_config_flow.poll.side_effect = ConnectionError
result = await hass.config_entries.flow.async_init(
DOMAIN,
context={"source": config_entries.SOURCE_DHCP},
data=dhcp.DhcpServiceInfo(
ip="1.1.1.1",
macaddress="AA:BB:CC:DD:EE:FF",
hostname="zentrios-Evil",
),
)
assert result["type"] == "abort"
assert result["reason"] == "not_intellifire_device"
| 34.159533 | 100 | 0.681399 | 1,045 | 8,779 | 5.456459 | 0.121531 | 0.063837 | 0.073658 | 0.069449 | 0.821817 | 0.790951 | 0.768502 | 0.748159 | 0.714135 | 0.714135 | 0 | 0.034084 | 0.201276 | 8,779 | 256 | 101 | 34.292969 | 0.779093 | 0.003759 | 0 | 0.635922 | 0 | 0 | 0.170634 | 0.067525 | 0 | 0 | 0 | 0 | 0.174757 | 1 | 0 | false | 0 | 0.043689 | 0 | 0.043689 | 0.004854 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
129ad3d48081ec0e1b77ba778d2fde2371b7fd16 | 13,653 | py | Python | public_interface/forms.py | danmcelroy/VoSeq | e22bd5d971154170bf3f4f24b684b95a12418637 | [
"BSD-3-Clause"
] | 2 | 2019-08-20T04:16:12.000Z | 2020-08-25T02:05:12.000Z | public_interface/forms.py | danmcelroy/VoSeq | e22bd5d971154170bf3f4f24b684b95a12418637 | [
"BSD-3-Clause"
] | 65 | 2016-09-27T23:14:51.000Z | 2022-03-19T14:17:58.000Z | public_interface/forms.py | danmcelroy/VoSeq | e22bd5d971154170bf3f4f24b684b95a12418637 | [
"BSD-3-Clause"
] | 4 | 2018-07-02T16:57:44.000Z | 2021-03-23T02:12:15.000Z | import datetime
from functools import partial
from typing import Dict, Tuple
from django import forms
from haystack.forms import ModelSearchForm
from haystack.query import SearchQuerySet
from Bio.Alphabet import IUPAC
from public_interface.models import Genes, Vouchers, Sequences
DateInput = partial(forms.DateInput, {'class': 'datepicker form-control',
'placeholder': 'Type or pick a date'})
class SequencesAdminForm(forms.ModelForm):
def clean_sequences(self) -> str:
valid_letters = set(IUPAC.ambiguous_dna.letters.upper() + 'N?-')
sequence = str(self.cleaned_data['sequences'])
for nucleotide in sequence:
if nucleotide == ' ':
self.add_error('sequences', "White spaces are not valid. "
"Cannot save this sequence")
elif not valid_letters.issuperset(nucleotide.upper()):
self.add_error('sequences',
"The character {} is not valid. "
"Cannot save this sequence".format(nucleotide))
return sequence
class AdvancedSearchForm(ModelSearchForm):
NULL = 'Select'
MALE = 'male'
FEMALE = 'female'
LARVA = 'larva'
WORKER = 'worker'
QUEEN = 'queen'
UNKNOWN = 'unknown'
SEX_CHOICES = (
(NULL, 'Select'),
(MALE, 'male'),
(FEMALE, 'female'),
(LARVA, 'larva'),
(WORKER, 'worker'),
(QUEEN, 'queen'),
(UNKNOWN, 'unknown'),
)
NULL = 'Select'
DONT_KNOW = 'unknown'
YES = 'yes'
NO = 'not'
TYPE_SPECIES_CHOICES = (
(NULL, 'Select'),
(DONT_KNOW, 'unknown'),
(YES, 'yes'),
(NO, 'not'),
)
NULL = 'Select'
SPREAD = 'spread'
ENVELOPE = 'in envelope'
PHOTO = 'only photo'
NONE = 'no voucher'
DESTROYED = 'destroyed'
LOST = 'lost'
VOUCHER_CHOICES = (
(NULL, 'Select'),
(SPREAD, 'spread'),
(ENVELOPE, 'in envelope'),
(PHOTO, 'only photo'),
(NONE, 'no voucher'),
(DESTROYED, 'destroyed'),
(LOST, 'lost'),
(UNKNOWN, 'unknown'),
)
code = forms.CharField(label="Code in voseq", max_length=100, required=False)
orden = forms.CharField(label="Order", max_length=100, required=False)
superfamily = forms.CharField(label="Superfamily", max_length=100, required=False)
family = forms.CharField(label="Family", max_length=100, required=False)
subfamily = forms.CharField(label="Subfamily", max_length=100, required=False)
tribe = forms.CharField(label="Tribe", max_length=100, required=False)
subtribe = forms.CharField(label="Subtribe", max_length=100, required=False)
genus = forms.CharField(label="Genus", max_length=100, required=False)
species = forms.CharField(label="Species", max_length=100, required=False)
subspecies = forms.CharField(label="Subspecies", max_length=100, required=False)
country = forms.CharField(label="Country", max_length=100, required=False)
specific_locality = forms.CharField(
label="Specific Locality",
max_length=250,
required=False)
type_species = forms.ChoiceField(
label="Type species",
choices=TYPE_SPECIES_CHOICES,
widget=forms.Select(attrs={'class': 'form-control'}),
required=False)
latitude = forms.FloatField(label="Latitude", required=False)
longitude = forms.FloatField(label="Longitude", required=False)
max_altitude = forms.IntegerField(label="Maximum altitude", required=False)
min_altitude = forms.IntegerField(label="Minimum altitude", required=False)
collector = forms.CharField(label="Collector", max_length=100, required=False)
date_collection = forms.DateField(
label="Date of collection start",
required=False,
widget=DateInput(),
error_messages={'invalid': 'Enter valid date: YYYY-mm-dd'},
)
date_collection_end = forms.DateField(
label="Date of collection end",
required=False,
widget=DateInput(),
error_messages={'invalid': 'Enter valid date: YYYY-mm-dd'},
)
extraction = forms.CharField(
label="Extraction",
max_length=50,
help_text="Number of extraction event.",
required=False)
extraction_tube = forms.CharField(
label="Extraction tube",
max_length=50,
help_text="Tube containing DNA extract.",
required=False)
date_extraction = forms.DateField(
label="Date of extraction",
required=False,
widget=DateInput(),
error_messages={'invalid': 'Enter valid date: YYYY-mm-dd'})
extractor = forms.CharField(label="Extractor", max_length=100, required=False)
voucher_locality = forms.CharField(
label="Voucher locality",
max_length=200,
required=False)
published_in = forms.CharField(label="Published in", required=False)
notes = forms.CharField(label="Notes", required=False)
latest_editor = forms.CharField(label="Latest editor", required=False)
hostorg = forms.CharField(
label="Host organism",
max_length=200,
help_text="Hostplant or other host.",
required=False)
sex = forms.ChoiceField(label="Sex", choices=SEX_CHOICES, required=False,
widget=forms.Select(attrs={'class': 'form-control'}))
voucher = forms.ChoiceField(
label="Voucher",
choices=VOUCHER_CHOICES,
required=False,
widget=forms.Select(attrs={'class': 'form-control'}))
voucher_code = forms.CharField(
label="Alternative voucher code",
max_length=100,
help_text="Original code of voucher specimen.",
required=False)
code_bold = forms.CharField(
label="Code in BOLD database",
max_length=100,
help_text="Optional code for specimens kept in the BOLD database.",
required=False)
determined_by = forms.CharField(
label="Determined by",
max_length=100,
help_text="Person that identified the taxon for this specimen.",
required=False)
author = forms.CharField(
label="Author",
max_length=100,
help_text="Person that described this taxon.",
required=False)
# Sequences model
YES = 'y'
NO = 'n'
GENBANK_CHOICES = (
(YES, 'Yes'),
(NO, 'No'),
)
gene_code = forms.ModelChoiceField(
Genes.objects.all().order_by('gene_code'),
required=False,
widget=forms.Select(attrs={'class': 'form-control'}),
empty_label='Select',
)
genbank = forms.ChoiceField(
widget=forms.RadioSelect,
choices=GENBANK_CHOICES,
required=False)
accession = forms.CharField(max_length=100, required=False)
lab_person = forms.CharField(max_length=100, required=False)
def no_query_found(self):
sqs = SearchQuerySet.none
return sqs
def search(self):
keywords, sequence_keywords = self.clean_search_keywords()
sqs = ''
if keywords and not sequence_keywords:
sqs = Vouchers.objects.filter(**keywords).distinct("code")
elif sequence_keywords and not keywords:
sqs = Sequences.objects.filter(**sequence_keywords).distinct("code")
elif sequence_keywords and keywords:
sqs = Vouchers.objects.filter(**keywords).distinct("code")
if sqs:
voucher_list = sqs
sqs = Sequences.objects.filter(**sequence_keywords).filter(
code__in=voucher_list).distinct("code")
else:
self.no_query_found()
if not sqs:
self.no_query_found()
return sqs
def clean_search_keywords(self) -> Tuple[Dict[str, str], Dict[str, str]]:
keywords = {}
sequence_keywords = {}
for key, value in self.cleaned_data.items():
if value in ['', None, 'Select', 'models'] or key == "models":
continue
if key in ['date_collection', 'date_collection_end', 'date_extraction']:
value = datetime.date.strftime(value, "%Y-%m-%d")
if key in ['lab_person', 'accession']:
new_key = "{}__icontains".format(key)
sequence_keywords[new_key] = value
if key == 'gene_code':
new_key = "gene__gene_code"
sequence_keywords[new_key] = value.gene_code
if key == 'genbank' and value == 'y':
sequence_keywords[key] = True
elif key == 'genbank':
sequence_keywords[key] = False
if key not in ['lab_person', 'accession', 'genbank', 'gene_code']:
new_key = "{}__icontains".format(key)
keywords[new_key] = value
return keywords, sequence_keywords
# The following form is for the admin site batch_changes action
# It would be nice not to repeat the lines from the previous
# model, but...
class BatchChangesForm(forms.Form):
NULL = None
MALE = 'male'
FEMALE = 'female'
LARVA = 'larva'
WORKER = 'worker'
QUEEN = 'queen'
SEX_CHOICES = (
(NULL, 'Select'),
(MALE, 'male'),
(FEMALE, 'female'),
(LARVA, 'larva'),
(WORKER, 'worker'),
(QUEEN, 'queen'),
)
NULL = None
DONT_KNOW = 'unknown'
YES = 'yes'
NO = 'not'
TYPE_SPECIES_CHOICES = (
(NULL, 'Select'),
(DONT_KNOW, 'unknown'),
(YES, 'yes'),
(NO, 'not'),
)
NULL = None
SPREAD = 'spread'
ENVELOPE = 'in envelope'
PHOTO = 'only photo'
NONE = 'no voucher'
DESTROYED = 'destroyed'
LOST = 'lost'
VOUCHER_CHOICES = (
(NULL, 'Select'),
(SPREAD, 'spread'),
(ENVELOPE, 'in envelope'),
(PHOTO, 'only photo'),
(NONE, 'no voucher'),
(DESTROYED, 'destroyed'),
(LOST, 'lost'),
)
code = forms.CharField(label="Code in voseq", max_length=100, required=False)
orden = forms.CharField(label="Order", max_length=100, required=False)
superfamily = forms.CharField(label="Superfamily", max_length=100, required=False)
family = forms.CharField(label="Family", max_length=100, required=False)
subfamily = forms.CharField(label="Subfamily", max_length=100, required=False)
tribe = forms.CharField(label="Tribe", max_length=100, required=False)
subtribe = forms.CharField(label="Subtribe", max_length=100, required=False)
genus = forms.CharField(label="Genus", max_length=100, required=False)
species = forms.CharField(label="Species", max_length=100, required=False)
subspecies = forms.CharField(label="Subspecies", max_length=100, required=False)
country = forms.CharField(label="Country", max_length=100, required=False)
specific_locality = forms.CharField(
label="Specific Locality",
max_length=250,
required=False)
type_species = forms.ChoiceField(
label="Type species",
choices=TYPE_SPECIES_CHOICES,
widget=forms.Select,
required=False)
latitude = forms.FloatField(label="Latitude", required=False)
longitude = forms.FloatField(label="Longitude", required=False)
max_altitude = forms.IntegerField(label="Maximum altitude", required=False)
min_altitude = forms.IntegerField(label="Minimum altitude", required=False)
collector = forms.CharField(label="Collector", max_length=100, required=False)
date_collection = forms.DateField(label="Date of collection start", required=False)
date_collection_end = forms.DateField(label="Date of collection end", required=False)
extraction = forms.CharField(
label="Extraction",
max_length=50,
help_text="Number of extraction event.",
required=False)
extraction_tube = forms.CharField(
label="Extraction tube",
max_length=50,
help_text="Tube containing DNA extract.",
required=False)
date_extraction = forms.DateField(label="Date extraction", required=False)
extractor = forms.CharField(label="Extractor", max_length=100, required=False)
voucher_locality = forms.CharField(
label="Voucher locality",
max_length=200,
required=False)
published_in = forms.CharField(label="Published in", required=False)
notes = forms.CharField(label="Notes", required=False)
latest_editor = forms.CharField(label="Latest editor", required=False)
hostorg = forms.CharField(
label="Host organism",
max_length=200,
help_text="Hostplant or other host.",
required=False)
sex = forms.ChoiceField(
label="Sex",
choices=SEX_CHOICES,
required=False)
voucher = forms.ChoiceField(
label="Voucher",
choices=VOUCHER_CHOICES,
required=False)
voucher_code = forms.CharField(
label="Alternative voucher code",
max_length=100,
help_text="Original code of voucher specimen.",
required=False)
code_bold = forms.CharField(
label="Code in BOLD database",
max_length=100,
help_text="Optional code for specimens kept in the BOLD database.",
required=False)
determined_by = forms.CharField(
label="Determined by",
max_length=100,
help_text="Person that identified the taxon for this specimen.",
required=False)
author = forms.CharField(
label="Author",
max_length=100,
help_text="Person that described this taxon.",
required=False)
| 37.405479 | 89 | 0.624405 | 1,485 | 13,653 | 5.620875 | 0.148822 | 0.115251 | 0.113813 | 0.06709 | 0.780879 | 0.76842 | 0.752366 | 0.732599 | 0.720139 | 0.712831 | 0 | 0.01318 | 0.255329 | 13,653 | 364 | 90 | 37.508242 | 0.80781 | 0.010987 | 0 | 0.708824 | 0 | 0 | 0.173433 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011765 | false | 0 | 0.023529 | 0 | 0.402941 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
129f0fd8b898761dbcc8164d8b0b44a5402304e6 | 3,792 | py | Python | tools/upgrade/codemods.py | tahmidbintaslim/pyre-check | 59d694f7733d7b160ecaddbe79d5d79cef0e83bc | [
"MIT"
] | null | null | null | tools/upgrade/codemods.py | tahmidbintaslim/pyre-check | 59d694f7733d7b160ecaddbe79d5d79cef0e83bc | [
"MIT"
] | null | null | null | tools/upgrade/codemods.py | tahmidbintaslim/pyre-check | 59d694f7733d7b160ecaddbe79d5d79cef0e83bc | [
"MIT"
] | null | null | null | # Copyright (c) 2016-present, Facebook, Inc.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
import logging
import pathlib
import re
from logging import Logger
from .commands.command import Command
from .errors import Errors
LOG: Logger = logging.getLogger(__name__)
class MissingOverrideReturnAnnotations(Command):
def run(self) -> None:
errors = Errors.from_stdin(self._arguments.only_fix_error_code)
for path, errors in errors:
LOG.info("Patching errors in `%s`.", path)
errors = sorted(errors, key=lambda error: error["line"], reverse=True)
# pyre-fixme[6]: Expected `Union[_PathLike[str], str]` for 1st param but got
# `Union[typing.Iterator[typing.Dict[str, typing.Any]], str]`.
path = pathlib.Path(path)
lines = path.read_text().split("\n")
for error in errors:
# pyre-fixme[6]: Expected `Union[int, slice]` for 1st param but got
# `str`.
if error["code"] != 15:
continue
# pyre-fixme[6]: Expected `int` for 1st param but got `str`.
# pyre-fixme[6]: Expected `Union[int, slice]` for 1st param but got
# `str`.
line = error["line"] - 1
# pyre-fixme[6]: Expected `Union[int, slice]` for 1st param but got
# `str`.
match = re.match(r".*`(.*)`\.", error["description"])
if not match:
continue
annotation = match.groups()[0]
# Find last closing parenthesis in after line.
LOG.info("Looking at %d: %s", line, lines[line])
while True:
if "):" in lines[line]:
lines[line] = lines[line].replace("):", ") -> %s:" % annotation)
LOG.info("%d: %s", line, lines[line])
break
else:
line = line + 1
LOG.warn("Writing patched %s", str(path))
path.write_text("\n".join(lines))
class MissingGlobalAnnotations(Command):
def run(self) -> None:
errors = Errors.from_stdin(self._arguments.only_fix_error_code)
for path, errors in errors:
LOG.info("Patching errors in `%s`", path)
errors = sorted(errors, key=lambda error: error["line"], reverse=True)
# pyre-fixme[6]: Expected `Union[_PathLike[str], str]` for 1st param but got
# `Union[typing.Iterator[typing.Dict[str, typing.Any]], str]`.
path = pathlib.Path(path)
lines = path.read_text().split("\n")
for error in errors:
# pyre-fixme[6]: Expected `Union[int, slice]` for 1st param but got
# `str`.
if error["code"] != 5:
continue
# pyre-fixme[6]: Expected `int` for 1st param but got `str`.
# pyre-fixme[6]: Expected `Union[int, slice]` for 1st param but got
# `str`.
line = error["line"] - 1
# pyre-fixme[6]: Expected `Union[int, slice]` for 1st param but got
# `str`.
match = re.match(r".*`.*`.*`(.*)`.*", error["description"])
if not match:
continue
annotation = match.groups()[0]
LOG.info("Looking at %d: %s", line, lines[line])
if " =" in lines[line]:
lines[line] = lines[line].replace(" =", ": %s =" % annotation)
LOG.info("%d: %s", line, lines[line])
path.write_text("\n".join(lines))
| 39.5 | 88 | 0.513977 | 435 | 3,792 | 4.434483 | 0.248276 | 0.046656 | 0.05184 | 0.093313 | 0.776568 | 0.776568 | 0.752722 | 0.752722 | 0.752722 | 0.720581 | 0 | 0.013051 | 0.353376 | 3,792 | 95 | 89 | 39.915789 | 0.773654 | 0.27347 | 0 | 0.555556 | 0 | 0 | 0.077993 | 0 | 0 | 0 | 0 | 0.010526 | 0 | 1 | 0.037037 | false | 0 | 0.111111 | 0 | 0.185185 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
12bd33b688e68fdf725b49f7af6e6fae034847c5 | 210 | py | Python | fem/gui/vtk_widget/vtk_graphics/picking/__init__.py | mjredmond/FEMApp | dd8cc53acf80d0a1bb83ce9c89bcfd51e85c6be8 | [
"MIT"
] | 1 | 2019-08-03T21:40:26.000Z | 2019-08-03T21:40:26.000Z | fem/gui/vtk_widget/vtk_graphics/picking/__init__.py | mjredmond/FEMApp | dd8cc53acf80d0a1bb83ce9c89bcfd51e85c6be8 | [
"MIT"
] | null | null | null | fem/gui/vtk_widget/vtk_graphics/picking/__init__.py | mjredmond/FEMApp | dd8cc53acf80d0a1bb83ce9c89bcfd51e85c6be8 | [
"MIT"
] | null | null | null | from __future__ import print_function, absolute_import
from .box_picker import BoxPicker
from .picking_manager import PickingManager
from .poly_picker import PolyPicker
from .single_picker import SinglePicker
| 30 | 54 | 0.871429 | 27 | 210 | 6.407407 | 0.592593 | 0.208092 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104762 | 210 | 6 | 55 | 35 | 0.920213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.2 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
12d45e03fb6bc1bd429d33178c1f2dde1f1a4983 | 152 | py | Python | wavepy3/__init__.py | alexrkaufman/WavePy3 | 1f5fc05b7c245975157eb77fab4201e3bd2cca5c | [
"Apache-2.0"
] | 5 | 2020-11-17T11:58:19.000Z | 2021-10-29T21:29:31.000Z | wavepy3/__init__.py | alexrkaufman/WavePy3 | 1f5fc05b7c245975157eb77fab4201e3bd2cca5c | [
"Apache-2.0"
] | 3 | 2020-10-29T23:28:10.000Z | 2020-10-30T17:01:12.000Z | wavepy3/__init__.py | alexrkaufman/WavePy3 | 1f5fc05b7c245975157eb77fab4201e3bd2cca5c | [
"Apache-2.0"
] | 1 | 2020-10-23T00:58:32.000Z | 2020-10-23T00:58:32.000Z | from .atmos import Atmos
from .constraint_analysis import constraint_analysis
from .prop import split_step
from . import analytic
from . import sources
| 25.333333 | 52 | 0.835526 | 21 | 152 | 5.904762 | 0.47619 | 0.290323 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131579 | 152 | 5 | 53 | 30.4 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4227ef29c97efee894395460ce5d433594bc2e47 | 57 | py | Python | glosysnet/__init__.py | NareshAtnPLUS/glosysnet | e85df44727b8784766be7e728267e5699997e226 | [
"MIT"
] | null | null | null | glosysnet/__init__.py | NareshAtnPLUS/glosysnet | e85df44727b8784766be7e728267e5699997e226 | [
"MIT"
] | null | null | null | glosysnet/__init__.py | NareshAtnPLUS/glosysnet | e85df44727b8784766be7e728267e5699997e226 | [
"MIT"
] | null | null | null | from glosysnet.nn import *
from glosysnet.vision import * | 28.5 | 30 | 0.807018 | 8 | 57 | 5.75 | 0.625 | 0.565217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122807 | 57 | 2 | 30 | 28.5 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4251d7d35a8d2025ac9f3b8bb1cf8cbe119a4e70 | 30 | py | Python | misc/import_this.py | fcracker79/python_misc | be661a09a806010e440015b8caeb5e06773bb533 | [
"MIT"
] | null | null | null | misc/import_this.py | fcracker79/python_misc | be661a09a806010e440015b8caeb5e06773bb533 | [
"MIT"
] | null | null | null | misc/import_this.py | fcracker79/python_misc | be661a09a806010e440015b8caeb5e06773bb533 | [
"MIT"
] | null | null | null | import accrocchio
import this
| 10 | 17 | 0.866667 | 4 | 30 | 6.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 2 | 18 | 15 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
425bc0fabcf3d6d276612f3489768f96d21c4598 | 23,001 | py | Python | tests/unit/test_spatial_methods/test_spectral_volume.py | katiezzzzz/PyBaMM | 0b0fc47125c0f078da99a58f497e0700eb25225a | [
"BSD-3-Clause"
] | 1 | 2021-03-06T15:10:34.000Z | 2021-03-06T15:10:34.000Z | tests/unit/test_spatial_methods/test_spectral_volume.py | katiezzzzz/PyBaMM | 0b0fc47125c0f078da99a58f497e0700eb25225a | [
"BSD-3-Clause"
] | null | null | null | tests/unit/test_spatial_methods/test_spectral_volume.py | katiezzzzz/PyBaMM | 0b0fc47125c0f078da99a58f497e0700eb25225a | [
"BSD-3-Clause"
] | null | null | null | #
# Test for the operator class
#
import pybamm
import numpy as np
import unittest
def get_mesh_for_testing(
xpts=None, rpts=10, ypts=15, zpts=15, geometry=None, cc_submesh=None,
order=2
):
param = pybamm.ParameterValues(
values={
"Electrode width [m]": 0.4,
"Electrode height [m]": 0.5,
"Negative tab width [m]": 0.1,
"Negative tab centre y-coordinate [m]": 0.1,
"Negative tab centre z-coordinate [m]": 0.0,
"Positive tab width [m]": 0.1,
"Positive tab centre y-coordinate [m]": 0.3,
"Positive tab centre z-coordinate [m]": 0.5,
"Negative electrode thickness [m]": 0.3,
"Separator thickness [m]": 0.3,
"Positive electrode thickness [m]": 0.3,
}
)
if geometry is None:
geometry = pybamm.battery_geometry()
param.process_geometry(geometry)
submesh_types = {
"negative electrode": pybamm.MeshGenerator(
pybamm.SpectralVolume1DSubMesh,
{"order": order}
),
"separator": pybamm.MeshGenerator(pybamm.SpectralVolume1DSubMesh,
{"order": order}),
"positive electrode": pybamm.MeshGenerator(
pybamm.SpectralVolume1DSubMesh,
{"order": order}
),
"negative particle": pybamm.MeshGenerator(
pybamm.SpectralVolume1DSubMesh,
{"order": order}
),
"positive particle": pybamm.MeshGenerator(
pybamm.SpectralVolume1DSubMesh,
{"order": order}
),
"current collector": pybamm.MeshGenerator(pybamm.SubMesh0D),
}
if cc_submesh:
submesh_types["current collector"] = cc_submesh
if xpts is None:
xn_pts, xs_pts, xp_pts = 40, 25, 35
else:
xn_pts, xs_pts, xp_pts = xpts, xpts, xpts
var = pybamm.standard_spatial_vars
var_pts = {
var.x_n: xn_pts,
var.x_s: xs_pts,
var.x_p: xp_pts,
var.r_n: rpts,
var.r_p: rpts,
var.y: ypts,
var.z: zpts,
}
return pybamm.Mesh(geometry, submesh_types, var_pts)
def get_p2d_mesh_for_testing(xpts=None, rpts=10):
geometry = pybamm.battery_geometry()
return get_mesh_for_testing(xpts=xpts, rpts=rpts, geometry=geometry)
def get_1p1d_mesh_for_testing(
xpts=None,
rpts=10,
zpts=15,
cc_submesh=pybamm.MeshGenerator(pybamm.Uniform1DSubMesh),
):
geometry = pybamm.battery_geometry(current_collector_dimension=1)
return get_mesh_for_testing(
xpts=xpts, rpts=rpts, zpts=zpts, geometry=geometry, cc_submesh=cc_submesh
)
class TestSpectralVolume(unittest.TestCase):
def test_exceptions(self):
sp_meth = pybamm.SpectralVolume()
with self.assertRaises(ValueError):
sp_meth.chebyshev_differentiation_matrices(3, 3)
mesh = get_mesh_for_testing()
spatial_methods = {"macroscale": pybamm.SpectralVolume()}
disc = pybamm.Discretisation(mesh, spatial_methods)
whole_cell = ["negative electrode", "separator", "positive electrode"]
var = pybamm.Variable("var", domain=whole_cell)
disc.set_variable_slices([var])
discretised_symbol = pybamm.StateVector(*disc.y_slices[var.id])
sp_meth.build(mesh)
bcs = {"left": (pybamm.Scalar(0), "x"), "right": (pybamm.Scalar(3), "Neumann")}
with self.assertRaisesRegex(ValueError, "boundary condition must be"):
sp_meth.replace_dirichlet_values(var, discretised_symbol, bcs)
with self.assertRaisesRegex(ValueError, "boundary condition must be"):
sp_meth.replace_neumann_values(var, discretised_symbol, bcs)
bcs = {"left": (pybamm.Scalar(0), "Neumann"), "right": (pybamm.Scalar(3), "x")}
with self.assertRaisesRegex(ValueError, "boundary condition must be"):
sp_meth.replace_dirichlet_values(var, discretised_symbol, bcs)
with self.assertRaisesRegex(ValueError, "boundary condition must be"):
sp_meth.replace_neumann_values(var, discretised_symbol, bcs)
def test_grad_div_shapes_Dirichlet_bcs(self):
"""
Test grad and div with Dirichlet boundary conditions (applied by grad on var)
and also test the case where only one Spectral Volume is discretised
"""
whole_cell = ["negative electrode", "separator", "positive electrode"]
# create discretisation
mesh = get_mesh_for_testing(1)
spatial_methods = {"macroscale": pybamm.SpectralVolume()}
disc = pybamm.Discretisation(mesh, spatial_methods)
combined_submesh = mesh.combine_submeshes(*whole_cell)
# grad
var = pybamm.Variable("var", domain=whole_cell)
grad_eqn = pybamm.grad(var)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(1), "Dirichlet"),
"right": (pybamm.Scalar(1), "Dirichlet"),
}
}
disc.bcs = boundary_conditions
disc.set_variable_slices([var])
grad_eqn_disc = disc.process_symbol(grad_eqn)
constant_y = np.ones_like(combined_submesh.nodes[:, np.newaxis])
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, constant_y),
np.zeros_like(combined_submesh.edges[:, np.newaxis]),
)
# div: test on linear y (should have laplacian zero) so change bcs
linear_y = combined_submesh.nodes
N = pybamm.grad(var)
div_eqn = pybamm.div(N)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(0), "Dirichlet"),
"right": (pybamm.Scalar(1), "Dirichlet"),
}
}
disc.bcs = boundary_conditions
grad_eqn_disc = disc.process_symbol(grad_eqn)
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, linear_y),
np.ones_like(combined_submesh.edges[:, np.newaxis]),
)
div_eqn_disc = disc.process_symbol(div_eqn)
np.testing.assert_array_almost_equal(
div_eqn_disc.evaluate(None, linear_y),
np.zeros_like(combined_submesh.nodes[:, np.newaxis]),
)
def test_grad_1plus1d(self):
mesh = get_1p1d_mesh_for_testing()
spatial_methods = {"macroscale": pybamm.SpectralVolume()}
disc = pybamm.Discretisation(mesh, spatial_methods)
a = pybamm.Variable(
"a",
domain=["negative electrode"],
auxiliary_domains={"secondary": "current collector"},
)
b = pybamm.Variable(
"b",
domain=["separator"],
auxiliary_domains={"secondary": "current collector"},
)
c = pybamm.Variable(
"c",
domain=["positive electrode"],
auxiliary_domains={"secondary": "current collector"},
)
var = pybamm.Concatenation(a, b, c)
boundary_conditions = {
var.id: {
"left": (pybamm.Vector(np.linspace(0, 1, 15)), "Neumann"),
"right": (pybamm.Vector(np.linspace(0, 1, 15)), "Neumann"),
}
}
disc.bcs = boundary_conditions
disc.set_variable_slices([var])
grad_eqn_disc = disc.process_symbol(pybamm.grad(var))
# Evaulate
combined_submesh = mesh.combine_submeshes(*var.domain)
linear_y = np.outer(np.linspace(0, 1, 15), combined_submesh.nodes).reshape(
-1, 1
)
expected = np.outer(
np.linspace(0, 1, 15), np.ones_like(combined_submesh.edges)
).reshape(-1, 1)
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, linear_y), expected
)
def test_spherical_grad_div_shapes_Dirichlet_bcs(self):
"""
Test grad and div with Dirichlet boundary conditions (applied by grad on var)
"""
# create discretisation
mesh = get_1p1d_mesh_for_testing()
spatial_methods = {"negative particle": pybamm.SpectralVolume()}
disc = pybamm.Discretisation(mesh, spatial_methods)
submesh = mesh["negative particle"]
# grad
# grad(r) == 1
var = pybamm.Variable(
"var",
domain=["negative particle"],
auxiliary_domains={
"secondary": "negative electrode",
"tertiary": "current collector",
},
)
grad_eqn = pybamm.grad(var)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(1), "Dirichlet"),
"right": (pybamm.Scalar(1), "Dirichlet"),
}
}
disc.bcs = boundary_conditions
disc.set_variable_slices([var])
grad_eqn_disc = disc.process_symbol(grad_eqn)
total_npts = (
submesh.npts
* mesh["negative electrode"].npts
* mesh["current collector"].npts
)
total_npts_edges = (
(submesh.npts + 1)
* mesh["negative electrode"].npts
* mesh["current collector"].npts
)
constant_y = np.ones((total_npts, 1))
np.testing.assert_array_equal(
grad_eqn_disc.evaluate(None, constant_y), np.zeros((total_npts_edges, 1))
)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(0), "Dirichlet"),
"right": (pybamm.Scalar(1), "Dirichlet"),
}
}
disc.bcs = boundary_conditions
y_linear = np.tile(
submesh.nodes,
mesh["negative electrode"].npts * mesh["current collector"].npts,
)
grad_eqn_disc = disc.process_symbol(grad_eqn)
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, y_linear), np.ones((total_npts_edges, 1))
)
# div: test on linear r^2
# div (grad r^2) = 6
const = 6 * np.ones((total_npts, 1))
N = pybamm.grad(var)
div_eqn = pybamm.div(N)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(6), "Dirichlet"),
"right": (pybamm.Scalar(6), "Dirichlet"),
}
}
disc.bcs = boundary_conditions
div_eqn_disc = disc.process_symbol(div_eqn)
np.testing.assert_array_almost_equal(
div_eqn_disc.evaluate(None, const),
np.zeros(
(
submesh.npts
* mesh["negative electrode"].npts
* mesh["current collector"].npts,
1,
)
),
)
def test_p2d_spherical_grad_div_shapes_Dirichlet_bcs(self):
"""
Test grad and div with Dirichlet boundary conditions (applied by grad on var)
in the pseudo 2-dimensional case
"""
mesh = get_p2d_mesh_for_testing()
spatial_methods = {
"macroscale": pybamm.SpectralVolume(),
"negative particle": pybamm.SpectralVolume(),
"positive particle": pybamm.SpectralVolume(),
}
disc = pybamm.Discretisation(mesh, spatial_methods)
n_mesh = mesh["negative particle"]
mesh.add_ghost_meshes()
disc.mesh.add_ghost_meshes()
var = pybamm.Variable(
"var",
domain=["negative particle"],
auxiliary_domains={"secondary": "negative electrode"},
)
grad_eqn = pybamm.grad(var)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(1), "Dirichlet"),
"right": (pybamm.Scalar(1), "Dirichlet"),
}
}
disc.bcs = boundary_conditions
disc.set_variable_slices([var])
grad_eqn_disc = disc.process_symbol(grad_eqn)
prim_pts = n_mesh.npts
sec_pts = mesh["negative electrode"].npts
constant_y = np.kron(np.ones(sec_pts), np.ones(prim_pts))
grad_eval = grad_eqn_disc.evaluate(None, constant_y)
grad_eval = np.reshape(grad_eval, [sec_pts, prim_pts + 1])
np.testing.assert_array_equal(grad_eval, np.zeros([sec_pts, prim_pts + 1]))
# div
# div (grad r^2) = 6, N_left = N_right = 0
N = pybamm.grad(var)
div_eqn = pybamm.div(N)
bc_var = disc.process_symbol(
pybamm.SpatialVariable("x_n", domain="negative electrode")
)
boundary_conditions = {
var.id: {"left": (bc_var, "Neumann"), "right": (bc_var, "Neumann")}
}
disc.bcs = boundary_conditions
div_eqn_disc = disc.process_symbol(div_eqn)
const = 6 * np.ones(sec_pts * prim_pts)
div_eval = div_eqn_disc.evaluate(None, const)
div_eval = np.reshape(div_eval, [sec_pts, prim_pts])
np.testing.assert_array_almost_equal(
div_eval[:, :-1], np.zeros([sec_pts, prim_pts - 1])
)
def test_grad_div_shapes_Neumann_bcs(self):
"""Test grad and div with Neumann boundary conditions (applied by div on N)"""
whole_cell = ["negative electrode", "separator", "positive electrode"]
# create discretisation
mesh = get_mesh_for_testing()
spatial_methods = {"macroscale": pybamm.SpectralVolume()}
disc = pybamm.Discretisation(mesh, spatial_methods)
combined_submesh = mesh.combine_submeshes(*whole_cell)
# grad
var = pybamm.Variable("var", domain=whole_cell)
grad_eqn = pybamm.grad(var)
disc.set_variable_slices([var])
grad_eqn_disc = disc.process_symbol(grad_eqn)
constant_y = np.ones_like(combined_submesh.nodes[:, np.newaxis])
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, constant_y),
np.zeros_like(combined_submesh.edges[:][:, np.newaxis]),
)
# div
N = pybamm.grad(var)
div_eqn = pybamm.div(N)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(1), "Neumann"),
"right": (pybamm.Scalar(1), "Neumann"),
}
}
disc.bcs = boundary_conditions
div_eqn_disc = disc.process_symbol(div_eqn)
# Linear y should have laplacian zero
linear_y = combined_submesh.nodes
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, linear_y),
np.ones_like(combined_submesh.edges[:][:, np.newaxis]),
)
np.testing.assert_array_almost_equal(
div_eqn_disc.evaluate(None, linear_y),
np.zeros_like(combined_submesh.nodes[:, np.newaxis]),
)
def test_grad_div_shapes_Dirichlet_and_Neumann_bcs(self):
"""
Test grad and div with Dirichlet boundary conditions (applied by grad on c) on
one side and Neumann boundary conditions (applied by div on N) on the other
"""
whole_cell = ["negative electrode", "separator", "positive electrode"]
# create discretisation
mesh = get_mesh_for_testing()
spatial_methods = {"macroscale": pybamm.SpectralVolume()}
disc = pybamm.Discretisation(mesh, spatial_methods)
combined_submesh = mesh.combine_submeshes(*whole_cell)
# grad
var = pybamm.Variable("var", domain=whole_cell)
grad_eqn = pybamm.grad(var)
disc.set_variable_slices([var])
# div
N = pybamm.grad(var)
div_eqn = pybamm.div(N)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(1), "Dirichlet"),
"right": (pybamm.Scalar(0), "Neumann"),
}
}
disc.bcs = boundary_conditions
grad_eqn_disc = disc.process_symbol(grad_eqn)
div_eqn_disc = disc.process_symbol(div_eqn)
# Constant y should have gradient and laplacian zero
constant_y = np.ones_like(combined_submesh.nodes[:, np.newaxis])
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, constant_y),
np.zeros_like(combined_submesh.edges[:, np.newaxis]),
)
np.testing.assert_array_almost_equal(
div_eqn_disc.evaluate(None, constant_y),
np.zeros_like(combined_submesh.nodes[:, np.newaxis]),
)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(1), "Neumann"),
"right": (pybamm.Scalar(1), "Dirichlet"),
}
}
disc.bcs = boundary_conditions
grad_eqn_disc = disc.process_symbol(grad_eqn)
div_eqn_disc = disc.process_symbol(div_eqn)
# Linear y should have gradient one and laplacian zero
linear_y = combined_submesh.nodes
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, linear_y),
np.ones_like(combined_submesh.edges[:, np.newaxis]),
)
np.testing.assert_array_almost_equal(
div_eqn_disc.evaluate(None, linear_y),
np.zeros_like(combined_submesh.nodes[:, np.newaxis]),
)
def test_spherical_grad_div_shapes_Neumann_bcs(self):
"""Test grad and div with Neumann boundary conditions (applied by div on N)"""
# create discretisation
mesh = get_mesh_for_testing()
spatial_methods = {"negative particle": pybamm.SpectralVolume()}
disc = pybamm.Discretisation(mesh, spatial_methods)
combined_submesh = mesh.combine_submeshes("negative particle")
# grad
var = pybamm.Variable("var", domain="negative particle")
grad_eqn = pybamm.grad(var)
disc.set_variable_slices([var])
grad_eqn_disc = disc.process_symbol(grad_eqn)
constant_y = np.ones_like(combined_submesh.nodes[:, np.newaxis])
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, constant_y),
np.zeros_like(combined_submesh.edges[:][:, np.newaxis]),
)
linear_y = combined_submesh.nodes
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, linear_y),
np.ones_like(combined_submesh.edges[:][:, np.newaxis]),
)
# div
# div ( grad(r^2) ) == 6 , N_left = N_right = 0
N = pybamm.grad(var)
div_eqn = pybamm.div(N)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(0), "Neumann"),
"right": (pybamm.Scalar(0), "Neumann"),
}
}
disc.bcs = boundary_conditions
div_eqn_disc = disc.process_symbol(div_eqn)
linear_y = combined_submesh.nodes
const = 6 * np.ones(combined_submesh.npts)
np.testing.assert_array_almost_equal(
div_eqn_disc.evaluate(None, const), np.zeros((combined_submesh.npts, 1))
)
def test_p2d_spherical_grad_div_shapes_Neumann_bcs(self):
"""
Test grad and div with Dirichlet boundary conditions (applied by grad on var)
in the pseudo 2-dimensional case
"""
mesh = get_p2d_mesh_for_testing()
spatial_methods = {"negative particle": pybamm.SpectralVolume()}
disc = pybamm.Discretisation(mesh, spatial_methods)
n_mesh = mesh["negative particle"]
mesh.add_ghost_meshes()
disc.mesh.add_ghost_meshes()
# test grad
var = pybamm.Variable(
"var",
domain=["negative particle"],
auxiliary_domains={"secondary": "negative electrode"},
)
grad_eqn = pybamm.grad(var)
disc.set_variable_slices([var])
grad_eqn_disc = disc.process_symbol(grad_eqn)
prim_pts = n_mesh.npts
sec_pts = mesh["negative electrode"].npts
constant_y = np.kron(np.ones(sec_pts), np.ones(prim_pts))
grad_eval = grad_eqn_disc.evaluate(None, constant_y)
grad_eval = np.reshape(grad_eval, [sec_pts, prim_pts + 1])
np.testing.assert_array_equal(grad_eval, np.zeros([sec_pts, prim_pts + 1]))
# div
# div (grad r^2) = 6, N_left = N_right = 0
N = pybamm.grad(var)
div_eqn = pybamm.div(N)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(0), "Neumann"),
"right": (pybamm.Scalar(0), "Neumann"),
}
}
disc.bcs = boundary_conditions
div_eqn_disc = disc.process_symbol(div_eqn)
const = 6 * np.ones(sec_pts * prim_pts)
div_eval = div_eqn_disc.evaluate(None, const)
div_eval = np.reshape(div_eval, [sec_pts, prim_pts])
np.testing.assert_array_almost_equal(div_eval, np.zeros([sec_pts, prim_pts]))
def test_grad_div_shapes_mixed_domain(self):
"""
Test grad and div with Dirichlet boundary conditions (applied by grad on var)
"""
# create discretisation
mesh = get_mesh_for_testing()
spatial_methods = {"macroscale": pybamm.SpectralVolume()}
disc = pybamm.Discretisation(mesh, spatial_methods)
# grad
var = pybamm.Variable("var", domain=["negative electrode", "separator"])
grad_eqn = pybamm.grad(var)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(1), "Dirichlet"),
"right": (pybamm.Scalar(1), "Dirichlet"),
}
}
disc.bcs = boundary_conditions
disc.set_variable_slices([var])
grad_eqn_disc = disc.process_symbol(grad_eqn)
combined_submesh = mesh.combine_submeshes("negative electrode", "separator")
constant_y = np.ones_like(combined_submesh.nodes[:, np.newaxis])
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, constant_y),
np.zeros_like(combined_submesh.edges[:, np.newaxis]),
)
# div: test on linear y (should have laplacian zero) so change bcs
linear_y = combined_submesh.nodes
N = pybamm.grad(var)
div_eqn = pybamm.div(N)
boundary_conditions = {
var.id: {
"left": (pybamm.Scalar(0), "Dirichlet"),
"right": (pybamm.Scalar(combined_submesh.edges[-1]), "Dirichlet"),
}
}
disc.bcs = boundary_conditions
grad_eqn_disc = disc.process_symbol(grad_eqn)
np.testing.assert_array_almost_equal(
grad_eqn_disc.evaluate(None, linear_y),
np.ones_like(combined_submesh.edges[:, np.newaxis]),
)
div_eqn_disc = disc.process_symbol(div_eqn)
np.testing.assert_array_almost_equal(
div_eqn_disc.evaluate(None, linear_y),
np.zeros_like(combined_submesh.nodes[:, np.newaxis]),
)
if __name__ == "__main__":
print("Add -v for more debug output")
import sys
if "-v" in sys.argv:
debug = True
pybamm.settings.debug_mode = True
unittest.main()
| 35.223583 | 87 | 0.594539 | 2,610 | 23,001 | 4.990421 | 0.080077 | 0.025797 | 0.023647 | 0.036852 | 0.843608 | 0.818656 | 0.789635 | 0.739501 | 0.713397 | 0.701651 | 0 | 0.00905 | 0.29377 | 23,001 | 652 | 88 | 35.277607 | 0.792785 | 0.065606 | 0 | 0.58 | 0 | 0 | 0.093519 | 0 | 0 | 0 | 0 | 0 | 0.058 | 1 | 0.026 | false | 0 | 0.008 | 0 | 0.042 | 0.002 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
426af98da3c39994539ab6283c20b26e411835b7 | 75 | py | Python | pyPackage/__init__.py | 20centcroak/pyPackage | 521cdb8312ebfaf005744e077d783814175a2304 | [
"MIT"
] | null | null | null | pyPackage/__init__.py | 20centcroak/pyPackage | 521cdb8312ebfaf005744e077d783814175a2304 | [
"MIT"
] | null | null | null | pyPackage/__init__.py | 20centcroak/pyPackage | 521cdb8312ebfaf005744e077d783814175a2304 | [
"MIT"
] | null | null | null | from pyPackage.options import Options
from pyPackage.package import Package | 37.5 | 37 | 0.88 | 10 | 75 | 6.6 | 0.5 | 0.393939 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093333 | 75 | 2 | 38 | 37.5 | 0.970588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
42a5478e5e350376fa844117b1942e02b023851d | 104 | py | Python | sumospeak/traciwrapper/__init__.py | NASGregorio/sumospeak | 3ea1415a381a2eb5a97eb0d8842f4f0d9bd7a4d7 | [
"MIT"
] | null | null | null | sumospeak/traciwrapper/__init__.py | NASGregorio/sumospeak | 3ea1415a381a2eb5a97eb0d8842f4f0d9bd7a4d7 | [
"MIT"
] | null | null | null | sumospeak/traciwrapper/__init__.py | NASGregorio/sumospeak | 3ea1415a381a2eb5a97eb0d8842f4f0d9bd7a4d7 | [
"MIT"
] | null | null | null | from .traciwrapper import TraCIWrapper, SUMO_START, SUMO_STEP, SUMO_ARRIVAL, SUMO_CLOSE, SUMO_PHASE_LIST | 104 | 104 | 0.865385 | 15 | 104 | 5.6 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 104 | 1 | 104 | 104 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c420334440fcb7297e6bb09c69b615729cc98d70 | 6,357 | py | Python | py/orbit/diagnostics/TeapotDiagnosticsNode.py | austin-hoover/py-orbit | cd3dbdd5d75a5a4a3fcf2ec079c898d760282b02 | [
"MIT"
] | null | null | null | py/orbit/diagnostics/TeapotDiagnosticsNode.py | austin-hoover/py-orbit | cd3dbdd5d75a5a4a3fcf2ec079c898d760282b02 | [
"MIT"
] | null | null | null | py/orbit/diagnostics/TeapotDiagnosticsNode.py | austin-hoover/py-orbit | cd3dbdd5d75a5a4a3fcf2ec079c898d760282b02 | [
"MIT"
] | null | null | null | """
This module is a collimator node class for TEAPOT lattice
"""
import os
import math
# import the auxiliary classes
from orbit.utils import orbitFinalize, NamedObject, ParamsDictObject
# import general accelerator elements and lattice
from orbit.lattice import AccNode, AccActionsContainer, AccNodeBunchTracker
# import Diagnostics classes
from diagnostics import StatLats, StatLatsSetMember
from diagnostics import Moments, MomentsSetMember, BPMSignal
# import teapot drift class
from orbit.teapot import DriftTEAPOT
#import Bunch diagnostics
from bunch import BunchTuneAnalysis
class TeapotStatLatsNode(DriftTEAPOT):
"""
The statlats node class for TEAPOT lattice
"""
def __init__(self, filename , name = "statlats no name"):
"""
Constructor. Creates the StatLats TEAPOT element.
"""
DriftTEAPOT.__init__(self,name)
self.statlats = StatLats(filename)
self.setType("statlats teapot")
self.setLength(0.0)
self.position = 0.0
self.lattlength = 0.0
self.file_out = open(filename,"w")
def track(self, paramsDict):
"""
The statlats-teapot class implementation of the AccNodeBunchTracker class track(probe) method.
"""
length = self.getLength(self.getActivePartIndex())
bunch = paramsDict["bunch"]
self.statlats.writeStatLats(self.position,bunch,self.lattlength)
def setPosition(self,pos):
self.position = pos
def closeStatLats(self):
self.file_out.close()
def setLatticeLength(self, lattlength):
self.lattlength = lattlength
class TeapotStatLatsNodeSetMember(DriftTEAPOT):
"""
The statlats node class for TEAPOT lattice
"""
def __init__(self, file, name = "statlats no name"):
"""
Constructor. Creates the StatLats TEAPOT element.
"""
DriftTEAPOT.__init__(self,name)
self.statlats = StatLatsSetMember(file)
self.setType("statlats teapot")
self.setLength(0.0)
self.position = 0.0
self.lattlength = 0.0
self.active = True
self.file = file
def track(self, paramsDict):
"""
The statlats-teapot class implementation of the AccNodeBunchTracker class track(probe) method.
"""
if(self.active):
length = self.getLength(self.getActivePartIndex())
bunch = paramsDict["bunch"]
self.statlats.writeStatLats(self.position,bunch,self.lattlength)
def setPosition(self,pos):
self.position = pos
def setLatticeLength(self, lattlength):
self.lattlength = lattlength
def activate(self):
self.active = True
def deactivate(self):
self.active = False
def resetFile(self, file):
self.file = file
self.statlats.resetFile(self.file)
class TeapotMomentsNode(DriftTEAPOT):
"""
The moments node class for TEAPOT lattice
"""
def __init__(self, filename, order, nodispersion = True, emitnorm = False, name = "moments no name"):
"""
Constructor. Creates the StatLats TEAPOT element.
"""
DriftTEAPOT.__init__(self,name)
self.moments = Moments(filename, order, nodispersion, emitnorm)
self.setType("moments teapot")
self.setLength(0.0)
self.position = 0.0
self.lattlength = 0.0
self.file_out = open(filename,"w")
def track(self, paramsDict):
"""
The moments-teapot class implementation of the AccNodeBunchTracker class track(probe) method.
"""
length = self.getLength(self.getActivePartIndex())
bunch = paramsDict["bunch"]
self.moments.writeMoments(self.position,bunch,self.lattlength)
def setPosition(self,pos):
self.position = pos
def closeMoments(self):
self.file_out.close()
def setLatticeLength(self, lattlength):
self.lattlength = lattlength
class TeapotMomentsNodeSetMember(DriftTEAPOT):
"""
The moments node class for TEAPOT lattice
"""
def __init__(self, file, order, nodispersion = True, emitnorm = False, name = "moments no name"):
"""
Constructor. Creates the Moments TEAPOT element.
"""
DriftTEAPOT.__init__(self,str(name))
self.file = file
self.moments = MomentsSetMember(self.file, order, nodispersion, emitnorm)
self.setType("moments teapot")
self.setLength(0.0)
self.position = 0.0
self.lattlength = 0.0
self.active = True
def track(self, paramsDict):
"""
The moments-teapot class implementation of the AccNodeBunchTracker class track(probe) method.
"""
if(self.active):
length = self.getLength(self.getActivePartIndex())
bunch = paramsDict["bunch"]
self.moments.writeMoments(self.position, bunch, self.lattlength)
def setPosition(self,pos):
self.position = pos
def setLatticeLength(self, lattlength):
self.lattlength = lattlength
def activate(self):
self.active = True
def deactivate(self):
self.active = False
def resetFile(self, file):
self.file = file
self.moments.resetFile(self.file)
class TeapotTuneAnalysisNode(DriftTEAPOT):
def __init__(self, name = "tuneanalysis no name"):
"""
Constructor. Creates the StatLats TEAPOT element.
"""
DriftTEAPOT.__init__(self,name)
self.bunchtune = BunchTuneAnalysis()
self.setType("tune calculator teapot")
self.lattlength = 0.0
self.setLength(0.0)
self.position = 0.0
def track(self, paramsDict):
"""
The bunchtuneanalysis-teapot class implementation of the AccNodeBunchTracker class track(probe) method.
"""
length = self.getLength(self.getActivePartIndex())
bunch = paramsDict["bunch"]
self.bunchtune.analyzeBunch(bunch)
def setPosition(self,pos):
self.position = pos
def setLatticeLength(self, lattlength):
self.lattlength = lattlength
def assignTwiss(self, betax, alphax, etax, etapx, betay, alphay):
self.bunchtune.assignTwiss(betax, alphax, etax, etapx, betay, alphay)
class TeapotBPMSignalNode(DriftTEAPOT):
def __init__(self, name = "BPMSignal no name"):
"""
Constructor. Creates the StatLats TEAPOT element.
"""
DriftTEAPOT.__init__(self,name)
self.bpm = BPMSignal()
self.setType("BPMSignal")
self.lattlength = 0.0
self.setLength(0.0)
self.position = 0.0
def track(self, paramsDict):
"""
The bunchtuneanalysis-teapot class implementation of the AccNodeBunchTracker class track(probe) method.
"""
length = self.getLength(self.getActivePartIndex())
bunch = paramsDict["bunch"]
self.bpm.analyzeSignal(bunch)
def setPosition(self,pos):
self.position = pos
def setLatticeLength(self, lattlength):
self.lattlength = lattlength
def getSignal(self):
xAvg = self.bpm.getSignalX()
yAvg = self.bpm.getSignalY()
return xAvg, yAvg | 26.160494 | 105 | 0.732264 | 748 | 6,357 | 6.153743 | 0.148396 | 0.066913 | 0.020856 | 0.031284 | 0.772757 | 0.742559 | 0.72909 | 0.72909 | 0.72909 | 0.723876 | 0 | 0.006723 | 0.157622 | 6,357 | 243 | 106 | 26.160494 | 0.852848 | 0.199308 | 0 | 0.698529 | 0 | 0 | 0.044834 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.058824 | 0 | 0.360294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c4332f962837980d336f231677daeea0d7511a01 | 2,841 | py | Python | test/test_gpio_setup.py | olegantonyan/adafruit-beaglebone-io-python | 36c5a136dfb93b40213b61e04b8bf760086015d0 | [
"MIT"
] | null | null | null | test/test_gpio_setup.py | olegantonyan/adafruit-beaglebone-io-python | 36c5a136dfb93b40213b61e04b8bf760086015d0 | [
"MIT"
] | null | null | null | test/test_gpio_setup.py | olegantonyan/adafruit-beaglebone-io-python | 36c5a136dfb93b40213b61e04b8bf760086015d0 | [
"MIT"
] | null | null | null | import pytest
import os
import platform
import Adafruit_BBIO.GPIO as GPIO
kernel = platform.release()
def teardown_module(module):
GPIO.cleanup()
class TestSetup:
def test_setup_output_key(self):
GPIO.setup("P8_10", GPIO.OUT)
assert os.path.exists('/sys/class/gpio/gpio68')
direction = open('/sys/class/gpio/gpio68/direction').read()
assert direction == 'out\n'
GPIO.cleanup()
def test_setup_output_name(self):
GPIO.setup("TIMER6", GPIO.OUT)
assert os.path.exists('/sys/class/gpio/gpio68')
direction = open('/sys/class/gpio/gpio68/direction').read()
assert direction == 'out\n'
GPIO.cleanup()
def test_setup_input_key(self):
GPIO.setup("P8_10", GPIO.IN)
assert os.path.exists('/sys/class/gpio/gpio68')
direction = open('/sys/class/gpio/gpio68/direction').read()
assert direction == 'in\n'
GPIO.cleanup()
def test_setup_input_name(self):
GPIO.setup("TIMER6", GPIO.IN)
assert os.path.exists('/sys/class/gpio/gpio68')
direction = open('/sys/class/gpio/gpio68/direction').read()
assert direction == 'in\n'
GPIO.cleanup()
def test_setup_input_pull_up(self):
GPIO.setup("P8_10", GPIO.IN, pull_up_down=GPIO.PUD_UP)
assert os.path.exists('/sys/class/gpio/gpio68')
direction = open('/sys/class/gpio/gpio68/direction').read()
assert direction == 'in\n'
GPIO.cleanup()
def test_setup_input_pull_down(self):
GPIO.setup("P8_10", GPIO.IN, pull_up_down=GPIO.PUD_DOWN)
assert os.path.exists('/sys/class/gpio/gpio68')
direction = open('/sys/class/gpio/gpio68/direction').read()
assert direction == 'in\n'
GPIO.cleanup()
def test_setup_cleanup(self):
GPIO.setup("P8_10", GPIO.OUT)
assert os.path.exists('/sys/class/gpio/gpio68')
GPIO.cleanup()
if kernel < '4.1.0':
assert not os.path.exists('/sys/class/gpio/gpio68')
# for later kernels, the universal capemanager always loads the
# UARTs.
def test_setup_failed_type_error(self):
with pytest.raises(TypeError):
GPIO.setup("P8_10", "WEIRD")
GPIO.cleanup()
def test_setup_failed_value_error(self):
with pytest.raises(ValueError):
GPIO.setup("P8_10", 3)
GPIO.cleanup()
def test_setup_three_digit_gpio(self):
GPIO.setup("P9_31", GPIO.OUT)
assert os.path.exists('/sys/class/gpio/gpio110')
GPIO.cleanup()
if kernel < '4.1.0':
assert not os.path.exists('/sys/class/gpio/gpio110')
# for later kernels, the universal capemanager always loads the
# UARTs.
| 33.821429 | 75 | 0.606829 | 367 | 2,841 | 4.561308 | 0.190736 | 0.076464 | 0.114695 | 0.150538 | 0.82736 | 0.770012 | 0.742533 | 0.713859 | 0.713859 | 0.691756 | 0 | 0.031784 | 0.258008 | 2,841 | 83 | 76 | 34.228916 | 0.762334 | 0.048222 | 0 | 0.53125 | 0 | 0 | 0.187847 | 0.15339 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.171875 | false | 0 | 0.0625 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c45ff8e129435a4d8a018225b43a9ec8f181b2b3 | 112 | py | Python | helpers/__init__.py | robot-acceleration/GRiDCodeGenerator | 891490d6a02aa4119f7b7f03d5aa6524177e13c2 | [
"MIT"
] | 2 | 2021-12-25T17:00:07.000Z | 2022-03-18T15:50:24.000Z | helpers/__init__.py | robot-acceleration/GRiDCodeGenerator | 891490d6a02aa4119f7b7f03d5aa6524177e13c2 | [
"MIT"
] | null | null | null | helpers/__init__.py | robot-acceleration/GRiDCodeGenerator | 891490d6a02aa4119f7b7f03d5aa6524177e13c2 | [
"MIT"
] | null | null | null | from ._code_generation_helpers import *
from ._spatial_algebra_helpers import *
from ._topology_helpers import * | 37.333333 | 39 | 0.848214 | 14 | 112 | 6.214286 | 0.571429 | 0.448276 | 0.390805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098214 | 112 | 3 | 40 | 37.333333 | 0.861386 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
673cd7e7ceea09d7d02571f3f4c1fc7282c4407f | 47 | py | Python | xacro4sdf/__init__.py | ar-mine/xacro4sdf | 9967bbb08007e12fd49838b82b396aa819a35f6d | [
"MIT"
] | 12 | 2020-11-27T08:07:48.000Z | 2022-03-11T04:32:07.000Z | xacro4sdf/__init__.py | ar-mine/xacro4sdf | 9967bbb08007e12fd49838b82b396aa819a35f6d | [
"MIT"
] | null | null | null | xacro4sdf/__init__.py | ar-mine/xacro4sdf | 9967bbb08007e12fd49838b82b396aa819a35f6d | [
"MIT"
] | 2 | 2021-12-28T03:55:56.000Z | 2022-01-12T07:50:36.000Z | from xacro4sdf.xacro4sdf import xacro4sdf_main | 47 | 47 | 0.893617 | 6 | 47 | 6.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 0.085106 | 47 | 1 | 47 | 47 | 0.883721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
675166a269565104b3daa39492381d59b9d6c0bd | 122 | py | Python | ecs/dashboard/views.py | programmierfabrik/ecs | 2389a19453e21b2ea4e40b272552bcbd42b926a9 | [
"Apache-2.0"
] | 9 | 2017-02-13T18:17:13.000Z | 2020-11-21T20:15:54.000Z | ecs/dashboard/views.py | programmierfabrik/ecs | 2389a19453e21b2ea4e40b272552bcbd42b926a9 | [
"Apache-2.0"
] | 2 | 2021-05-20T14:26:47.000Z | 2021-05-20T14:26:48.000Z | ecs/dashboard/views.py | programmierfabrik/ecs | 2389a19453e21b2ea4e40b272552bcbd42b926a9 | [
"Apache-2.0"
] | 4 | 2017-04-02T18:48:59.000Z | 2021-11-23T15:40:35.000Z | from django.shortcuts import render
def view_dashboard(request):
return render(request, 'dashboard/dashboard.html')
| 20.333333 | 54 | 0.786885 | 15 | 122 | 6.333333 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122951 | 122 | 5 | 55 | 24.4 | 0.88785 | 0 | 0 | 0 | 0 | 0 | 0.196721 | 0.196721 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
679bdce17b55c1a089d8f2f8f54a242375ad4f15 | 199 | py | Python | bridger/websockets/auth.py | intellineers/django-bridger | ed097984a99df7da40a4d01bd00c56e3c6083056 | [
"BSD-3-Clause"
] | 2 | 2020-03-17T00:53:23.000Z | 2020-07-16T07:00:33.000Z | bridger/websockets/auth.py | intellineers/django-bridger | ed097984a99df7da40a4d01bd00c56e3c6083056 | [
"BSD-3-Clause"
] | 76 | 2019-12-05T01:15:57.000Z | 2021-09-07T16:47:27.000Z | bridger/websockets/auth.py | intellineers/django-bridger | ed097984a99df7da40a4d01bd00c56e3c6083056 | [
"BSD-3-Clause"
] | 1 | 2020-02-05T15:09:47.000Z | 2020-02-05T15:09:47.000Z | from channels.sessions import CookieMiddleware
from bridger.websockets.middleware import JWTAuthMiddleware
def JWTAuthMiddlewareStack(inner):
return CookieMiddleware(JWTAuthMiddleware(inner))
| 24.875 | 59 | 0.854271 | 18 | 199 | 9.444444 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095477 | 199 | 7 | 60 | 28.428571 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
67c22fee65b9d0192579c12f79db21bcca20f5cd | 77,363 | py | Python | resolversrv/tests/unittests/test_db_query.py | golnazads/resolver_service | 782dce5a4daa864aff0b3d171185b2553a254a3f | [
"MIT"
] | 1 | 2021-06-03T15:04:47.000Z | 2021-06-03T15:04:47.000Z | resolversrv/tests/unittests/test_db_query.py | golnazads/resolver_service | 782dce5a4daa864aff0b3d171185b2553a254a3f | [
"MIT"
] | 38 | 2017-11-20T15:53:59.000Z | 2021-11-23T16:17:24.000Z | resolversrv/tests/unittests/test_db_query.py | golnazads/resolver_service | 782dce5a4daa864aff0b3d171185b2553a254a3f | [
"MIT"
] | 10 | 2017-10-06T19:23:20.000Z | 2022-02-19T16:52:47.000Z | import sys, os
project_home = os.path.abspath(os.path.join(os.path.dirname(__file__), '../../../'))
if project_home not in sys.path:
sys.path.insert(0, project_home)
import unittest
import json
from resolversrv import app
from resolversrv.tests.unittests.base import TestCaseDatabase
from resolversrv.utils import get_records, add_records, get_records_new, add_records_new, del_records_new, get_ids
from resolversrv.views import LinkRequest, PopulateRequest
from adsmsg import DocumentRecords
class TestDatabase(TestCaseDatabase):
def create_app(self):
'''Start the wsgi application'''
a = app.create_app(**{
'SQLALCHEMY_DATABASE_URI': self.postgresql_url,
'RESOLVER_GATEWAY_URL': '/{bibcode}/{link_type}/{url}',
})
return a
def add_stub_data(self):
"""
Add stub data
:return:
"""
stub_data = [
('2013MNRAS.435.1904M', 'TOC', '', [''], [''], 0),
('2013MNRAS.435.1904M', 'ESOURCE', 'EPRINT_HTML', ['http://arxiv.org/abs/1307.6556'], [''], 0),
('2013MNRAS.435.1904M', 'ESOURCE', 'EPRINT_PDF', ['http://arxiv.org/pdf/1307.6556'], [''], 0),
('2013MNRAS.435.1904M', 'ESOURCE', 'PUB_HTML', ['https://doi.org/10.1093%2Fmnras%2Fstt1379'], [''], 0),
('2013MNRAS.435.1904M', 'ESOURCE', 'PUB_PDF', ['http://mnras.oxfordjournals.org/content/435/3/1904.full.pdf'], [''], 0),
('2013MNRAS.435.1904M', 'DATA', 'Chandra', ['http://cda.harvard.edu/chaser?obsid=494'], ['Chandra Data Archive ObsIds 494'], 27),
('2013MNRAS.435.1904M', 'DATA', 'ESA', ['http://archives.esac.esa.int/ehst/#bibcode=2013MNRAS.435.1904M'], ['European HST References (EHST)'], 1),
('2013MNRAS.435.1904M', 'DATA', 'HEASARC', ['http://heasarc.gsfc.nasa.gov/cgi-bin/W3Browse/biblink.pl?code=2013MNRAS.435.1904M'], [], 1),
('2013MNRAS.435.1904M', 'DATA', 'Herschel', ['http://herschel.esac.esa.int/hpt/publicationdetailsview.do?bibcode=2013MNRAS.435.1904M'], [], 1),
('2013MNRAS.435.1904M', 'DATA', 'MAST', ['http://archive.stsci.edu/mastbibref.php?bibcode=2013MNRAS.435.1904M'], ['MAST References (GALEX EUVE HST)'], 3),
('2013MNRAS.435.1904M', 'DATA', 'NED', ['http://$NED$/cgi-bin/nph-objsearch?search_type=Search&refcode=2013MNRAS.435.1904M'], ['NED Objects (1)'], 1),
('2013MNRAS.435.1904M', 'DATA', 'SIMBAD', ['http://$SIMBAD$/simbo.pl?bibcode=2013MNRAS.435.1904M'], ['SIMBAD Objects (30)'], 30),
('2013MNRAS.435.1904M', 'DATA', 'XMM', ['http://nxsa.esac.esa.int/nxsa-web/#obsid=0097820101'], ['XMM-Newton Observation Number 0097820101'], 1),
('2017MNRAS.467.3556B', 'PRESENTATION', '', ['http://www.astro.lu.se/~alexey/animations.html'], [''], 0),
('1943RvMP...15....1C', 'INSPIRE', '', ['http://inspirehep.net/search?p=find+j+RMPHA,15,1'], [''], 0),
('1971ATsir.615....4D', 'ASSOCIATED', '', ['1971ATsir.615....4D', '1974Afz....10..315D', '1971ATsir.621....7D', '1976Afz....12..665D', '1971ATsir.624....1D', '1983Afz....19..229D', '1983Ap.....19..134D', '1973ATsir.759....6D', '1984Afz....20..525D', '1984Ap.....20..290D', '1974ATsir.809....1D', '1974ATsir.809....2D', '1974ATsir.837....2D'], ['Part 1', 'Part 2', 'Part 3', 'Part 4', 'Part 5', 'Part 6', 'Part 7', 'Part 8', 'Part 9', 'Part 10', 'Part 11', 'Part 12', 'Part 13'], 0),
('1514temg.book.....V', 'ASSOCIATED', '', ['1514temg.book.....V', 'https://www.si.edu/object/siris_sil_154413'], ['Main Paper', 'Supplementary Material'], 0),
('2007ASPC..368...27R', 'ESOURCE', 'ADS_PDF', ['http://articles.adsabs.harvard.edu/pdf/2007ASPC..368...27R'], [''], 0),
('2007ASPC..368...27R', 'ESOURCE', 'ADS_SCAN', ['http://articles.adsabs.harvard.edu/full/2007ASPC..368...27R'], [''], 0),
('2007ASPC..368...27R', 'ESOURCE', 'EPRINT_HTML', ['https://arxiv.org/abs/astro-ph/0703637'], [''], 0),
('2007ASPC..368...27R', 'ESOURCE', 'EPRINT_PDF', ['https://arxiv.org/pdf/astro-ph/0703637'], [''], 0),
('2007ASPC..368...27R', 'ESOURCE', 'PUB_HTML', ['http://aspbooks.org/custom/publications/paper/368-0027.html'], [''], 0),
('2007ASPC..368...27R', 'TOC', '', [''], [''], 0),
('2004astro.ph..1427R', 'DATA', 'MAST', ['http://archive.stsci.edu/prepds/gems','https://archive.stsci.edu/mastbibref.php?bibcode=2004ApJS..152..163R'],['GEMS: Galaxy Evolution from Morphologies and SEDs (Hans-Walter Rix)','MAST References (HST)'], 2),
]
datalinks_list = []
for record in stub_data:
datalinks_record = {'bibcode': record[0],
'data_links_rows': [{'link_type': record[1], 'link_sub_type': record[2],
'url': record[3], 'title': record[4],
'item_count': record[5]}]}
datalinks_list.append(datalinks_record)
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.put('/update', data=json.dumps(datalinks_list), headers=headers)
self.assertEqual(response._status_code, 200)
self.assertEqual(response.json['status'], 'updated db with new data successfully')
def test_link_type_all(self):
"""
return links for all types of a bibcode
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'display', u'links': {u'count': 17, u'records': [
{u'url': u'/2013MNRAS.435.1904M/ABSTRACT', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'abstract', u'title': u'ABSTRACT (1)'},
{u'url': u'/2013MNRAS.435.1904M/CITATIONS', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'citations', u'title': u'CITATIONS (1)'},
{u'url': u'/2013MNRAS.435.1904M/REFERENCES', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'references', u'title': u'REFERENCES (1)'},
{u'url': u'/2013MNRAS.435.1904M/COREADS', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'coreads', u'title': u'COREADS (1)'},
{u'url': u'/2013MNRAS.435.1904M/TOC', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'toc', u'title': u'TOC (1)'},
{u'url': u'/2013MNRAS.435.1904M/OPENURL', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'openurl', u'title': u'OPENURL (1)'},
{u'url': u'/2013MNRAS.435.1904M/GRAPHICS', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'graphics', u'title': u'GRAPHICS (1)'},
{u'url': u'/2013MNRAS.435.1904M/METRICS', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'metrics', u'title': u'METRICS (1)'},
{u'url': u'/2013MNRAS.435.1904M/SIMILAR', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'similar', u'title': u'SIMILAR (1)'},
{u'url': u'/2013MNRAS.435.1904M/ESOURCE', u'count': 4, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'esource', u'title': u'ESOURCE (4)'},
{u'url': u'/2013MNRAS.435.1904M/DATA', u'count': 65, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'data', u'title': u'DATA (65)'},
{u'url': u'/2013MNRAS.435.1904M/DOI', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'doi', u'title': u'DOI (1)'},
{u'url': u'/2013MNRAS.435.1904M/ARXIV', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'arxiv', u'title': u'ARXIV (1)'}],
u'link_type': 'all'}, u'service': u''})
def test_link_inspire(self):
"""
return a record of link type == inspire
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/1943RvMP...15....1C/INSPIRE', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://inspirehep.net/search?p=find+j+RMPHA,15,1',
u'link_type': u'INSPIRE',
u'service': u'http://inspirehep.net/search?p=find+j+RMPHA,15,1'})
def test_link_presentation(self):
"""
fetch record of a link_type presentation
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2017MNRAS.467.3556B/PRESENTATION', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://www.astro.lu.se/~alexey/animations.html',
u'link_type': u'PRESENTATION',
u'service': u'http://www.astro.lu.se/~alexey/animations.html'})
def test_link_associated(self):
"""
returning list of url, title pairs
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/1971ATsir.615....4D/ASSOCIATED', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'display',
u'links': {u'count': 13,
u'records': [{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1971ATsir.615....4D%2Fabstract', u'bibcode': u'1971ATsir.615....4D', u'title': u'Part 1'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1974Afz....10..315D%2Fabstract', u'bibcode': u'1974Afz....10..315D', u'title': u'Part 2'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1971ATsir.621....7D%2Fabstract', u'bibcode': u'1971ATsir.621....7D', u'title': u'Part 3'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1976Afz....12..665D%2Fabstract', u'bibcode': u'1976Afz....12..665D', u'title': u'Part 4'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1971ATsir.624....1D%2Fabstract', u'bibcode': u'1971ATsir.624....1D', u'title': u'Part 5'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1983Afz....19..229D%2Fabstract', u'bibcode': u'1983Afz....19..229D', u'title': u'Part 6'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1983Ap.....19..134D%2Fabstract', u'bibcode': u'1983Ap.....19..134D', u'title': u'Part 7'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1973ATsir.759....6D%2Fabstract', u'bibcode': u'1973ATsir.759....6D', u'title': u'Part 8'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1984Afz....20..525D%2Fabstract', u'bibcode': u'1984Afz....20..525D', u'title': u'Part 9'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1984Ap.....20..290D%2Fabstract', u'bibcode': u'1984Ap.....20..290D', u'title': u'Part 10'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1974ATsir.809....1D%2Fabstract', u'bibcode': u'1974ATsir.809....1D', u'title': u'Part 11'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1974ATsir.809....2D%2Fabstract', u'bibcode': u'1974ATsir.809....2D', u'title': u'Part 12'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1974ATsir.837....2D%2Fabstract', u'bibcode': u'1974ATsir.837....2D', u'title': u'Part 13'}],
u'link_type': u'ASSOCIATED'},
u'service': u'/abs/1971ATsir.615....4D/associated'})
def test_link_esource_subtype(self):
"""
check status code for calling process_request for a esource sub type link
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M/EPRINT_HTML', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://arxiv.org/abs/1307.6556',
u'link_type': u'ESOURCE|EPRINT_HTML',
u'service': u'http://arxiv.org/abs/1307.6556'
})
def test_link_esource(self):
"""
returning list of urls
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M/ESOURCE', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'display',
u'links': {u'count': 4,
u'link_type': u'ESOURCE',
u'bibcode': u'2013MNRAS.435.1904M',
u'records': [
{u'url': u'http://arxiv.org/abs/1307.6556', u'title': u'http://arxiv.org/abs/1307.6556', u'link_type': u'ESOURCE|EPRINT_HTML'},
{u'url': u'http://arxiv.org/pdf/1307.6556', u'title': u'http://arxiv.org/pdf/1307.6556', u'link_type': u'ESOURCE|EPRINT_PDF'},
{u'url': u'https://doi.org/10.1093%2Fmnras%2Fstt1379', u'title': u'https://doi.org/10.1093%2Fmnras%2Fstt1379', u'link_type': u'ESOURCE|PUB_HTML'},
{u'url': u'http://mnras.oxfordjournals.org/content/435/3/1904.full.pdf', u'title': u'http://mnras.oxfordjournals.org/content/435/3/1904.full.pdf', u'link_type': u'ESOURCE|PUB_PDF'}
]
},
u'service': u''})
def test_link_data_subtype(self):
"""
check status code for calling process_request for a data sub type link
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M/ESA', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://archives.esac.esa.int/ehst/#bibcode=2013MNRAS.435.1904M',
u'link_type': u'DATA|ESA',
u'service': u'http://archives.esac.esa.int/ehst/#bibcode=2013MNRAS.435.1904M'})
def test_link_data(self):
"""
returning list of url, title pairs
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M/DATA', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json,{u'action': u'display',
u'service': u'',
u'links': {u'count': 8,
u'records': [{u'url': u'http://cxc.harvard.edu/cda', # Chandra
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|Chandra/http%3A%2F%2Fcda.harvard.edu%2Fchaser%3Fobsid%3D494',
u'title': u'Chandra Data Archive ObsIds 494',
u'link_type': u'DATA|Chandra'}],
u'title': u'Chandra X-Ray Observatory'},
{u'url': u'http://archives.esac.esa.int', # ESA
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|ESA/http%3A%2F%2Farchives.esac.esa.int%2Fehst%2F%23bibcode%3D2013MNRAS.435.1904M',
u'title': u'European HST References (EHST)',
u'link_type': u'DATA|ESA'}],
u'title': u'ESAC Science Data Center'
},
{u'url': u'https://heasarc.gsfc.nasa.gov/', # HEASARC
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|HEASARC/http%3A%2F%2Fheasarc.gsfc.nasa.gov%2Fcgi-bin%2FW3Browse%2Fbiblink.pl%3Fcode%3D2013MNRAS.435.1904M',
u'title': u'http://heasarc.gsfc.nasa.gov/cgi-bin/W3Browse/biblink.pl?code=2013MNRAS.435.1904M',
u'link_type': u'DATA|HEASARC'}],
u'title': u"NASA's High Energy Astrophysics Science Archive Research Center"
},
{u'url': u'https://www.cosmos.esa.int/web/herschel/home', #Herschel
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|Herschel/http%3A%2F%2Fherschel.esac.esa.int%2Fhpt%2Fpublicationdetailsview.do%3Fbibcode%3D2013MNRAS.435.1904M',
u'title': u'http://herschel.esac.esa.int/hpt/publicationdetailsview.do?bibcode=2013MNRAS.435.1904M',
u'link_type': u'DATA|Herschel'}],
u'title': u'Herschel Science Center'
},
{u'url': u'http://archive.stsci.edu', # MAST
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|MAST/http%3A%2F%2Farchive.stsci.edu%2Fmastbibref.php%3Fbibcode%3D2013MNRAS.435.1904M',
u'title': u'MAST References (GALEX EUVE HST)',
u'link_type': u'DATA|MAST'}],
u'title': u'Mikulski Archive for Space Telescopes'
},
{u'url': u'https://ned.ipac.caltech.edu', # NED
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|NED/http%3A%2F%2Fned.ipac.caltech.edu%2Fcgi-bin%2Fnph-objsearch%3Fsearch_type%3DSearch%26refcode%3D2013MNRAS.435.1904M',
u'title': u'NED Objects (1)',
u'link_type': u'DATA|NED'}],
u'title': u'NASA/IPAC Extragalactic Database'
},
{u'url': u'http://simbad.u-strasbg.fr', # SIMBAD
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|SIMBAD/http%3A%2F%2Fsimbad.u-strasbg.fr%2Fsimbo.pl%3Fbibcode%3D2013MNRAS.435.1904M',
u'title': u'SIMBAD Objects (30)',
u'link_type': u'DATA|SIMBAD'}],
u'title': u'SIMBAD Database at the CDS'
},
{u'url': u'http://nxsa.esac.esa.int', # XMM
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|XMM/http%3A%2F%2Fnxsa.esac.esa.int%2Fnxsa-web%2F%23obsid%3D0097820101',
u'title': u'XMM-Newton Observation Number 0097820101',
u'link_type': u'DATA|XMM'}],
u'title': u'XMM Newton Science Archive'
}],
u'bibcode': u'2013MNRAS.435.1904M'}})
def test_link_all_error_bibcode(self):
"""
call get_records to fetch all the records for a none existing bibcode
:return:
"""
results = get_records(bibcode='errorbibcode')
self.assertEqual(results, None)
def test_error_with_sub_type(self):
"""
call get_records to fetch the records for a none existing bibcode, link_type, and link_subtype
:return:
"""
results = get_records(bibcode='errorbibcode', link_type='errorlinktype', link_sub_type='errorlinksubtype')
self.assertEqual(results, None)
def test_link_associated_error_bibcode(self):
"""
return 404 for not finding any records
:return:
"""
results = get_records(bibcode='errorbibcode', link_type='ASSOCIATED')
response = LinkRequest(bibcode='').request_link_type_associated(results)
self.assertEqual(response._status_code, 404)
def test_link_esource_error_bibcode(self):
"""
return 404 for not finding any records
:return:
"""
results = get_records(bibcode='errorbibcode', link_type='ESOURCE')
response = LinkRequest(bibcode='').request_link_type_esource(results)
self.assertEqual(response._status_code, 404)
def test_link_data_error_bibcode(self):
"""
return 404 for not finding any records
:return:
"""
results = get_records(bibcode='errorbibcode', link_type='DATA')
response = LinkRequest(bibcode='').request_link_type_data(results)
self.assertEqual(response._status_code, 404)
def test_process_request_upsert(self):
"""
return 200 for successful insert/update to db
:return:
"""
self.add_stub_data()
datalinks_list = [{"bibcode": "1513efua.book.....S",
"data_links_rows": [{"link_type": "LIBRARYCATALOG", "link_sub_type": "",
"url": ["http://catalog.loc.gov/cgi-bin/Pwebrecon.cgi?v3=1&DB=local&CMD=010a+unk82013020&CNT=10+records+per+page"],
"title": [""],
"item_count": 0}]}]
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
# insert it here
response = self.client.put('/update', data=json.dumps(datalinks_list), headers=headers)
self.assertEqual(response._status_code, 200)
self.assertEqual(response.json['status'], 'updated db with new data successfully')
# select it here
response = self.client.get('/1513efua.book.....S/LIBRARYCATALOG', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://catalog.loc.gov/cgi-bin/Pwebrecon.cgi?v3=1&DB=local&CMD=010a+unk82013020&CNT=10+records+per+page',
u'link_type': u'LIBRARYCATALOG',
u'service': u'http://catalog.loc.gov/cgi-bin/Pwebrecon.cgi?v3=1&DB=local&CMD=010a+unk82013020&CNT=10+records+per+page'})
def test_process_request_delete(self):
"""
return 200 for successful deletion from db
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
# delete it here
bibcodes = {'bibcode': ['2013MNRAS.435.1904M']}
response = self.client.delete('/delete', data=json.dumps(bibcodes), headers=headers)
self.assertEqual(response._status_code, 200)
self.assertEqual(response.json['status'], 'removed 13 records of 1 bibcodes')
# select it here
response = self.client.get('/2013MNRAS.435.1904M/ESOURCE', headers=headers)
self.assertEqual(response._status_code, 404)
self.assertEqual(response.json['error'], 'did not find any records')
def test_link_toc(self):
"""
TOC was one of the on the fly types, as of 3/27/2019 we should have bibcodes with TOC in db
so this link should not be created if the entry does not exists in db
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
# verify that TOC record does not exsits in db and hence return error
response = self.client.get('/2019AIPC.2081c0032P/TOC', headers=headers)
self.assertEqual(response._status_code, 404)
# insert it here
datalinks_list = [{"bibcode": "2019AIPC.2081c0032P",
"data_links_rows": [{"link_type": "TOC", "link_sub_type": "", "url": [""], "title": [""], "item_count": 0}]}]
response = self.client.put('/update', data=json.dumps(datalinks_list), headers=headers)
self.assertEqual(response._status_code, 200)
self.assertEqual(response.json['status'], 'updated db with new data successfully')
# select it here
response = self.client.get('/2019AIPC.2081c0032P/TOC', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'/abs/2019AIPC.2081c0032P/toc',
u'service': u'/abs/2019AIPC.2081c0032P/toc',
u'link_type': u'TOC'})
# delete it here
bibcodes = {'bibcode': ['2019AIPC.2081c0032P']}
response = self.client.delete('/delete', data=json.dumps(bibcodes), headers=headers)
self.assertEqual(response._status_code, 200)
# verify that is gone
response = self.client.get('/2019AIPC.2081c0032P/TOC', headers=headers)
self.assertEqual(response._status_code, 404)
def test_link_esource_subtype_article(self):
"""
check status code for calling process_request for a esource sub type link for legacy type article
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2007ASPC..368...27R/ARTICLE', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://articles.adsabs.harvard.edu/pdf/2007ASPC..368...27R',
u'service': u'http://articles.adsabs.harvard.edu/pdf/2007ASPC..368...27R',
u'link_type': u'ESOURCE|ADS_PDF'})
response = self.client.get('/2013MNRAS.435.1904M/ARTICLE', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://mnras.oxfordjournals.org/content/435/3/1904.full.pdf',
u'service': u'http://mnras.oxfordjournals.org/content/435/3/1904.full.pdf',
u'link_type': u'ESOURCE|PUB_PDF'})
def test_data_subtype_multiple_links(self):
"""
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2004astro.ph..1427R/MAST', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'display',
u'links': {u'bibcode': u'2004astro.ph..1427R',
u'count': 1,
u'records': [{u'data': [{u'link_type': u'DATA|MAST',
u'title': u'GEMS: Galaxy Evolution from Morphologies and SEDs (Hans-Walter Rix)',
u'url': u'/2004astro.ph..1427R/DATA|MAST/http%3A%2F%2Farchive.stsci.edu%2Fprepds%2Fgems'}],
u'title': u'Mikulski Archive for Space Telescopes',
u'url': u'http://archive.stsci.edu'},
{u'data': [{u'link_type': u'DATA|MAST',
u'title': u'MAST References (HST)',
u'url': u'/2004astro.ph..1427R/DATA|MAST/https%3A%2F%2Farchive.stsci.edu%2Fmastbibref.php%3Fbibcode%3D2004ApJS..152..163R'}],
u'title': u'Mikulski Archive for Space Telescopes',
u'url': u'http://archive.stsci.edu'}]},
u'service': u''})
def test_verify_url(self):
"""
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/1514temg.book.....V/verify_url:https%3A%2F%2Fwww.si.edu%2Fobject%2Fsiris_sil_154413', headers=headers)
self.assertEqual(response.json, {'link': 'verified'})
response = self.client.get('/1514temg.book.....V/verify_url:http%3A%2F%2Fwww.google.com', headers=headers)
self.assertEqual(response.json, {'link': 'not found'})
class TestDatabaseNew(TestCaseDatabase):
def create_app(self):
'''Start the wsgi application'''
a = app.create_app(**{
'SQLALCHEMY_DATABASE_URI': self.postgresql_url,
'RESOLVER_GATEWAY_URL': '/{bibcode}/{link_type}/{url}',
})
return a
def add_stub_data(self):
"""
Add stub data
:return:
"""
stub_data = [
{
"bibcode": "2013MNRAS.435.1904M",
"identifier": ["2013MNRAS.435.1904M", "2013arXiv1307.6556M", "2013MNRAS.tmp.2206M", "10.1093/mnras/stt1379", "arXiv:1307.6556"],
"links": {
"DOI": ["10.1093/mnras/stt1379"],
"ARXIV": ["arXiv:1307.6556"],
"DATA": {
"Chandra": {
"url": ["https://cda.harvard.edu/chaser?obsid=494,493,5290,5289,5286,5288,5287,3666,6162,6159,6163,6160,6161,13413,12028,10900,10898,13416,13414,12029,12027,13417,10899,13412,10901,13415,12026"],
"title": ["Chandra Data Archive ObsIds 494, 493, 5290, 5289, 5286, 5288, 5287, 3666, 6162, 6159, 6163, 6160, 6161, 13413, 12028, 10900, 10898, 13416, 13414, 12029, 12027, 13417, 10899, 13412, 10901, 13415, 12026"],
"count": 1
},
"ESA": {
"url": ["http://archives.esac.esa.int/ehst/#bibcode=2013MNRAS.435.1904M"],
"title": ["European HST References (EHST)"],
"count": 1
},
"HEASARC": {
"url": ["http://heasarc.gsfc.nasa.gov/cgi-bin/W3Browse/biblink.pl?code=2013MNRAS.435.1904M"],
"title": ["http://heasarc.gsfc.nasa.gov/cgi-bin/W3Browse/biblink.pl?code=2013MNRAS.435.1904M"],
"count": 1
},
"Herschel": {
"url": ["http://archives.esac.esa.int/hsa/whsa/?ACTION=PUBLICATION&ID=2013MNRAS.435.1904M"],
"title": ["http://archives.esac.esa.int/hsa/whsa/?ACTION=PUBLICATION&ID=2013MNRAS.435.1904M"],
"count": 1
},
"MAST": {
"url": ["https://archive.stsci.edu/mastbibref.php?bibcode=2013MNRAS.435.1904M"],
"title": ["MAST References (HST, EUVE, GALEX)"],
"count": 3
},
"NED": {
"url": ["https://$NED$/uri/NED::InRefcode/2013MNRAS.435.1904M"],
"title": ["NED Objects (1)"],
"count": 1
},
"SIMBAD": {
"url": ["http://$SIMBAD$/simbo.pl?bibcode=2013MNRAS.435.1904M"],
"title": ["SIMBAD Objects (30)"],
"count": 30
},
"XMM": {
"url": ["https://nxsa.esac.esa.int/nxsa-web/#bibcode=2013MNRAS.435.1904M"],
"title": ["XMM data (1 observations)"],
"count": 1
}
},
"ESOURCE": {
"EPRINT_HTML": {
"url": ["https://arxiv.org/abs/1307.6556"],
"title": ['']
},
"EPRINT_PDF": {
"url": ["https://arxiv.org/pdf/1307.6556"],
"title": ['']
},
"PUB_HTML": {
"url": ["https://doi.org/10.1093%2Fmnras%2Fstt1379"],
"title": ['']
},
"PUB_PDF": {
"url": ["https://academic.oup.com/mnras/pdf-lookup/doi/10.1093/mnras/stt1379"],
"title": ['']
}
},
"CITATIONS": True,
"REFERENCES": True
}
},
{
"bibcode": "2017MNRAS.467.3556B",
"identifier": ["2017MNRAS.467.3556B", "2017arXiv170202377B", "10.1093/mnras/stx312", "arXiv:1702.02377"],
"links": {
"DOI": ["10.1093/mnras/stx312"],
"ARXIV": ["arXiv:1702.02377"],
"DATA": {
"SIMBAD": {
"url": ["http://$SIMBAD$/simbo.pl?bibcode=2017MNRAS.467.3556B"],
"title": ["SIMBAD Objects (5)"],
"count": 5
}
},
"ESOURCE": {
"EPRINT_HTML": {
"url": ["https://arxiv.org/abs/1702.02377"],
"title": ['']
},
"EPRINT_PDF": {
"url": ["https://arxiv.org/pdf/1702.02377"],
"title": ['']
},
"PUB_HTML": {
"url": ["https://doi.org/10.1093%2Fmnras%2Fstx312"],
"title": ['']},
"PUB_PDF": {
"url": ["https://academic.oup.com/mnras/pdf-lookup/doi/10.1093/mnras/stx312"],
"title": ['']
}
},
"ASSOCIATED": {
"url": ["http://www.astro.lu.se/~alexey/animations.html"],
"title": ["Supporting Media"]
},
"PRESENTATION": {
"url": ["http://www.astro.lu.se/~alexey/animations.html"],
"title": ['']
},
"CITATIONS": True,
"REFERENCES": True
}
},
{
"bibcode": "1943RvMP...15....1C",
"identifier": ["1943RvMP...15....1C", "10.1103/RevModPhys.15.1"],
"links": {
"DOI": ["10.1103/RevModPhys.15.1"],
"ESOURCE": {
"PUB_HTML": {
"url": ["http://link.aps.org/doi/10.1103/RevModPhys.15.1"],
"title": ['']
}
},
"INSPIRE": {
"url": ["http://inspirehep.net/search?p=find+j+RMPHA,15,1"],
"title": ['']
},
"CITATIONS": True,
"REFERENCES": True
}
},
{
"bibcode": "1971ATsir.615....4D",
"identifier": ["1971ATsir.615....4D"],
"links": {
"ASSOCIATED": {
"url": ["1971ATsir.615....4D", "1971ATsir.621....7D", "1971ATsir.624....1D", "1973ATsir.759....6D", "1974Afz....10..315D", "1974ATsir.809....1D", "1974ATsir.809....2D", "1974ATsir.837....2D", "1976Afz....12..665D", "1983Afz....19..229D", "1983Ap.....19..134D", "1984Afz....20..525D", "1984Ap.....20..290D"],
"title": ["Part 1","Part 3","Part 5","Part 8","Part 2","Part 11","Part 12","Part 13","Part 4","Part 6","Part 7","Part 9","Part 10"],
"count": 10
},
"CITATIONS": True,
"REFERENCES": False
}
},
{
"bibcode": "1514temg.book.....V",
"identifier": ["1514temg.book.....V", "10.3931/e-rara-426"],
"links": {
"DOI": ["10.3931/e-rara-426"],
"ESOURCE": {
"PUB_HTML": {
"url": ["https://doi.org/10.3931%2Fe-rara-426"],
"title": ['']
}
},
"ASSOCIATED": {
"url": ["1514temg.book.....V", "https://www.si.edu/object/siris_sil_154413"],
"title": ["Main Paper", "Supplementary Material"],
"count": 2
},
"CITATIONS": False,
"REFERENCES": False
}
},
{
"bibcode": "2007ASPC..368...27R",
"identifier": ["2007ASPC..368...27R", "2007astro.ph..3637R", "arXiv:astro-ph/0703637"],
"links": {
"ARXIV": ["arXiv:astro-ph/0703637"],
"TOC": True,
"ESOURCE": {
"ADS_PDF": {
"url": ["http://articles.adsabs.harvard.edu/pdf/2007ASPC..368...27R"],
"title": ['']
},
"ADS_SCAN": {
"url": ["http://articles.adsabs.harvard.edu/full/2007ASPC..368...27R"],
"title": ['']},
"EPRINT_HTML": {
"url": ["http://arxiv.org/abs/astro-ph/0703637"],
"title": ['']
},
"EPRINT_PDF": {
"url": ["http://arxiv.org/pdf/astro-ph/0703637"],
"title": ['']
},
"PUB_HTML": {
"url": ["http://aspbooks.org/custom/publications/paper/368-0027.html"],
"title": ['']
}
},
"REFERENCES": True
}
},
{
"bibcode": "2004ApJS..152..163R",
"identifier": ["2004ApJS..152..163R", "2004astro.ph..1427R", "10.1086/420885", "arXiv:astro-ph/0401427"],
"links": {
"DOI": ["10.1086/420885"],
"ARXIV": ["arXiv:astro-ph/0401427"],
"DATA": {
"ESA": {
"url": ["http://archives.esac.esa.int/ehst/#bibcode=2004ApJS..152..163R"],
"title": ["European HST References (EHST)"],
"count": 1
},
"MAST": {
"url": ["http://archive.stsci.edu/mastbibref.php?bibcode=2004ApJS..152..163R", "http://archive.stsci.edu/prepds/gems"],
"title": ["MAST References (HST)", "GEMS: Galaxy Evolution from Morphologies and SEDs (Hans-Walter Rix)"],
"count": 2
},
"SIMBAD": {
"url": ["http://$SIMBAD$/simbo.pl?bibcode=2004ApJS..152..163R"],
"title": ["SIMBAD Objects (3)"],
"count": 3
}
},
"ESOURCE": {
"EPRINT_HTML": {
"url": ["http://arxiv.org/abs/astro-ph/0401427"],
"title": ['']
},
"EPRINT_PDF": {
"url": ["http://arxiv.org/pdf/astro-ph/0401427"],
"title": ['']
},
"PUB_HTML": {
"url": ["https://doi.org/10.1086%2F420885"],
"title": ['']
},
"PUB_PDF": {
"url": ["http://stacks.iop.org/0067-0049/152/163/pdf"],
"title": ['']
}
},
"CITATIONS": True,
"REFERENCES": True
}
},
{
"bibcode": "2021JOSS....6.2807C",
"identifier": ["2021JOSS....6.2807C", "10.21105/joss.02807"],
"links": {
"DOI": ["10.21105/joss.02807"],
"CITATIONS": False,
"REFERENCES": False
}
},
{
"bibcode": "2021zndo...4441439K",
"identifier": ["2021zndo...4441439K", "10.5281/zenodo.4441439"],
"links": {
"DOI": ["10.5281/zenodo.4441439"],
"ESOURCE": {
"PUB_HTML": {
"url": ["https://doi.org/10.5281/zenodo.4441439"],
"title": ["https://doi.org/10.5281/zenodo.4441439"]
}
},
"CITATIONS": True,
"REFERENCES": False
}
}
]
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.put('/update_new', data=json.dumps(stub_data), headers=headers)
self.assertEqual(response._status_code, 200)
self.assertEqual(response.json['status'], 'updated db with new data successfully')
def test_link_type_all(self):
"""
return links for all types of a bibcode
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'display', u'links': {u'count': 17, u'records': [
{u'url': u'/2013MNRAS.435.1904M/ABSTRACT', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'abstract', u'title': u'ABSTRACT (1)'},
{u'url': u'/2013MNRAS.435.1904M/CITATIONS', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'citations', u'title': u'CITATIONS (1)'},
{u'url': u'/2013MNRAS.435.1904M/REFERENCES', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'references', u'title': u'REFERENCES (1)'},
{u'url': u'/2013MNRAS.435.1904M/COREADS', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'coreads', u'title': u'COREADS (1)'},
{u'url': u'/2013MNRAS.435.1904M/OPENURL', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'openurl', u'title': u'OPENURL (1)'},
{u'url': u'/2013MNRAS.435.1904M/GRAPHICS', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'graphics', u'title': u'GRAPHICS (1)'},
{u'url': u'/2013MNRAS.435.1904M/METRICS', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'metrics', u'title': u'METRICS (1)'},
{u'url': u'/2013MNRAS.435.1904M/SIMILAR', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'similar', u'title': u'SIMILAR (1)'},
{u'url': u'/2013MNRAS.435.1904M/ESOURCE', u'count': 4, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'esource', u'title': u'ESOURCE (4)'},
{u'url': u'/2013MNRAS.435.1904M/DATA', u'count': 39, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'data', u'title': u'DATA (39)'},
{u'url': u'/2013MNRAS.435.1904M/DOI', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'doi', u'title': u'DOI (1)'},
{u'url': u'/2013MNRAS.435.1904M/ARXIV', u'count': 1, u'bibcode': u'2013MNRAS.435.1904M', u'type': u'arxiv', u'title': u'ARXIV (1)'}],
u'link_type': 'all'}, u'service': u''})
def test_link_inspire(self):
"""
return a record of link type == inspire
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/1943RvMP...15....1C/INSPIRE/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://inspirehep.net/search?p=find+j+RMPHA,15,1',
u'link_type': u'INSPIRE',
u'service': u'http://inspirehep.net/search?p=find+j+RMPHA,15,1'})
def test_link_presentation(self):
"""
fetch record of a link_type presentation
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2017MNRAS.467.3556B/PRESENTATION/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://www.astro.lu.se/~alexey/animations.html',
u'link_type': u'PRESENTATION',
u'service': u'http://www.astro.lu.se/~alexey/animations.html'})
def test_link_associated(self):
"""
returning list of url, title pairs
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/1971ATsir.615....4D/ASSOCIATED/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'display',
u'links': {u'count': 13,
u'records': [{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1971ATsir.615....4D%2Fabstract', u'bibcode': u'1971ATsir.615....4D', u'title': u'Part 1'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1974Afz....10..315D%2Fabstract', u'bibcode': u'1974Afz....10..315D', u'title': u'Part 2'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1971ATsir.621....7D%2Fabstract', u'bibcode': u'1971ATsir.621....7D', u'title': u'Part 3'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1976Afz....12..665D%2Fabstract', u'bibcode': u'1976Afz....12..665D', u'title': u'Part 4'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1971ATsir.624....1D%2Fabstract', u'bibcode': u'1971ATsir.624....1D', u'title': u'Part 5'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1983Afz....19..229D%2Fabstract', u'bibcode': u'1983Afz....19..229D', u'title': u'Part 6'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1983Ap.....19..134D%2Fabstract', u'bibcode': u'1983Ap.....19..134D', u'title': u'Part 7'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1973ATsir.759....6D%2Fabstract', u'bibcode': u'1973ATsir.759....6D', u'title': u'Part 8'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1984Afz....20..525D%2Fabstract', u'bibcode': u'1984Afz....20..525D', u'title': u'Part 9'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1984Ap.....20..290D%2Fabstract', u'bibcode': u'1984Ap.....20..290D', u'title': u'Part 10'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1974ATsir.809....1D%2Fabstract', u'bibcode': u'1974ATsir.809....1D', u'title': u'Part 11'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1974ATsir.809....2D%2Fabstract', u'bibcode': u'1974ATsir.809....2D', u'title': u'Part 12'},
{u'url': u'/1971ATsir.615....4D/associated/:%2Fabs%2F1974ATsir.837....2D%2Fabstract', u'bibcode': u'1974ATsir.837....2D', u'title': u'Part 13'}],
u'link_type': u'ASSOCIATED'},
u'service': u'/abs/1971ATsir.615....4D/associated'})
def test_link_esource_subtype(self):
"""
check status code for calling process_request for a esource sub type link
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M/EPRINT_HTML/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'https://arxiv.org/abs/1307.6556',
u'link_type': u'ESOURCE|EPRINT_HTML',
u'service': u'https://arxiv.org/abs/1307.6556'
})
def test_link_esource(self):
"""
returning list of urls
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M/ESOURCE/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'display',
u'links': {u'count': 4,
u'link_type': u'ESOURCE',
u'bibcode': u'2013MNRAS.435.1904M',
u'records': [
{u'url': u'https://academic.oup.com/mnras/pdf-lookup/doi/10.1093/mnras/stt1379', u'title': u'https://academic.oup.com/mnras/pdf-lookup/doi/10.1093/mnras/stt1379', u'link_type': u'ESOURCE|PUB_PDF'},
{u'url': u'https://doi.org/10.1093%2Fmnras%2Fstt1379', u'title': u'https://doi.org/10.1093%2Fmnras%2Fstt1379', u'link_type': u'ESOURCE|PUB_HTML'},
{u'url': u'https://arxiv.org/pdf/1307.6556', u'title': u'https://arxiv.org/pdf/1307.6556', u'link_type': u'ESOURCE|EPRINT_PDF'},
{u'url': u'https://arxiv.org/abs/1307.6556', u'title': u'https://arxiv.org/abs/1307.6556', u'link_type': u'ESOURCE|EPRINT_HTML'},
]
},
u'service': u''})
def test_link_data_subtype(self):
"""
check status code for calling process_request for a data sub type link
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M/ESA/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://archives.esac.esa.int/ehst/#bibcode=2013MNRAS.435.1904M',
u'link_type': u'DATA|ESA',
u'service': u'http://archives.esac.esa.int/ehst/#bibcode=2013MNRAS.435.1904M'})
def test_link_data(self):
"""
returning list of url, title pairs
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2013MNRAS.435.1904M/DATA/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json,{u'action': u'display',
u'service': u'',
u'links': {u'count': 8,
u'records': [{u'url': u'http://archives.esac.esa.int', # ESA
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|ESA/http%3A%2F%2Farchives.esac.esa.int%2Fehst%2F%23bibcode%3D2013MNRAS.435.1904M',
u'title': u'European HST References (EHST)',
u'link_type': u'DATA|ESA'}],
u'title': u'ESAC Science Data Center'
},
{u'url': u'https://ned.ipac.caltech.edu', # NED
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|NED/https%3A%2F%2Fned.ipac.caltech.edu%2Furi%2FNED%3A%3AInRefcode%2F2013MNRAS.435.1904M',
u'title': u'NED Objects (1)',
u'link_type': u'DATA|NED'}],
u'title': u'NASA/IPAC Extragalactic Database'
},
{u'url': u'http://nxsa.esac.esa.int', # XMM
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|XMM/https%3A%2F%2Fnxsa.esac.esa.int%2Fnxsa-web%2F%23bibcode%3D2013MNRAS.435.1904M',
u'title': u'XMM data (1 observations)',
u'link_type': u'DATA|XMM'}],
u'title': u'XMM Newton Science Archive'
},
{u'url': u'http://archive.stsci.edu', # MAST
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|MAST/https%3A%2F%2Farchive.stsci.edu%2Fmastbibref.php%3Fbibcode%3D2013MNRAS.435.1904M',
u'title': u'MAST References (HST, EUVE, GALEX)',
u'link_type': u'DATA|MAST'}],
u'title': u'Mikulski Archive for Space Telescopes'
},
{u'url': u'http://simbad.u-strasbg.fr', # SIMBAD
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|SIMBAD/http%3A%2F%2Fsimbad.u-strasbg.fr%2Fsimbo.pl%3Fbibcode%3D2013MNRAS.435.1904M',
u'title': u'SIMBAD Objects (30)',
u'link_type': u'DATA|SIMBAD'}],
u'title': u'SIMBAD Database at the CDS'
},
{u'url': u'http://cxc.harvard.edu/cda', # Chandra
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|Chandra/https%3A%2F%2Fcda.harvard.edu%2Fchaser%3Fobsid%3D494%2C493%2C5290%2C5289%2C5286%2C5288%2C5287%2C3666%2C6162%2C6159%2C6163%2C6160%2C6161%2C13413%2C12028%2C10900%2C10898%2C13416%2C13414%2C12029%2C12027%2C13417%2C10899%2C13412%2C10901%2C13415%2C12026',
u'title': u'Chandra Data Archive ObsIds 494, 493, 5290, 5289, 5286, 5288, 5287, 3666, 6162, 6159, 6163, 6160, 6161, 13413, 12028, 10900, 10898, 13416, 13414, 12029, 12027, 13417, 10899, 13412, 10901, 13415, 12026',
u'link_type': u'DATA|Chandra'}],
u'title': u'Chandra X-Ray Observatory'
},
{u'url': u'https://heasarc.gsfc.nasa.gov/', # HEASARC
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|HEASARC/http%3A%2F%2Fheasarc.gsfc.nasa.gov%2Fcgi-bin%2FW3Browse%2Fbiblink.pl%3Fcode%3D2013MNRAS.435.1904M',
u'title': u'http://heasarc.gsfc.nasa.gov/cgi-bin/W3Browse/biblink.pl?code=2013MNRAS.435.1904M',
u'link_type': u'DATA|HEASARC'}],
u'title': u"NASA's High Energy Astrophysics Science Archive Research Center"
},
{u'url': u'https://www.cosmos.esa.int/web/herschel/home', #Herschel
u'data': [{u'url': u'/2013MNRAS.435.1904M/DATA|Herschel/http%3A%2F%2Farchives.esac.esa.int%2Fhsa%2Fwhsa%2F%3FACTION%3DPUBLICATION%26ID%3D2013MNRAS.435.1904M',
u'title': u'http://archives.esac.esa.int/hsa/whsa/?ACTION=PUBLICATION&ID=2013MNRAS.435.1904M',
u'link_type': u'DATA|Herschel'}],
u'title': u'Herschel Science Center'
}],
u'bibcode': u'2013MNRAS.435.1904M'}})
def test_link_all_error_bibcode(self):
"""
call get_records to fetch all the records for a none existing bibcode
:return:
"""
results = get_records(bibcode='errorbibcode')
self.assertEqual(results, None)
def test_error_with_sub_type(self):
"""
call get_records to fetch the records for a none existing bibcode, link_type, and link_subtype
:return:
"""
results = get_records(bibcode='errorbibcode', link_type='errorlinktype', link_sub_type='errorlinksubtype')
self.assertEqual(results, None)
def test_link_associated_error_bibcode(self):
"""
return 404 for not finding any records
:return:
"""
results = get_records_new(bibcode='errorbibcode', link_type='ASSOCIATED')
response = LinkRequest(bibcode='').request_link_type_associated(results)
self.assertEqual(response._status_code, 404)
def test_link_esource_error_bibcode(self):
"""
return 404 for not finding any records
:return:
"""
results = get_records_new(bibcode='errorbibcode', link_type='ESOURCE')
response = LinkRequest(bibcode='').request_link_type_esource(results)
self.assertEqual(response._status_code, 404)
def test_link_data_error_bibcode(self):
"""
return 404 for not finding any records
:return:
"""
results = get_records_new(bibcode='errorbibcode', link_type='DATA')
response = LinkRequest(bibcode='').request_link_type_data(results)
self.assertEqual(response._status_code, 404)
def test_process_request_upsert(self):
"""
return 200 for successful insert/update to db
:return:
"""
self.add_stub_data()
document_record = {
"bibcode": "1513efua.book.....S",
"identifier": [],
"links": {
"LIBRARYCATALOG": {
"url": ["http://catalog.loc.gov/cgi-bin/Pwebrecon.cgi?v3=1&DB=local&CMD=010a+unk82013020&CNT=10+records+per+page"],
"title": ['']
}
}
}
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
# insert it here
response = self.client.put('/update_new', data=json.dumps([document_record]), headers=headers)
self.assertEqual(response._status_code, 200)
self.assertEqual(response.json['status'], 'updated db with new data successfully')
# select it here
response = self.client.get('/1513efua.book.....S/LIBRARYCATALOG/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://catalog.loc.gov/cgi-bin/Pwebrecon.cgi?v3=1&DB=local&CMD=010a+unk82013020&CNT=10+records+per+page',
u'link_type': u'LIBRARYCATALOG',
u'service': u'http://catalog.loc.gov/cgi-bin/Pwebrecon.cgi?v3=1&DB=local&CMD=010a+unk82013020&CNT=10+records+per+page'})
def test_process_request_delete(self):
"""
return 200 for successful deletion from db
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
# delete it here
bibcodes = {'bibcode': ['2013MNRAS.435.1904M']}
response = self.client.delete('/delete_new', data=json.dumps(bibcodes), headers=headers)
self.assertEqual(response._status_code, 200)
self.assertEqual(response.json['status'], 'removed 1 records of 1 bibcodes')
# select it here
response = self.client.get('/2013MNRAS.435.1904M/ESOURCE/new', headers=headers)
self.assertEqual(response._status_code, 404)
self.assertEqual(response.json['error'], 'did not find any records')
def test_link_toc(self):
"""
TOC was one of the on the fly types, as of 3/27/2019 we should have bibcodes with TOC in db
so this link should not be created if the entry does not exists in db
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
# verify that TOC record does not exsits in db and hence return error
response = self.client.get('/2019AIPC.2081c0032P/TOC/new', headers=headers)
self.assertEqual(response._status_code, 404)
# insert it here
document_record = {"bibcode": "2019AIPC.2081c0032P",
"identifier": [],
"links": {"TOC": True}}
response = self.client.put('/update_new', data=json.dumps([document_record]), headers=headers)
self.assertEqual(response._status_code, 200)
self.assertEqual(response.json['status'], 'updated db with new data successfully')
# select it here
response = self.client.get('/2019AIPC.2081c0032P/TOC/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'/abs/2019AIPC.2081c0032P/toc',
u'service': u'/abs/2019AIPC.2081c0032P/toc',
u'link_type': u'TOC'})
# delete it here
bibcodes = {'bibcode': ['2019AIPC.2081c0032P']}
response = self.client.delete('/delete_new', data=json.dumps(bibcodes), headers=headers)
self.assertEqual(response._status_code, 200)
# verify that is gone
response = self.client.get('/2019AIPC.2081c0032P/TOC/new', headers=headers)
self.assertEqual(response._status_code, 404)
def test_link_esource_subtype_article(self):
"""
check status code for calling process_request for a esource sub type link for legacy type article
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2007ASPC..368...27R/ARTICLE/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'http://articles.adsabs.harvard.edu/pdf/2007ASPC..368...27R',
u'service': u'http://articles.adsabs.harvard.edu/pdf/2007ASPC..368...27R',
u'link_type': u'ESOURCE|ADS_PDF'})
response = self.client.get('/2013MNRAS.435.1904M/ARTICLE/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'redirect',
u'link': u'https://academic.oup.com/mnras/pdf-lookup/doi/10.1093/mnras/stt1379',
u'service': u'https://academic.oup.com/mnras/pdf-lookup/doi/10.1093/mnras/stt1379',
u'link_type': u'ESOURCE|PUB_PDF'})
def test_data_subtype_multiple_links(self):
"""
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/2004ApJS..152..163R/MAST/new', headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {u'action': u'display',
u'links': {u'bibcode': u'2004ApJS..152..163R',
u'count': 1,
u'records': [{u'data': [{u'link_type': u'DATA|MAST',
u'title': u'MAST References (HST)',
u'url': u'/2004ApJS..152..163R/DATA|MAST/http%3A%2F%2Farchive.stsci.edu%2Fmastbibref.php%3Fbibcode%3D2004ApJS..152..163R'}],
u'title': u'Mikulski Archive for Space Telescopes',
u'url': u'http://archive.stsci.edu'},
{u'data': [{u'link_type': u'DATA|MAST',
u'title': u'GEMS: Galaxy Evolution from Morphologies and SEDs (Hans-Walter Rix)',
u'url': u'/2004ApJS..152..163R/DATA|MAST/http%3A%2F%2Farchive.stsci.edu%2Fprepds%2Fgems'}],
u'title': u'Mikulski Archive for Space Telescopes',
u'url': u'http://archive.stsci.edu'}]},
u'service': u''})
def test_verify_url(self):
"""
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
response = self.client.get('/1514temg.book.....V/verify_url:https%3A%2F%2Fwww.si.edu%2Fobject%2Fsiris_sil_154413/new', headers=headers)
self.assertEqual(response.json, {'link': 'verified'})
response = self.client.get('/1514temg.book.....V/verify_url:http%3A%2F%2Fwww.google.com/new', headers=headers)
self.assertEqual(response.json, {'link': 'not found'})
def test_link_esource_subtype_article_no_record(self):
"""
check status code for calling process_request for a esource sub type link for legacy type article where there is no esource
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
# no esource record
response = self.client.get('/2021JOSS....6.2807C/ARTICLE/new', headers=headers)
self.assertEqual(response._status_code, 404)
self.assertDictEqual(response.json, {'error': 'did not find any records'})
# no %_PDF record
response = self.client.get('/2021zndo...4441439K/ARTICLE/new', headers=headers)
self.assertEqual(response._status_code, 404)
self.assertDictEqual(response.json, {'error': 'did not find any records'})
def test_get_records_failure(self):
"""
:return:
"""
with self.assertRaises(Exception) as context:
get_records_new({'bibcode':'is single value'})
self.assertTrue("can't adapt type 'dict'" in str(context))
with self.assertRaises(Exception) as context:
document_records = {
'status': 2, # name='new', index=2, number=2,
'document_records': [
{
'bibcode': '2021JOSS....6.2807C',
'identifier': ['2021JOSS....6.2807C'],
'links': {
'DOI': ['10.21105/joss.02807'],
'CITATIONS': False,
'REFERENCES': False
}
}, {
'bibcode': '2021JOSS....6.2807C',
'identifier': ['2021JOSS....6.2807C'],
'links': {
'DOI': ['10.21105/joss.02807'],
'CITATIONS': False,
'REFERENCES': False
}
},
]
}
# duplicate records
add_records_new(DocumentRecords(**document_records))
self.assertTrue("ON CONFLICT DO UPDATE command cannot affect row a second time" in str(context))
with self.assertRaises(Exception) as context:
del_records_new([12, 14])
self.assertTrue("operator does not exist: character varying = integer" in str(context))
with self.assertRaises(Exception) as context:
get_ids({'id': ['is a list']})
self.assertTrue("no matches found in database" in str(context))
def test_reconciliation(self):
"""
:return:
"""
self.add_stub_data()
headers = {'Content-type': 'application/json', 'Accept': 'text/plain'}
identifiers = ['doi:10.1093/mnras/stt1379', 'arXiv:1702.02377', 'https://doi.org/10.5281/zenodo.4441439', '10.1086/420885', 'astro-ph/0703637']
response = self.client.post('/reconciliation', data=json.dumps({'identifier':identifiers}), headers=headers)
self.assertEqual(response._status_code, 200)
self.assertDictEqual(response.json, {'ids': [
['10.1093/mnras/stt1379', '2013MNRAS.435.1904M'],
['arXiv:1702.02377', '2017MNRAS.467.3556B'],
['10.5281/zenodo.4441439', '2021zndo...4441439K'],
['10.1086/420885', '2004ApJS..152..163R'],
['astro-ph/0703637', 'not in database']
]})
if __name__ == '__main__':
unittest.main()
| 64.308396 | 528 | 0.465067 | 7,669 | 77,363 | 4.619768 | 0.07602 | 0.031161 | 0.059979 | 0.035564 | 0.882808 | 0.866127 | 0.836885 | 0.823224 | 0.798781 | 0.770075 | 0 | 0.113778 | 0.386632 | 77,363 | 1,202 | 529 | 64.361897 | 0.63285 | 0.040562 | 0 | 0.611354 | 0 | 0.054585 | 0.373734 | 0.093221 | 0 | 0 | 0 | 0 | 0.115721 | 1 | 0.049127 | false | 0 | 0.008734 | 0 | 0.062227 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
67c2fefe5e5837bdd50dbb1893b4854e1d55a134 | 19 | py | Python | web3_erc20_predefined/predefined/bsc_testnet/__init__.py | kkristof200/py_web3_erc20_predefined | e95399bb14c61bb56e56f474937b0ace8565772b | [
"MIT"
] | null | null | null | web3_erc20_predefined/predefined/bsc_testnet/__init__.py | kkristof200/py_web3_erc20_predefined | e95399bb14c61bb56e56f474937b0ace8565772b | [
"MIT"
] | null | null | null | web3_erc20_predefined/predefined/bsc_testnet/__init__.py | kkristof200/py_web3_erc20_predefined | e95399bb14c61bb56e56f474937b0ace8565772b | [
"MIT"
] | null | null | null | from .wbnb import * | 19 | 19 | 0.736842 | 3 | 19 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 19 | 1 | 19 | 19 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
67d6879f5146f87caabfd02d2a8803ac319ed5b6 | 23 | py | Python | handwritten_multi_digit_number_recognition/models/__init__.py | kingyiusuen/handwritten-digit-string-recognition | a1527c72258ba374df5be38a7e76a82e845028fa | [
"MIT"
] | 1 | 2021-08-24T10:10:02.000Z | 2021-08-24T10:10:02.000Z | handwritten_multi_digit_number_recognition/models/__init__.py | kingyiusuen/handwritten-multi-digit-number-recognition | a1527c72258ba374df5be38a7e76a82e845028fa | [
"MIT"
] | null | null | null | handwritten_multi_digit_number_recognition/models/__init__.py | kingyiusuen/handwritten-multi-digit-number-recognition | a1527c72258ba374df5be38a7e76a82e845028fa | [
"MIT"
] | 1 | 2021-08-24T19:09:08.000Z | 2021-08-24T19:09:08.000Z | from .crnn import CRNN
| 11.5 | 22 | 0.782609 | 4 | 23 | 4.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
67ec2fcef222bf65327501a5068fad2e6ca60b81 | 661 | py | Python | pkg_dir/src/parameters/__init__.py | robperch/robase_datalysis | 343cb59b16630ca776bd941897ab8da63f20bfe1 | [
"MIT"
] | 1 | 2022-01-09T18:20:58.000Z | 2022-01-09T18:20:58.000Z | pkg_dir/src/parameters/__init__.py | robperch/robasecode | 343cb59b16630ca776bd941897ab8da63f20bfe1 | [
"MIT"
] | 2 | 2022-01-13T23:49:17.000Z | 2022-01-13T23:51:14.000Z | pkg_dir/src/parameters/__init__.py | robperch/robasecode | 343cb59b16630ca776bd941897ab8da63f20bfe1 | [
"MIT"
] | null | null | null | ## INITIALIZATION MODULE - PARAMETERS
## Imports
"----------------------------------------------------------------------------------------------------------------------"
"----------------------------------------------------------------------------------------------------------------------"
############################################# END OF FILE ##############################################################
"----------------------------------------------------------------------------------------------------------------------"
"----------------------------------------------------------------------------------------------------------------------"
| 44.066667 | 120 | 0.069592 | 7 | 661 | 6.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034796 | 661 | 14 | 121 | 47.214286 | 0.0721 | 0.263238 | 0 | 1 | 0 | 0 | 0.955466 | 0.955466 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e1d33bec95a5b5f3071b0ff42d08c60dd272cfdd | 41,226 | py | Python | networking_cisco/tests/unit/cisco/l3/test_asr1k_routertype_driver.py | CiscoSystems/networking-cisco | ed18627faf90caa1c0d1b7fe00f240a57901dee4 | [
"Apache-2.0"
] | 8 | 2016-02-12T01:25:29.000Z | 2019-01-13T14:19:25.000Z | networking_cisco/tests/unit/cisco/l3/test_asr1k_routertype_driver.py | CiscoSystems/networking-cisco | ed18627faf90caa1c0d1b7fe00f240a57901dee4 | [
"Apache-2.0"
] | 25 | 2016-01-28T12:33:41.000Z | 2016-07-28T21:18:03.000Z | networking_cisco/tests/unit/cisco/l3/test_asr1k_routertype_driver.py | CiscoSystems/networking-cisco | ed18627faf90caa1c0d1b7fe00f240a57901dee4 | [
"Apache-2.0"
] | 9 | 2015-05-07T02:47:55.000Z | 2019-10-18T15:25:27.000Z | # Copyright 2015 Cisco Systems, Inc. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from oslo_config import cfg
from oslo_utils import uuidutils
from neutron import context
from neutron.extensions import l3
from networking_cisco.plugins.cisco.common import cisco_constants
from networking_cisco.plugins.cisco.db.l3 import ha_db
from networking_cisco.plugins.cisco.extensions import ha
from networking_cisco.plugins.cisco.extensions import routerhostingdevice
from networking_cisco.plugins.cisco.extensions import routerrole
from networking_cisco.plugins.cisco.extensions import routertype
from networking_cisco.plugins.cisco.extensions import routertypeawarescheduler
from networking_cisco.tests.unit.cisco.l3 import (
test_ha_l3_router_appliance_plugin as cisco_ha_test)
from networking_cisco.tests.unit.cisco.l3 import (
test_l3_routertype_aware_schedulers as cisco_test_case)
_uuid = uuidutils.generate_uuid
EXTERNAL_GW_INFO = l3.EXTERNAL_GW_INFO
AGENT_TYPE_L3_CFG = cisco_constants.AGENT_TYPE_L3_CFG
ROUTER_ROLE_GLOBAL = cisco_constants.ROUTER_ROLE_GLOBAL
ROUTER_ROLE_LOGICAL_GLOBAL = cisco_constants.ROUTER_ROLE_LOGICAL_GLOBAL
ROUTER_ROLE_HA_REDUNDANCY = cisco_constants.ROUTER_ROLE_HA_REDUNDANCY
LOGICAL_ROUTER_ROLE_NAME = cisco_constants.LOGICAL_ROUTER_ROLE_NAME
ROUTER_ROLE_ATTR = routerrole.ROUTER_ROLE_ATTR
HOSTING_DEVICE_ATTR = routerhostingdevice.HOSTING_DEVICE_ATTR
AUTO_SCHEDULE_ATTR = routertypeawarescheduler.AUTO_SCHEDULE_ATTR
class Asr1kRouterTypeDriverTestCase(
cisco_test_case.L3RoutertypeAwareHostingDeviceSchedulerTestCaseBase):
# Nexus router type for ASR1k driver tests, why?
# - Yes(!), it does not matter and there is only one hosting device for
# that router type in the test setup which makes scheduling deterministic
router_type = 'Nexus_ToR_Neutron_router'
def _verify_created_routers(self, router_ids, hd_id):
# tenant routers
q_p = '%s=None' % ROUTER_ROLE_ATTR
r_ids = {r['id'] for r in self._list(
'routers', query_params=q_p)['routers']}
self.assertEqual(len(r_ids), len(router_ids))
for r_id in r_ids:
self.assertIn(r_id, router_ids)
# global router on hosting device
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_GLOBAL)
g_rtrs = self._list('routers', query_params=q_p)['routers']
self.assertEqual(len(g_rtrs), 1)
g_rtr = g_rtrs[0]
self.assertEqual(g_rtr['name'].endswith(
hd_id[-cisco_constants.ROLE_ID_LEN:]), True)
# logical global router for global routers HA
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_LOGICAL_GLOBAL)
g_l_rtrs = self._list('routers', query_params=q_p)['routers']
self.assertEqual(len(g_l_rtrs), 1)
g_l_rtr = g_l_rtrs[0]
self.assertEqual(g_l_rtr['name'], LOGICAL_ROUTER_ROLE_NAME)
self.assertEqual(g_l_rtr[AUTO_SCHEDULE_ATTR], False)
# ensure first routers_updated notification was for global router
notifier = self.plugin.agent_notifiers[AGENT_TYPE_L3_CFG]
notify_call = notifier.method_calls[0]
self.assertEqual(notify_call[0], 'routers_updated')
updated_routers = notify_call[1][1]
self.assertEqual(len(updated_routers), 1)
self.assertEqual(updated_routers[0]['id'], g_rtr['id'])
# ensure *no* update notifications where sent for logical global router
for call in notifier.method_calls:
self.assertNotIn(call[1][1][0][ROUTER_ROLE_ATTR],
[ROUTER_ROLE_LOGICAL_GLOBAL])
def _test_gw_router_create_adds_global_router(self, set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
ext_gw = {'network_id': s['subnet']['network_id']}
with self.router(tenant_id=tenant_id, external_gateway_info=ext_gw,
set_context=set_context) as router1:
r1 = router1['router']
self.plugin._process_backlogged_routers()
r1_after = self._show('routers', r1['id'])['router']
hd_id = r1_after[HOSTING_DEVICE_ATTR]
# should have one global router now
self._verify_created_routers({r1['id']}, hd_id)
with self.router(name='router2', tenant_id=tenant_id,
external_gateway_info=ext_gw,
set_context=set_context) as router2:
r2 = router2['router']
self.plugin._process_backlogged_routers()
# should still have only one global router
self._verify_created_routers({r1['id'], r2['id']}, hd_id)
def test_gw_router_create_adds_global_router(self):
self._test_gw_router_create_adds_global_router()
def test_gw_router_create_adds_global_router_non_admin(self):
self._test_gw_router_create_adds_global_router(True)
def _test_router_create_adds_no_global_router(self, set_context=False):
with self.router(set_context=set_context) as router:
r = router['router']
self.plugin._process_backlogged_routers()
# tenant routers
q_p = '%s=None' % ROUTER_ROLE_ATTR
t_rtrs = self._list('routers', query_params=q_p)['routers']
self.assertEqual(len(t_rtrs), 1)
t_rtr = t_rtrs[0]
self.assertEqual(t_rtr['id'], r['id'])
# global router
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_GLOBAL)
g_rtrs = self._list('routers', query_params=q_p)['routers']
self.assertEqual(len(g_rtrs), 0)
# logical global router for global routers HA
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_LOGICAL_GLOBAL)
g_l_rtrs = self._list('routers', query_params=q_p)['routers']
self.assertEqual(len(g_l_rtrs), 0)
notifier = self.plugin.agent_notifiers[AGENT_TYPE_L3_CFG]
# ensure *no* update notifications where sent for global
# router (as there should be none) or logical global router
for call in notifier.method_calls:
if call[0] != 'router_deleted':
self.assertNotIn(call[1][1][0][ROUTER_ROLE_ATTR],
[ROUTER_ROLE_GLOBAL,
ROUTER_ROLE_LOGICAL_GLOBAL])
def test_router_create_adds_no_global_router(self):
self._test_router_create_adds_no_global_router()
def test_router_create_adds_no_global_router_non_admin(self):
self._test_router_create_adds_no_global_router(True)
def _verify_updated_routers(self, router_ids, hd_id=None, call_index=1):
# tenant routers
q_p = '%s=None' % ROUTER_ROLE_ATTR
r_ids = {r['id'] for r in self._list(
'routers', query_params=q_p)['routers']}
self.assertEqual(len(r_ids), len(router_ids))
for r_id in r_ids:
self.assertIn(r_id, router_ids)
# global routers
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_GLOBAL)
g_rtrs = self._list('routers', query_params=q_p)['routers']
# logical global router for global routers HA
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_LOGICAL_GLOBAL)
g_l_rtrs = self._list('routers', query_params=q_p)['routers']
notifier = self.plugin.agent_notifiers[AGENT_TYPE_L3_CFG]
if hd_id:
self.assertEqual(len(g_rtrs), 1)
g_rtr = g_rtrs[0]
self.assertEqual(
g_rtr['name'].endswith(hd_id[-cisco_constants.ROLE_ID_LEN:]),
True)
self.assertEqual(len(g_l_rtrs), 1)
g_l_rtr = g_l_rtrs[0]
self.assertEqual(g_l_rtr['name'], LOGICAL_ROUTER_ROLE_NAME)
self.assertEqual(g_l_rtr[AUTO_SCHEDULE_ATTR], False)
# routers_updated notification call_index is for global router
notify_call = notifier.method_calls[call_index]
self.assertEqual(notify_call[0], 'routers_updated')
updated_routers = notify_call[1][1]
self.assertEqual(len(updated_routers), 1)
self.assertEqual(updated_routers[0]['id'], g_rtr['id'])
else:
self.assertEqual(len(g_rtrs), 0)
self.assertEqual(len(g_l_rtrs), 0)
# ensure *no* update notifications where sent for logical global router
for call in notifier.method_calls:
if call[0] != 'router_deleted':
self.assertNotIn(call[1][1][0][ROUTER_ROLE_ATTR],
[ROUTER_ROLE_LOGICAL_GLOBAL])
def _test_router_update_set_gw_adds_global_router(self, set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
with self.router(tenant_id=tenant_id,
set_context=set_context) as router1,\
self.router(name='router2', tenant_id=tenant_id,
set_context=set_context) as router2:
r1 = router1['router']
r2 = router2['router']
# backlog processing will trigger one routers_updated
# notification containing r1 and r2
self.plugin._process_backlogged_routers()
# should have no global router yet
r_ids = {r1['id'], r2['id']}
self._verify_updated_routers(r_ids)
ext_gw = {'network_id': s['subnet']['network_id']}
r_spec = {'router': {l3.EXTERNAL_GW_INFO: ext_gw}}
r1_after = self._update('routers', r1['id'], r_spec)['router']
hd_id = r1_after[HOSTING_DEVICE_ATTR]
# should now have one global router
self._verify_updated_routers(r_ids, hd_id)
self._update('routers', r2['id'], r_spec)
# should still have only one global router
self._verify_updated_routers(r_ids, hd_id)
def test_router_update_set_gw_adds_global_router(self):
self._test_router_update_set_gw_adds_global_router()
def test_router_update_set_gw_adds_global_router_non_admin(self):
self._test_router_update_set_gw_adds_global_router(True)
def _test_router_update_unset_gw_keeps_global_router(self,
set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
ext_gw = {'network_id': s['subnet']['network_id']}
with self.router(tenant_id=tenant_id,
external_gateway_info=ext_gw,
set_context=set_context) as router1,\
self.router(name='router2', tenant_id=tenant_id,
external_gateway_info=ext_gw,
set_context=set_context) as router2:
r1 = router1['router']
r2 = router2['router']
# backlog processing will trigger one routers_updated
# notification containing r1 and r2
self.plugin._process_backlogged_routers()
r1_after = self._show('routers', r1['id'])['router']
hd_id = r1_after[HOSTING_DEVICE_ATTR]
r_ids = {r1['id'], r2['id']}
# should have one global router now
self._verify_updated_routers(r_ids, hd_id, 0)
r_spec = {'router': {l3.EXTERNAL_GW_INFO: None}}
self._update('routers', r1['id'], r_spec)
# should still have one global router
self._verify_updated_routers(r_ids, hd_id, 0)
self._update('routers', r2['id'], r_spec)
# should have no global router now
self._verify_updated_routers(r_ids)
def test_router_update_unset_gw_keeps_global_router(self):
self._test_router_update_unset_gw_keeps_global_router()
def test_router_update_unset_gw_keeps_global_router_non_admin(self):
self._test_router_update_unset_gw_keeps_global_router(True)
def _verify_deleted_routers(self, hd_id=None, id_global_router=None):
# global routers
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_GLOBAL)
g_rtrs = self._list('routers', query_params=q_p)['routers']
if hd_id:
self.assertEqual(len(g_rtrs), 1)
g_rtr = g_rtrs[0]
self.assertEqual(g_rtr['name'].endswith(
hd_id[-cisco_constants.ROLE_ID_LEN:]), True)
return g_rtrs[0]['id']
else:
self.assertEqual(len(g_rtrs), 0)
notifier = self.plugin.agent_notifiers[AGENT_TYPE_L3_CFG]
# ensure last router_deleted notification was for global router
notify_call = notifier.method_calls[-1]
self.assertEqual(notify_call[0], 'router_deleted')
deleted_router = notify_call[1][1]
self.assertEqual(deleted_router['id'], id_global_router)
def _test_gw_router_delete_removes_global_router(self, set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
ext_gw = {'network_id': s['subnet']['network_id']}
with self.router(tenant_id=tenant_id, external_gateway_info=ext_gw,
set_context=set_context) as router1,\
self.router(name='router2', tenant_id=tenant_id,
external_gateway_info=ext_gw,
set_context=set_context) as router2:
r1 = router1['router']
r2 = router2['router']
self.plugin._process_backlogged_routers()
r1_after = self._show('routers', r1['id'])['router']
hd_id = r1_after[HOSTING_DEVICE_ATTR]
self._delete('routers', r1['id'])
# should still have the global router
id_global_router = self._verify_deleted_routers(hd_id)
self._delete('routers', r2['id'])
# should be no global router now
self._verify_deleted_routers(id_global_router=id_global_router)
def test_gw_router_delete_removes_global_router(self):
self._test_gw_router_delete_removes_global_router()
def test_gw_router_delete_removes_global_router_non_admin(self):
self._test_gw_router_delete_removes_global_router(True)
def _test_router_delete_removes_no_global_router(self, set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
ext_gw = {'network_id': s['subnet']['network_id']}
with self.router(tenant_id=tenant_id,
set_context=set_context) as router1,\
self.router(name='router2', tenant_id=tenant_id,
external_gateway_info=ext_gw,
set_context=set_context) as router2:
r1 = router1['router']
r2 = router2['router']
self.plugin._process_backlogged_routers()
r1_after = self._show('routers', r1['id'])['router']
hd_id = r1_after[HOSTING_DEVICE_ATTR]
self._delete('routers', r1['id'])
# should still have the global router
id_global_router = self._verify_deleted_routers(hd_id)
self._delete('routers', r2['id'])
# should be no global router now
self._verify_deleted_routers(id_global_router=id_global_router)
def test_router_delete_removes_no_global_router(self):
self._test_router_delete_removes_no_global_router()
def test_router_delete_removes_no_global_router_non_admin(self):
self._test_router_delete_removes_no_global_router(True)
class Asr1kHARouterTypeDriverTestCase(
Asr1kRouterTypeDriverTestCase,
cisco_ha_test.HAL3RouterTestsMixin):
# For the HA tests we need more than one hosting device
router_type = 'ASR1k_Neutron_router'
_is_ha_tests = True
def setUp(self, core_plugin=None, l3_plugin=None, dm_plugin=None,
ext_mgr=None):
if l3_plugin is None:
l3_plugin = cisco_test_case.HA_L3_PLUGIN_KLASS
if ext_mgr is None:
ext_mgr = (cisco_test_case.
TestHASchedulingL3RouterApplianceExtensionManager())
cfg.CONF.set_override('default_ha_redundancy_level', 1, group='ha')
super(Asr1kHARouterTypeDriverTestCase, self).setUp(
l3_plugin=l3_plugin, ext_mgr=ext_mgr)
def _verify_ha_created_routers(self, router_ids, num_redundancy=1,
has_gw=None):
if has_gw is None:
has_gw = [True for r_id in router_ids]
temp = {}
for i in range(len(router_ids)):
temp[router_ids[i]] = has_gw[i]
has_gw = temp
# tenant HA user_visible routers
q_p = '%s=None' % ROUTER_ROLE_ATTR
uv_routers = self._list('routers', query_params=q_p)['routers']
uv_r_ids = {r['id'] for r in uv_routers}
self.assertEqual(len(uv_r_ids), len(router_ids))
for uv_r_id in uv_r_ids:
self.assertIn(uv_r_id, router_ids)
# tenant HA redundancy routers
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_HA_REDUNDANCY)
rr_id_to_rr = {
r['id']: r for r in self._list('routers',
query_params=q_p)['routers']}
all_rr_ids = rr_id_to_rr.keys()
num_rr_ids = 0
hd_ids = set()
for uv_r in uv_routers:
uv_r_hd_id = uv_r[HOSTING_DEVICE_ATTR]
if has_gw[uv_r['id']] is True:
self.assertIsNotNone(uv_r[EXTERNAL_GW_INFO])
hd_ids.add(uv_r_hd_id)
else:
self.assertIsNone(uv_r[EXTERNAL_GW_INFO])
rr_ids = [rr_info['id']
for rr_info in uv_r[ha.DETAILS][ha.REDUNDANCY_ROUTERS]]
num = len(rr_ids)
num_rr_ids += num
self.assertEqual(num, num_redundancy)
for rr_id in rr_ids:
self.assertIn(rr_id, all_rr_ids)
rr = rr_id_to_rr[rr_id]
rr_hd_id = rr[HOSTING_DEVICE_ATTR]
# redundancy router must not be hosted on same device as its
# user visible router since that defeats HA
self.assertFalse(uv_r_hd_id == rr_hd_id)
if has_gw[uv_r['id']] is True:
self.assertIsNotNone(rr[EXTERNAL_GW_INFO])
hd_ids.add(rr_hd_id)
else:
self.assertIsNone(rr[EXTERNAL_GW_INFO])
self.assertEqual(num_rr_ids, len(all_rr_ids))
# we should have a global router on all hosting devices that hosts
# a router (user visible or redundancy router) with gateway set
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_GLOBAL)
g_rtrs = self._list('routers', query_params=q_p)['routers']
self.assertEqual(len(g_rtrs), len(hd_ids))
g_rtr_ids = set()
for g_rtr in g_rtrs:
self.assertIn(g_rtr[HOSTING_DEVICE_ATTR], hd_ids)
g_rtr_ids.add(g_rtr['id'])
# logical global router for global routers HA
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_LOGICAL_GLOBAL)
g_l_rtrs = self._list('routers', query_params=q_p)['routers']
if g_l_rtrs:
self.assertEqual(len(g_l_rtrs), 1)
g_l_rtr = g_l_rtrs[0]
self.assertEqual(g_l_rtr['name'], LOGICAL_ROUTER_ROLE_NAME)
self.assertEqual(g_l_rtr[AUTO_SCHEDULE_ATTR], False)
else:
self.assertEqual(len(g_l_rtrs), 0)
notifier = self.plugin.agent_notifiers[AGENT_TYPE_L3_CFG]
if g_l_rtrs:
# ensure first routers_updated notifications were
# for global routers
for i in range(len(hd_ids)):
notify_call = notifier.method_calls[i]
self.assertEqual(notify_call[0], 'routers_updated')
updated_routers = notify_call[1][1]
self.assertEqual(len(updated_routers), 1)
self.assertIn(updated_routers[0]['id'], g_rtr_ids)
g_rtr_ids.remove(updated_routers[0]['id'])
else:
# ensure *no* update notifications where sent for global routers
for call in notifier.method_calls:
self.assertNotIn(call[1][1][0][ROUTER_ROLE_ATTR],
[ROUTER_ROLE_GLOBAL])
# ensure *no* update notifications where sent for logical global router
for call in notifier.method_calls:
self.assertNotIn(call[1][1][0][ROUTER_ROLE_ATTR],
[ROUTER_ROLE_LOGICAL_GLOBAL])
def _test_gw_router_create_adds_global_router(self, set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
ext_gw = {'network_id': s['subnet']['network_id']}
with self.router(tenant_id=tenant_id, external_gateway_info=ext_gw,
set_context=set_context) as router1:
r = router1['router']
self.plugin._process_backlogged_routers()
# should now have one user-visible router, its single
# redundancy router and two global routers (one for each of
# the hosting devices of the aforementioned routers)
self._verify_ha_created_routers([r['id']])
def _test_router_create_adds_no_global_router(self, set_context=False):
with self.router(set_context=set_context) as router:
r = router['router']
self.plugin._process_backlogged_routers()
self._verify_ha_created_routers([r['id']], 1, has_gw=[False])
def _verify_ha_updated_router(self, router_id, hd_ids=None, call_index=1,
num_redundancy=1, has_gw=True):
# ids of hosting devices hosting routers with gateway set
hd_ids = hd_ids or set()
if router_id:
# tenant router
uv_r = self._show('routers', router_id)['router']
uv_r_hd_id = uv_r[HOSTING_DEVICE_ATTR]
if has_gw is True:
self.assertIsNotNone(uv_r[EXTERNAL_GW_INFO])
hd_ids.add(uv_r_hd_id)
else:
self.assertIsNone(uv_r[EXTERNAL_GW_INFO])
rr_ids = [rr_info['id']
for rr_info in uv_r[ha.DETAILS][ha.REDUNDANCY_ROUTERS]]
# tenant HA redundancy routers
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_HA_REDUNDANCY)
rr_id_to_rr = {
r['id']: r for r in self._list('routers',
query_params=q_p)['routers']}
all_rr_ids = rr_id_to_rr.keys()
self.assertEqual(len(rr_ids), num_redundancy)
for rr_id in rr_ids:
self.assertIn(rr_id, all_rr_ids)
rr = rr_id_to_rr[rr_id]
rr_hd_id = rr[HOSTING_DEVICE_ATTR]
# redundancy router must not be hosted on same device as its
# user visible router since that defeats HA
self.assertFalse(uv_r_hd_id == rr_hd_id)
if has_gw is True:
self.assertIsNotNone(rr[EXTERNAL_GW_INFO])
hd_ids.add(rr_hd_id)
else:
self.assertIsNone(rr[EXTERNAL_GW_INFO])
# we should have a global router on all hosting devices that hosts
# a router (user visible or redundancy router) with gateway set
num_devices_hosting_gateway_routers = len(hd_ids)
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_GLOBAL)
g_rtrs = self._list('routers', query_params=q_p)['routers']
self.assertEqual(len(g_rtrs), num_devices_hosting_gateway_routers)
g_rtr_ids = set()
for g_rtr in g_rtrs:
self.assertIn(g_rtr[HOSTING_DEVICE_ATTR], hd_ids)
g_rtr_ids.add(g_rtr['id'])
# logical global router for global routers HA
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_LOGICAL_GLOBAL)
g_l_rtrs = self._list('routers', query_params=q_p)['routers']
if num_devices_hosting_gateway_routers > 0:
self.assertEqual(len(g_l_rtrs), 1)
g_l_rtr = g_l_rtrs[0]
self.assertEqual(g_l_rtr['name'], LOGICAL_ROUTER_ROLE_NAME)
self.assertEqual(g_l_rtr[AUTO_SCHEDULE_ATTR], False)
else:
self.assertEqual(len(g_l_rtrs), 0)
# global routers
notifier = self.plugin.agent_notifiers[AGENT_TYPE_L3_CFG]
# routers_updated notification call_index is for global router
notify_call = notifier.method_calls[call_index]
self.assertEqual(notify_call[0], 'routers_updated')
updated_routers = notify_call[1][1]
self.assertEqual(len(updated_routers), 1)
self.assertEqual(updated_routers[0][ROUTER_ROLE_ATTR],
ROUTER_ROLE_GLOBAL)
# ensure *no* update notifications where sent for logical global router
for call in notifier.method_calls:
if call[0] != 'router_deleted':
self.assertNotIn(call[1][1][0][ROUTER_ROLE_ATTR],
[ROUTER_ROLE_LOGICAL_GLOBAL])
return hd_ids
def _test_router_update_set_gw_adds_global_router(self, set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
with self.router(tenant_id=tenant_id,
set_context=set_context) as router1,\
self.router(name='router2', tenant_id=tenant_id,
set_context=set_context) as router2:
r1 = router1['router']
r2 = router2['router']
# backlog processing to schedule the routers
self.plugin._process_backlogged_routers()
# should have no global router yet
r_ids = [r1['id'], r2['id']]
self._verify_ha_created_routers(r_ids, 1, has_gw=[False,
False])
ext_gw = {'network_id': s['subnet']['network_id']}
r_spec = {'router': {l3.EXTERNAL_GW_INFO: ext_gw}}
self._update('routers', r1['id'], r_spec)
# should now have two global routers, one for hosting device
# of user visible router r1 and one for the hosting device r1's
# redundancy router
hd_ids = self._verify_ha_updated_router(r1['id'])
self._update('routers', r2['id'], r_spec)
self._verify_ha_updated_router(r2['id'], hd_ids)
def _test_router_update_unset_gw_keeps_global_router(self,
set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
ext_gw = {'network_id': s['subnet']['network_id']}
with self.router(tenant_id=tenant_id, external_gateway_info=ext_gw,
set_context=set_context) as router1,\
self.router(name='router2', tenant_id=tenant_id,
external_gateway_info=ext_gw,
set_context=set_context) as router2:
r1 = router1['router']
r2 = router2['router']
# make sure we have only two eligible hosting devices
# in this test
qp = "template_id=00000000-0000-0000-0000-000000000005"
hds = self._list('hosting_devices', query_params=qp)
self._delete('hosting_devices',
hds['hosting_devices'][1]['id'])
# backlog processing to schedule the routers
self.plugin._process_backlogged_routers()
self._verify_ha_created_routers([r1['id'], r2['id']])
r_spec = {'router': {l3.EXTERNAL_GW_INFO: None}}
self._update('routers', r1['id'], r_spec)
# should still have two global routers, we verify using r2
self._verify_ha_updated_router(r2['id'])
self._update('routers', r2['id'], r_spec)
# should have no global routers now, we verify using r1
self._verify_ha_updated_router(r2['id'], has_gw=False)
def _test_gw_router_delete_removes_global_router(self, set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
ext_gw = {'network_id': s['subnet']['network_id']}
with self.router(tenant_id=tenant_id, external_gateway_info=ext_gw,
set_context=set_context) as router1,\
self.router(name='router2', tenant_id=tenant_id,
external_gateway_info=ext_gw,
set_context=set_context) as router2:
r1 = router1['router']
r2 = router2['router']
# make sure we have only two eligible hosting devices
# in this test
qp = "template_id=00000000-0000-0000-0000-000000000005"
hds = self._list('hosting_devices', query_params=qp)
self._delete('hosting_devices',
hds['hosting_devices'][1]['id'])
# backlog processing to schedule the routers
self.plugin._process_backlogged_routers()
self._verify_ha_created_routers([r1['id'], r2['id']])
self._delete('routers', r1['id'])
# should still have two global routers, we verify using r2
self._verify_ha_updated_router(r2['id'])
self._delete('routers', r2['id'])
# should have no global routers now
self._verify_ha_updated_router(None)
def _test_router_delete_removes_no_global_router(self, set_context=False):
tenant_id = _uuid()
with self.network(tenant_id=tenant_id) as n_external:
res = self._create_subnet(self.fmt, n_external['network']['id'],
cidr='10.0.1.0/24', tenant_id=tenant_id)
s = self.deserialize(self.fmt, res)
self._set_net_external(s['subnet']['network_id'])
ext_gw = {'network_id': s['subnet']['network_id']}
with self.router(tenant_id=tenant_id,
set_context=set_context) as router1,\
self.router(name='router2', tenant_id=tenant_id,
external_gateway_info=ext_gw,
set_context=set_context) as router2:
r1 = router1['router']
r2 = router2['router']
# make sure we have only two eligible hosting devices
# in this test
qp = "template_id=00000000-0000-0000-0000-000000000005"
hds = self._list('hosting_devices', query_params=qp)
self._delete('hosting_devices',
hds['hosting_devices'][1]['id'])
self.plugin._process_backlogged_routers()
self._verify_ha_created_routers([r1['id'], r2['id']],
has_gw=[False, True])
self._delete('routers', r1['id'])
# should still have two global routers, we verify using r2
self._verify_ha_updated_router(r2['id'])
self._delete('routers', r2['id'])
# should have no global routers now
self._verify_ha_updated_router(None)
class L3CfgAgentAsr1kRouterTypeDriverTestCase(
cisco_test_case.L3RoutertypeAwareHostingDeviceSchedulerTestCaseBase,
cisco_ha_test.HAL3RouterTestsMixin):
_is_ha_tests = True
def setUp(self, core_plugin=None, l3_plugin=None, dm_plugin=None,
ext_mgr=None):
if l3_plugin is None:
l3_plugin = cisco_test_case.HA_L3_PLUGIN_KLASS
if ext_mgr is None:
ext_mgr = (cisco_test_case.
TestHASchedulingL3RouterApplianceExtensionManager())
cfg.CONF.set_override('default_ha_redundancy_level', 1, group='ha')
super(L3CfgAgentAsr1kRouterTypeDriverTestCase, self).setUp(
l3_plugin=l3_plugin, ext_mgr=ext_mgr)
self.orig_get_sync_data = self.plugin.get_sync_data
self.plugin.get_sync_data = self.plugin.get_sync_data_ext
def tearDown(self):
self.plugin.get_sync_data = self.orig_get_sync_data
super(L3CfgAgentAsr1kRouterTypeDriverTestCase, self).tearDown()
def _verify_sync_data(self, context, ids_colocated_routers, g_l_rtr,
g_l_rtr_rr_ids, ha_settings):
routers = self.plugin.get_sync_data_ext(context,
ids_colocated_routers)
self.assertEqual(len(routers), 2)
global_router = [r for r in routers if
r[ROUTER_ROLE_ATTR] == ROUTER_ROLE_GLOBAL][0]
# verify that global router has HA information from logical
# global router, in particular VIP address for the gw port
# comes from the gw port of the logical global router
ha_info = global_router['gw_port']['ha_info']
ha_port_id = ha_info['ha_port']['id']
vip_address = g_l_rtr[l3.EXTERNAL_GW_INFO][
'external_fixed_ips'][0]['ip_address']
self.assertEqual(
ha_info['ha_port']['fixed_ips'][0]['ip_address'],
vip_address)
self.assertEqual(global_router['gw_port_id'] == ha_port_id,
False)
self._verify_ha_settings(global_router, ha_settings)
rr_info_list = global_router[ha.DETAILS][ha.REDUNDANCY_ROUTERS]
self.assertEqual(len(rr_info_list), len(g_l_rtr_rr_ids))
for rr_info in rr_info_list:
self.assertIn(rr_info['id'], g_l_rtr_rr_ids)
def test_l3_cfg_agent_query_global_router_info(self):
with self.subnet(cidr='10.0.1.0/24') as s_ext:
self._set_net_external(s_ext['subnet']['network_id'])
ext_gw = {'network_id': s_ext['subnet']['network_id']}
with self.router(external_gateway_info=ext_gw) as router:
r = router['router']
self.plugin._process_backlogged_routers()
r_after = self._show('routers', r['id'])['router']
hd_id = r_after[HOSTING_DEVICE_ATTR]
id_r_ha_backup = r_after[ha.DETAILS][
ha.REDUNDANCY_ROUTERS][0]['id']
r_ha_backup_after = self._show('routers',
id_r_ha_backup)['router']
ha_backup_hd_id = r_ha_backup_after[HOSTING_DEVICE_ATTR]
# logical global router for global routers HA
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_LOGICAL_GLOBAL)
g_l_rtrs = self._list('routers', query_params=q_p)['routers']
# should be only one logical global router
self.assertEqual(len(g_l_rtrs), 1)
g_l_rtr = g_l_rtrs[0]
g_l_rtr_rr_ids = {r_info['id'] for r_info in g_l_rtr[
ha.DETAILS][ha.REDUNDANCY_ROUTERS]}
self.assertEqual(g_l_rtr[ha.ENABLED], True)
self.assertEqual(g_l_rtr[routertype.TYPE_ATTR],
r[routertype.TYPE_ATTR])
# no auto-scheduling to ensure logical global router is never
# instantiated (unless an admin does some bad thing...)
self.assertEqual(g_l_rtr[AUTO_SCHEDULE_ATTR], False)
# global router on hosting devices
q_p = '%s=%s' % (ROUTER_ROLE_ATTR, ROUTER_ROLE_GLOBAL)
g_rtrs = {g_r[HOSTING_DEVICE_ATTR]: g_r for g_r in self._list(
'routers', query_params=q_p)['routers']}
self.assertEqual(len(g_rtrs), 2)
for g_r in g_rtrs.values():
self.assertEqual(g_r[routertype.TYPE_ATTR],
r[routertype.TYPE_ATTR])
# global routers should have HA disabled in db
self.assertEqual(g_r[ha.ENABLED], False)
# global routers should never be auto-scheduled as that
# can result in them being moved to another hosting device
self.assertEqual(g_r[AUTO_SCHEDULE_ATTR], False)
# global router should be redundancy router of the logical
# global router for this router type
self.assertIn(g_r['id'], g_l_rtr_rr_ids)
e_context = context.get_admin_context()
# global routers should here have HA setup information from
# the logical global router
ha_settings = self._get_ha_defaults(
ha_type=cfg.CONF.ha.default_ha_mechanism,
redundancy_level=2, priority=ha_db.DEFAULT_MASTER_PRIORITY)
# verify global router co-located with the user visible router
ids_colocated_routers = [r['id'], g_rtrs[hd_id]['id']]
self._verify_sync_data(e_context, ids_colocated_routers,
g_l_rtr, g_l_rtr_rr_ids, ha_settings)
# verify global router co.located with the ha backup
# router of the user visible router
ids_colocated_routers = [r_ha_backup_after['id'],
g_rtrs[ha_backup_hd_id]['id']]
self._verify_sync_data(e_context, ids_colocated_routers,
g_l_rtr, g_l_rtr_rr_ids, ha_settings)
| 52.18481 | 79 | 0.603163 | 5,240 | 41,226 | 4.392939 | 0.062023 | 0.050567 | 0.02372 | 0.027108 | 0.80729 | 0.769799 | 0.75177 | 0.720796 | 0.694948 | 0.652374 | 0 | 0.015791 | 0.304128 | 41,226 | 789 | 80 | 52.250951 | 0.786601 | 0.121598 | 0 | 0.705977 | 0 | 0 | 0.061884 | 0.00615 | 0 | 0 | 0 | 0 | 0.138934 | 1 | 0.054927 | false | 0 | 0.021002 | 0 | 0.090469 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e1de7ba01ef13562cad5954a88c5500f10276b50 | 36 | py | Python | readers/__init__.py | nbalas/advent_of_code | 7c67a07eccc6e07f56fc448e463c557c937a4aaa | [
"MIT"
] | null | null | null | readers/__init__.py | nbalas/advent_of_code | 7c67a07eccc6e07f56fc448e463c557c937a4aaa | [
"MIT"
] | null | null | null | readers/__init__.py | nbalas/advent_of_code | 7c67a07eccc6e07f56fc448e463c557c937a4aaa | [
"MIT"
] | null | null | null | from .file_reader import FileReader
| 18 | 35 | 0.861111 | 5 | 36 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 36 | 1 | 36 | 36 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c038cf8240d76a136e440e8bea225a9fa5bf8789 | 1,025 | py | Python | Level2/Lessons12980/najy.py | StudyForCoding/ProgrammersLevel | dc957b1c02cc4383a93b8cbf3d739e6c4d88aa25 | [
"MIT"
] | null | null | null | Level2/Lessons12980/najy.py | StudyForCoding/ProgrammersLevel | dc957b1c02cc4383a93b8cbf3d739e6c4d88aa25 | [
"MIT"
] | null | null | null | Level2/Lessons12980/najy.py | StudyForCoding/ProgrammersLevel | dc957b1c02cc4383a93b8cbf3d739e6c4d88aa25 | [
"MIT"
] | 1 | 2021-04-05T07:35:59.000Z | 2021-04-05T07:35:59.000Z | #점프와 순간 이동
def solution(n): # n까지 가는 최소 연료 = n/2에서 순간이동
if n==1:
return 1
if n%2 == 0:
return solution(n/2)
else:
return solution((n-1)/2)+1 # 홀수에선 n칸 전에 한번 점프 후 순간이동
'''
정확성 테스트
테스트 1 〉 통과 (0.00ms, 10.1MB)
테스트 2 〉 통과 (0.00ms, 10.1MB)
테스트 3 〉 통과 (0.01ms, 10.2MB)
테스트 4 〉 통과 (0.01ms, 10.2MB)
테스트 5 〉 통과 (0.01ms, 10.2MB)
테스트 6 〉 통과 (0.01ms, 10.1MB)
테스트 7 〉 통과 (0.01ms, 10.1MB)
테스트 8 〉 통과 (0.01ms, 10.1MB)
테스트 9 〉 통과 (0.01ms, 10.2MB)
테스트 10 〉 통과 (0.01ms, 10.2MB)
테스트 11 〉 통과 (0.01ms, 10.1MB)
테스트 12 〉 통과 (0.01ms, 10.2MB)
테스트 13 〉 통과 (0.01ms, 10.2MB)
테스트 14 〉 통과 (0.01ms, 10.2MB)
테스트 15 〉 통과 (0.01ms, 10.1MB)
테스트 16 〉 통과 (0.01ms, 10.2MB)
테스트 17 〉 통과 (0.01ms, 10.1MB)
테스트 18 〉 통과 (0.01ms, 10.1MB)
효율성 테스트
테스트 1 〉 통과 (0.02ms, 10.3MB)
테스트 2 〉 통과 (0.02ms, 10.2MB)
테스트 3 〉 통과 (0.02ms, 10MB)
테스트 4 〉 통과 (0.02ms, 10.2MB)
테스트 5 〉 통과 (0.02ms, 10.1MB)
테스트 6 〉 통과 (0.02ms, 10.2MB)
테스트 7 〉 통과 (0.02ms, 10.2MB)
테스트 8 〉 통과 (0.02ms, 10.2MB)
테스트 9 〉 통과 (0.01ms, 10.1MB)
테스트 10 〉 통과 (0.01ms, 10.1MB)
''' | 25 | 60 | 0.561951 | 267 | 1,025 | 2.262172 | 0.194757 | 0.139073 | 0.18543 | 0.238411 | 0.763245 | 0.701987 | 0.645695 | 0 | 0 | 0 | 0 | 0.276129 | 0.243902 | 1,025 | 41 | 61 | 25 | 0.467097 | 0.057561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
c0490b866b0b1915729667fff5c3e78d386a4579 | 12,218 | py | Python | pangea/core/tests/test_nested_api.py | LongTailBio/pangea-django | 630551dded7f9e38f95eda8c36039e0de46961e7 | [
"MIT"
] | null | null | null | pangea/core/tests/test_nested_api.py | LongTailBio/pangea-django | 630551dded7f9e38f95eda8c36039e0de46961e7 | [
"MIT"
] | 27 | 2020-03-26T02:55:12.000Z | 2022-03-12T00:55:04.000Z | pangea/core/tests/test_nested_api.py | LongTailBio/pangea-django | 630551dded7f9e38f95eda8c36039e0de46961e7 | [
"MIT"
] | 1 | 2021-09-14T08:15:54.000Z | 2021-09-14T08:15:54.000Z | from django.urls import reverse
from rest_framework import status
from rest_framework.test import APITestCase
from pangea.core.models import (
PangeaUser,
Organization,
SampleGroup,
SampleLibrary,
Sample,
SampleGroupAnalysisResult,
SampleAnalysisResult,
)
class NestedSampleGroupTests(APITestCase):
@classmethod
def setUpTestData(cls):
cls.organization = Organization.objects.create(name='Test Organization HJDH')
cls.user = PangeaUser.objects.create(email='user@domain.com', password='Foobar22')
def test_create_sample_group(self):
"""Ensure authorized user can create sample group."""
self.organization.users.add(self.user)
self.client.force_authenticate(user=self.user)
url = reverse('nested-sample-group-create', kwargs={'org_pk': self.organization.pk})
data = {'name': 'Test Sample Group HJKHJ', 'organization': self.organization.pk}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(SampleGroup.objects.count(), 1)
self.assertEqual(SampleGroup.objects.get().name, 'Test Sample Group HJKHJ')
def test_create_sample_group_with_name(self):
"""Ensure authorized user can create sample group."""
self.organization.users.add(self.user)
self.client.force_authenticate(user=self.user)
url = reverse('nested-sample-group-create', kwargs={'org_pk': self.organization.name})
data = {'name': 'Test Sample Group NBYTU', 'organization': self.organization.pk}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(SampleGroup.objects.count(), 1)
self.assertEqual(SampleGroup.objects.get().name, 'Test Sample Group NBYTU')
def test_retrieve_sample_group(self):
"""Ensure authorized user can create sample group."""
grp_name = 'Test Sample Group HGJHJ'
self.organization.create_sample_group(name=grp_name, is_public=True)
url = reverse('nested-sample-group-detail', kwargs={
'org_pk': self.organization.pk,
'grp_pk': grp_name,
})
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_retrieve_sample_group_case_insensitive(self):
"""Ensure authorized user can create sample group."""
grp_name = 'Test Sample Group TRUGKH'
self.organization.create_sample_group(name=grp_name, is_public=True)
url = reverse('nested-sample-group-detail', kwargs={
'org_pk': self.organization.pk,
'grp_pk': grp_name.lower(),
})
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
class NestedSampleTests(APITestCase):
@classmethod
def setUpTestData(cls):
cls.organization = Organization.objects.create(name='Test Organization EWKKK')
cls.user = PangeaUser.objects.create(email='user@domain.com', password='Foobar22')
def test_create_sample(self):
"""Ensure authorized user can create sample group."""
self.organization.users.add(self.user)
self.client.force_authenticate(user=self.user)
sample_library = self.organization.create_sample_group(name='Test Library JKLLL', is_library=True)
url = reverse('nested-sample-create', kwargs={
'org_pk': self.organization.pk,
'grp_pk': sample_library.pk,
})
data = {'name': 'Test Sample JKLLL', 'library': sample_library.pk}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(Sample.objects.count(), 1)
self.assertEqual(Sample.objects.get().name, 'Test Sample JKLLL')
self.assertTrue(sample_library.sample_set.filter(pk=response.data.get('uuid')).exists())
def test_create_sample_with_name(self):
"""Ensure authorized user can create sample group."""
self.organization.users.add(self.user)
self.client.force_authenticate(user=self.user)
sample_library = self.organization.create_sample_group(name='Test Library', is_library=True)
url = reverse('nested-sample-create', kwargs={
'org_pk': self.organization.name,
'grp_pk': sample_library.name,
})
data = {'name': 'Test Sample', 'library': sample_library.pk}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(Sample.objects.count(), 1)
self.assertEqual(Sample.objects.get().name, 'Test Sample')
self.assertTrue(sample_library.sample_set.filter(pk=response.data.get('uuid')).exists())
def test_retrieve_sample(self):
"""Ensure authorized user can create sample group."""
grp = self.organization.create_sample_group(
name='Test Sample Group KKJSGHFG',
is_public=True,
is_library=True,
)
grp.create_sample(name='Test Sample KKJSGHFG')
url = reverse('nested-sample-details', kwargs={
'org_pk': self.organization.pk,
'grp_pk': grp.name,
'sample_pk': 'Test Sample KKJSGHFG',
})
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_retrieve_sample_from_non_library(self):
"""Ensure authorized user can create sample group."""
lib = self.organization.create_sample_group(
name='Test Library RYJTSDCGEH',
is_public=True,
is_library=True,
)
sample = lib.create_sample(name='Test Sample RYJTSDCGEH')
grp = self.organization.create_sample_group(
name='Test Sample Group RYJTSDCGEH',
is_public=True,
is_library=True,
)
grp.add_sample(sample)
url = reverse('nested-sample-details', kwargs={
'org_pk': self.organization.pk,
'grp_pk': grp.name,
'sample_pk': 'Test Sample RYJTSDCGEH',
})
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
class AnalysisResultTests(APITestCase):
@classmethod
def setUpTestData(cls):
cls.organization = Organization.objects.create(name='Test Organization')
cls.user = PangeaUser.objects.create(email='user@domain.com', password='Foobar22')
cls.organization.users.add(cls.user)
cls.sample_group = cls.organization.create_sample_group(name='Test Library', is_library=True)
cls.sample_library = cls.sample_group.library
cls.sample = cls.sample_group.create_sample(name='Test Sample')
def test_retrieve_sample_analysis_result(self):
"""Ensure we can retrieve analysis result."""
self.sample.create_analysis_result(module_name='test_module_KKJSGHFG')
url = reverse('nested-sample-ar-details', kwargs={
'org_pk': self.organization.pk,
'grp_pk': self.sample_group.name,
'sample_pk': self.sample.pk,
'ar_pk': 'test_module_KKJSGHFG',
})
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_retrieve_sample_analysis_result_with_nonexistent_replicate(self):
"""Ensure we can retrieve analysis result."""
self.sample.create_analysis_result(module_name='test_module_IUWJHREWJ', replicate='foo')
url = reverse('nested-sample-ar-details', kwargs={
'org_pk': self.organization.pk,
'grp_pk': self.sample_group.name,
'sample_pk': self.sample.pk,
'ar_pk': 'test_module_IUWJHREWJ',
})
url += '?replicate=bar'
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_404_NOT_FOUND)
def test_retrieve_sample_analysis_result_with_replicate(self):
"""Ensure we can retrieve analysis result."""
s1 = self.sample.create_analysis_result(module_name='test_module_TYUKDSJGHV', replicate='foo')
s2 = self.sample.create_analysis_result(module_name='test_module_TYUKDSJGHV', replicate='bar')
base_url = reverse('nested-sample-ar-details', kwargs={
'org_pk': self.organization.pk,
'grp_pk': self.sample_group.name,
'sample_pk': self.sample.pk,
'ar_pk': 'test_module_TYUKDSJGHV',
})
url = base_url + '?replicate=foo'
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json()['uuid'], str(s1.uuid))
url = base_url + '?replicate=bar'
response = self.client.get(url, format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(response.json()['uuid'], str(s2.uuid))
def test_create_sample_group_analysis_result(self):
self.client.force_authenticate(user=self.user)
url = reverse('nested-sample-group-ar-create', kwargs={
'org_pk': self.organization.pk,
'grp_pk': self.sample_library.pk,
})
data = {'module_name': 'taxa', 'sample_group': self.sample_group.pk}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(SampleGroupAnalysisResult.objects.count(), 1)
self.assertEqual(SampleGroupAnalysisResult.objects.get().sample_group, self.sample_group)
self.assertEqual(SampleGroupAnalysisResult.objects.get().module_name, 'taxa')
def test_create_sample_analysis_result(self):
self.client.force_authenticate(user=self.user)
url = reverse('nested-sample-ar-create', kwargs={
'org_pk': self.organization.pk,
'grp_pk': self.sample_library.pk,
'sample_pk': self.sample.pk,
})
data = {'module_name': 'taxa', 'sample': self.sample.pk}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(SampleAnalysisResult.objects.count(), 1)
self.assertEqual(SampleAnalysisResult.objects.get().sample, self.sample)
self.assertEqual(SampleAnalysisResult.objects.get().module_name, 'taxa')
def test_create_sample_group_analysis_result_with_name(self):
self.client.force_authenticate(user=self.user)
url = reverse('nested-sample-group-ar-create', kwargs={
'org_pk': self.organization.name,
'grp_pk': self.sample_library.group.name,
})
data = {'module_name': 'taxa', 'sample_group': self.sample_group.pk}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(SampleGroupAnalysisResult.objects.count(), 1)
self.assertEqual(SampleGroupAnalysisResult.objects.get().sample_group, self.sample_group)
self.assertEqual(SampleGroupAnalysisResult.objects.get().module_name, 'taxa')
def test_create_sample_analysis_result_with_name(self):
self.client.force_authenticate(user=self.user)
url = reverse('nested-sample-ar-create', kwargs={
'org_pk': self.organization.name,
'grp_pk': self.sample_library.group.name,
'sample_pk': self.sample.name,
})
data = {'module_name': 'taxa', 'sample': self.sample.pk}
response = self.client.post(url, data, format='json')
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(SampleAnalysisResult.objects.count(), 1)
self.assertEqual(SampleAnalysisResult.objects.get().sample, self.sample)
self.assertEqual(SampleAnalysisResult.objects.get().module_name, 'taxa')
| 44.754579 | 106 | 0.670977 | 1,436 | 12,218 | 5.522284 | 0.079387 | 0.069357 | 0.042875 | 0.050441 | 0.915006 | 0.881715 | 0.874275 | 0.849937 | 0.8314 | 0.820681 | 0 | 0.006807 | 0.206417 | 12,218 | 272 | 107 | 44.919118 | 0.811056 | 0.041169 | 0 | 0.62212 | 0 | 0 | 0.136633 | 0.036881 | 0 | 0 | 0 | 0 | 0.184332 | 1 | 0.082949 | false | 0.013825 | 0.018433 | 0 | 0.115207 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fbebdd25a13948ec86f09da238b915a6d1385980 | 168 | py | Python | board/views.py | dbwodlf3/DjangoBoard | 1d7728f48d83d97c20c46995f2e5e4d2b3052130 | [
"MIT"
] | null | null | null | board/views.py | dbwodlf3/DjangoBoard | 1d7728f48d83d97c20c46995f2e5e4d2b3052130 | [
"MIT"
] | null | null | null | board/views.py | dbwodlf3/DjangoBoard | 1d7728f48d83d97c20c46995f2e5e4d2b3052130 | [
"MIT"
] | null | null | null | from django.shortcuts import render
from django.http import HttpResponse
# Create your views here.
def views1(request):
return render(request, "board/index.html") | 24 | 46 | 0.779762 | 23 | 168 | 5.695652 | 0.782609 | 0.152672 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006897 | 0.136905 | 168 | 7 | 46 | 24 | 0.896552 | 0.136905 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
2200f8509df07cb5093489a5f2c78afc2deaa9ae | 113 | py | Python | src/kinorrt/utilities/__init__.py | XianyiCheng/kinorrt | 99e22914e52623bad5aad797971735ad5f339e0f | [
"MIT"
] | 1 | 2022-03-21T01:08:34.000Z | 2022-03-21T01:08:34.000Z | src/kinorrt/utilities/__init__.py | XianyiCheng/kinorrt | 99e22914e52623bad5aad797971735ad5f339e0f | [
"MIT"
] | null | null | null | src/kinorrt/utilities/__init__.py | XianyiCheng/kinorrt | 99e22914e52623bad5aad797971735ad5f339e0f | [
"MIT"
] | 2 | 2021-09-30T21:38:45.000Z | 2022-03-01T07:43:42.000Z | from .geometry import *
from .obstacle_generation import *
from .plotting import *
from .transformations import * | 28.25 | 34 | 0.79646 | 13 | 113 | 6.846154 | 0.538462 | 0.337079 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132743 | 113 | 4 | 35 | 28.25 | 0.908163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
223eaac0cddd7fa955926a13accebb3ee5a535d4 | 201 | py | Python | tests/test_package_info.py | boo13/xCV | 9287ba0289420f5d3460821ad2e95c1dab08d1b8 | [
"MIT"
] | 2 | 2019-06-05T17:21:03.000Z | 2019-07-10T01:34:23.000Z | tests/test_package_info.py | boo13/xCV | 9287ba0289420f5d3460821ad2e95c1dab08d1b8 | [
"MIT"
] | 6 | 2019-07-16T21:46:42.000Z | 2022-01-13T01:30:15.000Z | tests/test_package_info.py | boo13/xCV | 9287ba0289420f5d3460821ad2e95c1dab08d1b8 | [
"MIT"
] | 1 | 2019-07-16T21:44:20.000Z | 2019-07-16T21:44:20.000Z | def test_package_info():
import xcv
from xcv.version import XCV_VERSION
assert XCV_VERSION == "0.1.3"
assert xcv.__email__ == "boo13bot@gmail.com"
assert xcv.__author__ == "Boo13"
| 25.125 | 48 | 0.686567 | 28 | 201 | 4.5 | 0.642857 | 0.238095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04375 | 0.20398 | 201 | 7 | 49 | 28.714286 | 0.74375 | 0 | 0 | 0 | 0 | 0 | 0.139303 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.166667 | true | 0 | 0.333333 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
22602431a2915090ee31c9f7951b3b3bc3290d8b | 233 | py | Python | tests/test_fileregistry.py | ownbite/delta_utils | 3fd006fb90e5b633486e5f7923c779bfd5ff3140 | [
"MIT"
] | null | null | null | tests/test_fileregistry.py | ownbite/delta_utils | 3fd006fb90e5b633486e5f7923c779bfd5ff3140 | [
"MIT"
] | null | null | null | tests/test_fileregistry.py | ownbite/delta_utils | 3fd006fb90e5b633486e5f7923c779bfd5ff3140 | [
"MIT"
] | null | null | null | from delta_utils.fileregistry import S3FullScan
def test_initiate_fileregistry(spark):
file_registry = S3FullScan("/mnt/husqvarna-datalake-omega-live/analytics/usr/linus.wallin/step-poc/file-registry/raw/step-product" , spark)
| 38.833333 | 143 | 0.815451 | 31 | 233 | 6 | 0.806452 | 0.129032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009259 | 0.072961 | 233 | 5 | 144 | 46.6 | 0.851852 | 0 | 0 | 0 | 0 | 0.333333 | 0.433476 | 0.433476 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3f3d2e0057a233abee5a1c670deec9a08f402a3f | 53 | py | Python | lib/GeneralSchoolCourse/__init__.py | JoshOrndorff/snippets | ef06e03de09897014f88d89a84b597aabde7edaa | [
"Unlicense"
] | null | null | null | lib/GeneralSchoolCourse/__init__.py | JoshOrndorff/snippets | ef06e03de09897014f88d89a84b597aabde7edaa | [
"Unlicense"
] | null | null | null | lib/GeneralSchoolCourse/__init__.py | JoshOrndorff/snippets | ef06e03de09897014f88d89a84b597aabde7edaa | [
"Unlicense"
] | null | null | null | from .GeneralSchoolCourse import GeneralSchoolCourse
| 26.5 | 52 | 0.90566 | 4 | 53 | 12 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 53 | 1 | 53 | 53 | 0.979592 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
58c402e2d553f72dcf25fcce18e44b3703fee45d | 7,327 | py | Python | cyder/cydns/domain/tests/auto_delete.py | ngokevin/chili | 36c354ac567471d5e36dccf9eea5096c6b02d4b9 | [
"BSD-3-Clause"
] | 2 | 2019-03-16T00:47:09.000Z | 2022-03-04T14:39:08.000Z | cyder/cydns/domain/tests/auto_delete.py | ngokevin/chili | 36c354ac567471d5e36dccf9eea5096c6b02d4b9 | [
"BSD-3-Clause"
] | 1 | 2020-04-24T08:24:55.000Z | 2020-04-24T08:24:55.000Z | cyder/cydns/domain/tests/auto_delete.py | ngokevin/chili | 36c354ac567471d5e36dccf9eea5096c6b02d4b9 | [
"BSD-3-Clause"
] | null | null | null | from django.test import TestCase
from cyder.core.system.models import System
from cyder.cydns.address_record.models import AddressRecord
from cyder.cydhcp.interface.static_intr.models import StaticInterface
from cyder.cydns.cname.models import CNAME
from cyder.cydns.txt.models import TXT
from cyder.cydns.mx.models import MX
from cyder.cydns.srv.models import SRV
from cyder.cydns.domain.models import Domain
from cyder.cydns.nameserver.models import Nameserver
from cyder.cydns.utils import ensure_label_domain, prune_tree
from cyder.cydns.tests.utils import create_fake_zone
class AutoDeleteTests(TestCase):
def setUp(self):
c = Domain(name='poo')
c.save()
self.assertFalse(c.purgeable)
self.f_c = create_fake_zone('foo.poo', suffix="")
self.assertEqual(self.f_c.name, 'foo.poo')
def test_cleanup_txt(self):
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
self.assertFalse(self.f_c.purgeable)
fqdn = "bar.x.y.z.foo.poo"
label, the_domain = ensure_label_domain(fqdn)
txt = TXT(label=label, domain=the_domain, txt_data="Nthing")
txt.save()
self.assertFalse(prune_tree(the_domain))
txt.delete()
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
def test_cleanup_address(self):
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
fqdn = "bar.x.y.z.foo.poo"
label, the_domain = ensure_label_domain(fqdn)
addr = AddressRecord(label=label, domain=the_domain,
ip_type='4', ip_str="10.2.3.4")
addr.save()
self.assertFalse(prune_tree(the_domain))
addr.delete()
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
def test_cleanup_mx(self):
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
fqdn = "bar.x.y.z.foo.poo"
label, the_domain = ensure_label_domain(fqdn)
mx = MX(label=label, domain=the_domain, server="foo", priority=4)
mx.save()
self.assertFalse(prune_tree(the_domain))
mx.delete()
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
def test_ns_cleanup(self):
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
fqdn = "bar.x.y.z.foo.poo"
label, the_domain = ensure_label_domain(fqdn)
ns = Nameserver(domain=the_domain, server="asdfasffoo")
ns.save()
self.assertFalse(prune_tree(the_domain))
ns.delete()
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
def test_srv_cleanup(self):
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
fqdn = "bar.x.y.z.foo.poo"
label, the_domain = ensure_label_domain(fqdn)
srv = SRV(
label='_' + label, domain=the_domain, target="foo", priority=4,
weight=4, port=34
)
srv.save()
self.assertFalse(prune_tree(the_domain))
srv.delete()
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
def test_cleanup_cname(self):
# Make sure CNAME record block
c = Domain(name='foo1')
c.save()
self.assertFalse(c.purgeable)
f_c = create_fake_zone('foo.foo1', suffix="")
self.assertEqual(f_c.name, 'foo.foo1')
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.foo1"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.foo1"))
self.assertFalse(Domain.objects.filter(name="z.foo.foo1"))
self.assertTrue(Domain.objects.filter(name="foo.foo1"))
self.assertFalse(f_c.purgeable)
fqdn = "cname.x.y.z.foo.foo1"
label, the_domain = ensure_label_domain(fqdn)
cname = CNAME(label=label, domain=the_domain, target="foo")
cname.save()
self.assertFalse(prune_tree(the_domain))
cname.delete()
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.foo1"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.foo1"))
self.assertFalse(Domain.objects.filter(name="z.foo.foo1"))
fqdn = "bar.x.y.z.foo.poo"
self.assertTrue(Domain.objects.filter(name="foo.foo1"))
def test_cleanup_intr(self):
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
Domain.objects.get_or_create(name="arpa")
Domain.objects.get_or_create(name="in-addr.arpa")
Domain.objects.get_or_create(name="10.in-addr.arpa")
fqdn = "bar.x.y.z.foo.poo"
label, the_domain = ensure_label_domain(fqdn)
system = System()
addr = StaticInterface(
label=label, domain=the_domain, ip_type='4', ip_str="10.2.3.4",
mac="00:11:22:33:44:55", system=system
)
addr.save()
self.assertFalse(prune_tree(the_domain))
addr.delete()
self.assertFalse(Domain.objects.filter(name="x.y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="y.z.foo.poo"))
self.assertFalse(Domain.objects.filter(name="z.foo.poo"))
self.assertTrue(Domain.objects.filter(name="foo.poo"))
| 42.109195 | 75 | 0.658796 | 1,027 | 7,327 | 4.617332 | 0.089581 | 0.161746 | 0.224378 | 0.271615 | 0.793336 | 0.776677 | 0.745044 | 0.668916 | 0.657318 | 0.657318 | 0 | 0.007209 | 0.185888 | 7,327 | 173 | 76 | 42.352601 | 0.787762 | 0.003821 | 0 | 0.594406 | 0 | 0 | 0.114568 | 0 | 0 | 0 | 0 | 0 | 0.482517 | 1 | 0.055944 | false | 0 | 0.083916 | 0 | 0.146853 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
58fd78b31ac89d33700f97e5aa69becd83be57ef | 12,705 | py | Python | test-framework/test-suites/unit/tests/command/stack/commands/remove/host/firmware/mapping/test_command_stack_commands_remove_host_firmware_mapping_plugin_basic.py | kmcm0/stacki | eb9dff1b45d5725b4986e567876bf61707fec28f | [
"BSD-3-Clause"
] | 123 | 2015-05-12T23:36:45.000Z | 2017-07-05T23:26:57.000Z | test-framework/test-suites/unit/tests/command/stack/commands/remove/host/firmware/mapping/test_command_stack_commands_remove_host_firmware_mapping_plugin_basic.py | kmcm0/stacki | eb9dff1b45d5725b4986e567876bf61707fec28f | [
"BSD-3-Clause"
] | 177 | 2015-06-05T19:17:47.000Z | 2017-07-07T17:57:24.000Z | test-framework/test-suites/unit/tests/command/stack/commands/remove/host/firmware/mapping/test_command_stack_commands_remove_host_firmware_mapping_plugin_basic.py | kmcm0/stacki | eb9dff1b45d5725b4986e567876bf61707fec28f | [
"BSD-3-Clause"
] | 32 | 2015-06-07T02:25:03.000Z | 2017-06-23T07:35:35.000Z | from unittest.mock import create_autospec, ANY, patch, call
import pytest
from stack.commands import DatabaseConnection
from stack.commands.remove.host.firmware.mapping import Command
from stack.exception import CommandError
from stack.commands.remove.host.firmware.mapping.plugin_basic import Plugin
class TestRemoveHostFirmwareMappingBasicPlugin:
"""A test case for the remove host firmware mapping basic plugin."""
@pytest.fixture
def basic_plugin(self):
"""A fixture that returns the plugin instance for use in tests.
This sets up the required mocks needed to construct the plugin class.
"""
mock_command = create_autospec(
spec = Command,
instance = True,
)
mock_command.db = create_autospec(
spec = DatabaseConnection,
spec_set = True,
instance = True,
)
return Plugin(command = mock_command)
def test_provides(self, basic_plugin):
"""Ensure that provides returns 'basic'."""
assert basic_plugin.provides() == "basic"
def test_validate_make(self, basic_plugin):
"""Ensure the make is validated if it exists."""
mock_make = "foo"
basic_plugin.validate_make(make = mock_make)
basic_plugin.owner.ensure_make_exists.assert_called_once_with(
make = mock_make,
)
def test_validate_make_not_provided(self, basic_plugin):
"""Ensure the make is not validated if not provided."""
mock_make = ""
basic_plugin.validate_make(make = mock_make)
basic_plugin.owner.ensure_make_exists.assert_not_called()
def test_validate_make_error(self, basic_plugin):
"""Ensure that validation fails when the make is invalid."""
mock_make = "foo"
basic_plugin.owner.ensure_make_exists.side_effect = CommandError(
cmd = basic_plugin.owner,
msg = "Test error",
)
with pytest.raises(CommandError):
basic_plugin.validate_make(make = mock_make)
def test_validate_model(self, basic_plugin):
"""Ensure the model is validated if it exists."""
mock_make = "foo"
mock_model = "bar"
basic_plugin.validate_model(make = mock_make, model = mock_model)
basic_plugin.owner.ensure_model_exists.assert_called_once_with(
make = mock_make,
model = mock_model,
)
def test_validate_model_not_provided(self, basic_plugin):
"""Ensure the model is not validated if not provided."""
mock_make = "foo"
mock_model = ""
basic_plugin.validate_model(make = mock_make, model = mock_model)
basic_plugin.owner.ensure_model_exists.assert_not_called()
def test_validate_model_error(self, basic_plugin):
"""Ensure that validation fails when the model is invalid."""
mock_make = "foo"
mock_model = "bar"
basic_plugin.owner.ensure_model_exists.side_effect = CommandError(
cmd = basic_plugin.owner,
msg = "Test error",
)
with pytest.raises(CommandError):
basic_plugin.validate_model(make = mock_make, model = mock_model)
@pytest.mark.parametrize(
"hosts, versions, make, model",
(
(["foo"], ["bar"], "baz", "bag"),
(["foo"], [], "baz", "bag"),
(["foo"], [], "baz", ""),
(["foo"], [], "", ""),
([], ["bar"], "baz", "bag"),
([], [], "baz", "bag"),
([], [], "baz", ""),
([], [], "", ""),
)
)
def test_get_firmware_mappings_to_remove(self, hosts, versions, make, model, basic_plugin):
"""Test that get_firmware_mappings_to_remove works as expected for every valid argument combination."""
test_inputs = {
"hosts": hosts,
"versions": versions,
"make": make,
"model": model,
}
basic_plugin.owner.db.select.return_value = [["1"]]
expected_query_params = list(value for value in test_inputs.values() if value)
assert [basic_plugin.owner.db.select.return_value[0][0]] == basic_plugin.get_firmware_mappings_to_remove(**test_inputs)
basic_plugin.owner.db.select.assert_called_once_with(ANY, expected_query_params)
@patch.object(target = Plugin, attribute = "get_firmware_mappings_to_remove", autospec = True)
@patch.object(target = Plugin, attribute = "validate_model", autospec = True)
@patch.object(target = Plugin, attribute = "validate_make", autospec = True)
@patch(target = "stack.commands.remove.host.firmware.mapping.plugin_basic.lowered", autospec = True)
@patch(target = "stack.commands.remove.host.firmware.mapping.plugin_basic.unique_everseen", autospec = True)
def test_run(
self,
mock_unique_everseen,
mock_lowered,
mock_validate_make,
mock_validate_model,
mock_get_firmware_mappings_to_remove,
basic_plugin,
):
"""Test that run works as expected when all params and args are provided and valid."""
mock_args = ["foo", "bar"]
expected_hosts = tuple(mock_args)
mock_params = {"make": "fizz", "model": "buzz", "versions": "bazz, bang"}
expected_versions = tuple(version.strip() for version in mock_params["versions"].split(",") if version.strip())
mock_lowered.return_value = mock_params.values()
mock_unique_everseen.side_effect = (
mock_args,
expected_versions,
)
basic_plugin.owner.getHosts.return_value = expected_hosts
mock_get_firmware_mappings_to_remove.return_value = ["1", "2"]
basic_plugin.run(args = (mock_params, mock_args))
assert [call(mock_args), call(basic_plugin.owner.fillParams.return_value)] == mock_lowered.mock_calls
mock_unique_everseen.assert_any_call(mock_lowered.return_value)
# Check the generator expression passed to the second call of unique_everseen
assert tuple(mock_unique_everseen.call_args_list[1][0][0]) == expected_versions
basic_plugin.owner.getHosts.assert_called_once_with(args = expected_hosts)
basic_plugin.owner.fillParams.assert_called_once_with(
names = [
("make", ""),
("model", ""),
("versions", ""),
],
params = mock_params,
)
mock_validate_make.assert_called_once_with(
basic_plugin,
make = mock_params["make"],
)
mock_validate_model.assert_called_once_with(
basic_plugin,
make = mock_params["make"],
model = mock_params["model"],
)
basic_plugin.owner.ensure_firmwares_exist.assert_called_once_with(
make = mock_params["make"],
model = mock_params["model"],
versions = expected_versions,
)
mock_get_firmware_mappings_to_remove.assert_called_once_with(
basic_plugin,
hosts = expected_hosts,
make = mock_params["make"],
model = mock_params["model"],
versions = expected_versions,
)
basic_plugin.owner.db.execute.assert_called_once_with(
ANY,
(mock_get_firmware_mappings_to_remove.return_value,),
)
@patch.object(target = Plugin, attribute = "get_firmware_mappings_to_remove", autospec = True)
@patch.object(target = Plugin, attribute = "validate_model", autospec = True)
@patch.object(target = Plugin, attribute = "validate_make", autospec = True)
@patch(target = "stack.commands.remove.host.firmware.mapping.plugin_basic.lowered", autospec = True)
@patch(target = "stack.commands.remove.host.firmware.mapping.plugin_basic.unique_everseen", autospec = True)
def test_run_no_hosts_or_versions(
self,
mock_unique_everseen,
mock_lowered,
mock_validate_make,
mock_validate_model,
mock_get_firmware_mappings_to_remove,
basic_plugin,
):
"""Test that run works as expected when hosts and versions are not provided."""
mock_args = []
expected_hosts = tuple(mock_args)
mock_params = {"make": "fizz", "model": "buzz", "versions": ""}
mock_lowered.return_value = mock_params.values()
mock_unique_everseen.return_value = mock_args
mock_get_firmware_mappings_to_remove.return_value = ["1", "2"]
basic_plugin.run(args = (mock_params, mock_args))
assert [call(mock_args), call(basic_plugin.owner.fillParams.return_value)] == mock_lowered.mock_calls
mock_unique_everseen.assert_called_once_with(mock_lowered.return_value)
basic_plugin.owner.getHosts.assert_not_called()
basic_plugin.owner.fillParams.assert_called_once_with(
names = [
("make", ""),
("model", ""),
("versions", ""),
],
params = mock_params,
)
mock_validate_make.assert_called_once_with(
basic_plugin,
make = mock_params["make"],
)
mock_validate_model.assert_called_once_with(
basic_plugin,
make = mock_params["make"],
model = mock_params["model"],
)
basic_plugin.owner.ensure_firmwares_exist.assert_not_called()
mock_get_firmware_mappings_to_remove.assert_called_once_with(
basic_plugin,
hosts = expected_hosts,
make = mock_params["make"],
model = mock_params["model"],
versions = mock_params["versions"],
)
basic_plugin.owner.db.execute.assert_called_once_with(
ANY,
(mock_get_firmware_mappings_to_remove.return_value,),
)
@patch.object(target = Plugin, attribute = "get_firmware_mappings_to_remove", autospec = True)
@patch.object(target = Plugin, attribute = "validate_model", autospec = True)
@patch.object(target = Plugin, attribute = "validate_make", autospec = True)
@patch(target = "stack.commands.remove.host.firmware.mapping.plugin_basic.lowered", autospec = True)
@patch(target = "stack.commands.remove.host.firmware.mapping.plugin_basic.unique_everseen", autospec = True)
def test_run_no_mappings_to_remove(
self,
mock_unique_everseen,
mock_lowered,
mock_validate_make,
mock_validate_model,
mock_get_firmware_mappings_to_remove,
basic_plugin,
):
"""Test that run works as expected when there are no mappings to remove."""
mock_args = ["foo", "bar"]
expected_hosts = tuple(mock_args)
mock_params = {"make": "fizz", "model": "buzz", "versions": "bazz, bang"}
expected_versions = tuple(version.strip() for version in mock_params["versions"].split(",") if version.strip())
mock_lowered.return_value = mock_params.values()
mock_unique_everseen.side_effect = (
mock_args,
expected_versions,
)
basic_plugin.owner.getHosts.return_value = expected_hosts
mock_get_firmware_mappings_to_remove.return_value = []
basic_plugin.run(args = (mock_params, mock_args))
assert [call(mock_args), call(basic_plugin.owner.fillParams.return_value)] == mock_lowered.mock_calls
mock_unique_everseen.assert_any_call(mock_lowered.return_value)
# Check the generator expression passed to the second call of unique_everseen
assert tuple(mock_unique_everseen.call_args_list[1][0][0]) == expected_versions
basic_plugin.owner.getHosts.assert_called_once_with(args = expected_hosts)
basic_plugin.owner.fillParams.assert_called_once_with(
names = [
("make", ""),
("model", ""),
("versions", ""),
],
params = mock_params,
)
mock_validate_make.assert_called_once_with(
basic_plugin,
make = mock_params["make"],
)
mock_validate_model.assert_called_once_with(
basic_plugin,
make = mock_params["make"],
model = mock_params["model"],
)
basic_plugin.owner.ensure_firmwares_exist.assert_called_once_with(
make = mock_params["make"],
model = mock_params["model"],
versions = expected_versions,
)
mock_get_firmware_mappings_to_remove.assert_called_once_with(
basic_plugin,
hosts = expected_hosts,
make = mock_params["make"],
model = mock_params["model"],
versions = expected_versions,
)
basic_plugin.owner.db.execute.assert_not_called()
@pytest.mark.parametrize("failure_mock", ("validate_make", "validate_model", "ensure_firmwares_exist"))
@patch.object(target = Plugin, attribute = "get_firmware_mappings_to_remove", autospec = True)
@patch.object(target = Plugin, attribute = "validate_model", autospec = True)
@patch.object(target = Plugin, attribute = "validate_make", autospec = True)
@patch(target = "stack.commands.remove.host.firmware.mapping.plugin_basic.lowered", autospec = True)
@patch(target = "stack.commands.remove.host.firmware.mapping.plugin_basic.unique_everseen", autospec = True)
def test_run_errors(
self,
mock_unique_everseen,
mock_lowered,
mock_validate_make,
mock_validate_model,
mock_get_firmware_mappings_to_remove,
failure_mock,
basic_plugin,
):
"""Test that run fails when the params or args are invalid."""
mock_args = ["foo", "bar"]
expected_hosts = tuple(mock_args)
mock_params = {"make": "fizz", "model": "buzz", "versions": "bazz, bang"}
expected_versions = tuple(version.strip() for version in mock_params["versions"].split(",") if version.strip())
mock_lowered.return_value = mock_params.values()
mock_unique_everseen.side_effect = (
mock_args,
expected_versions,
)
basic_plugin.owner.getHosts.return_value = expected_hosts
mock_validation_functions = {
"validate_make": mock_validate_make,
"validate_model": mock_validate_model,
"ensure_firmwares_exist": basic_plugin.owner.ensure_firmwares_exist,
}
mock_validation_functions[failure_mock].side_effect = CommandError(
cmd = basic_plugin.owner,
msg = "test error",
)
with pytest.raises(CommandError):
basic_plugin.run(args = (mock_params, mock_args))
basic_plugin.owner.db.execute.assert_not_called()
| 36.09375 | 121 | 0.741991 | 1,669 | 12,705 | 5.315758 | 0.08568 | 0.08307 | 0.05771 | 0.049594 | 0.843215 | 0.802976 | 0.789225 | 0.753607 | 0.713593 | 0.713593 | 0 | 0.001185 | 0.136403 | 12,705 | 351 | 122 | 36.196581 | 0.807419 | 0.083747 | 0 | 0.655518 | 0 | 0 | 0.117866 | 0.06157 | 0 | 0 | 0 | 0 | 0.123746 | 1 | 0.043478 | false | 0 | 0.020067 | 0 | 0.070234 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4519a092e3b4dba6cd108a0022c8e4d69d652043 | 7,935 | py | Python | examples/ccsd_d2.py | maxscheurer/pdaggerq | e9fef3466e0d0170afc3094ab79e603200e78dfb | [
"Apache-2.0"
] | 37 | 2020-09-17T19:29:18.000Z | 2022-03-03T16:29:16.000Z | examples/ccsd_d2.py | maxscheurer/pdaggerq | e9fef3466e0d0170afc3094ab79e603200e78dfb | [
"Apache-2.0"
] | 7 | 2021-02-28T19:22:12.000Z | 2022-02-22T15:17:47.000Z | examples/ccsd_d2.py | maxscheurer/pdaggerq | e9fef3466e0d0170afc3094ab79e603200e78dfb | [
"Apache-2.0"
] | 6 | 2021-02-16T22:34:29.000Z | 2021-12-04T19:37:23.000Z |
import pdaggerq
from pdaggerq.parser import contracted_strings_to_tensor_terms
def main():
pq = pdaggerq.pq_helper("fermi")
# D2(p,q,r,s) = <0|(1 + l1 + l2) e(-T) p*q*sr e(T) |0>
pq.set_left_operators(['1','l1','l2'])
print('\n', '# D2(i,j,k,l):', '\n')
pq.set_left_operators(['1','l1','l2'])
pq.add_st_operator(1.0,['e2(i,j,l,k)'],['t1','t2'])
pq.simplify()
d2_terms_deprince = pq.fully_contracted_strings()
d2_terms_ncr = contracted_strings_to_tensor_terms(d2_terms_deprince)
for my_term, deprince_term in zip(d2_terms_ncr, d2_terms_deprince):
print("#\t", deprince_term)
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[o, o, o, o]',
output_variables=['i', 'j', 'k', 'l']))
print()
pq.clear()
print('\n', '# D2(i,j,k,a):', '\n')
pq.set_left_operators(['1','l1','l2'])
pq.add_st_operator(1.0,['e2(i,j,a,k)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[o, o, o, v]',
output_variables=['i', 'j', 'k', 'a']))
print()
pq.clear()
print('\n', '# D2(i,j,a,l):', '\n')
pq.set_left_operators(['1','l1','l2'])
pq.add_st_operator(1.0,['e2(i,j,l,a)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[o, o, v, o]',
output_variables=['i', 'j', 'a', 'l']))
print()
pq.clear()
print('\n', '# D2(i,a,k,l):', '\n')
pq.set_left_operators(['1','l1','l2'])
pq.add_st_operator(1.0,['e2(i,a,l,k)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[o, v, o, o]',
output_variables=['i', 'a', 'k', 'l']))
print()
pq.clear()
print('\n', '# D2(a,j,k,l):', '\n')
pq.set_left_operators(['1','l1','l2'])
pq.add_st_operator(1.0,['e2(a,j,l,k)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[v, o, o, o]',
output_variables=['a', 'j', 'k', 'l']))
print()
pq.clear()
print('\n', '# D2(a,b,c,d):', '\n')
pq.set_left_operators(['1','l1','l2'])
pq.add_st_operator(1.0,['e2(a,b,d,c)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[v, v, v, v]',
output_variables=['a', 'b', 'c', 'd']))
print()
pq.clear()
print('\n', '# D2(a,b,c,i):', '\n')
pq.add_st_operator(1.0,['e2(a,b,i,c)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[v, v, v, o]',
output_variables=['a', 'b', 'c', 'i']))
print()
pq.clear()
print('\n', '# D2(a,b,i,d):', '\n')
pq.add_st_operator(1.0,['e2(a,b,d,i)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[v, v, o, v]',
output_variables=['a', 'b', 'i', 'd']))
print()
pq.clear()
print('\n', '# D2(i,b,c,d):', '\n')
pq.add_st_operator(1.0,['e2(i,b,d,c)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[o, v, v, v]',
output_variables=['i', 'b', 'c', 'd']))
print()
pq.clear()
print('\n', '# D2(a,i,c,d):', '\n')
pq.add_st_operator(1.0,['e2(a,i,d,c)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[v, o, v, v]',
output_variables=['a', 'i', 'c', 'd']))
print()
pq.clear()
print('\n', '# D2(i,j,a,b):', '\n')
pq.add_st_operator(1.0,['e2(i,j,b,a)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[o, o, v, v]',
output_variables=['i', 'j', 'a', 'b']))
print()
pq.clear()
print('\n', '# D2(a,b,i,j):', '\n')
pq.add_st_operator(1.0, ['e2(a,b,j,i)'], ['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[v, v, o, o]',
output_variables=['a', 'b', 'i', 'j']))
print()
pq.clear()
print('\n', '# D2(i,a,j,b):', '\n')
pq.add_st_operator(1.0,['e2(i,a,b,j)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[o, v, o, v]',
output_variables=['i', 'a', 'j', 'b']))
print()
pq.clear()
print('\n', '# D2(a,i,j,b):', '\n')
pq.add_st_operator(1.0,['e2(a,i,b,j)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[v, o, o, v]',
output_variables=['a', 'i', 'j', 'b']))
print()
pq.clear()
print('\n', '# D2(i,a,b,j):', '\n')
pq.add_st_operator(1.0, ['e2(i,a,j,b)'], ['t1', 't2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[o, v, v, o]',
output_variables=['i', 'a', 'b', 'j']))
print()
pq.clear()
print('\n', '# D2(a,i,b,j):', '\n')
pq.add_st_operator(1.0,['e2(a,i,j,b)'],['t1','t2'])
pq.simplify()
d2_terms = pq.fully_contracted_strings()
d2_terms = contracted_strings_to_tensor_terms(d2_terms)
for my_term in d2_terms:
print("#\t", my_term)
print(my_term.einsum_string(update_val='tpdm[v, o, v, o]',
output_variables=['a', 'i', 'b', 'j']))
print()
pq.clear()
if __name__ == "__main__":
main() | 36.906977 | 75 | 0.542281 | 1,206 | 7,935 | 3.300166 | 0.054726 | 0.114322 | 0.081156 | 0.106784 | 0.949497 | 0.865327 | 0.855025 | 0.839447 | 0.803518 | 0.755276 | 0 | 0.031778 | 0.254442 | 7,935 | 215 | 76 | 36.906977 | 0.640974 | 0.006553 | 0 | 0.610526 | 0 | 0 | 0.126253 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005263 | false | 0 | 0.010526 | 0 | 0.015789 | 0.342105 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
452e331b47da099c036a21e2ea034f678cae5291 | 17,905 | py | Python | datadog_checks_dev/tests/tooling/manifest_validator/test_v2_validator.py | abraham-leal/integrations-core | 5062702ddae5314f504e6161c7720a1fcde777ea | [
"BSD-3-Clause"
] | 663 | 2016-08-23T05:23:45.000Z | 2022-03-29T00:37:23.000Z | datadog_checks_dev/tests/tooling/manifest_validator/test_v2_validator.py | abraham-leal/integrations-core | 5062702ddae5314f504e6161c7720a1fcde777ea | [
"BSD-3-Clause"
] | 6,642 | 2016-06-09T16:29:20.000Z | 2022-03-31T22:24:09.000Z | datadog_checks_dev/tests/tooling/manifest_validator/test_v2_validator.py | abraham-leal/integrations-core | 5062702ddae5314f504e6161c7720a1fcde777ea | [
"BSD-3-Clause"
] | 1,222 | 2017-01-27T15:51:38.000Z | 2022-03-31T18:17:51.000Z | # (C) Datadog, Inc. 2021-present
# All rights reserved
# Licensed under a 3-clause BSD style license (see LICENSE)
import json
import os
from copy import deepcopy
from pathlib import Path
import mock
import pytest
import tests.tooling.manifest_validator.input_constants as input_constants
import datadog_checks.dev.tooling.manifest_validator.common.validator as common
import datadog_checks.dev.tooling.manifest_validator.v2.validator as v2_validators
from datadog_checks.dev.tooling.constants import get_root, set_root
from datadog_checks.dev.tooling.datastructures import JSONDict
from datadog_checks.dev.tooling.manifest_validator import get_all_validators
from datadog_checks.dev.tooling.manifest_validator.constants import V2
# Helpers
def get_changed_immutable_short_name_manifest():
"""
Helper function to change immutable short names in a manifest
"""
immutable_attributes_changed_short_name = JSONDict(deepcopy(input_constants.V2_VALID_MANIFEST))
immutable_attributes_changed_short_name['assets']['dashboards'] = {
"oracle-changed": "assets/dashboards/example.json"
}
return immutable_attributes_changed_short_name
def get_changed_immutable_attribute_manifest():
"""
Helper function to change other immutable attributes in a manifest
"""
immutable_attributes_changed_attribute = JSONDict(deepcopy(input_constants.V2_VALID_MANIFEST))
immutable_attributes_changed_attribute['app_id'] = 'datadog-oracle-changed'
return immutable_attributes_changed_attribute
@pytest.fixture
def setup_route():
# We want to change the path before and after running each individual test
root = Path(os.path.realpath(__file__)).parent.parent.parent.parent.parent.absolute()
current_root = get_root()
set_root(str(root))
yield root
set_root(current_root)
@mock.patch(
'datadog_checks.dev.tooling.utils.read_metadata_rows', return_value=input_constants.ORACLE_METADATA_CSV_EXAMPLE
)
def test_manifest_v2_all_pass(_, setup_route):
"""
Run a valid manifest through all V2 validators
"""
validators = get_all_validators(False, "2.0.0")
for validator in validators:
# Currently skipping SchemaValidator because of no context object and config
if isinstance(validator, v2_validators.SchemaValidator):
continue
validator.validate('active_directory', JSONDict(input_constants.V2_VALID_MANIFEST), False)
assert not validator.result.failed, validator.result
assert not validator.result.fixed
def test_manifest_v2_maintainer_validator_incorrect_maintainer(setup_route):
"""
Ensure MaintainerValidator fails if supplied an incorrect support_email
"""
incorrect_maintainer_manifest = JSONDict(
{
"author": {
"homepage": "https://www.datadoghq.com",
"name": "Datadog",
"sales_email": "help@datadoghq.com",
"support_email": "fake_email@datadoghq.com",
},
}
)
# Use specific validator
validator = common.MaintainerValidator(
is_extras=False, is_marketplace=False, check_in_extras=False, check_in_marketplace=False, version=V2
)
validator.validate('active_directory', incorrect_maintainer_manifest, False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
def test_manifest_v2_maintainer_validator_invalid_maintainer(setup_route):
"""
Ensure MaintainerValidator fails if supplied a support_email with non-ASCII characters
"""
invalid_maintainer_manifest = JSONDict(
{
"author": {
"homepage": "https://www.datadoghq.com",
"name": "Datadog",
"sales_email": "help@datadoghq.com",
"support_email": "Ǩ_help@datadoghq.com",
},
}
)
# Use specific validator
validator = common.MaintainerValidator(
is_extras=False, is_marketplace=False, check_in_extras=False, check_in_marketplace=False, version=V2
)
validator.validate('active_directory', invalid_maintainer_manifest, False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
def test_manifest_v2_maintainer_validator_correct_maintainer(setup_route):
# Use specific validator
validator = common.MaintainerValidator(
is_extras=False, is_marketplace=False, check_in_extras=False, check_in_marketplace=False, version=V2
)
validator.validate('active_directory', JSONDict(input_constants.V2_VALID_MANIFEST), False)
# Assert test case
assert not validator.result.failed, validator.result
assert not validator.result.fixed
def test_manifest_v2_metrics_metadata_validator_file_exists_not_in_manifest(setup_route):
"""
Ensure MetricsMetadataValidator fails if supplied an empty metadata_path value
"""
file_exists_not_in_manifest = JSONDict(
{
"assets": {
"integration": {
"metrics": {
"auto_install": True,
"check": "oracle.session_count",
"metadata_path": "",
"prefix": "oracle.",
},
},
},
}
)
# Use specific validator
validator = common.MetricsMetadataValidator(version=V2)
validator.validate('active_directory', file_exists_not_in_manifest, False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('os.path.isfile', return_value=False)
def test_manifest_v2_metrics_metadata_validator_file_in_manifest_not_exist(_, setup_route):
"""
Ensure MetricsMetadataValidator fails if supplied a path to a non-existant metadata.csv
"""
file_in_manifest_does_not_exist = JSONDict(
{
"assets": {
"integration": {
"metrics": {
"auto_install": True,
"check": "oracle.session_count",
"metadata_path": "metrics_metadata1.csv",
"prefix": "oracle.",
},
},
},
}
)
# Use specific validator
validator = common.MetricsMetadataValidator(version=V2)
validator.validate('active_directory', file_in_manifest_does_not_exist, False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch(
'datadog_checks.dev.tooling.utils.read_metadata_rows', return_value=input_constants.ORACLE_METADATA_CSV_EXAMPLE
)
def test_manifest_v2_metrics_metadata_validator_correct_metadata(_, setup_route):
# Use specific validator
validator = common.MetricsMetadataValidator(version=V2)
validator.validate('active_directory', JSONDict(input_constants.V2_VALID_MANIFEST), False)
# Assert test case
assert not validator.result.failed, validator.result
assert not validator.result.fixed
def test_manifest_v2_metrics_to_check_validator_check_not_in_metadata(setup_route):
"""
Ensure MetricToCheckValidator fails if the check value is not present in
the metadata.csv
"""
check_not_in_metadata_csv = JSONDict(
{
"assets": {
"integration": {
"metrics": {
"auto_install": True,
"check": "oracle.session_count",
"metadata_path": "metrics_metadata.csv",
"prefix": "oracle.",
},
},
},
}
)
# Use specific validator
validator = common.MetricToCheckValidator(version=V2)
validator.validate('active_directory', check_not_in_metadata_csv, False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
def test_manifest_v2_metrics_to_check_validator_check_not_in_manifest(setup_route):
"""
Ensure MetricToCheckValidator fails if supplied an empty check value
"""
check_not_in_manifest = JSONDict(
{
"assets": {
"integration": {
"metrics": {
"auto_install": True,
"check": "",
"metadata_path": "metrics_metadata.csv",
"prefix": "oracle.",
},
},
},
}
)
# Use specific validator
validator = common.MetricToCheckValidator(version=V2)
validator.validate('active_directory', check_not_in_manifest, False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch(
'datadog_checks.dev.tooling.utils.read_metadata_rows', return_value=input_constants.ORACLE_METADATA_CSV_EXAMPLE
)
def test_manifest_v2_metrics_metadata_validator_correct_check_in_metadata(_, setup_route):
# Use specific validator
validator = common.MetricToCheckValidator(version=V2)
validator.validate('active_directory', JSONDict(input_constants.V2_VALID_MANIFEST), False)
# Assert test case
assert not validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('datadog_checks.dev.tooling.utils.has_logs', return_value=True)
def test_manifest_v2_logs_category_validator_has_logs_no_tag(_, setup_route):
"""
Ensure LogsCategoryValidator fails if the integration has logs but no Log Collection tag
"""
has_logs_no_tag_manifest = JSONDict(
{
"classifier_tags": [
"Category::Marketplace",
"Offering::Integration",
"Offering::UI Extension",
],
}
)
# Use specific validator
validator = common.LogsCategoryValidator(version=V2)
validator.validate('active_directory', has_logs_no_tag_manifest, False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('datadog_checks.dev.tooling.utils.has_logs', return_value=True)
def test_manifest_v2_logs_category_validator_correct_has_logs_correct_tag(_, setup_route):
# Use specific validator
validator = common.LogsCategoryValidator(version=V2)
validator.validate('active_directory', JSONDict(input_constants.V2_VALID_MANIFEST), False)
# Assert test case
assert not validator.result.failed, validator.result
assert not validator.result.fixed
def test_manifest_v2_display_on_public_validator_invalid(setup_route):
"""
Ensure DisplayOnPublicValidator fails if the display_on_public_website attribute is not True
"""
display_on_public_invalid_manifest = JSONDict({"app_id": "datadog-oracle"})
# Use specific validator
validator = v2_validators.DisplayOnPublicValidator(version=V2)
validator.validate('active_directory', display_on_public_invalid_manifest, False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
def test_manifest_v2_display_on_public_validator_valid(setup_route):
display_on_public_valid_manifest = JSONDict({"display_on_public_website": True})
# Use specific validator
validator = v2_validators.DisplayOnPublicValidator(version=V2)
validator.validate('active_directory', display_on_public_valid_manifest, False)
# Assert test case
assert not validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('requests.post', return_value=input_constants.MockedResponseInvalid())
def test_manifest_v2_schema_validator_manifest_invalid(_, setup_route):
"""
Ensure SchemaValidator fails if a 400 status_code is received from request
"""
# Use specific validator
validator = v2_validators.SchemaValidator(ctx=input_constants.MockedContextObj(), version=V2, skip_if_errors=False)
validator.validate('active_directory', JSONDict(input_constants.V2_VALID_MANIFEST), False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('requests.post', return_value=input_constants.MockedResponseValid())
def test_manifest_v2_schema_validator_manifest_valid(_, setup_route):
# Use specific validator
validator = v2_validators.SchemaValidator(ctx=input_constants.MockedContextObj(), version=V2, skip_if_errors=False)
validator.validate('active_directory', JSONDict(input_constants.V2_VALID_MANIFEST), False)
# Assert test case
assert not validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch(
'datadog_checks.dev.tooling.manifest_validator.common.validator.git_show_file',
return_value=json.dumps(input_constants.V2_VALID_MANIFEST),
)
def test_manifest_v2_immutable_attributes_validator_invalid_attribute_change(_, setup_route):
"""
Ensure ImmutableAttributesValidator fails if an immutable attribute is changed
"""
# Use specific validator
validator = common.ImmutableAttributesValidator(version=V2)
validator.validate('active_directory', JSONDict(get_changed_immutable_attribute_manifest()), False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch(
'datadog_checks.dev.tooling.manifest_validator.common.validator.git_show_file',
return_value=json.dumps(input_constants.V2_VALID_MANIFEST),
)
def test_manifest_v2_immutable_attributes_validator_invalid_short_name_change(_, setup_route):
"""
Ensure ImmutableAttributesValidator fails if the short name of an asset is changed
"""
# Use specific validator
validator = common.ImmutableAttributesValidator(version=V2)
validator.validate('active_directory', JSONDict(get_changed_immutable_short_name_manifest()), False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch(
'datadog_checks.dev.tooling.manifest_validator.common.validator.git_show_file',
return_value=input_constants.IMMUTABLE_ATTRIBUTES_V1_MANIFEST,
)
def test_manifest_v2_immutable_attributes_validator_version_upgrade(_, setup_route):
"""
Ensure ImmutableAttributesValidator skips validations if the manifest is being upgraded from v1 to v2
"""
# Use specific validator
validator = common.ImmutableAttributesValidator(version=V2)
validator.validate('active_directory', input_constants.IMMUTABLE_ATTRIBUTES_V2_MANIFEST, False)
# Assert test case
assert not validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch(
'datadog_checks.dev.tooling.manifest_validator.common.validator.git_show_file',
return_value=json.dumps(input_constants.V2_VALID_MANIFEST),
)
def test_manifest_v2_immutable_attributes_validator_valid_change(_, setup_route):
# Use specific validator
validator = common.ImmutableAttributesValidator(version=V2)
validator.validate('active_directory', JSONDict(input_constants.V2_VALID_MANIFEST), False)
# Assert test case
assert not validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('os.path.getsize', return_value=300000)
def test_manifest_v2_media_gallery_validator_pass(_, setup_route):
# Use specific validator
validator = v2_validators.MediaGalleryValidator(is_marketplace=True, version=V2, check_in_extras=False)
validator.validate('active_directory', JSONDict(input_constants.VALID_MEDIA_MANIFEST), False)
# Assert test case
assert not validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('os.path.getsize', return_value=1300000)
def test_manifest_v2_media_gallery_validator_image_size_too_large(_, setup_route):
# Use specific validator
validator = v2_validators.MediaGalleryValidator(is_marketplace=True, version=V2, check_in_extras=False)
validator.validate('active_directory', JSONDict(input_constants.VALID_MEDIA_MANIFEST), False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('os.path.getsize', return_value=300000)
def test_manifest_v2_media_gallery_validator_too_many_videos(_, setup_route):
# Use specific validator
validator = v2_validators.MediaGalleryValidator(is_marketplace=True, version=V2, check_in_extras=False)
validator.validate('active_directory', JSONDict(input_constants.INVALID_MEDIA_MANIFEST_TOO_MANY_VIDEOS), False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('os.path.getsize', return_value=300000)
def test_manifest_v2_media_gallery_validator_bad_structure(_, setup_route):
# Use specific validator
validator = v2_validators.MediaGalleryValidator(is_marketplace=True, version=V2, check_in_extras=False)
validator.validate('active_directory', JSONDict(input_constants.INVALID_MEDIA_MANIFEST_BAD_STRUCTURE), False)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
@mock.patch('os.path.getsize', return_value=300000)
def test_manifest_v2_media_gallery_validator_incorrect_vimeo_id_type(_, setup_route):
# Use specific validator
validator = v2_validators.MediaGalleryValidator(is_marketplace=True, version=V2, check_in_extras=False)
validator.validate(
'active_directory', JSONDict(input_constants.INVALID_MEDIA_MANIFEST_INCORRECT_VIMEO_ID_TYPE), False
)
# Assert test case
assert validator.result.failed, validator.result
assert not validator.result.fixed
| 37.458159 | 119 | 0.730075 | 2,033 | 17,905 | 6.131825 | 0.113133 | 0.090245 | 0.050537 | 0.067383 | 0.828654 | 0.800738 | 0.773223 | 0.732151 | 0.704637 | 0.704637 | 0 | 0.008465 | 0.188439 | 17,905 | 477 | 120 | 37.536688 | 0.849425 | 0.134432 | 0 | 0.496528 | 0 | 0 | 0.124088 | 0.046204 | 0 | 0 | 0 | 0 | 0.173611 | 1 | 0.097222 | false | 0.006944 | 0.045139 | 0 | 0.149306 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
453d02047f5575a1896445c8fa5317d5dc2bd4c9 | 213 | py | Python | tests/calendar/test_calendar.py | matthewgdv/office | 8a779cecb9382a196a34a358c43d23a30c48bb04 | [
"MIT"
] | 1 | 2020-12-26T16:08:42.000Z | 2020-12-26T16:08:42.000Z | tests/calendar/test_calendar.py | matthewgdv/office | 8a779cecb9382a196a34a358c43d23a30c48bb04 | [
"MIT"
] | null | null | null | tests/calendar/test_calendar.py | matthewgdv/office | 8a779cecb9382a196a34a358c43d23a30c48bb04 | [
"MIT"
] | 1 | 2021-05-30T11:25:20.000Z | 2021-05-30T11:25:20.000Z | # import pytest
class TestCalendar:
def test_events(self): # synced
assert True
def test_event(self): # synced
assert True
def test_new_event(self): # synced
assert True
| 16.384615 | 39 | 0.624413 | 26 | 213 | 4.961538 | 0.5 | 0.162791 | 0.372093 | 0.465116 | 0.651163 | 0.418605 | 0 | 0 | 0 | 0 | 0 | 0 | 0.309859 | 213 | 12 | 40 | 17.75 | 0.877551 | 0.159624 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.428571 | 1 | 0.428571 | false | 0 | 0 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
453ff6ed71daa082904511b1c47aad1e6bfce96a | 13,026 | py | Python | models/encoder.py | laddie132/LW-PT | 28b469ba68a5d4fba68b992cff5372e63ec2ed42 | [
"MIT"
] | 9 | 2020-08-20T18:15:43.000Z | 2022-02-10T02:54:30.000Z | models/encoder.py | laddie132/LW-PT | 28b469ba68a5d4fba68b992cff5372e63ec2ed42 | [
"MIT"
] | 1 | 2021-11-19T01:29:47.000Z | 2021-11-19T09:58:38.000Z | models/encoder.py | laddie132/LW-PT | 28b469ba68a5d4fba68b992cff5372e63ec2ed42 | [
"MIT"
] | 3 | 2021-05-29T02:11:34.000Z | 2021-12-14T15:43:22.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
__author__ = "Han"
__email__ = "liuhan132@foxmail.com"
from .layers import *
class LWBiRNNEncoder(torch.nn.Module):
"""
Label-Wise Bidirectional RNN encoder on word-level for document representation
Inputs:
doc_emb: (batch, doc_len, emb_dim)
doc_mask: (batch, doc_len)
label: (batch, label_size) or None
Outputs:
doc_rep: (batch, hidden_size * 2) / (batch, label_size, hidden_size * 2)
"""
def __init__(self, model_config):
super(LWBiRNNEncoder, self).__init__()
embedding_dim = model_config['embedding_dim']
hidden_size = model_config['hidden_size']
label_size = model_config['label_size']
dropout_p = model_config['dropout_p']
enable_layer_norm = model_config['layer_norm']
cell = model_config['cell']
num_layers = model_config['num_layers']
self.doc_rnn = MyRNNBase(mode=cell,
input_size=embedding_dim,
hidden_size=hidden_size,
bidirectional=True,
dropout_p=dropout_p,
enable_layer_norm=enable_layer_norm,
batch_first=True,
num_layers=num_layers)
self.doc_attention = MultiHeadSelfAttention(in_features=hidden_size * 2,
labels=label_size)
def forward(self, doc_emb, doc_mask, label=None):
visual_parm = {}
# (batch, doc_len, hidden_size * 2)
doc_rep, _ = self.doc_rnn(doc_emb, doc_mask)
# (batch, hidden_size * 2) / (batch, label_size, hidden_size * 2)
doc_rep, doc_word_att_p = self.doc_attention(doc_rep, label, doc_mask)
visual_parm['doc_word_att_p'] = doc_word_att_p
return doc_rep, visual_parm
class HLWANEncoder(torch.nn.Module):
"""
Hierarchical Label-Wise Attention Network for document representation
Inputs:
doc_emb: (batch, doc_sent_len, doc_word_len, emb_dim)
doc_mask: (batch, doc_sent_len, doc_word_len)
label: (batch, label_size)
Outputs:
doc_rep: (batch, hidden_size * 2)
"""
def __init__(self, model_config):
super(HLWANEncoder, self).__init__()
embedding_dim = model_config['embedding_dim']
hidden_size = model_config['hidden_size']
label_size = model_config['label_size']
dropout_p = model_config['dropout_p']
enable_layer_norm = model_config['layer_norm']
cell = model_config['cell']
num_layers = model_config['num_layers']
self.doc_word_rnn = MyRNNBase(mode=cell,
input_size=embedding_dim,
hidden_size=hidden_size,
bidirectional=True,
dropout_p=dropout_p,
enable_layer_norm=enable_layer_norm,
batch_first=True,
num_layers=num_layers)
self.doc_word_attention = MultiHeadSelfAttention(in_features=hidden_size * 2,
labels=label_size)
self.doc_sentence_rnn = MyRNNBase(mode=cell,
input_size=hidden_size * 2,
hidden_size=hidden_size,
bidirectional=True,
dropout_p=dropout_p,
enable_layer_norm=enable_layer_norm,
batch_first=True,
num_layers=num_layers)
self.doc_sentence_attention = MultiHeadSelfAttention(in_features=hidden_size * 2,
labels=label_size)
def forward(self, doc_emb, doc_mask, label=None):
visual_parm = {}
batch, doc_sent_len, doc_word_len, _ = doc_emb.size()
doc_word_emb = doc_emb.view(batch * doc_sent_len, doc_word_len, -1)
doc_word_mask = doc_mask.view(batch * doc_sent_len, doc_word_len)
if label is not None:
word_level_label = label.unsqueeze(1).expand(batch, doc_sent_len, -1).contiguous()
word_level_label = word_level_label.view(batch * doc_sent_len, -1)
else:
word_level_label = None
# (batch * doc_sent_len, doc_word_len, hidden_size * 2)
doc_word_rep, _ = self.doc_word_rnn(doc_word_emb, doc_word_mask)
# (batch * doc_sent_len, hidden_size * 2) / (batch * doc_sent_len, label_size, hidden_size * 2)
doc_sent_emb, doc_word_att_p = self.doc_word_attention(doc_word_rep, word_level_label, doc_word_mask)
visual_parm['doc_word_att_p'] = doc_word_att_p
doc_sent_mask = compute_top_layer_mask(doc_mask) # TODO: mismatch for aapd hier
if label is not None:
# (batch, doc_sent_len, hidden_size * 2)
doc_sent_emb = doc_sent_emb.view(batch, doc_sent_len, -1)
else:
# (batch * label_size, doc_sent_len, hidden * 2)
label_size = doc_sent_emb.shape[1]
doc_sent_emb = doc_sent_emb.view(batch, doc_sent_len, label_size, -1).transpose(1, 2)
doc_sent_emb = doc_sent_emb.reshape(batch * label_size, doc_sent_len, -1)
doc_sent_mask = doc_sent_mask.unsqueeze(1).expand(-1, label_size, -1)
doc_sent_mask = doc_sent_mask.reshape(batch * label_size, -1)
# (batch, doc_sent_len, hidden_size * 2) / (batch * label_size, doc_sent_len, hidden_size * 2)
doc_sent_rep, _ = self.doc_sentence_rnn(doc_sent_emb, doc_sent_mask)
doc_sent_len = doc_sent_rep.shape[1]
doc_sent_mask = doc_sent_mask[:, :doc_sent_len]
# (batch, hidden_size * 2) / (batch * label_size, label_size, hidden_size * 2)
doc_rep, doc_sent_att_p = self.doc_sentence_attention(doc_sent_rep, label, doc_sent_mask)
if label is None:
label_size = doc_rep.shape[1]
hsize = doc_rep.shape[-1]
doc_rep = doc_rep.view(batch, label_size, label_size, -1)
select_idx = torch.eye(label_size, device=doc_rep.device).unsqueeze(-1).\
repeat(batch, 1, 1, hsize).bool()
doc_rep = doc_rep[select_idx].view(batch, label_size, -1) # (batch, label_size, h * 2)
doc_sent_att_p = doc_sent_att_p.view(batch, label_size, label_size, -1)
select_idx = torch.eye(label_size, device=doc_rep.device).unsqueeze(-1).\
repeat(batch, 1, 1, doc_sent_len).bool()
doc_sent_att_p = doc_sent_att_p[select_idx].view(batch, label_size, -1) # (batch, label_size, len)
visual_parm['doc_sent_att_p'] = doc_sent_att_p
return doc_rep, visual_parm
class BiRNNEncoder(torch.nn.Module):
"""
Bidirectional RNN encoder on word-level for document representation
Inputs:
doc_emb: (batch, doc_len, emb_dim)
doc_mask: (batch, doc_len)
label: (batch, label_size)
Outputs:
doc_rep: (batch, hidden_size * 2)
"""
def __init__(self, model_config):
super(BiRNNEncoder, self).__init__()
embedding_dim = model_config['embedding_dim']
hidden_size = model_config['hidden_size']
dropout_p = model_config['dropout_p']
enable_layer_norm = model_config['layer_norm']
cell = model_config['cell']
num_layers = model_config['num_layers']
self.doc_rnn = MyRNNBase(mode=cell,
input_size=embedding_dim,
hidden_size=hidden_size,
bidirectional=True,
dropout_p=dropout_p,
enable_layer_norm=enable_layer_norm,
batch_first=True,
num_layers=num_layers)
self.doc_attention = SelfAttention(in_features=hidden_size * 2)
def forward(self, doc_emb, doc_mask, label=None):
visual_parm = {}
# (batch, doc_len, hidden_size * 2)
doc_rep, _ = self.doc_rnn(doc_emb, doc_mask)
# (batch, hidden_size * 2)
doc_rep, doc_word_att_p = self.doc_attention(doc_rep, doc_mask)
visual_parm['doc_word_att_p'] = doc_word_att_p
return doc_rep, visual_parm
class HANEncoder(torch.nn.Module):
"""
Hierarchical Attention Network for document representation
Inputs:
doc_emb: (batch, doc_sent_len, doc_word_len, emb_dim)
doc_mask: (batch, doc_sent_len, doc_word_len)
Outputs:
doc_rep: (batch, hidden_size * 2)
"""
def __init__(self, model_config):
super(HANEncoder, self).__init__()
embedding_dim = model_config['embedding_dim']
hidden_size = model_config['hidden_size']
dropout_p = model_config['dropout_p']
enable_layer_norm = model_config['layer_norm']
cell = model_config['cell']
num_layers = model_config['num_layers']
self.doc_word_rnn = MyRNNBase(mode=cell,
input_size=embedding_dim,
hidden_size=hidden_size,
bidirectional=True,
dropout_p=dropout_p,
enable_layer_norm=enable_layer_norm,
batch_first=True,
num_layers=num_layers)
self.doc_word_attention = SelfAttention(in_features=hidden_size * 2)
self.doc_sentence_rnn = MyRNNBase(mode=cell,
input_size=hidden_size * 2,
hidden_size=hidden_size,
bidirectional=True,
dropout_p=dropout_p,
enable_layer_norm=enable_layer_norm,
batch_first=True,
num_layers=num_layers)
self.doc_sentence_attention = SelfAttention(in_features=hidden_size * 2)
def forward(self, doc_emb, doc_mask, label=None):
visual_parm = {}
batch, doc_sent_len, doc_word_len, _ = doc_emb.size()
doc_word_emb = doc_emb.view(batch * doc_sent_len, doc_word_len, -1)
doc_word_mask = doc_mask.view(batch * doc_sent_len, doc_word_len)
# (batch * doc_sent_len, doc_word_len, hidden_size * 2)
doc_word_rep, _ = self.doc_word_rnn(doc_word_emb, doc_word_mask)
# (batch * doc_sent_len, hidden_size * 2)
doc_sent_emb, doc_word_att_p = self.doc_word_attention(doc_word_rep, doc_word_mask)
visual_parm['doc_word_att_p'] = doc_word_att_p
# (batch, doc_sent_len, hidden_size * 2)
doc_sent_emb = doc_sent_emb.view(batch, doc_sent_len, -1)
doc_sent_mask = compute_top_layer_mask(doc_mask)
# (batch, doc_sent_len, hidden_size * 2)
doc_sent_rep, _ = self.doc_sentence_rnn(doc_sent_emb, doc_sent_mask)
doc_sent_len = doc_sent_rep.shape[1]
doc_sent_mask = doc_sent_mask[:, :doc_sent_len]
# (batch, hidden_size * 2)
doc_rep, doc_sent_att_p = self.doc_sentence_attention(doc_sent_rep, doc_sent_mask)
visual_parm['doc_sent_att_p'] = doc_sent_att_p
return doc_rep, visual_parm
class CNNEncoder(torch.nn.Module):
def __init__(self, model_config):
super(CNNEncoder, self).__init__()
embedding_dim = model_config['embedding_dim']
dropout_p = model_config['dropout_p']
Co = 100
Ks = [3, 4, 5]
self.convs1 = torch.nn.ModuleList([torch.nn.Conv2d(1, Co, (K, embedding_dim)) for K in Ks])
'''
self.conv13 = nn.Conv2d(Ci, Co, (3, D))
self.conv14 = nn.Conv2d(Ci, Co, (4, D))
self.conv15 = nn.Conv2d(Ci, Co, (5, D))
'''
self.dropout = torch.nn.Dropout(dropout_p)
def conv_and_pool(self, x, conv):
x = F.relu(conv(x)).squeeze(3) # (N, Co, W)
x = F.max_pool1d(x, x.size(2)).squeeze(2)
return x
def forward(self, x, *args):
x = x.unsqueeze(1) # (N, Ci, W, D)
x = [F.relu(conv(x)).squeeze(3) for conv in self.convs1] # [(N, Co, W), ...]*len(Ks)
x = [F.max_pool1d(i, i.size(2)).squeeze(2) for i in x] # [(N, Co), ...]*len(Ks)
x = torch.cat(x, 1)
'''
x1 = self.conv_and_pool(x,self.conv13) #(N,Co)
x2 = self.conv_and_pool(x,self.conv14) #(N,Co)
x3 = self.conv_and_pool(x,self.conv15) #(N,Co)
x = torch.cat((x1, x2, x3), 1) # (N,len(Ks)*Co)
'''
x = self.dropout(x) # (N, len(Ks)*Co)
return x, None
| 42.568627 | 111 | 0.577691 | 1,652 | 13,026 | 4.142857 | 0.085956 | 0.07671 | 0.046756 | 0.052601 | 0.841029 | 0.831239 | 0.808445 | 0.77294 | 0.753653 | 0.73948 | 0 | 0.012755 | 0.325887 | 13,026 | 305 | 112 | 42.708197 | 0.766655 | 0.150699 | 0 | 0.691892 | 0 | 0 | 0.036031 | 0.002002 | 0 | 0 | 0 | 0.003279 | 0 | 1 | 0.059459 | false | 0 | 0.005405 | 0 | 0.124324 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
18a6176a4ac419eb508a129af46b130cc0b1f5ff | 398 | py | Python | h/claim/util.py | noscripter/h | a7a4095a46683ea08dae62335bbcd53f7ab313e2 | [
"MIT"
] | null | null | null | h/claim/util.py | noscripter/h | a7a4095a46683ea08dae62335bbcd53f7ab313e2 | [
"MIT"
] | null | null | null | h/claim/util.py | noscripter/h | a7a4095a46683ea08dae62335bbcd53f7ab313e2 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
def generate_claim_token(request, userid):
return request.registry.claim_serializer.dumps({'userid': userid})
def generate_claim_url(request, userid):
''' Generates a url that a user can visit to claim their account. '''
token = generate_claim_token(request, userid)
return request.route_url('claim_account', token=token)
def includeme(config):
pass
| 28.428571 | 73 | 0.723618 | 53 | 398 | 5.264151 | 0.509434 | 0.139785 | 0.114695 | 0.179211 | 0.315412 | 0.315412 | 0.315412 | 0 | 0 | 0 | 0 | 0.002976 | 0.155779 | 398 | 13 | 74 | 30.615385 | 0.827381 | 0.213568 | 0 | 0 | 1 | 0 | 0.062092 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.142857 | 0 | 0.142857 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
7a00120e15918c9370a96ab94db0dd57a5ce8b17 | 32 | py | Python | teste.py | gabrielvieira1/Python-course | 20e6deca6a75a41e71dcc311ce78acc443feb642 | [
"MIT"
] | null | null | null | teste.py | gabrielvieira1/Python-course | 20e6deca6a75a41e71dcc311ce78acc443feb642 | [
"MIT"
] | null | null | null | teste.py | gabrielvieira1/Python-course | 20e6deca6a75a41e71dcc311ce78acc443feb642 | [
"MIT"
] | null | null | null | print('Geek University gabriel') | 32 | 32 | 0.8125 | 4 | 32 | 6.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 32 | 1 | 32 | 32 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0.69697 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
e13b598e657c8eeba9152419e8963d91acf68028 | 123 | py | Python | self_supervised/models/layers/__init__.py | jianzhnie/self_supervised | d1e0f31ab032150ab0ad007c1e19773135a5fb79 | [
"Apache-2.0"
] | 1 | 2021-12-13T12:31:47.000Z | 2021-12-13T12:31:47.000Z | self_supervised/models/layers/__init__.py | jianzhnie/self_supervised | d1e0f31ab032150ab0ad007c1e19773135a5fb79 | [
"Apache-2.0"
] | null | null | null | self_supervised/models/layers/__init__.py | jianzhnie/self_supervised | d1e0f31ab032150ab0ad007c1e19773135a5fb79 | [
"Apache-2.0"
] | null | null | null | '''
Author: jianzhnie
Date: 2021-12-14 15:15:19
LastEditTime: 2021-12-14 15:15:20
LastEditors: jianzhnie
Description:
'''
| 13.666667 | 33 | 0.731707 | 19 | 123 | 4.736842 | 0.631579 | 0.133333 | 0.177778 | 0.222222 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0.256881 | 0.113821 | 123 | 8 | 34 | 15.375 | 0.568807 | 0.918699 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e13b776c9b1fef30b51f97637730369a1abc1e67 | 36 | py | Python | celery_app.py | Hack4Eugene/15th-night-by-dumpsterfire | e130c5bee868f11e7f5c34f1efc5ad6a6dcc022a | [
"MIT"
] | null | null | null | celery_app.py | Hack4Eugene/15th-night-by-dumpsterfire | e130c5bee868f11e7f5c34f1efc5ad6a6dcc022a | [
"MIT"
] | null | null | null | celery_app.py | Hack4Eugene/15th-night-by-dumpsterfire | e130c5bee868f11e7f5c34f1efc5ad6a6dcc022a | [
"MIT"
] | null | null | null | from _15thnight.queue import celery
| 18 | 35 | 0.861111 | 5 | 36 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.111111 | 36 | 1 | 36 | 36 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e13ec23a33c2e6713e9fa947efca242468c7808c | 36 | py | Python | rplugin/python3/lfx/__init__.py | dcampos/nvim-ulf | 9b70096bf04133a33f09691b1ef1ac89f8f29a85 | [
"MIT"
] | 2 | 2020-10-15T09:34:53.000Z | 2020-10-26T10:31:48.000Z | rplugin/python3/lfx/__init__.py | dcampos/nvim-ulf | 9b70096bf04133a33f09691b1ef1ac89f8f29a85 | [
"MIT"
] | null | null | null | rplugin/python3/lfx/__init__.py | dcampos/nvim-ulf | 9b70096bf04133a33f09691b1ef1ac89f8f29a85 | [
"MIT"
] | 1 | 2020-10-15T09:35:05.000Z | 2020-10-15T09:35:05.000Z | from .lfx import LFX, RequestHelper
| 18 | 35 | 0.805556 | 5 | 36 | 5.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138889 | 36 | 1 | 36 | 36 | 0.935484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e16cec62a95786acfea3c7181b4e46463c00af5b | 94 | py | Python | acondbs/blueprint/graphql_ide/__init__.py | simonsobs/acondbs | 6ca11c2889d827ecdb2b54d0cf3b94b8cdd281e6 | [
"MIT"
] | null | null | null | acondbs/blueprint/graphql_ide/__init__.py | simonsobs/acondbs | 6ca11c2889d827ecdb2b54d0cf3b94b8cdd281e6 | [
"MIT"
] | 24 | 2020-04-02T19:29:07.000Z | 2022-03-08T03:05:43.000Z | acondbs/blueprint/graphql_ide/__init__.py | simonsobs/acondbs | 6ca11c2889d827ecdb2b54d0cf3b94b8cdd281e6 | [
"MIT"
] | 1 | 2020-04-08T15:48:28.000Z | 2020-04-08T15:48:28.000Z | from .graphiql_newer import GRAPHIQL_NEWER
from .graphql_playground import GRAPHQL_PLAYGROUND
| 31.333333 | 50 | 0.893617 | 12 | 94 | 6.666667 | 0.5 | 0.325 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 94 | 2 | 51 | 47 | 0.930233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e19c251189f575ab7ec7168997d0b6307ee0edb0 | 19 | py | Python | python/testData/psi/FStringTerminatedByQuoteOfNestedStringLiteral.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/psi/FStringTerminatedByQuoteOfNestedStringLiteral.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/psi/FStringTerminatedByQuoteOfNestedStringLiteral.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | s = f'{f"{'foo'}"}' | 19 | 19 | 0.315789 | 4 | 19 | 1.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 19 | 1 | 19 | 19 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0.35 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e1ac294cd876b4c88339f435a34c4e7092e67025 | 1,280 | py | Python | blog/migrations/0006_auto_20191202_1249.py | TEOTD/Travel | 013deb247429cc4fbff910d1d068a7b87bebe324 | [
"MIT"
] | null | null | null | blog/migrations/0006_auto_20191202_1249.py | TEOTD/Travel | 013deb247429cc4fbff910d1d068a7b87bebe324 | [
"MIT"
] | null | null | null | blog/migrations/0006_auto_20191202_1249.py | TEOTD/Travel | 013deb247429cc4fbff910d1d068a7b87bebe324 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.7 on 2019-12-02 12:49
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('blog', '0005_book'),
]
operations = [
migrations.AlterField(
model_name='places',
name='adv1',
field=models.CharField(max_length=100),
),
migrations.AlterField(
model_name='places',
name='adv2',
field=models.CharField(max_length=100),
),
migrations.AlterField(
model_name='places',
name='adv3',
field=models.CharField(max_length=100),
),
migrations.AlterField(
model_name='places',
name='adv4',
field=models.CharField(max_length=100),
),
migrations.AlterField(
model_name='places',
name='name',
field=models.CharField(max_length=100),
),
migrations.AlterField(
model_name='places',
name='per',
field=models.CharField(max_length=100),
),
migrations.AlterField(
model_name='places',
name='price',
field=models.CharField(max_length=100),
),
]
| 26.122449 | 51 | 0.527344 | 118 | 1,280 | 5.59322 | 0.313559 | 0.212121 | 0.265152 | 0.307576 | 0.75303 | 0.75303 | 0.645455 | 0.645455 | 0.645455 | 0.645455 | 0 | 0.053204 | 0.353906 | 1,280 | 48 | 52 | 26.666667 | 0.744861 | 0.035156 | 0 | 0.666667 | 1 | 0 | 0.067315 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02381 | 0 | 0.095238 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e1c0622e3670de287f2c139a1ee37d8b6f011285 | 6,669 | py | Python | dlmb/losses.py | Jonathan-Andrews/dlmb | 552148bcac2ffb4308c8db24599c458652684ed2 | [
"MIT"
] | 5 | 2019-11-23T13:32:21.000Z | 2022-01-01T16:32:48.000Z | dlmb/losses.py | Jonathan-Andrews/dlmb | 552148bcac2ffb4308c8db24599c458652684ed2 | [
"MIT"
] | null | null | null | dlmb/losses.py | Jonathan-Andrews/dlmb | 552148bcac2ffb4308c8db24599c458652684ed2 | [
"MIT"
] | null | null | null | from abc import ABCMeta, abstractmethod
import numpy as np
from utils.function_helpers import *
class Base_Loss(metaclass=ABCMeta):
@abstractmethod
def __init__(self) -> None:
"""
The Base_Loss class is an abstract class for all loss functions.
All loss functions must inherit from Base_Loss.
"""
pass
@abstractmethod
def map_data(self, y_true:np.ndarray, y_pred:np.ndarray) -> np.ndarray:
"""
map_data() takes some data and applies a mathematical mapping to it.
Arguments:
y_true : np.ndarray : An n dimensional numpy array of target values for the output of a neural-net model.
y_pred : np.ndarray : An n dimensional numpy array of predicted values from a neural-net model.
Return:
output : np.ndarray : An n dimensional numpy array of the mapped data.
"""
return output
@abstractmethod
def calculate_gradients(self, y_true:np.ndarray, y_pred:np.ndarray) -> np.ndarray:
"""
calculate_gradients returns the derivative of the loss function W.R.T the data.
Arguments:
y_true : np.ndarray : An n dimensional numpy array of target values for the output of a neural-net model.
y_pred : np.ndarray : An n dimensional numpy array of predicted values from a neural-net model.
Return:
output : np.ndarray : An n dimensional numpy array of gradients.
"""
return output
class Mean_Squared_Error(Base_Loss):
def __init__(self) -> None:
"""
The MSE class is a commonly used regression loss function.
"""
pass
@accepts(self="any", y_true=np.ndarray, y_pred=np.ndarray)
def map_data(self, y_true, y_pred) -> np.ndarray:
"""
Calculates the squared distance between y_true and y_pred.
Arguments:
y_true : np.ndarray : An n dimensional numpy array of target values for the output of a neural-net model.
y_pred : np.ndarray : An n dimensional numpy array of predicted values from a neural-net model.
Returns:
output : np.ndarray : An n dimensional numpy array of the mean squared distance between y_true and y_pred.
"""
return (y_pred-y_true)**2/2
@accepts(self="any", y_true=np.ndarray, y_pred=np.ndarray)
def calculate_gradients(self, y_true, y_pred) -> np.ndarray:
"""
Calculates the derivatives of the function W.R.T y_pred.
Arguments:
y_true : np.ndarray : An n dimensional numpy array of target values for the output of a neural-net model.
y_pred : np.ndarray : An n dimensional numpy array of predicted values from a neural-net model.
Returns:
output : np.ndarray : An n dimensional numpy array of the calculated derivatives of the function W.R.T y_pred.
"""
return y_pred - y_true
class Binary_Crossentropy(Base_Loss):
def __init__(self) -> None:
"""
The Binary_crossentropy class measures the performance of a classification model whose output is a probability value between 0 and 1,
and where the number of outputs is less than 3.
"""
pass
@accepts(self="any", y_true=np.ndarray, y_pred=np.ndarray)
def map_data(self, y_true, y_pred) -> np.ndarray:
"""
Calculates the distance between y_true and y_pred.
Arguments:
y_true : np.ndarray : An n dimensional numpy array of target values for the output of a neural-net model.
y_pred : np.ndarray : An n dimensional numpy array of predicted values from a neural-net model.
Returns:
output : np.ndarray : The mean squared distance between y_true and y_pred.
"""
part1 = y_true*np.log(y_pred+1.0e-8) # I add 1.0e-8 to make sure 0 isn't going into np.log
part2 = (1-y_true)*np.log(1-y_pred+1.0e-8)
return -(part1 + part2)
@accepts(self="any", y_true=np.ndarray, y_pred=np.ndarray)
def calculate_gradients(self, y_true, y_pred) -> np.ndarray:
"""
Calculates the derivatives of the function W.R.T y_pred.
Arguments:
y_true : np.ndarray : An n dimensional numpy array of target values for the output of a neural-net model.
y_pred : np.ndarray : An n dimensional numpy array of predicted values from a neural-net model.
Returns:
output : np.ndarray : An n dimensional numpy array of the calculated derivatives of the function W.R.T y_pred.
"""
return division_check(y_true,y_pred) - division_check(1-y_true, 1-y_pred)
class Crossentropy(Base_Loss):
def __init__(self) -> None:
"""
The Crossentropy class measures the performance of a classification model whose output is a probability value between 0 and 1,
and where the number of outputs is more than 2.
"""
pass
@accepts(self="any", y_true=np.ndarray, y_pred=np.ndarray)
def map_data(self, y_true, y_pred) -> np.ndarray:
"""
Calculates the distance between y_true and y_pred.
Arguments:
y_true : np.ndarray : An n dimensional numpy array of target values for the output of a neural-net model.
y_pred : np.ndarray : An n dimensional numpy array of predicted values from a neural-net model.
Returns:
output : np.ndarray : The mean squared distance between y_true and y_pred.
"""
return -(y_true*np.log(y_pred+1.0e-8))
def calculate_gradients(self, y_pred:np.ndarray, y_true:np.ndarray) -> np.ndarray:
"""
Calculates the derivatives of the function W.R.T y_pred.
Arguments:
y_true : np.ndarray : An n dimensional numpy array of target values for the output of a neural-net model.
y_pred : np.ndarray : An n dimensional numpy array of predicted values from a neural-net model.
Returns:
output : np.ndarray : An n dimensional numpy array of the calculated derivatives of the function W.R.T y_pred.
"""
return division_check(y_true, y_pred)
def get(loss) -> Base_Loss:
"""
Finds and returns the correct loss function.
Arguments:
loss : Base_Loss/str : The loss function the user wants to use.
Returns:
loss : Base_Loss : The correct loss function.
"""
if isinstance(loss, str):
if loss.lower() in ("mse", "mean_squared_error"):
return Mean_Squared_Error()
elif loss.lower() in ("bc", "bce", "binary_crossentropy"):
return Binary_Crossentropy()
elif loss.lower() in ("ce", "crossentropy"):
return Crossentropy()
else:
print("At losses.get(): '%s' is not an available loss function. Has been set to 'Mean_squared_error' by default" % loss)
return Mean_squared_error()
elif isinstance(loss, Base_Loss):
return loss
else:
raise ValueError("At losses.get(): Expected 'class inheriting from Base_Loss' or 'str' for the argument 'loss', recieved '%s'" % type(loss))
| 29.25 | 143 | 0.691258 | 1,032 | 6,669 | 4.340116 | 0.141473 | 0.09645 | 0.05403 | 0.058942 | 0.737888 | 0.714445 | 0.707524 | 0.697477 | 0.678946 | 0.670685 | 0 | 0.005593 | 0.222522 | 6,669 | 227 | 144 | 29.378855 | 0.858245 | 0.595142 | 0 | 0.446429 | 0 | 0.035714 | 0.12066 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.232143 | false | 0.071429 | 0.053571 | 0 | 0.589286 | 0.017857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
e1c913142bc63abc8cec3392d856474b28b98f6c | 30 | py | Python | sscls/datasets/utils/__init__.py | poodarchu/sscls | 8b1bd94b1ef4f0cef3ec6ecbb48be9dab129687b | [
"MIT"
] | 2 | 2020-04-26T13:41:24.000Z | 2020-05-06T10:15:06.000Z | sscls/datasets/utils/__init__.py | poodarchu/sscls | 8b1bd94b1ef4f0cef3ec6ecbb48be9dab129687b | [
"MIT"
] | null | null | null | sscls/datasets/utils/__init__.py | poodarchu/sscls | 8b1bd94b1ef4f0cef3ec6ecbb48be9dab129687b | [
"MIT"
] | null | null | null | from .imgproc import imdecode
| 15 | 29 | 0.833333 | 4 | 30 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.961538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bee12bb84a70201abbdc7911e6ca9609efd95316 | 40 | py | Python | pycozmo/expressions/__init__.py | nalbion/pycozmo | 35ee1ea741ecf7a39affc38d4ff5ad17865fea16 | [
"MIT"
] | 123 | 2019-08-25T21:28:23.000Z | 2022-03-12T13:54:59.000Z | pycozmo/expressions/__init__.py | nalbion/pycozmo | 35ee1ea741ecf7a39affc38d4ff5ad17865fea16 | [
"MIT"
] | 41 | 2019-08-25T21:21:37.000Z | 2022-02-09T14:20:54.000Z | pycozmo/expressions/__init__.py | nalbion/pycozmo | 35ee1ea741ecf7a39affc38d4ff5ad17865fea16 | [
"MIT"
] | 51 | 2019-09-04T13:30:02.000Z | 2022-01-09T01:20:24.000Z |
from .expressions import * # noqa
| 13.333333 | 38 | 0.625 | 4 | 40 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.3 | 40 | 2 | 39 | 20 | 0.892857 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
83531a340bd8c3617aff81809391e3e8b6eb63cb | 14,984 | py | Python | tests/models/programdb/usage_profile/usage_profile_integration_test.py | weibullguy/ramstk | 3ec41d7e2933045a7a8028aed6c6b04365495095 | [
"BSD-3-Clause"
] | 4 | 2018-08-26T09:11:36.000Z | 2019-05-24T12:01:02.000Z | tests/models/programdb/usage_profile/usage_profile_integration_test.py | weibullguy/ramstk | 3ec41d7e2933045a7a8028aed6c6b04365495095 | [
"BSD-3-Clause"
] | 52 | 2018-08-24T12:51:22.000Z | 2020-12-28T04:59:42.000Z | tests/models/programdb/usage_profile/usage_profile_integration_test.py | weibullguy/ramstk | 3ec41d7e2933045a7a8028aed6c6b04365495095 | [
"BSD-3-Clause"
] | 1 | 2018-10-11T07:57:55.000Z | 2018-10-11T07:57:55.000Z | # pylint: skip-file
# type: ignore
# -*- coding: utf-8 -*-
#
# tests.models.usage_profile.usage_profile_integration_test.py is part of The
# RAMSTK Project
#
# All rights reserved.
# Copyright since 2007 Doyle "weibullguy" Rowland doyle.rowland <AT> reliaqual <DOT> com
"""Class for testing Usage Profile integrations."""
# Third Party Imports
import pytest
from pubsub import pub
from treelib import Tree
# RAMSTK Package Imports
from ramstk.models.dbrecords import (
RAMSTKEnvironmentRecord,
RAMSTKMissionPhaseRecord,
RAMSTKMissionRecord,
)
from ramstk.models.dbviews import RAMSTKUsageProfileView
@pytest.mark.usefixtures(
"test_view_model",
"test_mission_table_model",
"test_mission_phase_table_model",
"test_environment_table_model",
)
class TestSelectUsageProfile:
"""Class for testing Usage Profile do_select() and do_select_all() methods."""
def on_succeed_on_select_all(self, tree):
"""Listen for succeed_retrieve messages."""
assert isinstance(tree, Tree)
assert isinstance(tree.get_node("1").data["usage_profile"], RAMSTKMissionRecord)
assert isinstance(
tree.get_node("1.1").data["usage_profile"], RAMSTKMissionPhaseRecord
)
assert isinstance(
tree.get_node("1.1.1").data["usage_profile"], RAMSTKEnvironmentRecord
)
print("\033[36m\n\tsucceed_retrieve_usage_profile topic was broadcast.")
@pytest.mark.integration
def test_on_select_all(
self,
test_view_model,
test_mission_table_model,
test_mission_phase_table_model,
test_environment_table_model,
):
"""Should return records tree with missions, mission phases, environments."""
pub.subscribe(self.on_succeed_on_select_all, "succeed_retrieve_usage_profile")
test_mission_table_model.do_select_all(attributes={"revision_id": 1})
test_mission_phase_table_model.do_select_all(attributes={"revision_id": 1})
test_environment_table_model.do_select_all(attributes={"revision_id": 1})
assert isinstance(
test_view_model.tree.get_node("1").data["usage_profile"],
RAMSTKMissionRecord,
)
assert isinstance(
test_view_model.tree.get_node("1.1").data["usage_profile"],
RAMSTKMissionPhaseRecord,
)
assert isinstance(
test_view_model.tree.get_node("1.1.1").data["usage_profile"],
RAMSTKEnvironmentRecord,
)
assert isinstance(
test_view_model.tree.get_node("2").data["usage_profile"],
RAMSTKMissionRecord,
)
assert isinstance(
test_view_model.tree.get_node("2.2").data["usage_profile"],
RAMSTKMissionPhaseRecord,
)
assert isinstance(
test_view_model.tree.get_node("2.2.2").data["usage_profile"],
RAMSTKEnvironmentRecord,
)
assert isinstance(
test_view_model.tree.get_node("3").data["usage_profile"],
RAMSTKMissionRecord,
)
assert isinstance(
test_view_model.tree.get_node("3.3").data["usage_profile"],
RAMSTKMissionPhaseRecord,
)
assert isinstance(
test_view_model.tree.get_node("3.3.3").data["usage_profile"],
RAMSTKEnvironmentRecord,
)
pub.unsubscribe(self.on_succeed_on_select_all, "succeed_retrieve_usage_profile")
@pytest.mark.integration
def test_on_select_all_populated_tree(
self,
test_view_model,
test_mission_table_model,
test_mission_phase_table_model,
test_environment_table_model,
):
"""Should clear existing nodes from the records tree and then re-populate."""
test_mission_table_model.do_select_all(attributes={"revision_id": 1})
test_mission_phase_table_model.do_select_all(attributes={"revision_id": 1})
test_environment_table_model.do_select_all(attributes={"revision_id": 1})
assert isinstance(
test_view_model.tree.get_node("1").data["usage_profile"],
RAMSTKMissionRecord,
)
assert isinstance(
test_view_model.tree.get_node("1.1").data["usage_profile"],
RAMSTKMissionPhaseRecord,
)
assert isinstance(
test_view_model.tree.get_node("1.1.1").data["usage_profile"],
RAMSTKEnvironmentRecord,
)
pub.subscribe(self.on_succeed_on_select_all, "succeed_retrieve_usage_profile")
test_view_model.on_select_all()
assert isinstance(
test_view_model.tree.get_node("1").data["usage_profile"],
RAMSTKMissionRecord,
)
assert isinstance(
test_view_model.tree.get_node("1.1").data["usage_profile"],
RAMSTKMissionPhaseRecord,
)
assert isinstance(
test_view_model.tree.get_node("1.1.1").data["usage_profile"],
RAMSTKEnvironmentRecord,
)
pub.unsubscribe(self.on_succeed_on_select_all, "succeed_retrieve_usage_profile")
@pytest.mark.integration
def test_on_select_all_empty_base_tree(
self,
test_view_model,
test_mission_table_model,
test_mission_phase_table_model,
test_environment_table_model,
):
"""Should return an empty records tree if the base tree is empty."""
test_view_model._dic_trees["mission"] = Tree()
assert test_view_model.on_select_all() is None
assert test_view_model.tree.depth() == 0
@pytest.mark.usefixtures(
"test_view_model",
"test_mission_table_model",
"test_mission_phase_table_model",
"test_environment_table_model",
)
class TestInsertUsageProfile:
"""Class for testing the Usage Profile on_insert() method."""
def on_succeed_insert_mission(self, tree):
"""Listen for succeed_insert messages."""
assert isinstance(tree, Tree)
assert tree.contains("4")
print(
"\033[36m\n\tsucceed_insert_mission topic was broadcast on mission insert."
)
def on_succeed_insert_mission_phase(self, tree):
"""Listen for succeed_insert messages."""
assert isinstance(tree, Tree)
assert tree.contains("1.4")
print(
"\033[36m\n\tsucceed_insert_mission_phase topic was broadcast on mission "
"phase insert."
)
def on_succeed_insert_environment(self, tree):
"""Listen for succeed_insert messages."""
assert isinstance(tree, Tree)
assert tree.contains("1.1.4")
print(
"\033[36m\n\tsucceed_insert_environment topic was broadcast on "
"environment insert."
)
@pytest.mark.integration
def test_do_insert_mission(
self,
test_view_model,
test_mission_table_model,
test_mission_phase_table_model,
test_environment_table_model,
):
"""Should add a new mission record to the records tree."""
test_mission_table_model.do_select_all(attributes={"revision_id": 1})
test_mission_phase_table_model.do_select_all(
attributes={"revision_id": 1, "mission_id": 1}
)
test_environment_table_model.do_select_all(
attributes={"revision_id": 1, "mission_phase_id": 1}
)
assert not test_view_model.tree.contains("4")
pub.subscribe(self.on_succeed_insert_mission, "succeed_retrieve_usage_profile")
pub.sendMessage(
"request_insert_mission",
attributes={
"revision_id": 1,
"mission_id": 1,
},
)
pub.unsubscribe(
self.on_succeed_insert_mission, "succeed_retrieve_usage_profile"
)
@pytest.mark.integration
def test_do_insert_mission_phase(
self,
test_view_model,
test_mission_table_model,
test_mission_phase_table_model,
test_environment_table_model,
):
"""Should add a new mission phase record to the records tree."""
test_mission_table_model.do_select_all(attributes={"revision_id": 1})
test_mission_phase_table_model.do_select_all(
attributes={"revision_id": 1, "mission_id": 1}
)
test_environment_table_model.do_select_all(
attributes={"revision_id": 1, "mission_phase_id": 1}
)
assert not test_view_model.tree.contains("1.4")
pub.subscribe(
self.on_succeed_insert_mission_phase, "succeed_retrieve_usage_profile"
)
pub.sendMessage(
"request_insert_mission_phase",
attributes={
"revision_id": 1,
"mission_id": 1,
"mission_phase_id": 1,
},
)
assert test_view_model.tree.contains("1.4")
pub.unsubscribe(
self.on_succeed_insert_mission_phase, "succeed_retrieve_usage_profile"
)
@pytest.mark.integration
def test_do_insert_environment(
self,
test_view_model,
test_mission_table_model,
test_mission_phase_table_model,
test_environment_table_model,
):
"""Should add a new environment record to the records tree."""
test_mission_table_model.do_select_all(attributes={"revision_id": 1})
test_mission_phase_table_model.do_select_all(
attributes={"revision_id": 1, "mission_id": 1}
)
test_environment_table_model.do_select_all(
attributes={"revision_id": 1, "mission_id": 1, "mission_phase_id": 1}
)
assert not test_view_model.tree.contains("1.1.4")
pub.subscribe(
self.on_succeed_insert_environment, "succeed_retrieve_usage_profile"
)
pub.sendMessage(
"request_insert_environment",
attributes={
"revision_id": 1,
"mission_id": 1,
"mission_phase_id": 1,
"environment_id": 1,
"name": "Condition Name",
},
)
assert test_view_model.tree.contains("1.1.4")
pub.unsubscribe(
self.on_succeed_insert_environment, "succeed_retrieve_usage_profile"
)
@pytest.mark.usefixtures(
"test_view_model",
"test_mission_table_model",
"test_mission_phase_table_model",
"test_environment_table_model",
)
class TestDeleteUsageProfile:
"""Class for testing the Usage Profile do_delete() method."""
def on_succeed_delete_mission(self, tree):
"""Listen for succeed_delete messages."""
assert isinstance(tree, Tree)
assert not tree.contains("1.1.1")
assert not tree.contains("1.1")
assert not tree.contains("1")
print(
"\033[36m\n\tsucceed_retrieve_usage_profile topic was broadcast on mission "
"delete."
)
def on_succeed_delete_mission_phase(self, tree):
"""Listen for succeed_delete messages."""
assert isinstance(tree, Tree)
assert not tree.contains("2.2.2")
assert not tree.contains("2.2")
assert tree.contains("2")
print(
"\033[36m\n\tsucceed_retrieve_usage_profile topic was broadcast on mission "
"phase delete."
)
def on_succeed_delete_environment(self, tree):
"""Listen for succeed_delete messages."""
assert isinstance(tree, Tree)
assert not tree.contains("3.3.3")
assert tree.contains("3.3")
assert tree.contains("3")
print(
"\033[36m\n\tsucceed_retrieve_usage_profile topic was broadcast on "
"environment delete."
)
@pytest.mark.integration
def test_do_delete_environment(
self,
test_view_model,
test_mission_table_model,
test_mission_phase_table_model,
test_environment_table_model,
):
"""Should remove deleted environment record from the records tree."""
test_mission_table_model.do_select_all(attributes={"revision_id": 1})
test_mission_phase_table_model.do_select_all(
attributes={"revision_id": 1, "mission_id": 3}
)
test_environment_table_model.do_select_all(
attributes={"revision_id": 1, "mission_phase_id": 3}
)
assert test_view_model.tree.contains("3.3.3")
assert test_view_model.tree.contains("3.3")
assert test_view_model.tree.contains("3")
pub.subscribe(
self.on_succeed_delete_environment, "succeed_retrieve_usage_profile"
)
pub.sendMessage("request_delete_environment", node_id=3)
pub.unsubscribe(
self.on_succeed_delete_environment, "succeed_retrieve_usage_profile"
)
@pytest.mark.integration
def test_do_delete_mission_phase(
self,
test_view_model,
test_mission_table_model,
test_mission_phase_table_model,
test_environment_table_model,
):
"""Should remove deleted phase and environment records from records tree."""
test_mission_table_model.do_select_all(attributes={"revision_id": 1})
test_mission_phase_table_model.do_select_all(
attributes={"revision_id": 1, "mission_id": 2}
)
test_environment_table_model.do_select_all(
attributes={"revision_id": 1, "mission_phase_id": 2}
)
assert test_view_model.tree.contains("2.2.2")
assert test_view_model.tree.contains("2.2")
assert test_view_model.tree.contains("2")
pub.subscribe(
self.on_succeed_delete_mission_phase, "succeed_retrieve_usage_profile"
)
pub.sendMessage("request_delete_mission_phase", node_id=2)
pub.unsubscribe(
self.on_succeed_delete_mission_phase, "succeed_retrieve_usage_profile"
)
@pytest.mark.integration
def test_do_delete_mission(
self,
test_view_model,
test_mission_table_model,
test_mission_phase_table_model,
test_environment_table_model,
):
"""Should remove deleted mission, phase, and environment from records tree."""
test_mission_table_model.do_select_all(attributes={"revision_id": 1})
test_mission_phase_table_model.do_select_all(
attributes={"revision_id": 1, "mission_id": 1}
)
test_environment_table_model.do_select_all(
attributes={"revision_id": 1, "mission_phase_id": 1}
)
assert test_view_model.tree.contains("1.1.1")
assert test_view_model.tree.contains("1.1")
assert test_view_model.tree.contains("1")
pub.subscribe(self.on_succeed_delete_mission, "succeed_retrieve_usage_profile")
pub.sendMessage("request_delete_mission", node_id=1)
pub.unsubscribe(
self.on_succeed_delete_mission, "succeed_retrieve_usage_profile"
)
| 34.054545 | 88 | 0.648692 | 1,733 | 14,984 | 5.234853 | 0.079631 | 0.066138 | 0.064484 | 0.056217 | 0.889881 | 0.853726 | 0.818563 | 0.795745 | 0.763558 | 0.672399 | 0 | 0.016073 | 0.256741 | 14,984 | 439 | 89 | 34.132118 | 0.798509 | 0.091831 | 0 | 0.584527 | 0 | 0 | 0.178558 | 0.085797 | 0 | 0 | 0 | 0 | 0.151862 | 1 | 0.045845 | false | 0 | 0.014327 | 0 | 0.068768 | 0.020057 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
55cf6a2582273a280e4356d752cb29b4f15375e6 | 168 | py | Python | swampyer/__init__.py | zabertech/python-swampyer | 4181db7d11439cc2a79b5a7d7b2b7c581abf59ec | [
"MIT"
] | 1 | 2022-03-10T00:10:19.000Z | 2022-03-10T00:10:19.000Z | swampyer/__init__.py | zabertech/python-swampyer | 4181db7d11439cc2a79b5a7d7b2b7c581abf59ec | [
"MIT"
] | 4 | 2021-05-12T21:56:56.000Z | 2021-05-12T22:02:20.000Z | swampyer/__init__.py | zabertech/python-swampyer | 4181db7d11439cc2a79b5a7d7b2b7c581abf59ec | [
"MIT"
] | 2 | 2018-06-10T12:50:21.000Z | 2019-10-23T17:16:26.000Z | from .common import *
from .messages import *
from .utils import logger
from .exceptions import *
from .transport import *
from .queues import *
from .client import *
| 18.666667 | 25 | 0.75 | 22 | 168 | 5.727273 | 0.454545 | 0.396825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172619 | 168 | 8 | 26 | 21 | 0.906475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
55ee606c2fe5385d533c68b78a27c069b69af8d4 | 4,765 | py | Python | maximum_profit_dp.py | ndsvw/Maximum-Profit-Problem | a97359039f448fcb110573d68afc7e87e47821d8 | [
"MIT"
] | null | null | null | maximum_profit_dp.py | ndsvw/Maximum-Profit-Problem | a97359039f448fcb110573d68afc7e87e47821d8 | [
"MIT"
] | null | null | null | maximum_profit_dp.py | ndsvw/Maximum-Profit-Problem | a97359039f448fcb110573d68afc7e87e47821d8 | [
"MIT"
] | null | null | null | def maximum_profit_unlimited_dp(prices):
"""
A method that calculates the maximum profit that can be made by buying and selling unlimited often
Problem description: https://practice.geeksforgeeks.org/problems/maximum-profit/0
time complexity: O(n)
space complexity: O(n)
Parameters
----------
prices : int[]
a list of int values representing the price of something over a time span
Returns
-------
x : int
the maximum profit that can be made by buying and selling unlimited often
"""
n = len(prices)
if n == 0:
return 0
s, t = [None] * n, [None] * n
s[0], t[0] = -prices[0], 0
for i in range(1, n):
t[i] = max(prices[i] + s[i-1], t[i-1])
s[i] = max(-prices[i] + t[i-1], s[i-1])
return t[n-1]
def maximum_profit_limited_dp(prices, k):
"""
A method that calculates the maximum profit that can be made by buying and selling at most k times
Problem description: https://practice.geeksforgeeks.org/problems/maximum-profit/0
time complexity: O(n*k)
space complexity: O(n*k)
Parameters
----------
prices : int[]
a list of int values representing the price of something over a time span
k : int
integer that restricts how often you can buy and sell
Returns
-------
x : int
the maximum profit that can be made by buying and selling unlimited often
"""
n = len(prices)
if n == 0:
return 0
s = [[0 for _ in range(n)] for _ in range(k+1)]
t = [[0 for _ in range(n)] for _ in range(k+1)]
for i in range(k+1):
s[i][0], t[i][0] = -prices[0], 0
for j in range(1, k+1):
for i in range(1, n):
t[j][i] = max(prices[i] + s[j][i-1], t[j][i-1])
s[j][i] = max(-prices[i] + t[j-1][i-1], s[j][i-1])
return t[k][n-1]
def maximum_profit_limited_spaceoptimized1_dp(prices, k):
"""
A method that calculates the maximum profit that can be made by buying and selling at most k times (space-optimized 1/2)
Problem description: https://practice.geeksforgeeks.org/problems/maximum-profit/0
time complexity: O(n*k)
space complexity: O(n*k)
Parameters
----------
prices : int[]
a list of int values representing the price of something over a time span
k : int
integer that restricts how often you can buy and sell
Returns
-------
x : int
the maximum profit that can be made by buying and selling unlimited often
"""
n = len(prices)
if n == 0:
return 0
s = [0 for _ in range(n)]
t = [[0 for _ in range(n)] for _ in range(k+1)]
s[0] = -prices[0]
for i in range(k+1):
t[i][0] = 0
for j in range(1, k+1):
for i in range(1, n):
t[j][i] = max(prices[i] + s[i-1], t[j][i-1])
s[i] = max(-prices[i] + t[j-1][i-1], s[i-1])
return t[k][n-1]
def maximum_profit_limited_spaceoptimized2_dp(prices, k):
"""
A method that calculates the maximum profit that can be made by buying and selling at most k times (space-optimized 2/2)
Problem description: https://practice.geeksforgeeks.org/problems/maximum-profit/0
time complexity: O(n*k)
space complexity: O(n)
Parameters
----------
prices : int[]
a list of int values representing the price of something over a time span
k : int
integer that restricts how often you can buy and sell
Returns
-------
x : int
the maximum profit that can be made by buying and selling unlimited often
"""
n = len(prices)
if n == 0:
return 0
s = [0 for _ in range(n)]
t = [
[0 for _ in range(n)],
[0 for _ in range(n)]
]
s[0] = -prices[0]
t[0][0] = 0
for _ in range(1, k+1):
for i in range(1, n):
t[1][i] = max(prices[i] + s[i-1], t[1][i-1])
s[i] = max(-prices[i] + t[0][i-1], s[i-1])
t[0] = [e for e in t[1]]
return t[1][n-1]
def max_profit_dp(prices, k):
"""
A method that calculates the maximum profit that can be made by buying and selling at most k times (space-optimized)
Problem description: https://practice.geeksforgeeks.org/problems/maximum-profit/0
time complexity: O(n*k)
space complexity: O(n)
Parameters
----------
prices : int[]
a list of int values representing the price of something over a time span
k : int
integer that restricts how often you can buy and sell
Returns
-------
x : int
the maximum profit that can be made by buying and selling unlimited often
"""
return maximum_profit_limited_spaceoptimized2_dp(prices, k)
| 30.158228 | 124 | 0.581952 | 766 | 4,765 | 3.579634 | 0.0953 | 0.094821 | 0.040117 | 0.072939 | 0.951131 | 0.936543 | 0.917943 | 0.884026 | 0.859956 | 0.859956 | 0 | 0.02577 | 0.291501 | 4,765 | 157 | 125 | 30.350318 | 0.786434 | 0.547954 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.089286 | false | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
36251b042ddda173f5949810b59cf3ed4f60608b | 32 | py | Python | datasets/torchvision_datasets/__init__.py | jaehyek/deformable-DETR-2 | 930f36b9c491e35ce7870f2711c28243152ee058 | [
"MIT"
] | 12 | 2021-03-16T15:33:06.000Z | 2022-03-03T00:31:52.000Z | datasets/torchvision_datasets/__init__.py | ver0z/Deformable-DETR | 7f1f4ffd1d716f681c7cbb2570e2c7a3d4bcf417 | [
"Apache-2.0"
] | 3 | 2021-07-15T20:55:13.000Z | 2022-01-20T11:56:05.000Z | datasets/torchvision_datasets/__init__.py | ver0z/Deformable-DETR | 7f1f4ffd1d716f681c7cbb2570e2c7a3d4bcf417 | [
"Apache-2.0"
] | 6 | 2021-03-16T15:26:15.000Z | 2021-12-29T01:55:15.000Z |
from .coco import CocoDetection | 16 | 31 | 0.84375 | 4 | 32 | 6.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 2 | 31 | 16 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
364c2263fe67f476dab34b9f46e5ace20a13fe64 | 5,940 | py | Python | testsuite/mix-reg/run.py | luyatshimbalanga/OpenShadingLanguage | 2120647911af732f0d12d70e2f7f4e1ebe8fadcb | [
"BSD-3-Clause"
] | 1,105 | 2015-01-02T20:47:19.000Z | 2021-01-25T13:20:56.000Z | testsuite/mix-reg/run.py | luyatshimbalanga/OpenShadingLanguage | 2120647911af732f0d12d70e2f7f4e1ebe8fadcb | [
"BSD-3-Clause"
] | 696 | 2015-01-07T23:42:08.000Z | 2021-01-25T03:55:08.000Z | testsuite/mix-reg/run.py | luyatshimbalanga/OpenShadingLanguage | 2120647911af732f0d12d70e2f7f4e1ebe8fadcb | [
"BSD-3-Clause"
] | 248 | 2015-01-05T13:41:28.000Z | 2021-01-24T23:29:55.000Z | #!/usr/bin/env python
# Copyright Contributors to the Open Shading Language project.
# SPDX-License-Identifier: BSD-3-Clause
# https://github.com/AcademySoftwareFoundation/OpenShadingLanguage
# mix float, float, float includes masking
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_float_u_float_u_float.tif test_mix_u_float_u_float_u_float")
outputs.append ("mix_u_float_u_float_u_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_float_u_float_v_float.tif test_mix_u_float_u_float_v_float")
outputs.append ("mix_u_float_u_float_v_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_float_u_float_v_float.tif test_mix_v_float_u_float_v_float")
outputs.append ("mix_v_float_u_float_v_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_float_v_float_v_float.tif test_mix_u_float_v_float_v_float")
outputs.append ("mix_u_float_v_float_v_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_float_v_float_u_float.tif test_mix_u_float_v_float_u_float")
outputs.append ("mix_u_float_v_float_u_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_float_u_float_u_float.tif test_mix_v_float_u_float_u_float")
outputs.append ("mix_v_float_u_float_u_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_float_v_float_u_float.tif test_mix_v_float_v_float_u_float")
outputs.append ("mix_v_float_v_float_u_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_float_v_float_v_float.tif test_mix_v_float_v_float_v_float")
outputs.append ("mix_v_float_v_float_v_float.tif")
command += testshade("--vary_udxdy --vary_udxdy -t 1 -g 32 32 -od uint8 -o Cout mix_v_dfloat_v_dfloat_v_dfloat.tif test_mix_v_dfloat_v_dfloat_v_dfloat")
outputs.append ("mix_v_dfloat_v_dfloat_v_dfloat.tif")
command += testshade("--vary_udxdy --vary_udxdy -t 1 -g 32 32 -od uint8 -o Cout mix_v_dfloat_v_dfloat_c_float.tif test_mix_v_dfloat_v_dfloat_c_float")
outputs.append ("mix_v_dfloat_v_dfloat_c_float.tif")
# mix vector, vector, float includes masking
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_vector_u_vector_u_float.tif test_mix_u_vector_u_vector_u_float")
outputs.append ("mix_u_vector_u_vector_u_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_vector_u_vector_v_float.tif test_mix_u_vector_u_vector_v_float")
outputs.append ("mix_u_vector_u_vector_v_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_vector_v_vector_u_float.tif test_mix_u_vector_v_vector_u_float")
outputs.append ("mix_u_vector_v_vector_u_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_vector_v_vector_v_float.tif test_mix_u_vector_v_vector_v_float")
outputs.append ("mix_u_vector_v_vector_v_float.tif")
command += testshade("--vary_udxdy --vary_udxdy -t 1 -g 32 32 -od uint8 -o Cout mix_v_dvector_v_dvector_c_float.tif test_mix_v_dvector_v_dvector_c_float")
outputs.append ("mix_v_dvector_v_dvector_c_float.tif")
command += testshade("--vary_udxdy --vary_udxdy -t 1 -g 32 32 -od uint8 -o Cout mix_v_dvector_v_dvector_v_dfloat.tif test_mix_v_dvector_v_dvector_v_dfloat")
outputs.append ("mix_v_dvector_v_dvector_v_dfloat.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_vector_u_vector_u_float.tif test_mix_v_vector_u_vector_u_float")
outputs.append ("mix_v_vector_u_vector_u_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_vector_u_vector_v_float.tif test_mix_v_vector_u_vector_v_float")
outputs.append ("mix_v_vector_u_vector_v_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_vector_v_vector_u_float.tif test_mix_v_vector_v_vector_u_float")
outputs.append ("mix_v_vector_v_vector_u_float.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_vector_v_vector_v_float.tif test_mix_v_vector_v_vector_v_float")
outputs.append ("mix_v_vector_v_vector_v_float.tif")
# mix vector, vector, vector includes masking
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_vector_u_vector_u_vector.tif test_mix_u_vector_u_vector_u_vector")
outputs.append ("mix_u_vector_u_vector_u_vector.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_vector_u_vector_v_vector.tif test_mix_u_vector_u_vector_v_vector")
outputs.append ("mix_u_vector_u_vector_v_vector.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_vector_v_vector_u_vector.tif test_mix_u_vector_v_vector_u_vector")
outputs.append ("mix_u_vector_v_vector_u_vector.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_u_vector_v_vector_v_vector.tif test_mix_u_vector_v_vector_v_vector")
outputs.append ("mix_u_vector_v_vector_v_vector.tif")
command += testshade("--vary_udxdy --vary_udxdy -t 1 -g 32 32 -od uint8 -o Cout mix_v_dvector_v_dvector_c_vector.tif test_mix_v_dvector_v_dvector_c_vector")
outputs.append ("mix_v_dvector_v_dvector_c_vector.tif")
command += testshade("--vary_udxdy --vary_udxdy -t 1 -g 32 32 -od uint8 -o Cout mix_v_dvector_v_dvector_v_dvector.tif test_mix_v_dvector_v_dvector_v_dvector")
outputs.append ("mix_v_dvector_v_dvector_v_dvector.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_vector_u_vector_u_vector.tif test_mix_v_vector_u_vector_u_vector")
outputs.append ("mix_v_vector_u_vector_u_vector.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_vector_u_vector_v_vector.tif test_mix_v_vector_u_vector_v_vector")
outputs.append ("mix_v_vector_u_vector_v_vector.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_vector_v_vector_u_vector.tif test_mix_v_vector_v_vector_u_vector")
outputs.append ("mix_v_vector_v_vector_u_vector.tif")
command += testshade("-t 1 -g 32 32 -od uint8 -o Cout mix_v_vector_v_vector_v_vector.tif test_mix_v_vector_v_vector_v_vector")
outputs.append ("mix_v_vector_v_vector_v_vector.tif")
# expect a few LSB failures
failthresh = 0.008
failpercent = 3
| 53.513514 | 158 | 0.811785 | 1,185 | 5,940 | 3.578059 | 0.048945 | 0.099057 | 0.110377 | 0.035377 | 0.947642 | 0.940566 | 0.940566 | 0.915802 | 0.516981 | 0.516981 | 0 | 0.034515 | 0.092761 | 5,940 | 110 | 159 | 54 | 0.752273 | 0.056902 | 0 | 0 | 0 | 0.096774 | 0.743966 | 0.539067 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
365974881dc4f0379ee51a7e302c4dd5c7e252b1 | 39 | py | Python | basics/hello.py | SimoPrG/python_playground | 65e824b219f093c9071a1cffed3a2d43967ed9de | [
"MIT"
] | null | null | null | basics/hello.py | SimoPrG/python_playground | 65e824b219f093c9071a1cffed3a2d43967ed9de | [
"MIT"
] | null | null | null | basics/hello.py | SimoPrG/python_playground | 65e824b219f093c9071a1cffed3a2d43967ed9de | [
"MIT"
] | null | null | null | print "Hello world, I code anywhere :)" | 39 | 39 | 0.717949 | 6 | 39 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 39 | 1 | 39 | 39 | 0.848485 | 0 | 0 | 0 | 0 | 0 | 0.775 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
3689e871bf99c45ed41e1dcce24661b35294362b | 15,295 | py | Python | project/editorial/migrations/0028_auto_20171019_2207.py | cojennin/facet | 230e65316134b3399a35d40034728e61ba63cb2a | [
"MIT"
] | 25 | 2015-07-13T22:16:36.000Z | 2021-11-11T02:45:32.000Z | project/editorial/migrations/0028_auto_20171019_2207.py | cojennin/facet | 230e65316134b3399a35d40034728e61ba63cb2a | [
"MIT"
] | 74 | 2015-12-01T18:57:47.000Z | 2022-03-11T23:25:47.000Z | project/editorial/migrations/0028_auto_20171019_2207.py | cojennin/facet | 230e65316134b3399a35d40034728e61ba63cb2a | [
"MIT"
] | 6 | 2016-01-08T21:12:43.000Z | 2019-05-20T16:07:56.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
import django.contrib.postgres.fields
from django.conf import settings
class Migration(migrations.Migration):
dependencies = [
('editorial', '0027_auto_20171012_2329'),
]
operations = [
migrations.CreateModel(
name='ContentLicense',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.TextField(help_text=b'Name for the license.')),
('terms', models.TextField(help_text=b'Content of the terms.', blank=True)),
('upload', models.FileField(null=True, upload_to=b'license/%Y/%m/%d/', blank=True)),
('organization', models.ForeignKey(help_text=b'Organization that owns this license.', to='editorial.Organization')),
],
options={
'ordering': ['name'],
'verbose_name': 'Content License',
'verbose_name_plural': 'Content Licenses',
},
),
migrations.CreateModel(
name='Facet',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('original', models.BooleanField(default=True, help_text=b'Was this facet originally created by a user from this organization?')),
('creation_date', models.DateTimeField(help_text=b'Day facet was created.', auto_now_add=True)),
('name', models.TextField(help_text=b'Internal name for facet.')),
('headline', models.TextField(help_text=b'Headline of the facet')),
('description', models.TextField(help_text=b'Description of the facet.', blank=True)),
('content', models.TextField(help_text=b'Content of the facet.', blank=True)),
('status', models.CharField(help_text=b'Facet status choice.', max_length=25, choices=[(b'Draft', b'Draft'), (b'Pitch', b'Pitch'), (b'In Progress', b'In Progress'), (b'Edit', b'Edit'), (b'Revision', b'Revision'), (b'Needs Review', b'Needs Review'), (b'Ready', b'Ready')])),
('due_edit', models.DateTimeField(help_text=b'Due for edit.', null=True, blank=True)),
('run_date', models.DateTimeField(help_text=b'Planned run date.', null=True, blank=True)),
('keywords', django.contrib.postgres.fields.ArrayField(default=list, help_text=b'List of keywords for search.', size=None, base_field=models.CharField(max_length=100), blank=True)),
('update_notes', models.TextField(help_text=b'Text commenting regarding any updates or corrections made to the facet.', blank=True)),
('excerpt', models.TextField(help_text=b'Excerpt from the facet.', blank=True)),
('dateline', models.CharField(help_text=b'Where and when the facet was created.', max_length=150, blank=True)),
('share_note', models.TextField(help_text=b'Information for organizations making a copy of the facet.', blank=True)),
('topic_code', models.CharField(help_text=b'Unique code as needed to designate topic or coverage.', max_length=75, blank=True)),
('internal_code', models.CharField(help_text=b'Unique code as needed for ingest sytems or internal use. Use as needed.', max_length=75, blank=True)),
('length', models.CharField(help_text=b'Length of facet for audio or video.', max_length=75, blank=True)),
('wordcount', models.CharField(help_text=b'Wordcount for text-based facets.', max_length=75, blank=True)),
('related_links', models.TextField(help_text=b'Relevant links that can be included with the facet.', blank=True)),
('github_link', models.URLField(help_text=b'Link to code for any custom feature.', max_length=300, blank=True)),
('source', models.TextField(help_text=b'List of sources in the facet.', blank=True)),
('edit_notes', models.TextField(help_text=b'Information regarding allowable extent of editing and suggestions for specific kinds of edits.', blank=True)),
('pronunciations', models.TextField(help_text=b'Information about pronouncing names or potentially difficult words.', blank=True)),
('sponsors', models.TextField(help_text=b'Sponsors or underwriters if need to indicate any.', blank=True)),
('pull_quotes', models.TextField(help_text=b'List of quotes and attributions to be used as pull quotes.', blank=True)),
('embeds', models.TextField(help_text=b'The necessary information to embed something like a Tweet, FB post, map or video.', blank=True)),
('pushed_to_wp', models.BooleanField(default=False, help_text=b'Whether the facet has been pushed to the organization WordPress site.')),
('sidebar_content', models.TextField(help_text=b'Content separate from body text meant for sidebar or inset presentation.', blank=True)),
('series_title', models.TextField(help_text=b'Title of the video series.', blank=True)),
('episode_number', models.CharField(help_text=b'If the video is part of a series, the episode number.', max_length=75, blank=True)),
('usage_rights', models.TextField(help_text=b'Information regarding the usage of the video if shared.', blank=True)),
('tape_datetime', models.DateTimeField(help_text=b'Tape date.', null=True, blank=True)),
('locations', models.TextField(help_text=b'Shoot locations.', blank=True)),
('custom_one', models.TextField(help_text=b'User-defined field.', blank=True)),
('custom_two', models.TextField(help_text=b'User-defined field.', blank=True)),
('custom_three', models.TextField(help_text=b'User-defined field.', blank=True)),
('custom_four', models.TextField(help_text=b'User-defined field.', blank=True)),
('custom_five', models.TextField(help_text=b'User-defined field.', blank=True)),
('audio_assets', models.ManyToManyField(to='editorial.AudioAsset', blank=True)),
('content_license', models.ForeignKey(related_name='facetlicense', blank=True, to='editorial.ContentLicense')),
('credit', models.ManyToManyField(help_text=b'The full user name(s) to be listed as the credit for the facet.', related_name='facetcredit', to=settings.AUTH_USER_MODEL, blank=True)),
('document_assets', models.ManyToManyField(to='editorial.DocumentAsset', blank=True)),
('editor', models.ManyToManyField(help_text=b'The full user name(s) to be listed as the editor(s) for the facet.', related_name='faceteditor', to=settings.AUTH_USER_MODEL, blank=True)),
('image_assets', models.ManyToManyField(to='editorial.ImageAsset', blank=True)),
('organization', models.ForeignKey(help_text=b'Organization that owns this facet.', to='editorial.Organization')),
('owner', models.ForeignKey(related_name='facetowner', to=settings.AUTH_USER_MODEL)),
('producer', models.ForeignKey(related_name='facetproducer', blank=True, to=settings.AUTH_USER_MODEL)),
('story', models.ForeignKey(related_name='facetstory', to='editorial.Story')),
],
options={
'verbose_name': 'Facet',
'verbose_name_plural': 'Facets',
},
),
migrations.CreateModel(
name='FacetContributor',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('user_role', models.CharField(help_text=b'What did the user do?', max_length=255)),
('facet', models.ForeignKey(to='editorial.Facet')),
('user', models.ForeignKey(to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='HistoricalFacet',
fields=[
('id', models.IntegerField(verbose_name='ID', db_index=True, auto_created=True, blank=True)),
('original', models.BooleanField(default=True, help_text=b'Was this facet originally created by a user from this organization?')),
('creation_date', models.DateTimeField(help_text=b'Day facet was created.', editable=False, blank=True)),
('name', models.TextField(help_text=b'Internal name for facet.')),
('headline', models.TextField(help_text=b'Headline of the facet')),
('description', models.TextField(help_text=b'Description of the facet.', blank=True)),
('content', models.TextField(help_text=b'Content of the facet.', blank=True)),
('status', models.CharField(help_text=b'Facet status choice.', max_length=25, choices=[(b'Draft', b'Draft'), (b'Pitch', b'Pitch'), (b'In Progress', b'In Progress'), (b'Edit', b'Edit'), (b'Revision', b'Revision'), (b'Needs Review', b'Needs Review'), (b'Ready', b'Ready')])),
('due_edit', models.DateTimeField(help_text=b'Due for edit.', null=True, blank=True)),
('run_date', models.DateTimeField(help_text=b'Planned run date.', null=True, blank=True)),
('keywords', django.contrib.postgres.fields.ArrayField(default=list, help_text=b'List of keywords for search.', size=None, base_field=models.CharField(max_length=100), blank=True)),
('update_notes', models.TextField(help_text=b'Text commenting regarding any updates or corrections made to the facet.', blank=True)),
('excerpt', models.TextField(help_text=b'Excerpt from the facet.', blank=True)),
('dateline', models.CharField(help_text=b'Where and when the facet was created.', max_length=150, blank=True)),
('share_note', models.TextField(help_text=b'Information for organizations making a copy of the facet.', blank=True)),
('topic_code', models.CharField(help_text=b'Unique code as needed to designate topic or coverage.', max_length=75, blank=True)),
('internal_code', models.CharField(help_text=b'Unique code as needed for ingest sytems or internal use. Use as needed.', max_length=75, blank=True)),
('length', models.CharField(help_text=b'Length of facet for audio or video.', max_length=75, blank=True)),
('wordcount', models.CharField(help_text=b'Wordcount for text-based facets.', max_length=75, blank=True)),
('related_links', models.TextField(help_text=b'Relevant links that can be included with the facet.', blank=True)),
('github_link', models.URLField(help_text=b'Link to code for any custom feature.', max_length=300, blank=True)),
('source', models.TextField(help_text=b'List of sources in the facet.', blank=True)),
('edit_notes', models.TextField(help_text=b'Information regarding allowable extent of editing and suggestions for specific kinds of edits.', blank=True)),
('pronunciations', models.TextField(help_text=b'Information about pronouncing names or potentially difficult words.', blank=True)),
('sponsors', models.TextField(help_text=b'Sponsors or underwriters if need to indicate any.', blank=True)),
('pull_quotes', models.TextField(help_text=b'List of quotes and attributions to be used as pull quotes.', blank=True)),
('embeds', models.TextField(help_text=b'The necessary information to embed something like a Tweet, FB post, map or video.', blank=True)),
('pushed_to_wp', models.BooleanField(default=False, help_text=b'Whether the facet has been pushed to the organization WordPress site.')),
('sidebar_content', models.TextField(help_text=b'Content separate from body text meant for sidebar or inset presentation.', blank=True)),
('series_title', models.TextField(help_text=b'Title of the video series.', blank=True)),
('episode_number', models.CharField(help_text=b'If the video is part of a series, the episode number.', max_length=75, blank=True)),
('usage_rights', models.TextField(help_text=b'Information regarding the usage of the video if shared.', blank=True)),
('tape_datetime', models.DateTimeField(help_text=b'Tape date.', null=True, blank=True)),
('locations', models.TextField(help_text=b'Shoot locations.', blank=True)),
('custom_one', models.TextField(help_text=b'User-defined field.', blank=True)),
('custom_two', models.TextField(help_text=b'User-defined field.', blank=True)),
('custom_three', models.TextField(help_text=b'User-defined field.', blank=True)),
('custom_four', models.TextField(help_text=b'User-defined field.', blank=True)),
('custom_five', models.TextField(help_text=b'User-defined field.', blank=True)),
('history_id', models.AutoField(serialize=False, primary_key=True)),
('history_date', models.DateTimeField()),
('history_type', models.CharField(max_length=1, choices=[('+', 'Created'), ('~', 'Changed'), ('-', 'Deleted')])),
('content_license', models.ForeignKey(related_name='+', on_delete=django.db.models.deletion.DO_NOTHING, db_constraint=False, blank=True, to='editorial.ContentLicense', null=True)),
('history_user', models.ForeignKey(related_name='+', on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL, null=True)),
('organization', models.ForeignKey(related_name='+', on_delete=django.db.models.deletion.DO_NOTHING, db_constraint=False, blank=True, to='editorial.Organization', null=True)),
('owner', models.ForeignKey(related_name='+', on_delete=django.db.models.deletion.DO_NOTHING, db_constraint=False, blank=True, to=settings.AUTH_USER_MODEL, null=True)),
('producer', models.ForeignKey(related_name='+', on_delete=django.db.models.deletion.DO_NOTHING, db_constraint=False, blank=True, to=settings.AUTH_USER_MODEL, null=True)),
('story', models.ForeignKey(related_name='+', on_delete=django.db.models.deletion.DO_NOTHING, db_constraint=False, blank=True, to='editorial.Story', null=True)),
],
options={
'ordering': ('-history_date', '-history_id'),
'get_latest_by': 'history_date',
'verbose_name': 'historical Facet',
},
),
migrations.AddField(
model_name='facet',
name='team',
field=models.ManyToManyField(help_text=b'Users that contributed to a facet. Used to associate multiple users to a facet.', to=settings.AUTH_USER_MODEL, through='editorial.FacetContributor', blank=True),
),
migrations.AddField(
model_name='facet',
name='video_assets',
field=models.ManyToManyField(to='editorial.VideoAsset', blank=True),
),
]
| 91.041667 | 289 | 0.650016 | 1,899 | 15,295 | 5.104792 | 0.148499 | 0.069321 | 0.077986 | 0.113885 | 0.84413 | 0.798123 | 0.785022 | 0.769032 | 0.765319 | 0.765319 | 0 | 0.005216 | 0.21033 | 15,295 | 167 | 290 | 91.586826 | 0.7974 | 0.001373 | 0 | 0.63354 | 0 | 0 | 0.342326 | 0.012179 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.031056 | 0 | 0.049689 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3696ca0009362c14befa65dca0c2c1355d248a07 | 142 | py | Python | src/lambdaplatform/tasks/lambda_function.py | schlarpc/lambdaplatform | 0f58174bf156cdfd03302b6beef998c24b5d0654 | [
"MIT"
] | 4 | 2021-07-14T23:09:59.000Z | 2021-08-07T02:33:34.000Z | src/lambdaplatform/tasks/lambda_function.py | schlarpc/lambdaplatform | 0f58174bf156cdfd03302b6beef998c24b5d0654 | [
"MIT"
] | null | null | null | src/lambdaplatform/tasks/lambda_function.py | schlarpc/lambdaplatform | 0f58174bf156cdfd03302b6beef998c24b5d0654 | [
"MIT"
] | null | null | null | import urllib.request
def handler(event):
return urllib.request.urlopen("https://checkip.amazonaws.com").read().decode("utf-8").strip()
| 23.666667 | 97 | 0.725352 | 19 | 142 | 5.421053 | 0.894737 | 0.252427 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007692 | 0.084507 | 142 | 5 | 98 | 28.4 | 0.784615 | 0 | 0 | 0 | 0 | 0 | 0.239437 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 6 |
36c69227cde80ff3de0a37328a70a7b7e1cea257 | 202 | py | Python | functions/__init__.py | Jerrypiglet/Deformable-Convolution-V2-PyTorch | b7598e95507d1848081f1bb315c67059555e14af | [
"MIT"
] | 3 | 2021-05-15T03:49:39.000Z | 2022-01-24T05:05:04.000Z | functions/__init__.py | Jerrypiglet/Deformable-im2col-unfold-Deformable-Convolution-V2-PyTorch | b7598e95507d1848081f1bb315c67059555e14af | [
"MIT"
] | null | null | null | functions/__init__.py | Jerrypiglet/Deformable-im2col-unfold-Deformable-Convolution-V2-PyTorch | b7598e95507d1848081f1bb315c67059555e14af | [
"MIT"
] | null | null | null | from .deform_conv_func import DeformConvFunction, DeformIm2colFunction
from .modulated_deform_conv_func import ModulatedDeformConvFunction
from .deform_psroi_pooling_func import DeformRoIPoolingFunction | 67.333333 | 70 | 0.920792 | 21 | 202 | 8.47619 | 0.571429 | 0.168539 | 0.157303 | 0.224719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005263 | 0.059406 | 202 | 3 | 71 | 67.333333 | 0.931579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
36e0060030db375dbcc5b414a67e93dee5f6fab4 | 252 | py | Python | rpython/jit/backend/ppc/test/test_del.py | nanjekyejoannah/pypy | e80079fe13c29eda7b2a6b4cd4557051f975a2d9 | [
"Apache-2.0",
"OpenSSL"
] | 381 | 2018-08-18T03:37:22.000Z | 2022-02-06T23:57:36.000Z | rpython/jit/backend/ppc/test/test_del.py | nanjekyejoannah/pypy | e80079fe13c29eda7b2a6b4cd4557051f975a2d9 | [
"Apache-2.0",
"OpenSSL"
] | 16 | 2018-09-22T18:12:47.000Z | 2022-02-22T20:03:59.000Z | rpython/jit/backend/ppc/test/test_del.py | nanjekyejoannah/pypy | e80079fe13c29eda7b2a6b4cd4557051f975a2d9 | [
"Apache-2.0",
"OpenSSL"
] | 55 | 2015-08-16T02:41:30.000Z | 2022-03-20T20:33:35.000Z |
from rpython.jit.backend.ppc.test.support import JitPPCMixin
from rpython.jit.metainterp.test.test_del import DelTests
class TestDel(JitPPCMixin, DelTests):
# for the individual tests see
# ====> ../../../metainterp/test/test_del.py
pass
| 28 | 60 | 0.734127 | 33 | 252 | 5.545455 | 0.636364 | 0.120219 | 0.153005 | 0.229508 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138889 | 252 | 8 | 61 | 31.5 | 0.843318 | 0.281746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
36edab6be5ffa86b73e8b328ce15ed011d2543c2 | 71 | py | Python | jacdac/uv_index/__init__.py | microsoft/jacdac-python | 712ad5559e29065f5eccb5dbfe029c039132df5a | [
"MIT"
] | 1 | 2022-02-15T21:30:36.000Z | 2022-02-15T21:30:36.000Z | jacdac/uv_index/__init__.py | microsoft/jacdac-python | 712ad5559e29065f5eccb5dbfe029c039132df5a | [
"MIT"
] | null | null | null | jacdac/uv_index/__init__.py | microsoft/jacdac-python | 712ad5559e29065f5eccb5dbfe029c039132df5a | [
"MIT"
] | 1 | 2022-02-08T19:32:45.000Z | 2022-02-08T19:32:45.000Z | # Autogenerated file.
from .client import UvIndexClient # type: ignore
| 23.666667 | 48 | 0.788732 | 8 | 71 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140845 | 71 | 2 | 49 | 35.5 | 0.918033 | 0.450704 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
36ef45a718fbec327c4ce18df55d55df45f4a570 | 44,142 | py | Python | sixparser.py | charlesap/ibnf | 0efc87c7f250594d77fe400ba2f90f812ebe5aa1 | [
"MIT",
"Unlicense"
] | 1 | 2019-07-29T11:12:53.000Z | 2019-07-29T11:12:53.000Z | sixparser.py | charlesap/ibnf | 0efc87c7f250594d77fe400ba2f90f812ebe5aa1 | [
"MIT",
"Unlicense"
] | null | null | null | sixparser.py | charlesap/ibnf | 0efc87c7f250594d77fe400ba2f90f812ebe5aa1 | [
"MIT",
"Unlicense"
] | null | null | null | import sys
from binascii import *
fi = file(sys.argv[1]).read()
semantics = file(sys.argv[2]).read()
fo = open(sys.argv[3], "w+")
h={}; registers={}; context={}; mseq=0; dseq=1; T=True; F=False
def n2z( a ):
return ( '0' if a=='' else a )
def be2le( a ):
return a[6:8]+a[4:6]+a[2:4]+a[0:2]
def mark( p, s, t ):
( v, m, ss, l, c, a ) = t
if t[1]: x = p +"-" + str(s); h[x]=(v,m,l,a); return t
else:
if not t[0]: x = p +"-" + str(s); h[x]=(v,m,l,a); return t
return t
def been(p, s):
if h.has_key( p +"-" + str(s) ): return h[p +"-" + str(s)][1]
else: return False
def was(c,p,s): (v,m,l,a) = h[p+"-"+str(s)]; return (v,m,s,l,c,a)
def cm( ch, s, c ):
if s < len(fi):
if fi[s] == ch: return ( T, T, s, 1, c, ( "cm", fi[s] ) )
return ( False, True, s, 0, c, ( "cm", "") )
def andmemo( m ):
r = True
for i in m:
if not m[i]: r = False
return r
outdata = ""
def output( s ):
global outdata
outdata = outdata + str(s)
def syntax_p( s, c):
if been("syntax",s): return was( c, "syntax",s)
else:
mark("syntax",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
srules_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
end_p ( (ts+tl), tc)
if ok:
rv=syntax_s(a,andmemo(mem),s,ts+tl,tc,"syntax")
return mark("syntax",s,rv)
return mark("syntax",s,(F,T,s,0,c,("","")))
def srules_p( s, c):
if been("srules",s): return was( c, "srules",s)
else:
mark("srules",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
srule_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
srules_p ( (ts+tl), tc)
if ok:
rv=srules_s(a,andmemo(mem),s,ts+tl,tc,"srules")
return mark("srules",s,rv)
return mark("srules",s,(F,T,s,0,c,("","")))
def srule_p( s, c):
if been("srule",s): return was( c, "srule",s)
else:
mark("srule",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=blankline_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=base_p(s,c)
if not met:
return mark("srule",s,(F,T,s,0,c,("","")))
else:
return mark("srule",s,(met,mem,s,tl,tc,ta))
def blankline_p( s, c):
if been("blankline",s): return was( c, "blankline",s)
else:
mark("blankline",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(10) ,(ts+tl), tc)
if ok:
rv=blankline_s(a,andmemo(mem),s,ts+tl,tc,"blankline")
return mark("blankline",s,rv)
return mark("blankline",s,(F,T,s,0,c,("","")))
def base_p( s, c):
if been("base",s): return was( c, "base",s)
else:
mark("base",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
name_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
setup_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
body_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
recr_p ( (ts+tl), tc)
if ok:
rv=base_s(a,andmemo(mem),s,ts+tl,tc,"base")
return mark("base",s,rv)
return mark("base",s,(F,T,s,0,c,("","")))
def body_p( s, c):
if been("body",s): return was( c, "body",s)
else:
mark("body",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=qlineset_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=cline_p(s,c)
if not met:
return mark("body",s,(F,T,s,0,c,("","")))
else:
return mark("body",s,(met,mem,s,tl,tc,ta))
def setup_p( s, c):
if been("setup",s): return was( c, "setup",s)
else:
mark("setup",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(':',(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
code_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(10) ,(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
setup_p ( (ts+tl), tc)
if ok:
rv=setup_s(a,andmemo(mem),s,ts+tl,tc,"setup")
return mark("setup",s,rv)
return mark("setup",s,(F,T,s,0,c,("","")))
def recr_p( s, c):
if been("recr",s): return was( c, "recr",s)
else:
mark("recr",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('.',(ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
name_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
rsetup_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
body_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
recr_p ( (ts+tl), tc)
if ok:
rv=recr_s(a,andmemo(mem),s,ts+tl,tc,"recr")
return mark("recr",s,rv)
return mark("recr",s,(F,T,s,0,c,("","")))
def rsetup_p( s, c):
if been("rsetup",s): return was( c, "rsetup",s)
else:
mark("rsetup",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(':',(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
rcode_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(10) ,(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
rsetup_p ( (ts+tl), tc)
if ok:
rv=rsetup_s(a,andmemo(mem),s,ts+tl,tc,"rsetup")
return mark("rsetup",s,rv)
return mark("rsetup",s,(F,T,s,0,c,("","")))
def rcode_p( s, c):
if been("rcode",s): return was( c, "rcode",s)
else:
mark("rcode",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
ritm_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
rcode_p ( (ts+tl), tc)
if ok:
rv=rcode_s(a,andmemo(mem),s,ts+tl,tc,"rcode")
return mark("rcode",s,rv)
return mark("rcode",s,(F,T,s,0,c,("","")))
def ritm_p( s, c):
if been("ritm",s): return was( c, "ritm",s)
else:
mark("ritm",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=string_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=rcr_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=lwr_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=dpathw_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=dhas_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=pnt_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('>',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('<',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('{',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('}',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(':',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('%',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('(',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(',',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(')',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('_',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('[',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(']',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(';',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('+',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('-',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('*',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('/',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('=',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('!',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(' ',s,c)
if not met:
return mark("ritm",s,(F,T,s,0,c,("","")))
else:
return mark("ritm",s,(met,mem,s,tl,tc,ta))
def cline_p( s, c):
if been("cline",s): return was( c, "cline",s)
else:
mark("cline",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('^',(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
code_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(10) ,(ts+tl), tc)
if ok:
rv=cline_s(a,andmemo(mem),s,ts+tl,tc,"cline")
return mark("cline",s,rv)
return mark("cline",s,(F,T,s,0,c,("","")))
def qlineset_p( s, c):
if been("qlineset",s): return was( c, "qlineset",s)
else:
mark("qlineset",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
qlines_p ( (ts+tl), tc)
if ok:
rv=qlineset_s(a,andmemo(mem),s,ts+tl,tc,"qlineset")
return mark("qlineset",s,rv)
return mark("qlineset",s,(F,T,s,0,c,("","")))
def qlines_p( s, c):
if been("qlines",s): return was( c, "qlines",s)
else:
mark("qlines",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
qlsep_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
qline_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
qlines_p ( (ts+tl), tc)
if ok:
rv=qlines_s(a,andmemo(mem),s,ts+tl,tc,"qlines")
return mark("qlines",s,rv)
return mark("qlines",s,(F,T,s,0,c,("","")))
def qlsep_p( s, c):
if been("qlsep",s): return was( c, "qlsep",s)
else:
mark("qlsep",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('|',(ts+tl), tc)
if ok:
rv=qlsep_s(a,andmemo(mem),s,ts+tl,tc,"qlsep")
return mark("qlsep",s,rv)
return mark("qlsep",s,(F,T,s,0,c,("","")))
def qline_p( s, c):
if been("qline",s): return was( c, "qline",s)
else:
mark("qline",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
qchs_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(10) ,(ts+tl), tc)
if ok:
rv=qline_s(a,andmemo(mem),s,ts+tl,tc,"qline")
return mark("qline",s,rv)
return mark("qline",s,(F,T,s,0,c,("","")))
def qchs_p( s, c):
if been("qchs",s): return was( c, "qchs",s)
else:
mark("qchs",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
qch_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
qchs_p ( (ts+tl), tc)
if ok:
rv=qchs_s(a,andmemo(mem),s,ts+tl,tc,"qchs")
return mark("qchs",s,rv)
return mark("qchs",s,(F,T,s,0,c,("","")))
def qch_p( s, c):
if been("qch",s): return was( c, "qch",s)
else:
mark("qch",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=aln_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=qq_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=qt_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=qs_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=qsmb_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(' ',s,c)
if not met: (met,mem,ts,tl,tc,ta)=qcode_p(s,c)
if not met:
return mark("qch",s,(F,T,s,0,c,("","")))
else:
return mark("qch",s,(met,mem,s,tl,tc,ta))
def qq_p( s, c):
if been("qq",s): return was( c, "qq",s)
else:
mark("qq",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(34) ,(ts+tl), tc)
if ok:
rv=qq_s(a,andmemo(mem),s,ts+tl,tc,"qq")
return mark("qq",s,rv)
return mark("qq",s,(F,T,s,0,c,("","")))
def qt_p( s, c):
if been("qt",s): return was( c, "qt",s)
else:
mark("qt",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(39) ,(ts+tl), tc)
if ok:
rv=qt_s(a,andmemo(mem),s,ts+tl,tc,"qt")
return mark("qt",s,rv)
return mark("qt",s,(F,T,s,0,c,("","")))
def qs_p( s, c):
if been("qs",s): return was( c, "qs",s)
else:
mark("qs",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(92) ,(ts+tl), tc)
if ok:
rv=qs_s(a,andmemo(mem),s,ts+tl,tc,"qs")
return mark("qs",s,rv)
return mark("qs",s,(F,T,s,0,c,("","")))
def qcode_p( s, c):
if been("qcode",s): return was( c, "qcode",s)
else:
mark("qcode",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('`',(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
code_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('`',(ts+tl), tc)
if ok:
rv=qcode_s(a,andmemo(mem),s,ts+tl,tc,"qcode")
return mark("qcode",s,rv)
return mark("qcode",s,(F,T,s,0,c,("","")))
def name_p( s, c):
if been("name",s): return was( c, "name",s)
else:
mark("name",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
lwr_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
name_p ( (ts+tl), tc)
if ok:
rv=name_s(a,andmemo(mem),s,ts+tl,tc,"name")
return mark("name",s,rv)
return mark("name",s,(F,T,s,0,c,("","")))
def name_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def qsmb_p( s, c):
if been("qsmb",s): return was( c, "qsmb",s)
else:
mark("qsmb",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=cm('-',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('_',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('+',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('=',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('~',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('!',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('@',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('#',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('$',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('%',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('^',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('&',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('!',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('|',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('/',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(':',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(';',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('*',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('(',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(')',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('[',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(']',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('{',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('}',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(',',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('.',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('<',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('>',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('?',s,c)
if not met:
return mark("qsmb",s,(F,T,s,0,c,("","")))
else:
return mark("qsmb",s,(met,mem,s,tl,tc,ta))
def string_p( s, c):
if been("string",s): return was( c, "string",s)
else:
mark("string",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(34) ,(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
strcs_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(34) ,(ts+tl), tc)
if ok:
rv=string_s(a,andmemo(mem),s,ts+tl,tc,"string")
return mark("string",s,rv)
return mark("string",s,(F,T,s,0,c,("","")))
def strcs_p( s, c):
if been("strcs",s): return was( c, "strcs",s)
else:
mark("strcs",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
sch_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
strcs_p ( (ts+tl), tc)
if ok:
rv=strcs_s(a,andmemo(mem),s,ts+tl,tc,"strcs")
return mark("strcs",s,rv)
return mark("strcs",s,(F,T,s,0,c,("","")))
def code_p( s, c):
if been("code",s): return was( c, "code",s)
else:
mark("code",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
citm_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
code_p ( (ts+tl), tc)
if ok:
rv=code_s(a,andmemo(mem),s,ts+tl,tc,"code")
return mark("code",s,rv)
return mark("code",s,(F,T,s,0,c,("","")))
def citm_p( s, c):
if been("citm",s): return was( c, "citm",s)
else:
mark("citm",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=string_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=cnl_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=rcr_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=lwr_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=dpathw_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=dhas_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=pnt_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('>',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('<',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('{',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('}',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(':',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('%',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('(',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(',',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(')',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('_',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('[',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(']',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(';',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('+',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('-',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('*',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('/',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('=',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('!',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(' ',s,c)
if not met:
return mark("citm",s,(F,T,s,0,c,("","")))
else:
return mark("citm",s,(met,mem,s,tl,tc,ta))
def cnl_p( s, c):
if been("cnl",s): return was( c, "cnl",s)
else:
mark("cnl",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(10) ,(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('^',(ts+tl), tc)
if ok:
rv=cnl_s(a,andmemo(mem),s,ts+tl,tc,"cnl")
return mark("cnl",s,rv)
return mark("cnl",s,(F,T,s,0,c,("","")))
def dpathw_p( s, c):
if been("dpathw",s): return was( c, "dpathw",s)
else:
mark("dpathw",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
dpath_p ( (ts+tl), tc)
if ok:
rv=dpathw_s(a,andmemo(mem),s,ts+tl,tc,"dpathw")
return mark("dpathw",s,rv)
return mark("dpathw",s,(F,T,s,0,c,("","")))
def dpath_p( s, c):
if been("dpath",s): return was( c, "dpath",s)
else:
mark("dpath",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('.',(ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
pnt_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
dpath_p ( (ts+tl), tc)
if ok:
rv=dpath_s(a,andmemo(mem),s,ts+tl,tc,"dpath")
return mark("dpath",s,rv)
return mark("dpath",s,(F,T,s,0,c,("","")))
def dhas_p( s, c):
if been("dhas",s): return was( c, "dhas",s)
else:
mark("dhas",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('.',(ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('.',(ts+tl), tc)
if ok:
rv=dhas_s(a,andmemo(mem),s,ts+tl,tc,"dhas")
return mark("dhas",s,rv)
return mark("dhas",s,(F,T,s,0,c,("","")))
def rcr_p( s, c):
if been("rcr",s): return was( c, "rcr",s)
else:
mark("rcr",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=rca_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=rcb_p(s,c)
if not met:
return mark("rcr",s,(F,T,s,0,c,("","")))
else:
return mark("rcr",s,(met,mem,s,tl,tc,ta))
def rca_p( s, c):
if been("rca",s): return was( c, "rca",s)
else:
mark("rca",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('#',(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
name_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(':',(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
code_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('#',(ts+tl), tc)
if ok:
rv=rca_s(a,andmemo(mem),s,ts+tl,tc,"rca")
return mark("rca",s,rv)
return mark("rca",s,(F,T,s,0,c,("","")))
def rcb_p( s, c):
if been("rcb",s): return was( c, "rcb",s)
else:
mark("rcb",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('#',(ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
code_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('#',(ts+tl), tc)
if ok:
rv=rcb_s(a,andmemo(mem),s,ts+tl,tc,"rcb")
return mark("rcb",s,rv)
return mark("rcb",s,(F,T,s,0,c,("","")))
def end_p( s, c):
if been("end",s): return was( c, "end",s)
else:
mark("end",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
rv=end_s(a,andmemo(mem),s,ts+tl,tc,"end")
return mark("end",s,rv)
return mark("end",s,(F,T,s,0,c,("","")))
def dgt_p( s, c):
if been("dgt",s): return was( c, "dgt",s)
else:
mark("dgt",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=cm('0',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('1',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('2',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('3',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('4',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('5',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('6',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('7',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('8',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('9',s,c)
if not met:
return mark("dgt",s,(F,T,s,0,c,("","")))
else:
return mark("dgt",s,(met,mem,s,tl,tc,ta))
def upr_p( s, c):
if been("upr",s): return was( c, "upr",s)
else:
mark("upr",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=cm('A',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('B',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('C',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('D',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('E',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('F',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('G',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('H',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('I',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('J',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('K',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('L',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('M',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('N',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('O',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('P',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('Q',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('R',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('S',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('T',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('U',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('V',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('W',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('X',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('Y',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('Z',s,c)
if not met:
return mark("upr",s,(F,T,s,0,c,("","")))
else:
return mark("upr",s,(met,mem,s,tl,tc,ta))
def lwr_p( s, c):
if been("lwr",s): return was( c, "lwr",s)
else:
mark("lwr",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=cm('a',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('b',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('c',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('d',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('e',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('f',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('g',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('h',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('i',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('j',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('k',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('l',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('m',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('n',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('o',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('p',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('q',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('r',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('s',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('t',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('u',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('v',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('w',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('x',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('y',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('z',s,c)
if not met:
return mark("lwr",s,(F,T,s,0,c,("","")))
else:
return mark("lwr",s,(met,mem,s,tl,tc,ta))
def alp_p( s, c):
if been("alp",s): return was( c, "alp",s)
else:
mark("alp",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=upr_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=lwr_p(s,c)
if not met:
return mark("alp",s,(F,T,s,0,c,("","")))
else:
return mark("alp",s,(met,mem,s,tl,tc,ta))
def aln_p( s, c):
if been("aln",s): return was( c, "aln",s)
else:
mark("aln",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=upr_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=lwr_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=dgt_p(s,c)
if not met:
return mark("aln",s,(F,T,s,0,c,("","")))
else:
return mark("aln",s,(met,mem,s,tl,tc,ta))
def hex_p( s, c):
if been("hex",s): return was( c, "hex",s)
else:
mark("hex",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=dgt_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('A',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('B',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('C',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('D',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('E',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('F',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('a',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('b',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('c',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('d',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('e',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('f',s,c)
if not met:
return mark("hex",s,(F,T,s,0,c,("","")))
else:
return mark("hex",s,(met,mem,s,tl,tc,ta))
def smb_p( s, c):
if been("smb",s): return was( c, "smb",s)
else:
mark("smb",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=cm('-',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('_',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('+',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('=',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('`',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('~',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('!',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('@',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('#',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('$',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('%',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('^',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('&',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('|',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('/',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(':',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(';',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('*',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('(',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(')',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('[',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(']',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('{',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('}',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm(',',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('.',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('<',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('>',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('?',s,c)
if not met:
return mark("smb",s,(F,T,s,0,c,("","")))
else:
return mark("smb",s,(met,mem,s,tl,tc,ta))
def sps_p( s, c):
if been("sps",s): return was( c, "sps",s)
else:
mark("sps",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=bsl_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=btk_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=bqt_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=bnl_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=btb_p(s,c)
if not met:
return mark("sps",s,(F,T,s,0,c,("","")))
else:
return mark("sps",s,(met,mem,s,tl,tc,ta))
def bsl_p( s, c):
if been("bsl",s): return was( c, "bsl",s)
else:
mark("bsl",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(92) ,(ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(92) ,(ts+tl), tc)
if ok:
rv=bsl_s(a,andmemo(mem),s,ts+tl,tc,"bsl")
return mark("bsl",s,rv)
return mark("bsl",s,(F,T,s,0,c,("","")))
def bsl_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def btk_p( s, c):
if been("btk",s): return was( c, "btk",s)
else:
mark("btk",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(92) ,(ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(39) ,(ts+tl), tc)
if ok:
rv=btk_s(a,andmemo(mem),s,ts+tl,tc,"btk")
return mark("btk",s,rv)
return mark("btk",s,(F,T,s,0,c,("","")))
def btk_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def bqt_p( s, c):
if been("bqt",s): return was( c, "bqt",s)
else:
mark("bqt",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(92) ,(ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(34) ,(ts+tl), tc)
if ok:
rv=bqt_s(a,andmemo(mem),s,ts+tl,tc,"bqt")
return mark("bqt",s,rv)
return mark("bqt",s,(F,T,s,0,c,("","")))
def bqt_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def bnl_p( s, c):
if been("bnl",s): return was( c, "bnl",s)
else:
mark("bnl",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(92) ,(ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('n',(ts+tl), tc)
if ok:
rv=bnl_s(a,andmemo(mem),s,ts+tl,tc,"bnl")
return mark("bnl",s,rv)
return mark("bnl",s,(F,T,s,0,c,("","")))
def bnl_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def btb_p( s, c):
if been("btb",s): return was( c, "btb",s)
else:
mark("btb",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm(chr(92) ,(ts+tl), tc)
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
cm('t',(ts+tl), tc)
if ok:
rv=btb_s(a,andmemo(mem),s,ts+tl,tc,"btb")
return mark("btb",s,rv)
return mark("btb",s,(F,T,s,0,c,("","")))
def btb_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def wsc_p( s, c):
if been("wsc",s): return was( c, "wsc",s)
else:
mark("wsc",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=cm(' ',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('\t',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('\n',s,c)
if not met:
return mark("wsc",s,(F,T,s,0,c,("","")))
else:
return mark("wsc",s,(met,mem,s,tl,tc,ta))
def s_p( s, c):
if been("s",s): return was( c, "s",s)
else:
mark("s",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
sp_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
s_p ( (ts+tl), tc)
if ok:
rv=s_s(a,andmemo(mem),s,ts+tl,tc,"s")
return mark("s",s,rv)
return mark("s",s,(F,T,s,0,c,("","")))
def s_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def sp_p( s, c):
if been("sp",s): return was( c, "sp",s)
else:
mark("sp",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=cm(' ',s,c)
if not met: (met,mem,ts,tl,tc,ta)=cm('\t',s,c)
if not met:
return mark("sp",s,(F,T,s,0,c,("","")))
else:
return mark("sp",s,(met,mem,s,tl,tc,ta))
def sch_p( s, c):
if been("sch",s): return was( c, "sch",s)
else:
mark("sch",s,(F,T,s,0,c,("","")));met = F
if not met: (met,mem,ts,tl,tc,ta)=dgt_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=upr_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=lwr_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=smb_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=wsc_p(s,c)
if not met: (met,mem,ts,tl,tc,ta)=sps_p(s,c)
if not met:
return mark("sch",s,(F,T,s,0,c,("","")))
else:
return mark("sch",s,(met,mem,s,tl,tc,ta))
def chs_p( s, c):
if been("chs",s): return was( c, "chs",s)
else:
mark("chs",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
sch_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
chs_p ( (ts+tl), tc)
if ok:
rv=chs_s(a,andmemo(mem),s,ts+tl,tc,"chs")
return mark("chs",s,rv)
return mark("chs",s,(F,T,s,0,c,("","")))
def chs_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def pnt_p( s, c):
if been("pnt",s): return was( c, "pnt",s)
else:
mark("pnt",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
dgt_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
pnt_p ( (ts+tl), tc)
if ok:
rv=pnt_s(a,andmemo(mem),s,ts+tl,tc,"pnt")
return mark("pnt",s,rv)
return mark("pnt",s,(F,T,s,0,c,("","")))
def pnt_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def als_p( s, c):
if been("als",s): return was( c, "als",s)
else:
mark("als",s,(F,T,s,0,c,("","")))
ok=True; ts=s; tl=0; a={0: ("","")}
mem={0:True}; tc=c; n=0
if ok:
n=n+1; ( ok,mem[n],ts,tl,tc,a[n])=\
aln_p ( (ts+tl), tc)
if ok:
n=n+1; ( nok,mem[n],ts,tl,tc,a[n])=\
als_p ( (ts+tl), tc)
if ok:
rv=als_s(a,andmemo(mem),s,ts+tl,tc,"als")
return mark("als",s,rv)
return mark("als",s,(F,T,s,0,c,("","")))
def als_s(a,m,s,e,c,n): return(T,T,s,e-s,c,(n,fi[s:e]))
def syntax_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "syntax", a[1][1] + a[2][1] ))
def srules_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "srules", a[1][1] + a[2][1] ))
def blankline_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "blankline", "" ))
def base_s(a,m,s,e,c,n):
rbody=" o = \"\"\n rx="+a[2][1]+"_r\n if a != \"\":\n"+a[5][1]+" return (o)\n"
return (T,T,s,e-s,c,( "base", "\n" + \
"def " + a[2][1] + "_s(a,m,s,e,c,n):\n" + \
"" + (" rx="+a[2][1]+"_r " if a[5][1] != "" else "") + "\n" + \
"" + (" "+a[3][1] if a[3][1] != "" else "") + "\n" + \
" return (T,T,s,e-s,c,( \"" + a[2][1] + "\", " + a[4][1] + " ))\n" + \
"" + ("def "+a[2][1]+"_r(a,m,s,e,c,n):\n"+rbody if a[5][1] != "" else "") + "" ))
def setup_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "setup", ( ( a[4][1] + "\n " + a[7][1]) if a[7][1] != "" else a[4][1] ) ))
def rsetup_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "rsetup", ( ( " " + a[4][1] + "\n " + a[7][1]+"\n") if a[7][1] != "" else "\n "+a[4][1]+"\n" ) ))
def recr_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "recr", " if a[0] ==\"" + a[3][1] + "\":\n" + a[4][1] + " o=o+" + a[5][1] +"\n"+ a[6][1] ))
def cline_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "cline", a[4][1] ))
def qlineset_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "qlineset", "\"" + a[1][1] + "\"" ))
def qlines_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "qlines", a[2][1] + a[3][1] + a[4][1] ))
def qlsep_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "qlsep", "\\n\" + \\\n\"" ))
def qline_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "qline", a[1][1] ))
def qchs_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "qchs", a[1][1] + a[2][1] ))
def qq_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "qq", "\\\"" ))
def qt_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "qt", "\\\'" ))
def qs_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "qs", "\\\\" ))
def qcode_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "qcode", "\" + " + a[3][1] + " + \"" ))
def string_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "string", "\"" + a[2][1] + "\"" ))
def strcs_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "strcs", a[1][1] + a[2][1] ))
def code_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "code", a[1][1] + a[2][1] ))
def rcode_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "rcode", a[1][1] + a[2][1] ))
def cnl_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "cnl", "\\\n" ))
def dpathw_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "dpathw", "a" + a[1][1] ))
def dpath_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "dpath", "[" + a[2][1] + "]" + a[3][1] ))
def dhas_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "dhas", "a" ))
def rca_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "rca", a[3][1] +"_s(" + a[7][1] + ",m,s,e,c,n)[5][1]" ))
def rcb_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "rcb", "rx(" + a[3][1] + ",m,s,e,c,n)" ))
def end_s(a,m,s,e,c,n):
return (T,T,s,e-s,c,( "end", "\n" + \
"prologue=\"\"\"import sys\n" + \
"from binascii import *\n" + \
"fi = file(sys.argv[1]).read()\n" + \
"semantics = file(sys.argv[2]).read()\n" + \
"fo = open(sys.argv[3], \"w+\")\n" + \
"\n" + \
"h={}; registers={}; context={}; mseq=0; dseq=1; T=True; F=False\n" + \
"\n" + \
"def n2z( a ):\n" + \
" return ( \'0\' if a==\'\' else a )\n" + \
"\n" + \
"def be2le( a ):\n" + \
" return a[6:8]+a[4:6]+a[2:4]+a[0:2]\n" + \
"\n" + \
"def mark( p, s, t ):\n" + \
" ( v, m, ss, l, c, a ) = t\n" + \
" if t[1]: x = p +\"-\" + str(s); h[x]=(v,m,l,a); return t\n" + \
" else:\n" + \
" if not t[0]: x = p +\"-\" + str(s); h[x]=(v,m,l,a); return t\n" + \
" return t\n" + \
"\n" + \
"def been(p, s):\n" + \
" if h.has_key( p +\"-\" + str(s) ): return h[p +\"-\" + str(s)][1]\n" + \
" else: return False\n" + \
"\n" + \
"def was(c,p,s): (v,m,l,a) = h[p+\"-\"+str(s)]; return (v,m,s,l,c,a) \n" + \
"\n" + \
"def cm( ch, s, c ):\n" + \
" if s < len(fi):\n" + \
" if fi[s] == ch: return ( T, T, s, 1, c, ( \"cm\", fi[s] ) )\n" + \
" return ( False, True, s, 0, c, ( \"cm\", \"\") )\n" + \
"\n" + \
"def andmemo( m ):\n" + \
" r = True\n" + \
" for i in m:\n" + \
" if not m[i]: r = False\n" + \
" return r\n" + \
"\n" + \
"outdata = \"\"\n" + \
"\n" + \
"def output( s ):\n" + \
" global outdata\n" + \
" outdata = outdata + str(s)\n" + \
"\n" + \
"\"\"\"; epilogue=\"\"\"\n" + \
"\n" + \
"(v,m,s,l,c,a) = syntax_p( 0, ({},\'<1>\',\'<0>\') )\n" + \
"if v: \n" + \
" print \"Parsed \"+a[0]+\" OK\"\n" + \
"else: print \"Failed to Parse\"\n" + \
"print >> fo, a[1] \n" + \
"fo.close()\n" + \
"\"\"\"" ))
def build (v,m,s,l,c,a): return 'success'
T=True; F=False
prologue = """import sys
from binascii import *
fi = file(sys.argv[1]).read()
semantics = file(sys.argv[2]).read()
fo = open(sys.argv[3], "w+")
h={}; registers={}; context={}; mseq=0; dseq=1; T=True; F=False
def n2z( a ):
return ( '0' if a=='' else a )
def be2le( a ):
return a[6:8]+a[4:6]+a[2:4]+a[0:2]
def mark( p, s, t ):
( v, m, ss, l, c, a ) = t
if t[1]: x = p +"-" + str(s); h[x]=(v,m,l,a); return t
else:
if not t[0]: x = p +"-" + str(s); h[x]=(v,m,l,a); return t
return t
def been(p, s):
if h.has_key( p +"-" + str(s) ): return h[p +"-" + str(s)][1]
else: return False
def was(c,p,s): (v,m,l,a) = h[p+"-"+str(s)]; return (v,m,s,l,c,a)
def cm( ch, s, c ):
if s < len(fi):
if fi[s] == ch: return ( T, T, s, 1, c, ( "cm", fi[s] ) )
return ( False, True, s, 0, c, ( "cm", "") )
def andmemo( m ):
r = True
for i in m:
if not m[i]: r = False
return r
outdata = ""
def output( s ):
global outdata
outdata = outdata + str(s)
"""
epilogue = """
q3 = chr(34)+chr(34)+chr(34)
predefs = "prologue="+q3+prologue+q3+"; epilogue="+q3+epilogue+q3
(v,m,s,l,c,a) = syntax_p( 0, ({},'<1>','<0>') )
if v:
print "Parsed "+a[0]+" OK"
else: print "Failed to Parse"
print >> fo, a[1]
fo.close()
"""
outdata = ""
def output( s ):
global outdata
outdata = outdata + str(s)
q3 = chr(34)+chr(34)+chr(34)
predefs = "prologue="+q3+prologue+q3+"; epilogue="+q3+epilogue+q3
(v,m,s,l,c,a) = syntax_p( 0, ({},'<1>','<0>') )
if v:
print "Parsed "+a[0]+" OK"
else: print "Failed to Parse"
print >> fo, a[1]
fo.close()
| 28.515504 | 136 | 0.472566 | 9,751 | 44,142 | 2.112604 | 0.016306 | 0.095534 | 0.138058 | 0.117476 | 0.829417 | 0.79034 | 0.782913 | 0.767524 | 0.657233 | 0.639757 | 0 | 0.017952 | 0.216347 | 44,142 | 1,547 | 137 | 28.533937 | 0.577561 | 0 | 0 | 0.591122 | 0 | 0.021807 | 0.098047 | 0.005278 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.004673 | null | null | 0.007009 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
36efa0337873f88d446db314cebd163e03506cec | 4,447 | py | Python | tests/compiler_env_state_test.py | kokizzu/CompilerGym | 4dd454f7a386c26308ab60be12315c28f3fa1cbc | [
"MIT"
] | null | null | null | tests/compiler_env_state_test.py | kokizzu/CompilerGym | 4dd454f7a386c26308ab60be12315c28f3fa1cbc | [
"MIT"
] | null | null | null | tests/compiler_env_state_test.py | kokizzu/CompilerGym | 4dd454f7a386c26308ab60be12315c28f3fa1cbc | [
"MIT"
] | null | null | null | # Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
"""Unit tests for //compiler_gym:compiler_env_state."""
import pytest
from compiler_gym import CompilerEnvState
from tests.test_main import main
def test_state_to_csv_from_csv():
original_state = CompilerEnvState(
benchmark="foo", walltime=100, reward=1.5, commandline="-a -b -c"
)
state_from_csv = CompilerEnvState.from_csv(original_state.to_csv())
assert state_from_csv.benchmark == "foo"
assert state_from_csv.walltime == 100
assert state_from_csv.reward == 1.5
assert state_from_csv.commandline == "-a -b -c"
def test_state_to_csv_from_csv_no_reward():
original_state = CompilerEnvState(
benchmark="foo", walltime=100, commandline="-a -b -c"
)
state_from_csv = CompilerEnvState.from_csv(original_state.to_csv())
assert state_from_csv.benchmark == "foo"
assert state_from_csv.walltime == 100
assert state_from_csv.reward is None
assert state_from_csv.commandline == "-a -b -c"
def test_state_from_csv_empty():
with pytest.raises(ValueError) as ctx:
CompilerEnvState.from_csv("")
assert str(ctx.value) == "Failed to parse input: ``"
def test_state_from_csv_invalid_format():
with pytest.raises(ValueError) as ctx:
CompilerEnvState.from_csv("abcdef")
assert str(ctx.value).startswith("Failed to parse input: `abcdef`: ")
def test_state_to_json_from_json():
original_state = CompilerEnvState(
benchmark="foo", walltime=100, reward=1.5, commandline="-a -b -c"
)
state_from_json = CompilerEnvState.from_json(original_state.json())
assert state_from_json.benchmark == "foo"
assert state_from_json.walltime == 100
assert state_from_json.reward == 1.5
assert state_from_json.commandline == "-a -b -c"
def test_state_to_json_from_json_no_reward():
original_state = CompilerEnvState(
benchmark="foo", walltime=100, commandline="-a -b -c"
)
state_from_json = CompilerEnvState.from_json(original_state.json())
assert state_from_json.benchmark == "foo"
assert state_from_json.walltime == 100
assert state_from_json.reward is None
assert state_from_json.commandline == "-a -b -c"
def test_state_from_json_empty():
with pytest.raises(TypeError):
CompilerEnvState.from_json({})
def test_state_equality_different_types():
state = CompilerEnvState(benchmark="foo", walltime=10, commandline="-a -b -c")
assert not state == 5 # noqa testing __eq__
assert state != 5 # testing __ne__
def test_state_equality_same():
a = CompilerEnvState(benchmark="foo", walltime=10, commandline="-a -b -c")
b = CompilerEnvState(benchmark="foo", walltime=10, commandline="-a -b -c")
assert a == b # testing __eq__
assert not a != b # noqa testing __ne__
def test_state_equality_differnt_walltime():
"""Test that walltime is not compared."""
a = CompilerEnvState(benchmark="foo", walltime=10, commandline="-a -b -c")
b = CompilerEnvState(benchmark="foo", walltime=5, commandline="-a -b -c")
assert a == b # testing __eq__
assert not a != b # noqa testing __ne__
def test_state_equality_one_sided_reward():
a = CompilerEnvState(benchmark="foo", walltime=5, commandline="-a -b -c", reward=2)
b = CompilerEnvState(benchmark="foo", walltime=5, commandline="-a -b -c")
assert a == b # testing __eq__
assert b == a # testing __eq__
assert not a != b # noqa testing __ne__
assert not b != a # noqa testing __ne__
def test_state_equality_equal_reward():
a = CompilerEnvState(benchmark="foo", walltime=5, commandline="-a -b -c", reward=2)
b = CompilerEnvState(benchmark="foo", walltime=5, commandline="-a -b -c", reward=2)
assert a == b # testing __eq__
assert b == a # testing __eq__
assert not a != b # noqa testing __ne__
assert not b != a # noqa testing __ne__
def test_state_equality_unequal_reward():
a = CompilerEnvState(benchmark="foo", walltime=5, commandline="-a -b -c", reward=2)
b = CompilerEnvState(benchmark="foo", walltime=5, commandline="-a -b -c", reward=3)
assert not a == b # noqa testing __eq__
assert not b == a # noqatesting __eq__
assert a != b # testing __ne__
assert b != a # testing __ne__
if __name__ == "__main__":
main()
| 34.472868 | 87 | 0.697774 | 614 | 4,447 | 4.723127 | 0.149837 | 0.02 | 0.085172 | 0.091724 | 0.783103 | 0.773793 | 0.75069 | 0.725862 | 0.721724 | 0.684483 | 0 | 0.014905 | 0.185293 | 4,447 | 128 | 88 | 34.742188 | 0.785537 | 0.127951 | 0 | 0.517241 | 0 | 0 | 0.073063 | 0 | 0 | 0 | 0 | 0 | 0.413793 | 1 | 0.149425 | false | 0 | 0.034483 | 0 | 0.183908 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7fc2268f1854da9bfcdae8388e5f01f68ccf4652 | 14,617 | py | Python | wyvern-python/wyvern/types.py | carlocastoldi/wyvern | 7d762c0a26552458799f39230a9c89ef9524550b | [
"Apache-2.0"
] | 5 | 2018-11-21T09:53:19.000Z | 2021-02-11T11:04:50.000Z | wyvern-python/wyvern/types.py | carlocastoldi/wyvern | 7d762c0a26552458799f39230a9c89ef9524550b | [
"Apache-2.0"
] | null | null | null | wyvern-python/wyvern/types.py | carlocastoldi/wyvern | 7d762c0a26552458799f39230a9c89ef9524550b | [
"Apache-2.0"
] | 1 | 2018-10-08T14:00:28.000Z | 2018-10-08T14:00:28.000Z | #!/usr/bin/env python3
#
# Copyright 2018 | Dario Ostuni <dario.ostuni@gmail.com>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from enum import Enum
from .util import UnreachableError
class DataType(Enum):
int32 = "I32"
uint32 = "U32"
float32 = "F32"
bool = "Bool"
class IoType(Enum):
input = "input"
output = "output"
private = "private"
class Variable:
pass
class Array:
def _get_key(self, key):
if type(key) == int:
key = Constant.uint32(key, self._ctx)
elif type(key) == Constant and key._ty == DataType.uint32:
pass
else:
raise TypeError
return key
def __setitem__(self, key, value):
key = self._get_key(key)
value = self._ctx._sanitize(value)
if type(value) != Constant or value._ty != self._ty:
raise TypeError
self._ctx.getProgramBuilder()._add_command({
"ArrayStore": [self._tid, key._tid, value._tid]
})
def __getitem__(self, key):
key = self._get_key(key)
element = Constant._new_constant(self._ctx, self._ty)
self._ctx.getProgramBuilder()._add_command({
"ArrayLoad": [element._tid, self._tid, key._tid]
})
return element
def __len__(self):
length = Constant._new_constant(self._ctx, DataType.uint32)
self._ctx.getProgramBuilder()._add_command({
"ArrayLen": [length._tid, self._tid]
})
return length
class Constant:
def getContext(self):
return self._ctx
def getProgramBuilder(self):
return self.getContext().getProgramBuilder()
def __add__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Add": [result._tid, self._tid, other._tid]
})
return result
def __sub__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Sub": [result._tid, self._tid, other._tid]
})
return result
def __mul__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Mul": [result._tid, self._tid, other._tid]
})
return result
def __floordiv__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Div": [result._tid, self._tid, other._tid]
})
return result
def __truediv__(self, other):
return self.__floordiv__(other)
def __mod__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Rem": [result._tid, self._tid, other._tid]
})
return result
def __inv__(self):
assert self._ty in (DataType.int32, DataType.uint32)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Not": [result._tid, self._tid]
})
return result
def not_(self):
assert self._ty in (DataType.bool,)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Not": [result._tid, self._tid]
})
return result
def __neg__(self):
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Neg": [result._tid, self._tid]
})
return result
def __lshift__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Shl": [result._tid, self._tid, other._tid]
})
return result
def __rshift__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"Shr": [result._tid, self._tid, other._tid]
})
return result
def __xor__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.bool)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"BitXor": [result._tid, self._tid, other._tid]
})
return result
def __and__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.bool)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"BitAnd": [result._tid, self._tid, other._tid]
})
return result
def __or__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.bool)
result = Constant._new_constant(self.getContext(), self._ty)
self.getProgramBuilder()._add_command({
"BitOr": [result._tid, self._tid, other._tid]
})
return result
def __eq__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32,
DataType.float32, DataType.bool)
result = Constant._new_constant(self.getContext(), DataType.bool)
self.getProgramBuilder()._add_command({
"Eq": [result._tid, self._tid, other._tid]
})
return result
def __ne__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32,
DataType.float32, DataType.bool)
result = Constant._new_constant(self.getContext(), DataType.bool)
self.getProgramBuilder()._add_command({
"Ne": [result._tid, self._tid, other._tid]
})
return result
def __lt__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), DataType.bool)
self.getProgramBuilder()._add_command({
"Lt": [result._tid, self._tid, other._tid]
})
return result
def __le__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), DataType.bool)
self.getProgramBuilder()._add_command({
"Le": [result._tid, self._tid, other._tid]
})
return result
def __gt__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), DataType.bool)
self.getProgramBuilder()._add_command({
"Gt": [result._tid, self._tid, other._tid]
})
return result
def __ge__(self, other):
other = self._ctx._sanitize(other)
assert self.getProgramBuilder() == other.getProgramBuilder()
assert self._ty == other._ty
assert self._ty in (DataType.int32, DataType.uint32, DataType.float32)
result = Constant._new_constant(self.getContext(), DataType.bool)
self.getProgramBuilder()._add_command({
"Ge": [result._tid, self._tid, other._tid]
})
return result
@staticmethod
def _new_constant(ctx, ty):
const = Constant()
const._ctx = ctx
const._ty = ty
const._tid = ctx._new_constant(ty)
return const
@staticmethod
def int32(value, ctx=None):
if type(value) not in (int, Constant):
value = int(value)
if type(value) == Constant:
ctx = value.getContext()
const = Constant._new_constant(ctx, DataType.int32)
p = ctx.getProgramBuilder()
if type(value) == int:
if value < -2**31 or value >= 2**31:
raise ValueError
p._add_command({
"Constant": [const._tid, {DataType.int32.value: value}]
})
elif type(value) == Constant:
assert p == value.getProgramBuilder()
if value._ty == DataType.int32:
const._tid = value._tid
elif value._ty == DataType.uint32:
p._add_command({"I32fromU32": [const._tid, value._tid]})
elif value._ty == DataType.float32:
p._add_command({"I32fromF32": [const._tid, value._tid]})
else:
raise TypeError
else:
raise UnreachableError
return const
@staticmethod
def uint32(value, ctx=None):
if type(value) not in (int, Constant):
value = int(value)
if type(value) == Constant:
ctx = value.getContext()
const = Constant._new_constant(ctx, DataType.uint32)
p = ctx.getProgramBuilder()
if type(value) == int:
if value < 0 or value >= 2**32:
raise ValueError
p._add_command({
"Constant": [const._tid, {DataType.uint32.value: value}]
})
elif type(value) == Constant:
assert p == value.getProgramBuilder()
if value._ty == DataType.uint32:
const._tid = value._tid
elif value._ty == DataType.int32:
p._add_command({"U32fromI32": [const._tid, value._tid]})
elif value._ty == DataType.float32:
p._add_command({"U32fromF32": [const._tid, value._tid]})
else:
raise TypeError
else:
raise UnreachableError
return const
@staticmethod
def float32(value, ctx=None):
if type(value) not in (float, Constant):
value = float(value)
if type(value) == Constant:
ctx = value.getContext()
const = Constant._new_constant(ctx, DataType.float32)
p = ctx.getProgramBuilder()
if type(value) == float:
p._add_command({
"Constant": [const._tid, {DataType.float32.value: value}]
})
elif type(value) == Constant:
assert p == value.getProgramBuilder()
if value._ty == DataType.uint32:
p._add_command({"F32fromU32": [const._tid, value._tid]})
elif value._ty == DataType.int32:
p._add_command({"F32fromI32": [const._tid, value._tid]})
elif value._ty == DataType.float32:
const._tid = value._tid
else:
raise TypeError
else:
raise UnreachableError
return const
@staticmethod
def bool(value, ctx=None):
if type(value) not in (bool, Constant):
value = bool(value)
if type(value) == Constant:
ctx = value.getContext()
const = Constant._new_constant(ctx, DataType.bool)
p = ctx.getProgramBuilder()
if type(value) == bool:
p._add_command({
"Constant": [const._tid, {DataType.bool.value: value}]
})
elif type(value) == Constant:
assert p == value.getProgramBuilder()
if value._ty == DataType.bool:
const._tid = value._tid
else:
raise TypeError
else:
raise UnreachableError
return const
| 36.911616 | 78 | 0.604023 | 1,573 | 14,617 | 5.342657 | 0.102352 | 0.060685 | 0.049976 | 0.057473 | 0.808187 | 0.783198 | 0.773441 | 0.758091 | 0.74298 | 0.645526 | 0 | 0.018381 | 0.281658 | 14,617 | 395 | 79 | 37.005063 | 0.782 | 0.040911 | 0 | 0.682353 | 0 | 0 | 0.014924 | 0 | 0 | 0 | 0 | 0 | 0.161765 | 1 | 0.091176 | false | 0.005882 | 0.005882 | 0.008824 | 0.220588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7fe8d50554c24fb061cdc608cedcb32073db5589 | 8,753 | py | Python | code/CreateInputFeatureMaps_average_pooling_multiple_magnifications.py | saloniagarwal0403/SSCNN | 30dafb61e5822d0011516c1d600450818844c106 | [
"MIT"
] | null | null | null | code/CreateInputFeatureMaps_average_pooling_multiple_magnifications.py | saloniagarwal0403/SSCNN | 30dafb61e5822d0011516c1d600450818844c106 | [
"MIT"
] | null | null | null | code/CreateInputFeatureMaps_average_pooling_multiple_magnifications.py | saloniagarwal0403/SSCNN | 30dafb61e5822d0011516c1d600450818844c106 | [
"MIT"
] | null | null | null | import pickle
import os
import numpy as np
import tensorflow as tf
def get_model_predictions(original_image_features_pickle_file):
filenames_and_features = []
with open(original_image_features_pickle_file, "rb") as input_file:
filenames_and_features = pickle.load(input_file)
filenames = filenames_and_features[0]
features = filenames_and_features[1]
max_x = -1
max_y = -1
index_level_x_y = []
for filename in filenames:
tile_name = os.path.split(filename)[-1]
tile_name = tile_name.split(".")[0]
splited_tile_name = tile_name.split("_")
tile_level = int(splited_tile_name[-3])
tile_x = int(splited_tile_name[-2])
tile_y = int(splited_tile_name[-1])
index_level_x_y.append([tile_level,tile_x,tile_y])
if max_x<tile_x:
max_x = tile_x
if max_y<tile_y:
max_y = tile_y
num_feature = np.array(features[0]).shape[-1]
feature_map = {}
# print(index_level_x_y)
for i in range(len(features)):
feature_i = np.mean(np.mean(features[i],axis=0),axis=0)
feature_i = np.reshape(feature_i,(num_feature))
start_channel = num_feature*index_level_x_y[i][0]
end_channel = num_feature*(index_level_x_y[i][0]+1)
x = index_level_x_y[i][1]
y = index_level_x_y[i][2]
if (x,y) in feature_map:
temp_feature = feature_map[(x,y)]
temp_feature[start_channel:end_channel] = feature_i
feature_map[(x,y)] = temp_feature
else:
temp_feature = np.zeros(3*num_feature)
temp_feature[start_channel:end_channel] = feature_i
feature_map[(x,y)] = temp_feature
return list(feature_map.values())
def pca_transform_tensors(pca, num_feature, tensor_width, tensor_height, tensor_3D):
number_of_samples = len(tensor_3D)
resized_tensor = []
for sample in range(number_of_samples):
w = len(tensor_3D[sample])
h = len(tensor_3D[sample][0])
feature_map = np.zeros((w,h,num_feature))
for i in range(w):
for j in range(h):
feature_i_j = tensor_3D[sample][i][j]
transformed_feature_i_j = pca.transform([feature_i_j])[0]
feature_map[i,j,:] = transformed_feature_i_j
resized_tensor.append(tf.image.resize_with_crop_or_pad(feature_map, tensor_width, tensor_height))
resized_tensor = np.asarray(resized_tensor)
return resized_tensor
def create_tensors(original_image_features_pickle_file, pca, pca_num_feature, tensors_size, saving_folder):
filenames_and_features = []
with open(original_image_features_pickle_file, "rb") as input_file:
filenames_and_features = pickle.load(input_file)
filenames = filenames_and_features[0]
features = filenames_and_features[1]
### find the height and width of input tensor using filenames_and_features
### filename pattern tile_path = os.path.join(tile_folder,image_name+"_"+str(i)+"_"+str(x)+"_"+str(y)+".png")
max_x = -1
max_y = -1
index_level_x_y = []
for filename in filenames:
tile_name = os.path.split(filename)[-1]
tile_name = tile_name.split(".")[0]
splited_tile_name = tile_name.split("_")
tile_level = int(splited_tile_name[-3])
tile_x = int(splited_tile_name[-2])
tile_y = int(splited_tile_name[-1])
index_level_x_y.append([tile_level,tile_x,tile_y])
if max_x<tile_x:
max_x = tile_x
if max_y<tile_y:
max_y = tile_y
num_feature = np.array(features[0]).shape[-1]
feature_map = np.zeros(((max_x+1),(max_y+1),num_feature*3))
for i in range(len(features)):
feature_i = np.mean(np.mean(features[i],axis=0),axis=0)
feature_i = np.reshape(feature_i,(num_feature))
start_channel = num_feature*index_level_x_y[i][0]
end_channel = num_feature*(index_level_x_y[i][0]+1)
x = index_level_x_y[i][1]
y = index_level_x_y[i][2]
feature_map[x,y,start_channel:end_channel] = feature_i
w = len(feature_map)
h = len(feature_map[0])
resized_tensor = np.zeros((w,h,pca_num_feature))
for i in range(w):
for j in range(h):
feature_i_j = feature_map[i][j]
transformed_feature_i_j = pca.transform([feature_i_j])[0]
resized_tensor[i,j,:] = transformed_feature_i_j
resized_tensor = tf.image.resize_with_crop_or_pad(resized_tensor, tensors_size, tensors_size)
original_image_features_pickle_file_name = os.path.split(original_image_features_pickle_file)[-1]
# print(original_image_features_pickle_file_name)
with open(os.path.join(saving_folder,original_image_features_pickle_file_name), 'wb') as handle:
pickle.dump(resized_tensor, handle)
# return resized_tensor
# for i in range(len(features)):
# feature_i = np.mean(np.mean(features[i],axis=0),axis=0)
# feature_i = np.reshape(feature_i,(num_feature))
# start_channel = num_feature*index_level_x_y[i][0]
# end_channel = num_feature*(index_level_x_y[i][0]+1)
# x = index_level_x_y[i][1]
# y = index_level_x_y[i][2]
# feature_map[x,y,start_channel:end_channel] = feature_i
# return np.amax(np.amax(feature_map,axis=0),axis=0)
# num_feature = np.array(features[0]).shape[-1]
# feature_map = np.zeros(num_feature)
# for i in range(len(features)):
# feature_i = np.mean(np.mean(features[i],axis=0),axis=0)
# feature_i = np.reshape(feature_i,(num_feature))
# if index_level_x_y[i][0]==1:
# feature_map = feature_map+feature_i
# return feature_map/(len(features)/3.0)
# feature_map = np.zeros((42,42,num_feature))
# for i in range(len(features)):
# if index_level_x_y[i][0]==0:
# feature_i = np.mean(np.mean(features[i],axis=0),axis=0)
# feature_i = np.reshape(feature_i,(num_feature))
# x = index_level_x_y[i][1]
# y = index_level_x_y[i][2]
# feature_map[x,y,:] = feature_i
# return np.amax(np.amax(feature_map,axis=0),axis=0)
# def create_tensors(original_image_features_pickle_file):
# filenames_and_features = []
# with open(original_image_features_pickle_file, "rb") as input_file:
# filenames_and_features = pickle.load(input_file)
# filenames = filenames_and_features[0]
# features = filenames_and_features[1]
# ### find the height and width of input tensor using filenames_and_features
# ### filename pattern tile_path = os.path.join(tile_folder,image_name+"_"+str(i)+"_"+str(x)+"_"+str(y)+".png")
# max_x = -1
# max_y = -1
# index_level_x_y = []
# for filename in filenames:
# tile_name = os.path.split(filename)[-1]
# tile_name = tile_name.split(".")[0]
# splited_tile_name = tile_name.split("_")
# tile_level = int(splited_tile_name[-3])
# tile_x = int(splited_tile_name[-2])
# tile_y = int(splited_tile_name[-1])
# index_level_x_y.append([tile_level,tile_x,tile_y])
# if max_x<tile_x:
# max_x = tile_x
# if max_y<tile_y:
# max_y = tile_y
# # for i in range(len(features)):
# # feature_i = np.mean(np.mean(features[i],axis=0),axis=0)
# # feature_i = np.reshape(feature_i,(num_feature))
# # start_channel = num_feature*index_level_x_y[i][0]
# # end_channel = num_feature*(index_level_x_y[i][0]+1)
# # x = index_level_x_y[i][1]
# # y = index_level_x_y[i][2]
# # feature_map[x,y,start_channel:end_channel] = feature_i
# # return np.amax(np.amax(feature_map,axis=0),axis=0)
#
# num_feature = np.array(features[0]).shape[-1]
# feature_map = np.zeros(num_feature)
# for i in range(len(features)):
# feature_i = np.mean(np.mean(features[i],axis=0),axis=0)
# feature_i = np.reshape(feature_i,(num_feature))
# if index_level_x_y[i][0]==1:
# feature_map = feature_map+feature_i
# return feature_map/(len(features)/3.0)
#
# # feature_map = np.zeros((42,42,num_feature))
# # for i in range(len(features)):
# # if index_level_x_y[i][0]==0:
# # feature_i = np.mean(np.mean(features[i],axis=0),axis=0)
# # feature_i = np.reshape(feature_i,(num_feature))
# # x = index_level_x_y[i][1]
# # y = index_level_x_y[i][2]
# # feature_map[x,y,:] = feature_i
# # return np.amax(np.amax(feature_map,axis=0),axis=0)
| 45.118557 | 116 | 0.62927 | 1,322 | 8,753 | 3.822239 | 0.076399 | 0.064912 | 0.067485 | 0.07362 | 0.860281 | 0.848407 | 0.823273 | 0.812982 | 0.782703 | 0.782703 | 0 | 0.01762 | 0.241403 | 8,753 | 193 | 117 | 45.352332 | 0.743373 | 0.423283 | 0 | 0.653465 | 0 | 0 | 0.002108 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029703 | false | 0 | 0.039604 | 0 | 0.089109 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3d1224f1c3cbebc84146a6ecfea9252f233cedce | 26,790 | py | Python | tests/orca_unit_testing/test_combining_merge_asof_partition.py | jiajiaxu123/Orca | e86189e70c1d0387816bb98b8047a6232fbda9df | [
"Apache-2.0"
] | 20 | 2019-12-02T11:49:12.000Z | 2021-12-24T19:34:32.000Z | tests/orca_unit_testing/test_combining_merge_asof_partition.py | jiajiaxu123/Orca | e86189e70c1d0387816bb98b8047a6232fbda9df | [
"Apache-2.0"
] | null | null | null | tests/orca_unit_testing/test_combining_merge_asof_partition.py | jiajiaxu123/Orca | e86189e70c1d0387816bb98b8047a6232fbda9df | [
"Apache-2.0"
] | 5 | 2019-12-02T12:16:22.000Z | 2021-10-22T02:27:47.000Z | import unittest
import orca
import os.path as path
from setup.settings import *
from pandas.util.testing import *
def _create_odf_csv(datal, datar):
dfsDatabase = "dfs://testMergeAsofDB"
s = orca.default_session()
dolphindb_script = """
login('admin', '123456')
if(existsDatabase('{dbPath}'))
dropDatabase('{dbPath}')
db=database('{dbPath}', VALUE, 2010.01M..2010.05M)
stb1=extractTextSchema('{data1}')
update stb1 set type="SYMBOL" where name="type"
stb2=extractTextSchema('{data2}')
update stb2 set type="SYMBOL" where name="ticker"
loadTextEx(db,`tickers,`date, '{data1}',,stb1)
loadTextEx(db,`values,`date, '{data2}',,stb2)
""".format(dbPath=dfsDatabase, data1=datal, data2=datar)
s.run(dolphindb_script)
class Csv:
odfs_csv_left = None
odfs_csv_right = None
pdf_csv_left = None
pdf_csv_right = None
class DfsMergeTest(unittest.TestCase):
@classmethod
def setUpClass(cls):
# configure data directory
DATA_DIR = path.abspath(path.join(__file__, "../setup/data"))
left_fileName = 'test_merge_asof_left_table.csv'
right_fileName = 'test_merge_asof_right_table.csv'
datal = os.path.join(DATA_DIR, left_fileName)
datal= datal.replace('\\', '/')
datar = os.path.join(DATA_DIR, right_fileName)
datar = datar.replace('\\', '/')
dfsDatabase = "dfs://testMergeAsofDB"
# connect to a DolphinDB server
orca.connect(HOST, PORT, "admin", "123456")
_create_odf_csv(datal, datar)
# import
Csv.odfs_csv_left = orca.read_table(dfsDatabase, 'tickers')
Csv.pdf_csv_left = pd.read_csv(datal, parse_dates=[0])
Csv.odfs_csv_right = orca.read_table(dfsDatabase, 'values')
Csv.pdf_csv_right = pd.read_csv(datar, parse_dates=[0])
@property
def odfs_csv_left(self):
return Csv.odfs_csv_left
@property
def odfs_csv_right(self):
return Csv.odfs_csv_right
@property
def pdf_csv_left(self):
return Csv.pdf_csv_left
@property
def pdf_csv_right(self):
return Csv.pdf_csv_right
@property
def odfs_csv_left_index(self):
return Csv.odfs_csv_left.set_index("date")
@property
def odfs_csv_right_index(self):
return Csv.odfs_csv_right.set_index("date")
@property
def pdf_csv_left_index(self):
return Csv.pdf_csv_left.set_index("date")
@property
def pdf_csv_right_index(self):
return Csv.pdf_csv_right.set_index("date")
@property
def odfs_bid_csv_left(self):
return self.odfs_csv_left.sort_values(by=['bid', 'date']).reset_index(drop=True)
@property
def odfs_bid_csv_right(self):
return self.odfs_csv_right.sort_values(by=['bid', 'date']).reset_index(drop=True)
@property
def pdf_bid_csv_left(self):
return self.pdf_csv_left.sort_values(by=['bid', 'date']).reset_index(drop=True)
@property
def pdf_bid_csv_right(self):
return self.pdf_csv_right.sort_values(by=['bid', 'date']).reset_index(drop=True)
@property
def odfs_bid_csv_left_index(self):
return self.odfs_csv_left.sort_values(by=['bid', 'date']).set_index('bid')
@property
def odfs_bid_csv_right_index(self):
return self.odfs_csv_right.sort_values(by=['bid', 'date']).set_index('bid')
@property
def pdf_bid_csv_left_index(self):
return self.pdf_csv_left.sort_values(by=['bid', 'date']).set_index('bid')
@property
def pdf_bid_csv_right_index(self):
return self.pdf_csv_right.sort_values(by=['bid', 'date']).set_index('bid')
def test_assert_original_dataframe_equal(self):
assert_frame_equal(self.odfs_csv_left.to_pandas(), self.pdf_csv_left, check_dtype=False)
assert_frame_equal(self.odfs_csv_right.to_pandas(), self.pdf_csv_right, check_dtype=False)
assert_frame_equal(self.odfs_csv_left_index.to_pandas(), self.pdf_csv_left_index, check_dtype=False)
assert_frame_equal(self.odfs_csv_right_index.to_pandas(), self.pdf_csv_right_index, check_dtype=False)
assert_frame_equal(self.odfs_bid_csv_left.to_pandas(), self.pdf_bid_csv_left, check_dtype=False)
assert_frame_equal(self.odfs_bid_csv_right.to_pandas(), self.pdf_bid_csv_right, check_dtype=False)
assert_frame_equal(self.odfs_bid_csv_left_index.to_pandas(), self.pdf_bid_csv_left_index, check_dtype=False)
assert_frame_equal(self.odfs_bid_csv_right_index.to_pandas(), self.pdf_bid_csv_right_index, check_dtype=False)
def test_merge_asof_from_dfs_param_on(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, on='date')
odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, on='date')
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, on='bid')
odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, on='bid')
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_leftonrighton(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, left_on='date', right_on='date')
odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, left_on='date', right_on='date')
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, left_on='bid', right_on='bid')
odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, left_on='bid', right_on='bid')
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_index(self):
pdf = pd.merge_asof(self.pdf_csv_left_index, self.pdf_csv_right_index, left_index=True, right_index=True)
odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, left_index=True, right_index=True)
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left_index, self.pdf_bid_csv_right_index, left_index=True, right_index=True)
odf = orca.merge_asof(self.odfs_bid_csv_left_index, self.odfs_bid_csv_right_index, left_index=True, right_index=True)
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_by(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, on='date', by='ticker')
odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, on='date', by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, on='bid', by='ticker')
odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, on='bid', by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_leftbyrightby(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, on='date', left_by='ticker', right_by='ticker')
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, on='date', left_by='ticker', right_by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, on='bid', left_by='ticker',
right_by='ticker')
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, on='bid', left_by='ticker', right_by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, on='date', suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, on='date', suffixes=('_left', '_right'))
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, on='bid', suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, on='bid', suffixes=('_left', '_right'))
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_leftonrighton_param_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, left_on='date', right_on='date',
suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, left_on='date', right_on='date',
suffixes=('_left', '_right'))
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, left_on='bid', right_on='bid',
suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, left_on='bid', right_on='bid',
suffixes=('_left', '_right'))
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_leftonrighton_param_by(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, left_on='date', right_on='date', by='ticker')
odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, left_on='date', right_on='date', by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, left_on='bid', right_on='bid', by='ticker')
odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, left_on='bid', right_on='bid', by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_leftonrighton_param_leftbyrightby(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, on='date', left_by='ticker', right_by='ticker')
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, on='date', left_by='ticker', right_by='ticker')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, on='bid', left_by='ticker',
right_by='ticker')
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, on='bid', left_by='ticker', right_by='ticker')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_index_param_by(self):
pdf = pd.merge_asof(self.pdf_csv_left_index, self.pdf_csv_right_index, left_index=True, right_index=True,
by='ticker')
odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, left_index=True, right_index=True,
by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left_index, self.pdf_bid_csv_right_index, left_index=True,
right_index=True, by='ticker')
odf = orca.merge_asof(self.odfs_bid_csv_left_index, self.odfs_bid_csv_right_index, left_index=True,
right_index=True, by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_index_param_leftbyrightby(self):
pdf = pd.merge_asof(self.pdf_csv_left_index, self.pdf_csv_right_index, left_index=True, right_index=True,
left_by='ticker', right_by='ticker')
odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, left_index=True, right_index=True,
left_by='ticker', right_by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left_index, self.pdf_bid_csv_right_index, left_index=True,
right_index=True, left_by='ticker',
right_by='ticker')
odf = orca.merge_asof(self.odfs_bid_csv_left_index, self.odfs_bid_csv_right_index, left_index=True,
right_index=True, left_by='ticker',
right_by='ticker')
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_index_param_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left_index, self.pdf_csv_right_index, left_index=True, right_index=True,
suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, left_index=True, right_index=True,
suffixes=('_left', '_right'))
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left_index, self.pdf_bid_csv_right_index, left_index=True,
right_index=True, suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_bid_csv_left_index, self.odfs_bid_csv_right_index, left_index=True,
right_index=True,
suffixes=('_left', '_right'))
assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_on_param_by_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, on='date', by='ticker', suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, on='date', by='ticker',
suffixes=('_left', '_right'))
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, on='bid', by='ticker',
suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, on='bid', by='ticker',
suffixes=('_left', '_right'))
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_on_param_leftbyrightby_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, on='date',
left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, on='date',
# left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, on='bid', left_by='ticker',
right_by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, on='bid', left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_leftonrighton_param_by_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, left_on='date', right_on='date',
by='ticker', suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, left_on='date', right_on='date',
by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, left_on='bid', right_on='bid', by='ticker',
suffixes=('_left', '_right'))
pdf.fillna("", inplace=True)
odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, left_on='bid', right_on='bid', by='ticker',
suffixes=('_left', '_right'))
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_leftonrighton_param_leftbyrightby_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right, left_on='date', right_on='date',
left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right, left_on='date', right_on='date',
# left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right, left_on='bid', right_on='bid',
left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right, left_on='bid', right_on='bid',
# left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_index_param_by_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left_index, self.pdf_csv_right_index, left_index=True, right_index=True,
by='ticker', suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, left_index=True, right_index=True,
by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left_index, self.pdf_bid_csv_right_index, left_index=True,
right_index=True,
by='ticker', suffixes=('_left', '_right'))
odf = orca.merge_asof(self.odfs_bid_csv_left_index, self.odfs_bid_csv_right_index, left_index=True,
right_index=True,
by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_index_param_leftbyrightby_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left_index, self.pdf_csv_right_index, left_index=True, right_index=True,
left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, left_index=True, right_index=True,
# left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left_index, self.pdf_bid_csv_right_index, left_index=True,
right_index=True,
left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_bid_csv_left_index, self.odfs_bid_csv_right_index, left_index=True, right_index=True,
# left_by='ticker', right_by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_on_param_index(self):
pdf = pd.merge_asof(self.pdf_csv_left_index, self.pdf_csv_right, left_index=True, right_on='date')
# TODO:ORCA error left_index, right_on not supported
# odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, left_index=True, right_on='date')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right_index, right_index=True, left_on='date')
# TODO:ORCA error left_index, right_on not supported
# odf = orca.merge_asof(self.odfs_csv_left, self.odfs_csv_right_index, right_index=True, left_on='date')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left_index, self.pdf_bid_csv_right, left_index=True, right_on='bid')
# TODO:ORCA error left_index, right_on not supported
# odf = orca.merge_asof(self.odfs_bid_csv_left_index, self.odfs_bid_csv_right, left_index=True, right_on='bid')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right_index, right_index=True, left_on='bid')
# TODO:ORCA error left_index, right_on not supported
# odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right_index, right_index=True, left_on='bid')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_on_param_index_param_by(self):
pdf = pd.merge_asof(self.pdf_csv_left_index, self.pdf_csv_right, left_index=True, right_on='date', by='ticker')
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, left_index=True, right_on='date', by='ticker')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right_index, right_index=True, left_on='date', by='ticker')
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, right_index=True, left_on='date', by='ticker')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left_index, self.pdf_bid_csv_right, left_index=True, right_on='bid',
by='ticker')
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_bid_csv_left_index, self.odfs_bid_csv_right, left_index=True, right_on='bid', by='ticker')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right_index, right_index=True, left_on='bid',
by='ticker')
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right_index, right_index=True, left_on='bid', by='ticker')
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
def test_merge_asof_from_dfs_param_on_param_index_param_by_param_suffixes(self):
pdf = pd.merge_asof(self.pdf_csv_left_index, self.pdf_csv_right, left_index=True, right_on='date',
by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, left_index=True, right_on='date',
# by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_csv_left, self.pdf_csv_right_index, right_index=True, left_on='date',
by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_csv_left_index, self.odfs_csv_right_index, right_index=True, left_on='date',
# by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left_index, self.pdf_bid_csv_right, left_index=True, right_on='bid',
by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_bid_csv_left_index, self.odfs_bid_csv_right, left_index=True, right_on='bid',
# by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
pdf = pd.merge_asof(self.pdf_bid_csv_left, self.pdf_bid_csv_right_index, right_index=True, left_on='bid',
by='ticker', suffixes=('_left', '_right'))
# TODO:ORCA by bug
# odf = orca.merge_asof(self.odfs_bid_csv_left, self.odfs_bid_csv_right_index, right_index=True, left_on='bid',
# by='ticker', suffixes=('_left', '_right'))
# assert_frame_equal(odf.to_pandas().fillna(""), pdf.fillna(""), check_dtype=False, check_like=False)
if __name__ == '__main__':
unittest.main()
| 60.748299 | 157 | 0.661627 | 3,858 | 26,790 | 4.190254 | 0.032141 | 0.055858 | 0.077199 | 0.045033 | 0.930471 | 0.916553 | 0.898305 | 0.877088 | 0.877088 | 0.874737 | 0 | 0.00178 | 0.202986 | 26,790 | 440 | 158 | 60.886364 | 0.755339 | 0.272116 | 0 | 0.424125 | 0 | 0 | 0.096746 | 0.015626 | 0 | 0 | 0 | 0.002273 | 0.081712 | 1 | 0.155642 | false | 0 | 0.019455 | 0.062257 | 0.2607 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3d131a81b328755630687d48ce0728b9904b030e | 2,525 | py | Python | ensemble_transformers/ensemble.py | jaketae/ensemble-transformers | 14eb9639de20d694356b374e2bd8223b02c3472e | [
"MIT"
] | 18 | 2022-03-28T09:24:37.000Z | 2022-03-31T06:32:00.000Z | ensemble_transformers/ensemble.py | jaketae/ensemble-transformers | 14eb9639de20d694356b374e2bd8223b02c3472e | [
"MIT"
] | null | null | null | ensemble_transformers/ensemble.py | jaketae/ensemble-transformers | 14eb9639de20d694356b374e2bd8223b02c3472e | [
"MIT"
] | null | null | null | from typing import List, Union
import numpy as np
import torch
from PIL.Image import Image
from ensemble_transformers.base import EnsembleBaseModel
class EnsembleModelForSequenceClassification(EnsembleBaseModel):
def forward(
self,
text: List[str],
main_device: Union[str, torch.device] = "cpu",
return_all_outputs: bool = False,
preprocessor_kwargs: dict = {"return_tensors": "pt", "padding": True},
):
outputs = []
for i, (model, preprocessor) in enumerate(zip(self.models, self.preprocessors)):
inputs = preprocessor(text, **preprocessor_kwargs).to(self.devices[i])
output = model(**inputs)
outputs.append(output)
if return_all_outputs:
return outputs
return torch.stack(
[weight * output.logits.to(main_device) for weight, output in zip(self.config.weights, outputs)]
).sum(dim=0)
class EnsembleModelForImageClassification(EnsembleBaseModel):
def forward(
self,
images: List[Image],
main_device: Union[str, torch.device] = "cpu",
return_all_outputs: bool = False,
preprocessor_kwargs: dict = {"return_tensors": "pt"},
):
outputs = []
for i, (model, preprocessor) in enumerate(zip(self.models, self.preprocessors)):
inputs = preprocessor(images, **preprocessor_kwargs).to(self.devices[i])
output = model(**inputs)
outputs.append(output)
if return_all_outputs:
return outputs
return torch.stack(
[weight * output.logits.to(main_device) for weight, output in zip(self.config.weights, outputs)]
).sum(dim=0)
class EnsembleModelForAudioClassification(EnsembleBaseModel):
def forward(
self,
audio: np.ndarray,
main_device: Union[str, torch.device] = "cpu",
return_all_outputs: bool = False,
preprocessor_kwargs: dict = {"return_tensors": "pt", "sampling_rate": None, "padding": "longest"},
):
outputs = []
for i, (model, preprocessor) in enumerate(zip(self.models, self.preprocessors)):
inputs = preprocessor(audio, **preprocessor_kwargs).to(self.devices[i])
output = model(**inputs)
outputs.append(output)
if return_all_outputs:
return outputs
return torch.stack(
[weight * output.logits.to(main_device) for weight, output in zip(self.config.weights, outputs)]
).sum(dim=0)
| 37.132353 | 108 | 0.632475 | 272 | 2,525 | 5.764706 | 0.25 | 0.038265 | 0.061224 | 0.059311 | 0.727679 | 0.727679 | 0.727679 | 0.727679 | 0.727679 | 0.727679 | 0 | 0.001596 | 0.255446 | 2,525 | 67 | 109 | 37.686567 | 0.832447 | 0 | 0 | 0.711864 | 0 | 0 | 0.03604 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050847 | false | 0 | 0.084746 | 0 | 0.288136 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3d41e9a07845b7d59eb29ff545db8929e09e40c1 | 130 | py | Python | DonkiPlayer/scripts/mcstas-generator/src/neventarray.py | ess-dmsc/do-ess-data-simulator | 37ef0d87ad0152b092e3a636ef8d080db0711aaa | [
"BSD-2-Clause"
] | null | null | null | DonkiPlayer/scripts/mcstas-generator/src/neventarray.py | ess-dmsc/do-ess-data-simulator | 37ef0d87ad0152b092e3a636ef8d080db0711aaa | [
"BSD-2-Clause"
] | null | null | null | DonkiPlayer/scripts/mcstas-generator/src/neventarray.py | ess-dmsc/do-ess-data-simulator | 37ef0d87ad0152b092e3a636ef8d080db0711aaa | [
"BSD-2-Clause"
] | null | null | null | import numpy as np
event_t = np.dtype(np.uint64)
def multiplyNEventArray(data,multiplier) :
return np.tile(data,multiplier)
| 18.571429 | 42 | 0.761538 | 19 | 130 | 5.157895 | 0.736842 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 0.138462 | 130 | 6 | 43 | 21.666667 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
3d4fded10fddcb5590f4e2a2320bcfa1726fa1ac | 45 | py | Python | spacy/tests/enable_gpu.py | snosrap/spaCy | 3f68bbcfec44ef55d101e6db742d353b72652129 | [
"MIT"
] | 22,040 | 2016-10-03T11:58:15.000Z | 2022-03-31T21:08:19.000Z | spacy/tests/enable_gpu.py | snosrap/spaCy | 3f68bbcfec44ef55d101e6db742d353b72652129 | [
"MIT"
] | 6,927 | 2016-10-03T13:11:11.000Z | 2022-03-31T17:01:25.000Z | spacy/tests/enable_gpu.py | snosrap/spaCy | 3f68bbcfec44ef55d101e6db742d353b72652129 | [
"MIT"
] | 4,403 | 2016-10-04T03:36:33.000Z | 2022-03-31T14:12:34.000Z | from spacy import require_gpu
require_gpu()
| 11.25 | 29 | 0.822222 | 7 | 45 | 5 | 0.714286 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 45 | 3 | 30 | 15 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
18503c11b24cdd3fce4041c5c79c4283681a0823 | 6,355 | py | Python | boltzgen/distributions.py | VincentStimper/boltzmann-generators | 2ef7d9d5ee90277a4556339b71d0eba1443351ec | [
"MIT"
] | 2 | 2021-06-09T11:16:18.000Z | 2022-02-03T08:12:15.000Z | boltzgen/distributions.py | VincentStimper/boltzmann-generators | 2ef7d9d5ee90277a4556339b71d0eba1443351ec | [
"MIT"
] | null | null | null | boltzgen/distributions.py | VincentStimper/boltzmann-generators | 2ef7d9d5ee90277a4556339b71d0eba1443351ec | [
"MIT"
] | null | null | null | import torch
import numpy as np
import normflow as nf
import multiprocessing as mp
from torch import nn
from . import openmm_interface as omi
class Boltzmann(nf.distributions.PriorDistribution):
"""
Boltzmann distribution using OpenMM to get energy and forces
"""
def __init__(self, sim_context, temperature, energy_cut, energy_max):
"""
Constructor
:param sim_context: Context of the simulation object used for energy
and force calculation
:param temperature: Temperature of System
"""
# Save input parameters
self.sim_context = sim_context
self.temperature = temperature
self.energy_cut = torch.tensor(energy_cut)
self.energy_max = torch.tensor(energy_max)
# Set up functions
self.openmm_energy = omi.OpenMMEnergyInterface.apply
self.regularize_energy = omi.regularize_energy
self.norm_energy = lambda pos: self.regularize_energy(
self.openmm_energy(pos, self.sim_context, temperature)[:, 0],
self.energy_cut, self.energy_max)
def log_prob(self, z):
return -self.norm_energy(z)
class TransformedBoltzmann(nn.Module):
"""
Boltzmann distribution with respect to transformed variables,
uses OpenMM to get energy and forces
"""
def __init__(self, sim_context, temperature, energy_cut, energy_max, transform):
"""
Constructor
:param sim_context: Context of the simulation object used for energy
and force calculation
:param temperature: Temperature of System
:param energy_cut: Energy at which logarithm is applied
:param energy_max: Maximum energy
:param transform: Coordinate transformation
"""
super().__init__()
# Save input parameters
self.sim_context = sim_context
self.temperature = temperature
self.energy_cut = torch.tensor(energy_cut)
self.energy_max = torch.tensor(energy_max)
# Set up functions
self.openmm_energy = omi.OpenMMEnergyInterface.apply
self.regularize_energy = omi.regularize_energy
self.norm_energy = lambda pos: self.regularize_energy(
self.openmm_energy(pos, self.sim_context, temperature)[:, 0],
self.energy_cut, self.energy_max)
self.transform = transform
def log_prob(self, z):
z, log_det = self.transform(z)
return -self.norm_energy(z) + log_det
class BoltzmannParallel(nf.distributions.PriorDistribution):
"""
Boltzmann distribution using OpenMM to get energy and forces and processes the
batch of states in parallel
"""
def __init__(self, system, temperature, energy_cut, energy_max, n_threads=None):
"""
Constructor
:param system: Molecular system
:param temperature: Temperature of System
:param energy_cut: Energy at which logarithm is applied
:param energy_max: Maximum energy
:param n_threads: Number of threads to use to process batches, set
to the number of cpus if None
"""
# Save input parameters
self.system = system
self.temperature = temperature
self.energy_cut = torch.tensor(energy_cut)
self.energy_max = torch.tensor(energy_max)
self.n_threads = mp.cpu_count() if n_threads is None else n_threads
# Create pool for parallel processing
self.pool = mp.Pool(self.n_threads, omi.OpenMMEnergyInterfaceParallel.var_init,
(system, temperature))
# Set up functions
self.openmm_energy = omi.OpenMMEnergyInterfaceParallel.apply
self.regularize_energy = omi.regularize_energy
self.norm_energy = lambda pos: self.regularize_energy(
self.openmm_energy(pos, self.pool)[:, 0],
self.energy_cut, self.energy_max)
def log_prob(self, z):
return -self.norm_energy(z)
class TransformedBoltzmannParallel(nn.Module):
"""
Boltzmann distribution with respect to transformed variables,
uses OpenMM to get energy and forces and processes the batch of
states in parallel
"""
def __init__(self, system, temperature, energy_cut, energy_max, transform,
n_threads=None):
"""
Constructor
:param system: Molecular system
:param temperature: Temperature of System
:param energy_cut: Energy at which logarithm is applied
:param energy_max: Maximum energy
:param transform: Coordinate transformation
:param n_threads: Number of threads to use to process batches, set
to the number of cpus if None
"""
super().__init__()
# Save input parameters
self.system = system
self.temperature = temperature
self.energy_cut = torch.tensor(energy_cut)
self.energy_max = torch.tensor(energy_max)
self.n_threads = mp.cpu_count() if n_threads is None else n_threads
# Create pool for parallel processing
self.pool = mp.Pool(self.n_threads, omi.OpenMMEnergyInterfaceParallel.var_init,
(system, temperature))
# Set up functions
self.openmm_energy = omi.OpenMMEnergyInterfaceParallel.apply
self.regularize_energy = omi.regularize_energy
self.norm_energy = lambda pos: self.regularize_energy(
self.openmm_energy(pos, self.pool)[:, 0],
self.energy_cut, self.energy_max)
self.transform = transform
def log_prob(self, z):
z_, log_det = self.transform(z)
return -self.norm_energy(z_) + log_det
class DoubleWell(nf.distributions.PriorDistribution):
"""
Boltzmann distribution of the double well potential of the form
U(x, y) = 1/4 * a * x**4 - 1/2 * b * x**2 + c * x + 1/2 * d * y**2
"""
def __init__(self, a=1, b=6, c=1, d=1):
"""
Constructor
:param a: Parameter of the potential
:param b: Parameter of the potential
:param c: Parameter of the potential
:param d: Parameter of the potential
"""
self.a = a
self.b = b
self.c = c
self.d = d
def log_prob(self, z):
return -self.a / 4 * z[:, 0] ** 4 + self.b / 2 * z[:, 0] ** 2 - self.c * z[:, 0] - self.d / 2 * z[:, 1] ** 2 | 35.702247 | 116 | 0.647994 | 783 | 6,355 | 5.090677 | 0.155811 | 0.0429 | 0.026091 | 0.038133 | 0.906172 | 0.871801 | 0.86277 | 0.856498 | 0.856498 | 0.856498 | 0 | 0.005828 | 0.270968 | 6,355 | 178 | 116 | 35.702247 | 0.854522 | 0.306058 | 0 | 0.670886 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.126582 | false | 0 | 0.075949 | 0.037975 | 0.329114 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
185b9ebef27c6f66b6db599680fd48cace05d26b | 14,736 | py | Python | tests/test_class_decoration.py | paulroujansky/pydeco | 9bb31e8cf801f6316fd0c5c554000ae26256019a | [
"MIT"
] | 1 | 2019-08-26T09:10:48.000Z | 2019-08-26T09:10:48.000Z | tests/test_class_decoration.py | paulroujansky/pydeco | 9bb31e8cf801f6316fd0c5c554000ae26256019a | [
"MIT"
] | 4 | 2019-07-23T13:38:08.000Z | 2019-08-26T07:53:00.000Z | tests/test_class_decoration.py | paulroujansky/pydeco | 9bb31e8cf801f6316fd0c5c554000ae26256019a | [
"MIT"
] | 2 | 2019-08-26T09:06:00.000Z | 2021-04-29T16:42:33.000Z | """Test class decoration."""
import os
import pickle as pkl
import sys
from copy import deepcopy
import pytest
from joblib import Parallel, delayed
from pydeco import Decorator, MethodsDecorator
from pydeco.utils.register import unregister_all
from pydeco.utils import PYTHON_VERSION
global logs
logs = []
# Utils
# -----------------------------------------------------------------------------
# Defining custom func decorators
# -------------------------------
class Decorator1(Decorator):
"""Decorator 1."""
def __init__(self, name, *args, **kwargs):
self.name = name
Decorator.__init__(self)
def __repr__(self):
"""Return the string representation."""
return '{}[id={}]'.format(self.__class__.__name__, id(self))
def wrapper(self, instance, func, *args, **kwargs):
"""Wrap input instance method with runtime measurement."""
logs.append({
self.__class__.__name__: id(self),
instance.__class__.__name__: id(instance)
})
print('{} decorating {}[id={}]'.format(self, instance, id(instance)))
instance.cnt_dec_1 += 1 # updating instance cnt for decorator
return func(instance, *args, **kwargs)
class Decorator2(Decorator):
"""Decorator 2."""
def __init__(self, name, *args, **kwargs):
self.name = name
Decorator.__init__(self)
def __repr__(self):
"""Return the string representation."""
return '{}[id={}]'.format(self.__class__.__name__, id(self))
def wrapper(self, instance, func, *args, **kwargs):
"""Wrap input instance method with runtime measurement."""
logs.append({
self.__class__.__name__: id(self),
instance.__class__.__name__: id(instance)
})
print('{} decorating {}[id={}]'.format(self, instance, id(instance)))
instance.cnt_dec_2 += 1 # updating instance cnt for decorator
return func(instance, *args, **kwargs)
# Defining custom processing class
# --------------------------------
class MyClass():
"""Custom class."""
def __init__(self, *args, **kwargs):
self.cnt_dec_1 = 0
self.cnt_dec_2 = 0
def method_1(self, *args, **kwargs):
# print('Run method 1')
pass
def method_2(self, *args, **kwargs):
# print('Run method 2')
pass
def method_3(self, *args, **kwargs):
# print('Run method 3')
pass
def __repr__(self):
return '{}(cnt_dec_1={}, cnt_dec_1={})'.format(
self.__class__.__name__, self.cnt_dec_1, self.cnt_dec_2)
# Define custom function
# ----------------------
def myfunc(i, instance, copy, iter_process, verbose=True):
"""In the title."""
# keep track of iterations within each PID
pid = os.getpid() # get current process ID
if verbose:
print('Iter {}: PID={}\n'.format(i + 1, pid))
# run methods
instance.method_1()
instance.method_2()
instance.method_3()
return (pid, id(instance))
# Tests
# ----------------------------------------------------------------------------
def test_class_decoration(verbose=False):
"""Test class decoration."""
from pydeco.utils.parser import CONFIG
CONFIG['N_DISPATCH'] = None
unregister_all()
global logs
logs = []
# instantiate the class
instance = MyClass()
assert repr(instance) == 'MyClass(cnt_dec_1=0, cnt_dec_1=0)'
# run methods
instance.method_1()
instance.method_2()
instance.method_3()
assert instance.cnt_dec_1 == 0 and instance.cnt_dec_2 == 0
# decorate methods
MyClass_deco = MethodsDecorator(
mapping={
Decorator1(name='decorator_1'): ['method_1', 'method_2'],
Decorator2(name='decorator_2'): 'method_1'
})(MyClass)
# instantiate the class
instance = MyClass_deco()
assert repr(instance) == 'Wrapped(MyClass)(cnt_dec_1=0, cnt_dec_1=0)'
# run methods
instance.method_1()
instance.method_2()
instance.method_3()
assert repr(instance) == 'Wrapped(MyClass)(cnt_dec_1=2, cnt_dec_1=1)'
assert instance.cnt_dec_1 == 2 and instance.cnt_dec_2 == 1
# decorate methods
with pytest.raises(ValueError,
match='Input class has not method "method_4"'):
MyClass_deco = MethodsDecorator(
mapping={
Decorator1(name='decorator_1'): ['method_1', 'method_2'],
Decorator2(name='decorator_2'): ['method_1', 'method_4']
})(MyClass)
unregister_all()
def test_deepcopying(verbose=True):
"""Test deepcopying."""
from pydeco.utils.parser import CONFIG
CONFIG['N_DISPATCH'] = 1
unregister_all()
global logs
logs = []
# decorate methods of base class
MyClass_deco = MethodsDecorator(
mapping={
Decorator1(name='decorator_1'): ['method_1', 'method_2'],
Decorator2(name='decorator_2'): 'method_1'
})(MyClass)
# instantiate the decorated class
instance = MyClass_deco()
if verbose:
print('Before deepcopy...\n' + '-' * 18)
for inst in [instance]:
print('Instance: {} (id={})'.format(inst, id(inst)))
print('Mapping:')
print(inst._decorator_mapping)
print('Decorators')
for deco_name, deco in inst.decorators.items():
print('\t Decorator: {}'.format(deco))
print('\n')
# create a deepcopy of `instance`
instance_2 = deepcopy(instance)
if verbose:
print('After deepcopy...\n' + '-' * 17)
for inst in [instance, instance_2]:
print('Instance: {} (id={})'.format(inst, id(inst)))
print('Mapping:')
print(inst._decorator_mapping)
print('Decorators')
for deco_name, deco in inst.decorators.items():
print('\t Decorator: {}'.format(deco))
print('\n')
# check that `instance` and `instance_2` are distinct objects
assert instance is not instance_2
# check that decorators of `instance` and `instance_2` are distinct objects
for (deco1_name, deco1), (deco2_name, deco2) in zip(
instance.decorators.items(), instance_2.decorators.items()):
assert deco1 is not deco2
assert instance.cnt_dec_1 == 0 and instance.cnt_dec_2 == 0
assert instance_2.cnt_dec_1 == 0 and instance_2.cnt_dec_2 == 0
assert instance.cnt_dec_1 == 0 and instance.cnt_dec_2 == 0
assert instance_2.cnt_dec_1 == 0 and instance_2.cnt_dec_2 == 0
# run methods for `instance`
# run methods
instance_2.method_1()
instance_2.method_2()
instance_2.method_3()
for j, entry in enumerate(logs):
assert entry['Wrapped2(MyClass)'] == id(instance_2)
if j == 0:
assert (entry['Decorator2'] ==
id(instance_2.decorators['Decorator2']))
elif j == 1:
assert (entry['Decorator1'] ==
id(instance_2.decorators['Decorator1']))
elif j == 2:
assert (entry['Decorator1'] ==
id(instance_2.decorators['Decorator1']))
# check that internal variables of `instance`' have changed but not of
# `instance_2`
assert instance.cnt_dec_1 == 0 and instance.cnt_dec_2 == 0
assert instance_2.cnt_dec_1 == 2 and instance_2.cnt_dec_2 == 1
# run methods
instance.method_1()
instance.method_2()
instance.method_3()
for j, entry in enumerate(logs):
if j < 3:
assert entry['Wrapped2(MyClass)'] == id(instance_2)
if j == 0:
assert (entry['Decorator2'] ==
id(instance_2.decorators['Decorator2']))
elif j == 1:
assert (entry['Decorator1'] ==
id(instance_2.decorators['Decorator1']))
elif j == 2:
assert (entry['Decorator1'] ==
id(instance_2.decorators['Decorator1']))
else:
assert entry['Wrapped(MyClass)'] == id(instance)
if j == 3:
assert (entry['Decorator2'] ==
id(instance.decorators['Decorator2']))
elif j == 4:
assert (entry['Decorator1'] ==
id(instance.decorators['Decorator1']))
elif j == 5:
assert (entry['Decorator1'] ==
id(instance.decorators['Decorator1']))
assert instance.cnt_dec_1 == 2 and instance.cnt_dec_2 == 1
assert instance_2.cnt_dec_1 == 2 and instance_2.cnt_dec_2 == 1
unregister_all()
@pytest.mark.parametrize('dcopy', (False, True))
def test_pickling(dcopy, verbose=True):
"""Test pickling."""
from pydeco.utils.parser import CONFIG
CONFIG['N_DISPATCH'] = 1
unregister_all()
global logs
logs = []
# decorate methods of base class
MyClass_deco = MethodsDecorator(
mapping={
Decorator1(name='decorator_1'): ['method_1', 'method_2'],
Decorator2(name='decorator_2'): 'method_1'
})(MyClass)
# instantiate the decorated class
instance = MyClass_deco()
if verbose:
print('Before pickling...\n' + '-' * 18)
for inst in [instance]:
print('Instance: {} (id={})'.format(inst, id(inst)))
print('Mapping:')
print(inst._decorator_mapping)
print('Decorators')
for deco_name, deco in inst.decorators.items():
print('\t Decorator: {}'.format(deco))
print('\n')
instance_ = deepcopy(instance) if dcopy else instance
# Save instance as a pickle object
tmp = pkl.dumps(instance_)
# Load pickled module
instance_2 = pkl.loads(tmp)
if verbose:
print('After pickling...\n' + '-' * 17)
for inst in [instance, instance_2]:
print('Instance: {} (id={})'.format(inst, id(inst)))
print('Mapping:')
print(inst._decorator_mapping)
print('Decorators')
for deco_name, deco in inst.decorators.items():
print('\t Decorator: {}'.format(deco))
print('\n')
# check that `instance` and `instance_2` are distinct objects
assert instance is not instance_2
# check that decorators of `instance` and `instance_2` are distinct objects
for (deco1_name, deco1), (deco2_name, deco2) in zip(
instance.decorators.items(), instance_2.decorators.items()):
assert deco1 is not deco2
assert instance.cnt_dec_1 == 0 and instance.cnt_dec_2 == 0
assert instance_2.cnt_dec_1 == 0 and instance_2.cnt_dec_2 == 0
# run methods for `instance`
# run methods
instance_2.method_1()
instance_2.method_2()
instance_2.method_3()
new_classname = 'Wrapped2(MyClass)' if dcopy else 'Wrapped(MyClass)'
for j, entry in enumerate(logs):
assert entry[new_classname] == id(instance_2)
if j == 0:
assert (entry['Decorator2'] ==
id(instance_.decorators['Decorator2']))
elif j == 1:
assert (entry['Decorator1'] ==
id(instance_.decorators['Decorator1']))
elif j == 2:
assert (entry['Decorator1'] ==
id(instance_.decorators['Decorator1']))
# check that internal variables of `instance`' have changed but not of
# `instance_2`
assert instance.cnt_dec_1 == 0 and instance.cnt_dec_2 == 0
assert instance_2.cnt_dec_1 == 2 and instance_2.cnt_dec_2 == 1
# run methods
instance.method_1()
instance.method_2()
instance.method_3()
for j, entry in enumerate(logs):
if j < 3:
assert entry[new_classname] == id(instance_2)
if j == 0:
assert (entry['Decorator2'] ==
id(instance_.decorators['Decorator2']))
elif j == 1:
assert (entry['Decorator1'] ==
id(instance_.decorators['Decorator1']))
elif j == 2:
assert (entry['Decorator1'] ==
id(instance_.decorators['Decorator1']))
else:
assert entry['Wrapped(MyClass)'] == id(instance)
if j == 3:
assert (entry['Decorator2'] ==
id(instance.decorators['Decorator2']))
elif j == 4:
assert (entry['Decorator1'] ==
id(instance.decorators['Decorator1']))
elif j == 5:
assert (entry['Decorator1'] ==
id(instance.decorators['Decorator1']))
assert instance.cnt_dec_1 == 2 and instance.cnt_dec_2 == 1
assert instance_2.cnt_dec_1 == 2 and instance_2.cnt_dec_2 == 1
unregister_all()
@pytest.mark.parametrize(argnames='copy', argvalues=(True, False))
@pytest.mark.parametrize(argnames='n_iter', argvalues=(10, 20, 30))
def test_parallelizing(copy, n_iter, n_jobs=1, verbose=True):
"""Test parallelizing."""
from pydeco.utils.parser import CONFIG
CONFIG['N_DISPATCH'] = n_iter + 1
unregister_all()
global logs
logs = []
# store number of iterations for each process
global iter_process
iter_process = dict()
# decorate methods of base class
MyClass_deco = MethodsDecorator(
mapping={
Decorator1(name='decorator_1'): ['method_1', 'method_2'],
Decorator2(name='decorator_2'): 'method_1'
})(MyClass)
# instantiate the decorated class
instance = MyClass_deco()
# create a "reference" instance
instance_ref = deepcopy(instance)
if verbose:
print('Before deepcopy...\n' + '-' * 18)
for inst in [instance, instance_ref]:
print('Instance: {} (id={})'.format(inst, id(inst)))
print('Mapping:')
print(inst._decorator_mapping)
print('Decorators')
for deco_name, deco in inst.decorators.items():
print('\t Decorator: {}'.format(deco))
print('\n')
# run parallel jobs
if verbose:
print('Parallelizing: {} iterations distributed on {} jobs'.format(
n_iter, n_jobs))
backend = 'multiprocessing' if copy else 'threading'
with Parallel(n_jobs=n_jobs,
verbose=verbose,
pre_dispatch='all',
backend=backend) as parallel:
res = parallel(
delayed(myfunc)(i, deepcopy(instance), copy=copy, verbose=verbose,
iter_process=iter_process)
for i in range(n_iter)
)
unregister_all()
if __name__ == "__main__":
pytest.main([__file__])
| 31.286624 | 79 | 0.576208 | 1,675 | 14,736 | 4.853134 | 0.107463 | 0.033953 | 0.02325 | 0.013778 | 0.764055 | 0.747324 | 0.733547 | 0.733547 | 0.720507 | 0.708697 | 0 | 0.02762 | 0.282573 | 14,736 | 470 | 80 | 31.353191 | 0.741298 | 0.127986 | 0 | 0.779935 | 0 | 0 | 0.116686 | 0.004554 | 0 | 0 | 0 | 0 | 0.152104 | 1 | 0.05178 | false | 0.009709 | 0.042071 | 0.003236 | 0.122977 | 0.126214 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
43e56935f5c7f9942ef424cf34e77d1555d573b4 | 177 | py | Python | gs_extensions/exceptions.py | ilyachch/gnome_shell_extensions_install_tool | e1e19b4d390db99e243fa0e0cd4466fd843482c9 | [
"MIT"
] | null | null | null | gs_extensions/exceptions.py | ilyachch/gnome_shell_extensions_install_tool | e1e19b4d390db99e243fa0e0cd4466fd843482c9 | [
"MIT"
] | 4 | 2019-03-25T07:27:45.000Z | 2020-01-16T14:54:07.000Z | gs_extensions/exceptions.py | ilyachch/gs_extensions | e1e19b4d390db99e243fa0e0cd4466fd843482c9 | [
"MIT"
] | null | null | null | class GnomeShellNotInstalledError(RuntimeError):
pass
class ExtensionNotFoundInHub(RuntimeError):
pass
class NoExtensionVersionForGnomeShell(RuntimeError):
pass
| 16.090909 | 52 | 0.80791 | 12 | 177 | 11.916667 | 0.5 | 0.335664 | 0.293706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141243 | 177 | 10 | 53 | 17.7 | 0.940789 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
a176754ec9a778e1b2045ae7defa8b110abf246f | 123 | py | Python | tracklib/util/__init__.py | SGrosse-Holz/tracklib | e0b88e3959db2ce65869d8292ce5792f4c77c7a4 | [
"MIT"
] | 1 | 2022-01-30T15:10:51.000Z | 2022-01-30T15:10:51.000Z | tracklib/util/__init__.py | SGrosse-Holz/tracklib | e0b88e3959db2ce65869d8292ce5792f4c77c7a4 | [
"MIT"
] | null | null | null | tracklib/util/__init__.py | SGrosse-Holz/tracklib | e0b88e3959db2ce65869d8292ce5792f4c77c7a4 | [
"MIT"
] | null | null | null | from .util import *
from . import mcmc
from .sweep import Sweeper
from . import plotting
from .parallel import Parallelize
| 20.5 | 33 | 0.788618 | 17 | 123 | 5.705882 | 0.529412 | 0.206186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162602 | 123 | 5 | 34 | 24.6 | 0.941748 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a188a17ac12a6847a914161fd1960b3020690f3a | 49 | py | Python | src/token_secret.py | hpkishere/telebiblebot | de72006d1a08f7e3c15641406c99a78d6609bbac | [
"MIT"
] | null | null | null | src/token_secret.py | hpkishere/telebiblebot | de72006d1a08f7e3c15641406c99a78d6609bbac | [
"MIT"
] | 1 | 2018-01-03T08:30:46.000Z | 2018-01-03T08:30:46.000Z | src/token_secret.py | hpkishere/telebiblebot | de72006d1a08f7e3c15641406c99a78d6609bbac | [
"MIT"
] | null | null | null | #insert token here
token = '<insert token here>' | 24.5 | 29 | 0.714286 | 7 | 49 | 5 | 0.428571 | 0.628571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 49 | 2 | 29 | 24.5 | 0.853659 | 0.346939 | 0 | 0 | 0 | 0 | 0.612903 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a1a57fbdcd7e54a8d8a64e827541c84d1783830a | 275 | py | Python | snmpagent_unity/unity_impl/DiskSerialNumber.py | factioninc/snmp-unity-agent | 3525dc0fac60d1c784dcdd7c41693544bcbef843 | [
"Apache-2.0"
] | 2 | 2019-03-01T11:14:59.000Z | 2019-10-02T17:47:59.000Z | snmpagent_unity/unity_impl/DiskSerialNumber.py | factioninc/snmp-unity-agent | 3525dc0fac60d1c784dcdd7c41693544bcbef843 | [
"Apache-2.0"
] | 2 | 2019-03-01T11:26:29.000Z | 2019-10-11T18:56:54.000Z | snmpagent_unity/unity_impl/DiskSerialNumber.py | factioninc/snmp-unity-agent | 3525dc0fac60d1c784dcdd7c41693544bcbef843 | [
"Apache-2.0"
] | 1 | 2019-10-03T21:09:17.000Z | 2019-10-03T21:09:17.000Z | class DiskSerialNumber(object):
def read_get(self, name, idx_name, unity_client):
return unity_client.get_disk_serial_number(idx_name)
class DiskSerialNumberColumn(object):
def get_idx(self, name, idx, unity_client):
return unity_client.get_disks()
| 30.555556 | 60 | 0.752727 | 37 | 275 | 5.27027 | 0.459459 | 0.225641 | 0.112821 | 0.225641 | 0.317949 | 0.317949 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 275 | 8 | 61 | 34.375 | 0.844156 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
a1dafe77e624a0abc16c90271e68e80a9c1a9022 | 22,160 | py | Python | pybind/slxos/v16r_1_00b/brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v16r_1_00b/brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import stp
import rstp
import mstp
import pvstp
import rpvstp
class spanning_tree_info(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-xstp-ext - based on the path /brocade_xstp_ext_rpc/get-stp-brief-info/output/spanning-tree-info. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__stp_mode','__stp','__rstp','__mstp','__pvstp','__rpvstp',)
_yang_name = 'spanning-tree-info'
_rest_name = 'spanning-tree-info'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__rstp = YANGDynClass(base=rstp.rstp, is_container='container', presence=False, yang_name="rstp", rest_name="rstp", parent=self, choice=(u'spanning-tree-mode', u'rstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)
self.__mstp = YANGDynClass(base=mstp.mstp, is_container='container', presence=False, yang_name="mstp", rest_name="mstp", parent=self, choice=(u'spanning-tree-mode', u'mstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)
self.__rpvstp = YANGDynClass(base=YANGListType("vlan_id",rpvstp.rpvstp, yang_name="rpvstp", rest_name="rpvstp", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='vlan-id', extensions=None, choice=False), is_container='list', yang_name="rpvstp", rest_name="rpvstp", parent=self, choice=(u'spanning-tree-mode', u'rpvstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='list', is_config=True)
self.__stp_mode = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_type="dict_key", restriction_arg={u'none': {'value': 1}, u'rstp': {'value': 3}, u'mstp': {'value': 4}, u'rpvstp': {'value': 6}, u'pvstp': {'value': 5}, u'stp': {'value': 2}},), is_leaf=True, yang_name="stp-mode", rest_name="stp-mode", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='stp-type', is_config=True)
self.__pvstp = YANGDynClass(base=YANGListType("vlan_id",pvstp.pvstp, yang_name="pvstp", rest_name="pvstp", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='vlan-id', extensions=None, choice=False), is_container='list', yang_name="pvstp", rest_name="pvstp", parent=self, choice=(u'spanning-tree-mode', u'pvstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='list', is_config=True)
self.__stp = YANGDynClass(base=stp.stp, is_container='container', presence=False, yang_name="stp", rest_name="stp", parent=self, choice=(u'spanning-tree-mode', u'stp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'brocade_xstp_ext_rpc', u'get-stp-brief-info', u'output', u'spanning-tree-info']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'get-stp-brief-info', u'output', u'spanning-tree-info']
def _get_stp_mode(self):
"""
Getter method for stp_mode, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/stp_mode (stp-type)
YANG Description: Type of the spanning tree protocol configured
on this switch
"""
return self.__stp_mode
def _set_stp_mode(self, v, load=False):
"""
Setter method for stp_mode, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/stp_mode (stp-type)
If this variable is read-only (config: false) in the
source YANG file, then _set_stp_mode is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_stp_mode() directly.
YANG Description: Type of the spanning tree protocol configured
on this switch
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=RestrictedClassType(base_type=unicode, restriction_type="dict_key", restriction_arg={u'none': {'value': 1}, u'rstp': {'value': 3}, u'mstp': {'value': 4}, u'rpvstp': {'value': 6}, u'pvstp': {'value': 5}, u'stp': {'value': 2}},), is_leaf=True, yang_name="stp-mode", rest_name="stp-mode", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='stp-type', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """stp_mode must be of a type compatible with stp-type""",
'defined-type': "brocade-xstp-ext:stp-type",
'generated-type': """YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_type="dict_key", restriction_arg={u'none': {'value': 1}, u'rstp': {'value': 3}, u'mstp': {'value': 4}, u'rpvstp': {'value': 6}, u'pvstp': {'value': 5}, u'stp': {'value': 2}},), is_leaf=True, yang_name="stp-mode", rest_name="stp-mode", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='stp-type', is_config=True)""",
})
self.__stp_mode = t
if hasattr(self, '_set'):
self._set()
def _unset_stp_mode(self):
self.__stp_mode = YANGDynClass(base=RestrictedClassType(base_type=unicode, restriction_type="dict_key", restriction_arg={u'none': {'value': 1}, u'rstp': {'value': 3}, u'mstp': {'value': 4}, u'rpvstp': {'value': 6}, u'pvstp': {'value': 5}, u'stp': {'value': 2}},), is_leaf=True, yang_name="stp-mode", rest_name="stp-mode", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='stp-type', is_config=True)
def _get_stp(self):
"""
Getter method for stp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/stp (container)
"""
return self.__stp
def _set_stp(self, v, load=False):
"""
Setter method for stp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/stp (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_stp is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_stp() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=stp.stp, is_container='container', presence=False, yang_name="stp", rest_name="stp", parent=self, choice=(u'spanning-tree-mode', u'stp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """stp must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=stp.stp, is_container='container', presence=False, yang_name="stp", rest_name="stp", parent=self, choice=(u'spanning-tree-mode', u'stp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)""",
})
self.__stp = t
if hasattr(self, '_set'):
self._set()
def _unset_stp(self):
self.__stp = YANGDynClass(base=stp.stp, is_container='container', presence=False, yang_name="stp", rest_name="stp", parent=self, choice=(u'spanning-tree-mode', u'stp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)
def _get_rstp(self):
"""
Getter method for rstp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/rstp (container)
"""
return self.__rstp
def _set_rstp(self, v, load=False):
"""
Setter method for rstp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/rstp (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_rstp is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_rstp() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=rstp.rstp, is_container='container', presence=False, yang_name="rstp", rest_name="rstp", parent=self, choice=(u'spanning-tree-mode', u'rstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """rstp must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=rstp.rstp, is_container='container', presence=False, yang_name="rstp", rest_name="rstp", parent=self, choice=(u'spanning-tree-mode', u'rstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)""",
})
self.__rstp = t
if hasattr(self, '_set'):
self._set()
def _unset_rstp(self):
self.__rstp = YANGDynClass(base=rstp.rstp, is_container='container', presence=False, yang_name="rstp", rest_name="rstp", parent=self, choice=(u'spanning-tree-mode', u'rstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)
def _get_mstp(self):
"""
Getter method for mstp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/mstp (container)
YANG Description: CIST information
"""
return self.__mstp
def _set_mstp(self, v, load=False):
"""
Setter method for mstp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/mstp (container)
If this variable is read-only (config: false) in the
source YANG file, then _set_mstp is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mstp() directly.
YANG Description: CIST information
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=mstp.mstp, is_container='container', presence=False, yang_name="mstp", rest_name="mstp", parent=self, choice=(u'spanning-tree-mode', u'mstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mstp must be of a type compatible with container""",
'defined-type': "container",
'generated-type': """YANGDynClass(base=mstp.mstp, is_container='container', presence=False, yang_name="mstp", rest_name="mstp", parent=self, choice=(u'spanning-tree-mode', u'mstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)""",
})
self.__mstp = t
if hasattr(self, '_set'):
self._set()
def _unset_mstp(self):
self.__mstp = YANGDynClass(base=mstp.mstp, is_container='container', presence=False, yang_name="mstp", rest_name="mstp", parent=self, choice=(u'spanning-tree-mode', u'mstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='container', is_config=True)
def _get_pvstp(self):
"""
Getter method for pvstp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/pvstp (list)
YANG Description: PVST instance information
"""
return self.__pvstp
def _set_pvstp(self, v, load=False):
"""
Setter method for pvstp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/pvstp (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_pvstp is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_pvstp() directly.
YANG Description: PVST instance information
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType("vlan_id",pvstp.pvstp, yang_name="pvstp", rest_name="pvstp", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='vlan-id', extensions=None, choice=False), is_container='list', yang_name="pvstp", rest_name="pvstp", parent=self, choice=(u'spanning-tree-mode', u'pvstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='list', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """pvstp must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("vlan_id",pvstp.pvstp, yang_name="pvstp", rest_name="pvstp", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='vlan-id', extensions=None, choice=False), is_container='list', yang_name="pvstp", rest_name="pvstp", parent=self, choice=(u'spanning-tree-mode', u'pvstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='list', is_config=True)""",
})
self.__pvstp = t
if hasattr(self, '_set'):
self._set()
def _unset_pvstp(self):
self.__pvstp = YANGDynClass(base=YANGListType("vlan_id",pvstp.pvstp, yang_name="pvstp", rest_name="pvstp", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='vlan-id', extensions=None, choice=False), is_container='list', yang_name="pvstp", rest_name="pvstp", parent=self, choice=(u'spanning-tree-mode', u'pvstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='list', is_config=True)
def _get_rpvstp(self):
"""
Getter method for rpvstp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/rpvstp (list)
YANG Description: RPVST instance information
"""
return self.__rpvstp
def _set_rpvstp(self, v, load=False):
"""
Setter method for rpvstp, mapped from YANG variable /brocade_xstp_ext_rpc/get_stp_brief_info/output/spanning_tree_info/rpvstp (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_rpvstp is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_rpvstp() directly.
YANG Description: RPVST instance information
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType("vlan_id",rpvstp.rpvstp, yang_name="rpvstp", rest_name="rpvstp", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='vlan-id', extensions=None, choice=False), is_container='list', yang_name="rpvstp", rest_name="rpvstp", parent=self, choice=(u'spanning-tree-mode', u'rpvstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='list', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """rpvstp must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("vlan_id",rpvstp.rpvstp, yang_name="rpvstp", rest_name="rpvstp", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='vlan-id', extensions=None, choice=False), is_container='list', yang_name="rpvstp", rest_name="rpvstp", parent=self, choice=(u'spanning-tree-mode', u'rpvstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='list', is_config=True)""",
})
self.__rpvstp = t
if hasattr(self, '_set'):
self._set()
def _unset_rpvstp(self):
self.__rpvstp = YANGDynClass(base=YANGListType("vlan_id",rpvstp.rpvstp, yang_name="rpvstp", rest_name="rpvstp", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='vlan-id', extensions=None, choice=False), is_container='list', yang_name="rpvstp", rest_name="rpvstp", parent=self, choice=(u'spanning-tree-mode', u'rpvstp'), path_helper=self._path_helper, extmethods=self._extmethods, register_paths=False, extensions=None, namespace='urn:brocade.com:mgmt:brocade-xstp-ext', defining_module='brocade-xstp-ext', yang_type='list', is_config=True)
stp_mode = __builtin__.property(_get_stp_mode, _set_stp_mode)
stp = __builtin__.property(_get_stp, _set_stp)
rstp = __builtin__.property(_get_rstp, _set_rstp)
mstp = __builtin__.property(_get_mstp, _set_mstp)
pvstp = __builtin__.property(_get_pvstp, _set_pvstp)
rpvstp = __builtin__.property(_get_rpvstp, _set_rpvstp)
__choices__ = {u'spanning-tree-mode': {u'pvstp': [u'pvstp'], u'mstp': [u'mstp'], u'stp': [u'stp'], u'rstp': [u'rstp'], u'rpvstp': [u'rpvstp']}}
_pyangbind_elements = {'stp_mode': stp_mode, 'stp': stp, 'rstp': rstp, 'mstp': mstp, 'pvstp': pvstp, 'rpvstp': rpvstp, }
| 69.034268 | 630 | 0.719043 | 3,093 | 22,160 | 4.903977 | 0.062722 | 0.051424 | 0.059072 | 0.037975 | 0.842629 | 0.81573 | 0.806632 | 0.799578 | 0.796941 | 0.788502 | 0 | 0.001577 | 0.141561 | 22,160 | 320 | 631 | 69.25 | 0.795774 | 0.165839 | 0 | 0.438144 | 0 | 0.030928 | 0.335624 | 0.131259 | 0 | 0 | 0 | 0 | 0 | 1 | 0.108247 | false | 0 | 0.06701 | 0 | 0.298969 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b80e1741c139aa8cd0d394288c8be26e9f3580de | 57 | py | Python | examples/example2.py | rsoaresp/pydep | bcd391d7fa538431e364f611e6cf652d5baa8556 | [
"MIT"
] | null | null | null | examples/example2.py | rsoaresp/pydep | bcd391d7fa538431e364f611e6cf652d5baa8556 | [
"MIT"
] | null | null | null | examples/example2.py | rsoaresp/pydep | bcd391d7fa538431e364f611e6cf652d5baa8556 | [
"MIT"
] | null | null | null | from examples.example1 import f
def s():
print(f(2)) | 14.25 | 31 | 0.666667 | 10 | 57 | 3.8 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 0.192982 | 57 | 4 | 32 | 14.25 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
62a78e4aa886261faef7ec9a6f0a0e6f4ded0139 | 76 | py | Python | py_tdlib/constructors/inline_keyboard_button_type_buy.py | Mr-TelegramBot/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 24 | 2018-10-05T13:04:30.000Z | 2020-05-12T08:45:34.000Z | py_tdlib/constructors/inline_keyboard_button_type_buy.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 3 | 2019-06-26T07:20:20.000Z | 2021-05-24T13:06:56.000Z | py_tdlib/constructors/inline_keyboard_button_type_buy.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 5 | 2018-10-05T14:29:28.000Z | 2020-08-11T15:04:10.000Z | from ..factory import Type
class inlineKeyboardButtonTypeBuy(Type):
pass
| 12.666667 | 40 | 0.802632 | 8 | 76 | 7.625 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131579 | 76 | 5 | 41 | 15.2 | 0.924242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
62dc6e227ac884df6cccce08d76ba69cdd0182bc | 30,353 | py | Python | examples/servicenet/plotresults.py | oascigil/icarus_edge_comp | b7bb9f9b8d0f27b4b01469dcba9cfc0c4949d64b | [
"MIT"
] | 5 | 2021-03-20T09:22:55.000Z | 2021-12-20T17:01:33.000Z | examples/servicenet/plotresults.py | oascigil/icarus_edge_comp | b7bb9f9b8d0f27b4b01469dcba9cfc0c4949d64b | [
"MIT"
] | null | null | null | examples/servicenet/plotresults.py | oascigil/icarus_edge_comp | b7bb9f9b8d0f27b4b01469dcba9cfc0c4949d64b | [
"MIT"
] | 1 | 2020-01-02T12:39:54.000Z | 2020-01-02T12:39:54.000Z | # -*- coding: utf-8 -*-
"""Plot results read from a result set
"""
from __future__ import division
import os
import argparse
import collections
import logging
import numpy as np
import matplotlib as mpl
mpl.use('Agg')
import matplotlib.pyplot as plt
from icarus.util import Settings, Tree, config_logging, step_cdf
from icarus.tools import means_confidence_interval
from icarus.results import plot_lines, plot_bar_chart
from icarus.registry import RESULTS_READER
# Logger object
logger = logging.getLogger('plot')
# These lines prevent insertion of Type 3 fonts in figures
# Publishers don't want them
plt.rcParams['ps.useafm'] = True
plt.rcParams['pdf.use14corefonts'] = True
# If True text is interpreted as LaTeX, e.g. underscore are interpreted as
# subscript. If False, text is interpreted literally
plt.rcParams['text.usetex'] = False
# Aspect ratio of the output figures
plt.rcParams['figure.figsize'] = 8, 5
# Size of font in legends
LEGEND_SIZE = 14
# Line width in pixels
LINE_WIDTH = 1.5
# Plot
PLOT_EMPTY_GRAPHS = True
# This dict maps strategy names to the style of the line to be used in the plots
# Off-path strategies: solid lines
# On-path strategies: dashed lines
# No-cache: dotted line
STRATEGY_STYLE = {
'HR_SYMM': 'b-o',
'HR_ASYMM': 'g-D',
'HR_MULTICAST': 'm-^',
'HR_HYBRID_AM': 'c-s',
'HR_HYBRID_SM': 'r-v',
'LCE': 'b--p',
'LCD': 'g-->',
'CL4M': 'g-->',
'PROB_CACHE': 'c--<',
'RAND_CHOICE': 'r--<',
'RAND_BERNOULLI': 'g--*',
'NO_CACHE': 'k:o',
'OPTIMAL': 'k-o'
}
# This dict maps name of strategies to names to be displayed in the legend
STRATEGY_LEGEND = {
'LCE': 'LCE',
'LCD': 'LCD',
'HR_SYMM': 'HR Symm',
'HR_ASYMM': 'HR Asymm',
'HR_MULTICAST': 'HR Multicast',
'HR_HYBRID_AM': 'HR Hybrid AM',
'HR_HYBRID_SM': 'HR Hybrid SM',
'CL4M': 'CL4M',
'PROB_CACHE': 'ProbCache',
'RAND_CHOICE': 'Random (choice)',
'RAND_BERNOULLI': 'Random (Bernoulli)',
'NO_CACHE': 'No caching',
'OPTIMAL': 'Optimal'
}
# Color and hatch styles for bar charts of cache hit ratio and link load vs topology
STRATEGY_BAR_COLOR = {
'LCE': 'k',
'LCD': '0.4',
'NO_CACHE': '0.5',
'HR_ASYMM': '0.6',
'HR_SYMM': '0.7'
}
STRATEGY_BAR_HATCH = {
'LCE': None,
'LCD': '//',
'NO_CACHE': 'x',
'HR_ASYMM': '+',
'HR_SYMM': '\\'
}
def plot_cache_hits_vs_alpha(resultset, topology, cache_size, alpha_range, strategies, plotdir):
if 'NO_CACHE' in strategies:
strategies.remove('NO_CACHE')
desc = {}
desc['title'] = 'Cache hit ratio: T=%s C=%s' % (topology, cache_size)
desc['ylabel'] = 'Cache hit ratio'
desc['xlabel'] = u'Content distribution \u03b1'
desc['xparam'] = ('workload', 'alpha')
desc['xvals'] = alpha_range
desc['filter'] = {'topology': {'name': topology},
'cache_placement': {'network_cache': cache_size}}
desc['ymetrics'] = [('CACHE_HIT_RATIO', 'MEAN')]*len(strategies)
desc['ycondnames'] = [('strategy', 'name')]*len(strategies)
desc['ycondvals'] = strategies
desc['errorbar'] = True
desc['legend_loc'] = 'upper left'
desc['line_style'] = STRATEGY_STYLE
desc['legend'] = STRATEGY_LEGEND
desc['plotempty'] = PLOT_EMPTY_GRAPHS
plot_lines(resultset, desc, 'CACHE_HIT_RATIO_T=%s@C=%s.pdf'
% (topology, cache_size), plotdir)
def plot_cache_hits_vs_cache_size(resultset, topology, alpha, cache_size_range, strategies, plotdir):
desc = {}
if 'NO_CACHE' in strategies:
strategies.remove('NO_CACHE')
desc['title'] = 'Cache hit ratio: T=%s A=%s' % (topology, alpha)
desc['xlabel'] = u'Cache to population ratio'
desc['ylabel'] = 'Cache hit ratio'
desc['xscale'] = 'log'
desc['xparam'] = ('cache_placement','network_cache')
desc['xvals'] = cache_size_range
desc['filter'] = {'topology': {'name': topology},
'workload': {'name': 'STATIONARY', 'alpha': alpha}}
desc['ymetrics'] = [('CACHE_HIT_RATIO', 'MEAN')]*len(strategies)
desc['ycondnames'] = [('strategy', 'name')]*len(strategies)
desc['ycondvals'] = strategies
desc['errorbar'] = True
desc['legend_loc'] = 'upper left'
desc['line_style'] = STRATEGY_STYLE
desc['legend'] = STRATEGY_LEGEND
desc['plotempty'] = PLOT_EMPTY_GRAPHS
plot_lines(resultset, desc,'CACHE_HIT_RATIO_T=%s@A=%s.pdf'
% (topology, alpha), plotdir)
def plot_link_load_vs_alpha(resultset, topology, cache_size, alpha_range, strategies, plotdir):
desc = {}
desc['title'] = 'Internal link load: T=%s C=%s' % (topology, cache_size)
desc['xlabel'] = u'Content distribution \u03b1'
desc['ylabel'] = 'Internal link load'
desc['xparam'] = ('workload', 'alpha')
desc['xvals'] = alpha_range
desc['filter'] = {'topology': {'name': topology},
'cache_placement': {'network_cache': cache_size}}
desc['ymetrics'] = [('LINK_LOAD', 'MEAN_INTERNAL')]*len(strategies)
desc['ycondnames'] = [('strategy', 'name')]*len(strategies)
desc['ycondvals'] = strategies
desc['errorbar'] = True
desc['legend_loc'] = 'upper right'
desc['line_style'] = STRATEGY_STYLE
desc['legend'] = STRATEGY_LEGEND
desc['plotempty'] = PLOT_EMPTY_GRAPHS
plot_lines(resultset, desc, 'LINK_LOAD_INTERNAL_T=%s@C=%s.pdf'
% (topology, cache_size), plotdir)
def plot_link_load_vs_cache_size(resultset, topology, alpha, cache_size_range, strategies, plotdir):
desc = {}
desc['title'] = 'Internal link load: T=%s A=%s' % (topology, alpha)
desc['xlabel'] = 'Cache to population ratio'
desc['ylabel'] = 'Internal link load'
desc['xscale'] = 'log'
desc['xparam'] = ('cache_placement','network_cache')
desc['xvals'] = cache_size_range
desc['filter'] = {'topology': {'name': topology},
'workload': {'name': 'stationary', 'alpha': alpha}}
desc['ymetrics'] = [('LINK_LOAD', 'MEAN_INTERNAL')]*len(strategies)
desc['ycondnames'] = [('strategy', 'name')]*len(strategies)
desc['ycondvals'] = strategies
desc['errorbar'] = True
desc['legend_loc'] = 'upper right'
desc['line_style'] = STRATEGY_STYLE
desc['legend'] = STRATEGY_LEGEND
desc['plotempty'] = PLOT_EMPTY_GRAPHS
plot_lines(resultset, desc, 'LINK_LOAD_INTERNAL_T=%s@A=%s.pdf'
% (topology, alpha), plotdir)
def plot_latency_vs_alpha(resultset, topology, cache_size, alpha_range, strategies, plotdir):
desc = {}
desc['title'] = 'Latency: T=%s C=%s' % (topology, cache_size)
desc['xlabel'] = u'Content distribution \u03b1'
desc['ylabel'] = 'Latency (ms)'
desc['xparam'] = ('workload', 'alpha')
desc['xvals'] = alpha_range
desc['filter'] = {'topology': {'name': topology},
'cache_placement': {'network_cache': cache_size}}
desc['ymetrics'] = [('LATENCY', 'MEAN')]*len(strategies)
desc['ycondnames'] = [('strategy', 'name')]*len(strategies)
desc['ycondvals'] = strategies
desc['errorbar'] = True
desc['legend_loc'] = 'upper right'
desc['line_style'] = STRATEGY_STYLE
desc['legend'] = STRATEGY_LEGEND
desc['plotempty'] = PLOT_EMPTY_GRAPHS
plot_lines(resultset, desc, 'LATENCY_T=%s@C=%s.pdf'
% (topology, cache_size), plotdir)
def plot_latency_vs_cache_size(resultset, topology, alpha, cache_size_range, strategies, plotdir):
desc = {}
desc['title'] = 'Latency: T=%s A=%s' % (topology, alpha)
desc['xlabel'] = 'Cache to population ratio'
desc['ylabel'] = 'Latency'
desc['xscale'] = 'log'
desc['xparam'] = ('cache_placement','network_cache')
desc['xvals'] = cache_size_range
desc['filter'] = {'topology': {'name': topology},
'workload': {'name': 'STATIONARY', 'alpha': alpha}}
desc['ymetrics'] = [('LATENCY', 'MEAN')]*len(strategies)
desc['ycondnames'] = [('strategy', 'name')]*len(strategies)
desc['ycondvals'] = strategies
desc['metric'] = ('LATENCY', 'MEAN')
desc['errorbar'] = True
desc['legend_loc'] = 'upper right'
desc['line_style'] = STRATEGY_STYLE
desc['legend'] = STRATEGY_LEGEND
desc['plotempty'] = PLOT_EMPTY_GRAPHS
plot_lines(resultset, desc, 'LATENCY_T=%s@A=%s.pdf'
% (topology, alpha), plotdir)
def plot_cache_hits_vs_topology(resultset, alpha, cache_size, topology_range, strategies, plotdir):
"""
Plot bar graphs of cache hit ratio for specific values of alpha and cache
size for various topologies.
The objective here is to show that our algorithms works well on all
topologies considered
"""
if 'NO_CACHE' in strategies:
strategies.remove('NO_CACHE')
desc = {}
desc['title'] = 'Cache hit ratio: A=%s C=%s' % (alpha, cache_size)
desc['ylabel'] = 'Cache hit ratio'
desc['xparam'] = ('topology', 'name')
desc['xvals'] = topology_range
desc['filter'] = {'cache_placement': {'network_cache': cache_size},
'workload': {'name': 'STATIONARY', 'alpha': alpha}}
desc['ymetrics'] = [('CACHE_HIT_RATIO', 'MEAN')]*len(strategies)
desc['ycondnames'] = [('strategy', 'name')]*len(strategies)
desc['ycondvals'] = strategies
desc['errorbar'] = True
desc['legend_loc'] = 'lower right'
desc['bar_color'] = STRATEGY_BAR_COLOR
desc['bar_hatch'] = STRATEGY_BAR_HATCH
desc['legend'] = STRATEGY_LEGEND
desc['plotempty'] = PLOT_EMPTY_GRAPHS
plot_bar_chart(resultset, desc, 'CACHE_HIT_RATIO_A=%s_C=%s.pdf'
% (alpha, cache_size), plotdir)
def plot_link_load_vs_topology(resultset, alpha, cache_size, topology_range, strategies, plotdir):
"""
Plot bar graphs of link load for specific values of alpha and cache
size for various topologies.
The objective here is to show that our algorithms works well on all
topologies considered
"""
desc = {}
desc['title'] = 'Internal link load: A=%s C=%s' % (alpha, cache_size)
desc['ylabel'] = 'Internal link load'
desc['xparam'] = ('topology', 'name')
desc['xvals'] = topology_range
desc['filter'] = {'cache_placement': {'network_cache': cache_size},
'workload': {'name': 'STATIONARY', 'alpha': alpha}}
desc['ymetrics'] = [('LINK_LOAD', 'MEAN_INTERNAL')]*len(strategies)
desc['ycondnames'] = [('strategy', 'name')]*len(strategies)
desc['ycondvals'] = strategies
desc['errorbar'] = True
desc['legend_loc'] = 'lower right'
desc['bar_color'] = STRATEGY_BAR_COLOR
desc['bar_hatch'] = STRATEGY_BAR_HATCH
desc['legend'] = STRATEGY_LEGEND
desc['plotempty'] = PLOT_EMPTY_GRAPHS
plot_bar_chart(resultset, desc, 'LINK_LOAD_INTERNAL_A=%s_C=%s.pdf'
% (alpha, cache_size), plotdir)
def searchDictMultipleCat(lst, category_list, attr_value_pairs, num_pairs, collector, subtype):
"""
Search the resultset list for a particular [category, attribute, value] parameter such as ['strategy', 'extra_quota', 3]. attr_value_pairs include the key-value pairs.
and once such a key is found, extract the result for a collector, subtype such as ['CACHE_HIT_RATIO', 'MEAN']
Returns the result if found in the dictionary lst; otherwise returns None
"""
result = None
for l in lst:
num_match = 0
for key, val in l[0].items():
#print key + '-and-' + category + '-\n'
if key in category_list:
if (isinstance(val, dict)):
for key1, val1 in val.items():
for key2, val2 in attr_value_pairs.items():
if key1 == key2 and val1 == val2:
num_match = num_match + 1
if num_match == num_pairs:
result = l[1]
break
else:
print 'Something is wrong with the search for attr-value pairs\n'
return None
if result is not None:
break
if result is None:
print 'Error searched attribute, value pairs:\n'
for k, v in attr_value_pairs.items():
print '[ ' + repr(k) + ' , ' + repr(v) + ' ] '
print 'is not found, returning none\n'
return None
found = None
for key, val in result.items():
if key == collector:
for key1, val1 in val.items():
if key1 == subtype:
found = val1
break
if found is not None:
break
if found is None:
print 'Error searched collector, subtype ' + repr(collector) + ',' + repr(subtype) + 'is not found\n'
return found
def searchDictMultipleCat1(lst, category_list, attr_value_list, num_pairs, collector, subtype):
"""
Search the resultset list for a particular [category, attribute, value] parameter such as ['strategy', 'extra_quota', 3]. attr_value_pairs include the key-value pairs.
and once such a key is found, extract the result for a collector, subtype such as ['CACHE_HIT_RATIO', 'MEAN']
Returns the result if found in the dictionary lst; otherwise returns None
"""
result = None
for l in lst:
num_match = 0
for key, val in l[0].items():
#print key + '-and-' + category + '-\n'
if key in category_list:
if (isinstance(val, dict)):
for key1, val1 in val.items():
for arr in attr_value_list:
key2 = arr[0]
val2 = arr[1]
if key1 == key2 and val1 == val2:
num_match = num_match + 1
if num_match == num_pairs:
result = l[1]
break
else:
print 'Something is wrong with the search for attr-value pairs\n'
return None
if result is not None:
break
if result is None:
print 'Error searched attribute, value pairs:\n'
for arr in attr_value_list:
k = arr[0]
v = arr[1]
print '[ ' + repr(k) + ' , ' + repr(v) + ' ] '
print 'is not found, returning none\n'
return None
found = None
for key, val in result.items():
if key == collector:
for key1, val1 in val.items():
if key1 == subtype:
found = val1
break
if found is not None:
break
if found is None:
print 'Error searched collector, subtype ' + repr(collector) + ',' + repr(subtype) + 'is not found\n'
return found
def searchDict(lst, category, attr_value_pairs, num_pairs, collector, subtype):
"""
Search the resultset list for a particular [category, attribute, value] parameter such as ['strategy', 'extra_quota', 3]. attr_value_pairs include the key-value pairs.
and once such a key is found, extract the result for a collector, subtype such as ['CACHE_HIT_RATIO', 'MEAN']
Returns the result if found in the dictionary lst; otherwise returns None
"""
result = None
for l in lst:
for key, val in l[0].items():
#print key + '-and-' + category + '-\n'
if key == category:
if (isinstance(val, dict)):
num_match = 0
for key1, val1 in val.items():
for key2, val2 in attr_value_pairs.items():
if key1 == key2 and val1 == val2:
num_match = num_match + 1
if num_match == num_pairs:
result = l[1]
break
else:
print 'Something is wrong with the search for attr-value pairs\n'
return None
if result is not None:
break
if result is None:
print 'Error searched attribute, value pairs:\n'
for k, v in attr_value_pairs.items():
print '[ ' + repr(k) + ' , ' + repr(v) + ' ] '
print 'is not found, returning none\n'
return None
found = None
for key, val in result.items():
if key == collector:
for key1, val1 in val.items():
if key1 == subtype:
found = val1
break
if found is not None:
break
if found is None:
print 'Error searched collector, subtype ' + repr(collector) + ',' + repr(subtype) + 'is not found\n'
return found
def print_lru_probability_results(lst):
probs = [0.1, 0.25, 0.50, 0.75, 1.0]
strategies = ['LRU']
for strategy in strategies:
for p in probs:
filename = 'sat_' + str(strategy) + '_' + str(p)
f = open(filename, 'w')
f.write('# Sat. rate for LRU over time\n')
f.write('#\n')
f.write('# Time Sat. Rate\n')
sat_times = searchDict(lst, 'strategy', {'name': strategy, 'p' : p}, 2, 'LATENCY', 'SAT_TIMES')
for k in sorted(sat_times):
s = str(k[0][0]) + "\t" + str(k[1]) + "\n"
f.write(s)
f.close()
for strategy in strategies:
for p in probs:
filename = 'idle_' + str(strategy) + '_' + str(p)
f = open(filename, 'w')
f.write('# Idle time of strategies over time\n')
f.write('#\n')
f.write('# Time Idle percentage\n')
idle_times = searchDict(lst, 'strategy', {'name': strategy, 'p' : p}, 2, 'LATENCY', 'IDLE_TIMES')
for k in sorted(idle_times):
s = str(k[0][0]) + "\t" + str(k[1]) + "\n"
f.write(s)
f.close()
def print_strategies_performance(lst):
strategies = ['SDF', 'HYBRID', 'MFU', 'COORDINATED']
#strategies = ['HYBRID']
service_budget = 500
alpha = 0.5 #0.75
replacement_interval = 30.0
n_services = 1000
# Print Sat. rates:
for strategy in strategies:
filename = 'sat_' + str(strategy)
f = open(filename, 'w')
f.write('# Sat. rate over time\n')
f.write('#\n')
f.write('# Time Sat. Rate\n')
sat_times = searchDictMultipleCat(lst, ['strategy', 'computation_placement', 'workload'], {'name' : strategy, 'service_budget' : service_budget, 'alpha' : alpha}, 3, 'LATENCY', 'SAT_TIMES')
for k in sorted(sat_times):
s = str(k[0][0]) + "\t" + str(k[1]) + "\n"
f.write(s)
f.close()
# Print Idle times:
for strategy in strategies:
filename = 'idle_' + str(strategy)
f = open(filename, 'w')
f.write('# Idle time of strategies over time\n')
f.write('#\n')
f.write('# Time Idle percentage\n')
idle_times = searchDictMultipleCat(lst, ['strategy', 'computation_placement', 'workload'], {'name' : strategy, 'service_budget' : service_budget, 'alpha' : alpha}, 3, 'LATENCY', 'IDLE_TIMES')
for k in sorted(idle_times):
s = str(k[0][0]) + "\t" + str(k[1]) + "\n"
f.write(s)
f.close()
# Print per-service Sat. rates:
for strategy in strategies:
filename = 'sat_service_' + str(strategy)
f = open(filename, 'w')
f.write('# Per-service Sat. rate over time\n')
f.write('#\n')
f.write('# Time Sat. Rate\n')
sat_services = searchDictMultipleCat(lst, ['strategy', 'computation_placement', 'workload'], {'name' : strategy, 'service_budget' : service_budget, 'alpha' : alpha}, 3, 'LATENCY', 'PER_SERVICE_SATISFACTION')
#f.write(str(sat_services))
for indx in range(1, n_services):
s = str(indx) + "\t" + str(sat_services[indx]) + "\n"
f.write(s)
f.close()
def print_scheduling_experiments(lst):
strategies = ['SDF', 'HYBRID', 'MFU']
schedule_policies = ['EDF', 'FIFO']
service_budget = 500
alpha = 0.75
replacement_interval = 30.0
# Print Sat. rates:
for strategy in strategies:
for policy in schedule_policies:
filename = 'sat_' + str(strategy) + '_' + str(policy)
f = open(filename, 'w')
f.write('# Sat. rate over time\n')
f.write('#\n')
f.write('# Time Sat. Rate\n')
sat_times = searchDictMultipleCat1(lst, ['strategy', 'computation_placement', 'workload', 'sched_policy'], [['name', strategy], ['service_budget', service_budget], ['alpha', alpha], ['name', policy]], 4, 'LATENCY', 'SAT_TIMES')
for k in sorted(sat_times):
s = str(k[0][0]) + "\t" + str(k[1]) + "\n"
f.write(s)
f.close()
# Print idle times:
for strategy in strategies:
for policy in schedule_policies:
filename = 'idle_' + str(strategy) + '_' + str(policy)
f = open(filename, 'w')
f.write('# Idle times over time\n')
f.write('#\n')
f.write('# Time Idle percentage\n')
idle_times = searchDictMultipleCat1(lst, ['strategy', 'computation_placement', 'workload', 'sched_policy'], [['name', strategy], ['service_budget', service_budget], ['alpha', alpha], ['name', policy]], 4, 'LATENCY', 'IDLE_TIMES')
for k in sorted(idle_times):
s = str(k[0][0]) + "\t" + str((1.0*k[1])) + "\n"
f.write(s)
f.close()
def print_zipf_experiment(lst):
strategies = ['SDF', 'HYBRID', 'MFU']
alphas = [0.1, 0.25, 0.50, 0.75, 1.0]
replacement_interval = 30.0
service_budget = 500
# Print Sat. rates:
for strategy in strategies:
for alpha in alphas:
filename = 'sat_' + str(strategy) + '_' + str(alpha)
f = open(filename, 'w')
f.write('# Sat. rate over time\n')
f.write('#\n')
f.write('# Time Sat. Rate\n')
sat_times = searchDictMultipleCat(lst, ['strategy', 'computation_placement', 'workload'], {'name' : strategy, 'service_budget' : service_budget, 'alpha' : alpha}, 3, 'LATENCY', 'SAT_TIMES')
for k in sorted(sat_times):
s = str(k[0][0]) + "\t" + str(k[1]) + "\n"
f.write(s)
f.close()
# Print Idle times:
for strategy in strategies:
for alpha in alphas:
filename = 'idle_' + str(strategy) + '_' + str(alpha)
f = open(filename, 'w')
f.write('# Idle times over time\n')
f.write('#\n')
f.write('# Time Idle percentage\n')
idle_times = searchDictMultipleCat(lst, ['strategy', 'computation_placement', 'workload'], {'name' : strategy, 'service_budget' : service_budget, 'alpha' : alpha}, 3, 'LATENCY', 'IDLE_TIMES')
for k in sorted(sat_times):
s = str(k[0][0]) + "\t" + str((1.0*k[1])) + "\n"
f.write(s)
f.close()
def print_budget_experiment(lst):
strategies = ['COORDINATED', 'SDF', 'HYBRID', 'MFU']
TREE_DEPTH = 3
BRANCH_FACTOR = 2
NUM_CORES = 50
NUM_NODES = int(pow(BRANCH_FACTOR, TREE_DEPTH+1) -1)
alpha = 0.75
replacement_interval = 30.0
N_SERVICES = 1000
budgets = [NUM_CORES*NUM_NODES*1, NUM_CORES*NUM_NODES*3/2, NUM_CORES*NUM_NODES*2, NUM_CORES*NUM_NODES*5/2, NUM_CORES*NUM_NODES*3]
# Print Sat. rates:
for strategy in strategies:
for budget in budgets:
filename = 'Results/VM_Budget_Results/' + 'sat_' + str(strategy) + '_' + str(budget)
f = open(filename, 'w')
f.write('# Sat. rate over time\n')
f.write('#\n')
f.write('# Time Sat. Rate\n')
sat_times = searchDictMultipleCat(lst, ['strategy', 'computation_placement', 'workload'], {'name' : strategy, 'service_budget' : budget, 'alpha' : alpha}, 3, 'LATENCY', 'SAT_TIMES')
for k in sorted(sat_times):
s = str(k[0][0]) + "\t" + str(k[1]) + "\n"
f.write(s)
f.close()
# Print Idle times:
for strategy in strategies:
for budget in budgets:
filename = 'Results/VM_Budget_Results/' + 'idle_' + str(strategy) + '_' + str(budget)
f = open(filename, 'w')
f.write('# Idle times over time\n')
f.write('#\n')
f.write('# Time Idle percentage\n')
idle_times = searchDictMultipleCat(lst, ['strategy', 'computation_placement', 'workload'], {'name' : strategy, 'service_budget' : budget, 'alpha' : alpha}, 3, 'LATENCY', 'IDLE_TIMES')
for k in sorted(idle_times):
s = str(k[0][0]) + "\t" + str((1.0*k[1])) + "\n"
f.write(s)
f.close()
# Print Overhead times:
for strategy in strategies:
for budget in budgets:
filename = 'Results/VM_Budget_Results/' + 'overhead_' + str(strategy) + '_' + str(budget)
f = open(filename, 'w')
f.write('# VM instantiation overhead over time\n')
f.write('#\n')
f.write('# Time Overhead\n')
overhead_times = searchDictMultipleCat(lst, ['strategy', 'computation_placement', 'workload'], {'name' : strategy, 'service_budget' : budget, 'alpha' : alpha}, 3, 'LATENCY', 'INSTANTIATION_OVERHEAD')
for k in sorted(overhead_times):
s = str(k[0][0]) + "\t" + str((1.0*k[1])) + "\n"
f.write(s)
f.close()
def printTree(tree, d = 0):
if (tree == None or len(tree) == 0):
print "\t" * d, "-"
else:
for key, val in tree.items():
if (isinstance(val, dict)):
print "\t" * d, key
printTree(val, d+1)
else:
print "\t" * d, key, str(val)
def run(config, results, plotdir):
"""Run the plot script
Parameters
----------
config : str
The path of the configuration file
results : str
The file storing the experiment results
plotdir : str
The directory into which graphs will be saved
"""
resultset = RESULTS_READER['PICKLE'](results)
#Onur: added this BEGIN
lst = resultset.dump()
for l in lst:
print 'PARAMETERS:\n'
printTree(l[0])
print 'RESULTS:\n'
printTree(l[1])
#print_lru_probability_results(lst)
#print_strategies_performance(lst)
print_budget_experiment(lst)
#print_scheduling_experiments(lst)
#print_zipf_experiment(lst)
# /home/uceeoas/.local/bin/python ./plotresults.py --results results.pickle --output ./ config.py
"""
settings = Settings()
settings.read_from(config)
config_logging(settings.LOG_LEVEL)
resultset = RESULTS_READER[settings.RESULTS_FORMAT](results)
# Create dir if not existsing
if not os.path.exists(plotdir):
os.makedirs(plotdir)
# Parse params from settings
topologies = settings.TOPOLOGIES
cache_sizes = settings.NETWORK_CACHE
alphas = settings.ALPHA
strategies = settings.STRATEGIES
# Plot graphs
for topology in topologies:
for cache_size in cache_sizes:
logger.info('Plotting cache hit ratio for topology %s and cache size %s vs alpha' % (topology, str(cache_size)))
plot_cache_hits_vs_alpha(resultset, topology, cache_size, alphas, strategies, plotdir)
logger.info('Plotting link load for topology %s vs cache size %s' % (topology, str(cache_size)))
plot_link_load_vs_alpha(resultset, topology, cache_size, alphas, strategies, plotdir)
logger.info('Plotting latency for topology %s vs cache size %s' % (topology, str(cache_size)))
plot_latency_vs_alpha(resultset, topology, cache_size, alphas, strategies, plotdir)
for topology in topologies:
for alpha in alphas:
logger.info('Plotting cache hit ratio for topology %s and alpha %s vs cache size' % (topology, str(alpha)))
plot_cache_hits_vs_cache_size(resultset, topology, alpha, cache_sizes, strategies, plotdir)
logger.info('Plotting link load for topology %s and alpha %s vs cache size' % (topology, str(alpha)))
plot_link_load_vs_cache_size(resultset, topology, alpha, cache_sizes, strategies, plotdir)
logger.info('Plotting latency for topology %s and alpha %s vs cache size' % (topology, str(alpha)))
plot_latency_vs_cache_size(resultset, topology, alpha, cache_sizes, strategies, plotdir)
for cache_size in cache_sizes:
for alpha in alphas:
logger.info('Plotting cache hit ratio for cache size %s vs alpha %s against topologies' % (str(cache_size), str(alpha)))
plot_cache_hits_vs_topology(resultset, alpha, cache_size, topologies, strategies, plotdir)
logger.info('Plotting link load for cache size %s vs alpha %s against topologies' % (str(cache_size), str(alpha)))
plot_link_load_vs_topology(resultset, alpha, cache_size, topologies, strategies, plotdir)
logger.info('Exit. Plots were saved in directory %s' % os.path.abspath(plotdir))
"""
def main():
parser = argparse.ArgumentParser(__doc__)
parser.add_argument("-r", "--results", dest="results",
help='the results file',
required=True)
parser.add_argument("-o", "--output", dest="output",
help='the output directory where plots will be saved',
required=True)
parser.add_argument("config",
help="the configuration file")
args = parser.parse_args()
run(args.config, args.results, args.output)
if __name__ == '__main__':
main()
| 40.633199 | 241 | 0.574704 | 3,688 | 30,353 | 4.588395 | 0.102766 | 0.02872 | 0.014892 | 0.01631 | 0.796242 | 0.766931 | 0.756648 | 0.748729 | 0.729642 | 0.700508 | 0 | 0.010446 | 0.287187 | 30,353 | 746 | 242 | 40.687668 | 0.771677 | 0.041544 | 0 | 0.679487 | 0 | 0 | 0.224649 | 0.022655 | 0.005495 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.021978 | null | null | 0.054945 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
62df28bb1d11acaa6d0893b498961d61f2076551 | 35,329 | py | Python | eoxserver/resources/coverages/migrations/0001_initial.py | kalxas/eoxserver | 8073447d926f3833923bde7b7061e8a1658dee06 | [
"OML"
] | 25 | 2015-08-10T19:34:34.000Z | 2021-02-05T08:28:01.000Z | eoxserver/resources/coverages/migrations/0001_initial.py | kalxas/eoxserver | 8073447d926f3833923bde7b7061e8a1658dee06 | [
"OML"
] | 153 | 2015-01-20T08:35:49.000Z | 2022-03-16T11:00:56.000Z | eoxserver/resources/coverages/migrations/0001_initial.py | kalxas/eoxserver | 8073447d926f3833923bde7b7061e8a1658dee06 | [
"OML"
] | 10 | 2015-01-23T15:48:30.000Z | 2021-01-21T15:41:18.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.3 on 2017-09-06 19:09
from __future__ import unicode_literals
import django.contrib.gis.db.models.fields
import django.core.validators
from django.db import migrations, models
import django.db.models.deletion
import re
class Migration(migrations.Migration):
initial = True
dependencies = [
('backends', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='AcquisitionStation',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='AcquisitionSubType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='AllowedValueRange',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('start', models.FloatField()),
('end', models.FloatField()),
],
),
migrations.CreateModel(
name='ArchivingCenter',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='ArrayDataItem',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('location', models.CharField(max_length=1024)),
('format', models.CharField(blank=True, max_length=64, null=True)),
('field_index', models.PositiveSmallIntegerField(default=0)),
('band_count', models.PositiveSmallIntegerField(default=1)),
('subdataset_type', models.CharField(blank=True, max_length=64, null=True)),
('subdataset_locator', models.CharField(blank=True, max_length=1024, null=True)),
('bands_interpretation', models.PositiveSmallIntegerField(choices=[(0, b'fields'), (1, b'dimension')], default=0)),
],
),
migrations.CreateModel(
name='Browse',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('location', models.CharField(max_length=1024)),
('format', models.CharField(blank=True, max_length=64, null=True)),
('style', models.CharField(blank=True, max_length=256, null=True)),
('coordinate_reference_system', models.TextField()),
('min_x', models.FloatField()),
('min_y', models.FloatField()),
('max_x', models.FloatField()),
('max_y', models.FloatField()),
('width', models.PositiveIntegerField()),
('height', models.PositiveIntegerField()),
],
),
migrations.CreateModel(
name='BrowseType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=256, validators=[django.core.validators.RegexValidator(re.compile(b'^[a-zA-z_][a-zA-Z0-9_]*$'), message=b'This field must contain a valid Name.')])),
('red_or_grey_expression', models.CharField(blank=True, max_length=512, null=True)),
('green_expression', models.CharField(blank=True, max_length=512, null=True)),
('blue_expression', models.CharField(blank=True, max_length=512, null=True)),
('alpha_expression', models.CharField(blank=True, max_length=512, null=True)),
],
),
migrations.CreateModel(
name='CollectionMetadata',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('product_type', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('doi', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('platform', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('platform_serial_identifier', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('instrument', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('sensor_type', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('composite_type', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('processing_level', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('orbit_type', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('spectral_range', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('wavelength', models.IntegerField(blank=True, db_index=True, null=True)),
('product_metadata_summary', models.TextField(blank=True, null=True)),
('coverage_metadata_summary', models.TextField(blank=True, null=True)),
],
),
migrations.CreateModel(
name='CollectionType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=512, unique=True, validators=[django.core.validators.RegexValidator(re.compile(b'^[a-zA-z_][a-zA-Z0-9_]*$'), message=b'This field must contain a valid Name.')])),
],
),
migrations.CreateModel(
name='CoverageMetadata',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
],
),
migrations.CreateModel(
name='CoverageType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=512, unique=True, validators=[django.core.validators.RegexValidator(re.compile(b'^[a-zA-z_][a-zA-Z0-9_]*$'), message=b'This field must contain a valid Name.')])),
],
),
migrations.CreateModel(
name='EOObject',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('identifier', models.CharField(max_length=256, unique=True, validators=[django.core.validators.RegexValidator(re.compile(b'^[a-zA-z_][a-zA-Z0-9_.-]*$'), message=b'This field must contain a valid NCName.')])),
('begin_time', models.DateTimeField(blank=True, null=True)),
('end_time', models.DateTimeField(blank=True, null=True)),
('footprint', django.contrib.gis.db.models.fields.GeometryField(blank=True, null=True, srid=4326)),
('inserted', models.DateTimeField(auto_now_add=True)),
('updated', models.DateTimeField(auto_now=True)),
],
),
migrations.CreateModel(
name='FieldType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('index', models.PositiveSmallIntegerField()),
('identifier', models.CharField(max_length=512, validators=[django.core.validators.RegexValidator(re.compile(b'^[a-zA-z_][a-zA-Z0-9_.-]*$'), message=b'This field must contain a valid NCName.')])),
('description', models.TextField(blank=True, null=True)),
('definition', models.CharField(blank=True, max_length=512, null=True)),
('unit_of_measure', models.CharField(blank=True, max_length=64, null=True)),
('wavelength', models.FloatField(blank=True, null=True)),
('significant_figures', models.PositiveSmallIntegerField(blank=True, null=True)),
('numbits', models.PositiveSmallIntegerField(blank=True, null=True)),
('signed', models.BooleanField(default=True)),
('is_float', models.BooleanField(default=False)),
('coverage_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='field_types', to='coverages.CoverageType')),
],
options={
'ordering': ('index',),
},
),
migrations.CreateModel(
name='Frame',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Grid',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=256, null=True, unique=True, validators=[django.core.validators.RegexValidator(re.compile(b'^[a-zA-z_][a-zA-Z0-9_]*$'), message=b'This field must contain a valid Name.')])),
('coordinate_reference_system', models.TextField()),
('axis_1_name', models.CharField(max_length=256)),
('axis_2_name', models.CharField(blank=True, max_length=256, null=True)),
('axis_3_name', models.CharField(blank=True, max_length=256, null=True)),
('axis_4_name', models.CharField(blank=True, max_length=256, null=True)),
('axis_1_type', models.SmallIntegerField(choices=[(0, b'spatial'), (1, b'elevation'), (2, b'temporal'), (3, b'other')])),
('axis_2_type', models.SmallIntegerField(blank=True, choices=[(0, b'spatial'), (1, b'elevation'), (2, b'temporal'), (3, b'other')], null=True)),
('axis_3_type', models.SmallIntegerField(blank=True, choices=[(0, b'spatial'), (1, b'elevation'), (2, b'temporal'), (3, b'other')], null=True)),
('axis_4_type', models.SmallIntegerField(blank=True, choices=[(0, b'spatial'), (1, b'elevation'), (2, b'temporal'), (3, b'other')], null=True)),
('axis_1_offset', models.CharField(blank=True, max_length=256, null=True)),
('axis_2_offset', models.CharField(blank=True, max_length=256, null=True)),
('axis_3_offset', models.CharField(blank=True, max_length=256, null=True)),
('axis_4_offset', models.CharField(blank=True, max_length=256, null=True)),
('resolution', models.PositiveIntegerField(blank=True, null=True)),
],
),
migrations.CreateModel(
name='Mask',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('location', models.CharField(max_length=1024)),
('format', models.CharField(blank=True, max_length=64, null=True)),
('geometry', django.contrib.gis.db.models.fields.GeometryField(blank=True, null=True, srid=4326)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='MaskType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=512, validators=[django.core.validators.RegexValidator(re.compile(b'^[a-zA-z_][a-zA-Z0-9_]*$'), message=b'This field must contain a valid Name.')])),
],
),
migrations.CreateModel(
name='MetaDataItem',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('location', models.CharField(max_length=1024)),
('format', models.CharField(blank=True, max_length=64, null=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='NilValue',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(max_length=512)),
('reason', models.CharField(choices=[(b'http://www.opengis.net/def/nil/OGC/0/inapplicable', b'Inapplicable (There is no value)'), (b'http://www.opengis.net/def/nil/OGC/0/missing', b'Missing'), (b'http://www.opengis.net/def/nil/OGC/0/template', b'Template (The value will be available later)'), (b'http://www.opengis.net/def/nil/OGC/0/unknown', b'Unknown'), (b'http://www.opengis.net/def/nil/OGC/0/withheld', b'Withheld (The value is not divulged)'), (b'http://www.opengis.net/def/nil/OGC/0/AboveDetectionRange', b'Above detection range'), (b'http://www.opengis.net/def/nil/OGC/0/BelowDetectionRange', b'Below detection range')], max_length=512)),
('field_types', models.ManyToManyField(blank=True, related_name='nil_values', to='coverages.FieldType')),
],
),
migrations.CreateModel(
name='OrbitNumber',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='ProcessingCenter',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='ProcessingMode',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='ProcessorName',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='ProductMetadata',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('parent_identifier', models.CharField(blank=True, db_index=True, max_length=256, null=True)),
('production_status', models.PositiveSmallIntegerField(blank=True, choices=[(0, b'ARCHIVED'), (1, b'ACQUIRED'), (2, b'CANCELLED')], db_index=True, null=True)),
('acquisition_type', models.PositiveSmallIntegerField(blank=True, choices=[(0, b'NOMINAL'), (1, b'CALIBRATION'), (2, b'OTHER')], db_index=True, null=True)),
('orbit_direction', models.PositiveSmallIntegerField(blank=True, choices=[(0, b'ASCENDING'), (1, b'DESCENDING')], db_index=True, null=True)),
('product_quality_status', models.PositiveSmallIntegerField(blank=True, choices=[(0, b'NOMINAL'), (1, b'DEGRAGED')], db_index=True, null=True)),
('creation_date', models.DateTimeField(blank=True, db_index=True, null=True)),
('modification_date', models.DateTimeField(blank=True, db_index=True, null=True)),
('processing_date', models.DateTimeField(blank=True, db_index=True, null=True)),
('availability_time', models.DateTimeField(blank=True, db_index=True, null=True)),
('start_time_from_ascending_node', models.IntegerField(blank=True, db_index=True, null=True)),
('completion_time_from_ascending_node', models.IntegerField(blank=True, db_index=True, null=True)),
('illumination_azimuth_angle', models.FloatField(blank=True, db_index=True, null=True)),
('illumination_zenith_angle', models.FloatField(blank=True, db_index=True, null=True)),
('illumination_elevation_angle', models.FloatField(blank=True, db_index=True, null=True)),
('polarisation_mode', models.PositiveSmallIntegerField(blank=True, choices=[(0, b'single'), (1, b'dual'), (2, b'twin'), (3, b'quad'), (4, b'UNDEFINED')], db_index=True, null=True)),
('polarization_channels', models.PositiveSmallIntegerField(blank=True, choices=[(0, b'HV'), (1, b'HV, VH'), (2, b'VH'), (3, b'VV'), (4, b'HH, VV'), (5, b'HH, VH'), (6, b'HH, HV'), (7, b'VH, VV'), (8, b'VH, HV'), (9, b'VV, HV'), (10, b'VV, VH'), (11, b'HH'), (12, b'HH, HV, VH, VV'), (13, b'UNDEFINED')], db_index=True, null=True)),
('antenna_look_direction', models.PositiveSmallIntegerField(blank=True, choices=[(0, b'LEFT'), (1, b'RIGHT')], db_index=True, null=True)),
('minimum_incidence_angle', models.FloatField(blank=True, db_index=True, null=True)),
('maximum_incidence_angle', models.FloatField(blank=True, db_index=True, null=True)),
('doppler_frequency', models.FloatField(blank=True, db_index=True, null=True)),
('incidence_angle_variation', models.FloatField(blank=True, db_index=True, null=True)),
('cloud_cover', models.FloatField(blank=True, db_index=True, null=True)),
('snow_cover', models.FloatField(blank=True, db_index=True, null=True)),
('lowest_location', models.FloatField(blank=True, db_index=True, null=True)),
('highest_location', models.FloatField(blank=True, db_index=True, null=True)),
('acquisition_station', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.AcquisitionStation')),
('acquisition_sub_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.AcquisitionSubType')),
('archiving_center', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.ArchivingCenter')),
('frame', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.Frame')),
('orbit_number', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.OrbitNumber')),
('processing_center', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.ProcessingCenter')),
('processing_mode', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.ProcessingMode')),
('processor_name', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.ProcessorName')),
],
),
migrations.CreateModel(
name='ProductQualityDegredationTag',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='ProductType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=512, unique=True, validators=[django.core.validators.RegexValidator(re.compile(b'^[a-zA-z_][a-zA-Z0-9_]*$'), message=b'This field must contain a valid Name.')])),
('allowed_coverage_types', models.ManyToManyField(blank=True, related_name='allowed_product_types', to='coverages.CoverageType')),
],
),
migrations.CreateModel(
name='ProductVersion',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='SensorMode',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='SwathIdentifier',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Track',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.CharField(db_index=True, max_length=256, unique=True)),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Collection',
fields=[
('eoobject_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='coverages.EOObject')),
],
bases=('coverages.eoobject',),
),
migrations.CreateModel(
name='Coverage',
fields=[
('eoobject_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='coverages.EOObject')),
('axis_1_origin', models.CharField(blank=True, max_length=256, null=True)),
('axis_2_origin', models.CharField(blank=True, max_length=256, null=True)),
('axis_3_origin', models.CharField(blank=True, max_length=256, null=True)),
('axis_4_origin', models.CharField(blank=True, max_length=256, null=True)),
('axis_1_size', models.PositiveIntegerField()),
('axis_2_size', models.PositiveIntegerField(blank=True, null=True)),
('axis_3_size', models.PositiveIntegerField(blank=True, null=True)),
('axis_4_size', models.PositiveIntegerField(blank=True, null=True)),
('collections', models.ManyToManyField(blank=True, related_name='coverages', to='coverages.Collection')),
('coverage_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='coverages', to='coverages.CoverageType')),
('grid', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='coverages.Grid')),
],
options={
'abstract': False,
},
bases=('coverages.eoobject', models.Model),
),
migrations.CreateModel(
name='Mosaic',
fields=[
('eoobject_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='coverages.EOObject')),
('axis_1_origin', models.CharField(blank=True, max_length=256, null=True)),
('axis_2_origin', models.CharField(blank=True, max_length=256, null=True)),
('axis_3_origin', models.CharField(blank=True, max_length=256, null=True)),
('axis_4_origin', models.CharField(blank=True, max_length=256, null=True)),
('axis_1_size', models.PositiveIntegerField()),
('axis_2_size', models.PositiveIntegerField(blank=True, null=True)),
('axis_3_size', models.PositiveIntegerField(blank=True, null=True)),
('axis_4_size', models.PositiveIntegerField(blank=True, null=True)),
('collections', models.ManyToManyField(blank=True, related_name='mosaics', to='coverages.Collection')),
('coverage_type', models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='mosaics', to='coverages.CoverageType')),
('grid', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='coverages.Grid')),
],
options={
'abstract': False,
},
bases=('coverages.eoobject', models.Model),
),
migrations.CreateModel(
name='Product',
fields=[
('eoobject_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='coverages.EOObject')),
('collections', models.ManyToManyField(blank=True, related_name='products', to='coverages.Collection')),
('package', models.OneToOneField(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, to='backends.Storage')),
('product_type', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='products', to='coverages.ProductType')),
],
bases=('coverages.eoobject',),
),
migrations.CreateModel(
name='ReservedID',
fields=[
('eoobject_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='coverages.EOObject')),
('until', models.DateTimeField(blank=True, null=True)),
('request_id', models.CharField(blank=True, max_length=256, null=True)),
],
bases=('coverages.eoobject',),
),
migrations.AddField(
model_name='productmetadata',
name='product_quality_degradation_tag',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.ProductQualityDegredationTag'),
),
migrations.AddField(
model_name='productmetadata',
name='product_version',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.ProductVersion'),
),
migrations.AddField(
model_name='productmetadata',
name='sensor_mode',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.SensorMode'),
),
migrations.AddField(
model_name='productmetadata',
name='swath_identifier',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.SwathIdentifier'),
),
migrations.AddField(
model_name='productmetadata',
name='track',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='metadatas', to='coverages.Track'),
),
migrations.AddField(
model_name='metadataitem',
name='eo_object',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='metadata_items', to='coverages.EOObject'),
),
migrations.AddField(
model_name='metadataitem',
name='storage',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='backends.Storage'),
),
migrations.AddField(
model_name='masktype',
name='product_type',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='mask_types', to='coverages.ProductType'),
),
migrations.AddField(
model_name='mask',
name='mask_type',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='coverages.MaskType'),
),
migrations.AddField(
model_name='mask',
name='storage',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='backends.Storage'),
),
migrations.AddField(
model_name='collectiontype',
name='allowed_coverage_types',
field=models.ManyToManyField(blank=True, related_name='allowed_collection_types', to='coverages.CoverageType'),
),
migrations.AddField(
model_name='collectiontype',
name='allowed_product_types',
field=models.ManyToManyField(blank=True, related_name='allowed_collection_types', to='coverages.ProductType'),
),
migrations.AddField(
model_name='browsetype',
name='product_type',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='browse_types', to='coverages.ProductType'),
),
migrations.AddField(
model_name='browse',
name='browse_type',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='coverages.BrowseType'),
),
migrations.AddField(
model_name='browse',
name='storage',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='backends.Storage'),
),
migrations.AddField(
model_name='arraydataitem',
name='coverage',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='arraydata_items', to='coverages.EOObject'),
),
migrations.AddField(
model_name='arraydataitem',
name='storage',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='backends.Storage'),
),
migrations.AddField(
model_name='allowedvaluerange',
name='field_type',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='allowed_value_ranges', to='coverages.FieldType'),
),
migrations.AddField(
model_name='productmetadata',
name='product',
field=models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='product_metadata', to='coverages.Product'),
),
migrations.AlterUniqueTogether(
name='masktype',
unique_together=set([('name', 'product_type')]),
),
migrations.AddField(
model_name='mask',
name='product',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='masks', to='coverages.Product'),
),
migrations.AlterUniqueTogether(
name='fieldtype',
unique_together=set([('identifier', 'coverage_type'), ('index', 'coverage_type')]),
),
migrations.AddField(
model_name='coveragemetadata',
name='coverage',
field=models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='coverage_metadata', to='coverages.Coverage'),
),
migrations.AddField(
model_name='coverage',
name='mosaics',
field=models.ManyToManyField(blank=True, related_name='coverages', to='coverages.Mosaic'),
),
migrations.AddField(
model_name='coverage',
name='parent_product',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='coverages', to='coverages.Product'),
),
migrations.AddField(
model_name='collectionmetadata',
name='collection',
field=models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='collection_metadata', to='coverages.Collection'),
),
migrations.AddField(
model_name='collection',
name='collection_type',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='collections', to='coverages.CollectionType'),
),
migrations.AddField(
model_name='collection',
name='grid',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='coverages.Grid'),
),
migrations.AlterUniqueTogether(
name='browsetype',
unique_together=set([('name', 'product_type')]),
),
migrations.AddField(
model_name='browse',
name='product',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='browses', to='coverages.Product'),
),
migrations.AlterUniqueTogether(
name='arraydataitem',
unique_together=set([('coverage', 'field_index')]),
),
migrations.AlterUniqueTogether(
name='browse',
unique_together=set([('product', 'browse_type', 'style')]),
),
]
| 58.783694 | 662 | 0.603329 | 3,708 | 35,329 | 5.595469 | 0.088997 | 0.052053 | 0.039907 | 0.047715 | 0.821911 | 0.80268 | 0.747638 | 0.710093 | 0.6845 | 0.64999 | 0 | 0.012765 | 0.248295 | 35,329 | 600 | 663 | 58.881667 | 0.768498 | 0.001925 | 0 | 0.611486 | 1 | 0 | 0.176896 | 0.036871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010135 | 0 | 0.016892 | 0.001689 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1a0c7d6651be870290aa14fcf39354328f296984 | 189 | py | Python | app/admin/views.py | djm4/quiz-button | 33f5cf2ce042df1b7ce1951c07a1415fe2fda34f | [
"MIT"
] | null | null | null | app/admin/views.py | djm4/quiz-button | 33f5cf2ce042df1b7ce1951c07a1415fe2fda34f | [
"MIT"
] | null | null | null | app/admin/views.py | djm4/quiz-button | 33f5cf2ce042df1b7ce1951c07a1415fe2fda34f | [
"MIT"
] | null | null | null | from flask import render_template
from flask_login import login_required
from . import admin
@admin.route('/')
@login_required
def index():
return render_template('admin/index.html')
| 18.9 | 46 | 0.777778 | 26 | 189 | 5.461538 | 0.5 | 0.126761 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126984 | 189 | 9 | 47 | 21 | 0.860606 | 0 | 0 | 0 | 0 | 0 | 0.089947 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | true | 0 | 0.428571 | 0.142857 | 0.714286 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
c519a01450f3249ebfadf6d46e0d4e37ef9f2b24 | 47,535 | py | Python | tests/test_core.py | mirca/bt | 0363e6fa100d9392dd18e32e3d8379d5e83c28fa | [
"MIT"
] | 1 | 2021-05-07T19:40:16.000Z | 2021-05-07T19:40:16.000Z | tests/test_core.py | mirca/bt | 0363e6fa100d9392dd18e32e3d8379d5e83c28fa | [
"MIT"
] | null | null | null | tests/test_core.py | mirca/bt | 0363e6fa100d9392dd18e32e3d8379d5e83c28fa | [
"MIT"
] | 3 | 2021-05-07T19:40:22.000Z | 2022-01-19T19:37:15.000Z | from __future__ import division
import copy
import bt
from bt.core import Node, StrategyBase, SecurityBase, AlgoStack, Strategy
import pandas as pd
import numpy as np
from nose.tools import assert_almost_equal as aae
import sys
if sys.version_info < (3, 3):
import mock
else:
from unittest import mock
def test_node_tree():
c1 = Node('c1')
c2 = Node('c2')
p = Node('p', children=[c1, c2])
c1 = p['c1']
c2 = p['c2']
assert len(p.children) == 2
assert 'c1' in p.children
assert 'c2' in p.children
assert p == c1.parent
assert p == c2.parent
m = Node('m', children=[p])
p = m['p']
c1 = p['c1']
c2 = p['c2']
assert len(m.children) == 1
assert 'p' in m.children
assert p.parent == m
assert len(p.children) == 2
assert 'c1' in p.children
assert 'c2' in p.children
assert p == c1.parent
assert p == c2.parent
def test_strategybase_tree():
s1 = SecurityBase('s1')
s2 = SecurityBase('s2')
s = StrategyBase('p', [s1, s2])
s1 = s['s1']
s2 = s['s2']
assert len(s.children) == 2
assert 's1' in s.children
assert 's2' in s.children
assert s == s1.parent
assert s == s2.parent
def test_node_members():
s1 = SecurityBase('s1')
s2 = SecurityBase('s2')
s = StrategyBase('p', [s1, s2])
s1 = s['s1']
s2 = s['s2']
actual = s.members
assert len(actual) == 3
assert s1 in actual
assert s2 in actual
assert s in actual
actual = s1.members
assert len(actual) == 1
assert s1 in actual
actual = s2.members
assert len(actual) == 1
assert s2 in actual
def test_node_full_name():
s1 = SecurityBase('s1')
s2 = SecurityBase('s2')
s = StrategyBase('p', [s1, s2])
# we cannot access s1 and s2 directly since they are copied
# we must therefore access through s
assert s.full_name == 'p'
assert s['s1'].full_name == 'p>s1'
assert s['s2'].full_name == 'p>s2'
def test_security_setup_prices():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[0]] = 105
data['c2'][dts[0]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
assert c1.price == 105
assert len(c1.prices) == 1
assert c1.prices[0] == 105
assert c2.price == 95
assert len(c2.prices) == 1
assert c2.prices[0] == 95
# now with setup
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[0]] = 105
data['c2'][dts[0]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
assert c1.price == 105
assert len(c1.prices) == 1
assert c1.prices[0] == 105
assert c2.price == 95
assert len(c2.prices) == 1
assert c2.prices[0] == 95
def test_strategybase_tree_setup():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
assert len(s.data) == 3
assert len(c1.data) == 3
assert len(c2.data) == 3
assert len(s._prices) == 3
assert len(c1._prices) == 3
assert len(c2._prices) == 3
assert len(s._values) == 3
assert len(c1._values) == 3
assert len(c2._values) == 3
def test_strategybase_tree_adjust():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
s.adjust(1000)
assert s.capital == 1000
assert s.value == 1000
assert c1.value == 0
assert c2.value == 0
assert c1.weight == 0
assert c2.weight == 0
def test_strategybase_tree_update():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
c1.price == 100
c2.price == 100
i = 1
s.update(dts[i], data.ix[dts[i]])
c1.price == 105
c2.price == 95
i = 2
s.update(dts[i], data.ix[dts[i]])
c1.price == 100
c2.price == 100
def test_update_fails_if_price_is_nan_and_position_open():
c1 = SecurityBase('c1')
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1'], data=100)
data['c1'][dts[1]] = np.nan
c1.setup(data)
i = 0
# mock in position
c1._position = 100
c1.update(dts[i], data.ix[dts[i]])
# test normal case - position & non-nan price
assert c1._value == 100 * 100
i = 1
# this should fail, because we have non-zero position, and price is nan, so
# bt has no way of updating the _value
try:
c1.update(dts[i], data.ix[dts[i]])
assert False
except Exception as e:
assert str(e).startswith('Position is open')
# on the other hand, if position was 0, this should be fine, and update
# value to 0
c1._position = 0
c1.update(dts[i], data.ix[dts[i]])
assert c1._value == 0
def test_strategybase_tree_allocate():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
s.adjust(1000)
# since children have w == 0 this should stay in s
s.allocate(1000)
assert s.value == 1000
assert s.capital == 1000
assert c1.value == 0
assert c2.value == 0
# now allocate directly to child
c1.allocate(500)
assert c1.position == 5
assert c1.value == 500
assert s.capital == 1000 - 500
assert s.value == 1000
assert c1.weight == 500.0 / 1000
assert c2.weight == 0
def test_strategybase_tree_allocate_child_from_strategy():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
s.adjust(1000)
# since children have w == 0 this should stay in s
s.allocate(1000)
assert s.value == 1000
assert s.capital == 1000
assert c1.value == 0
assert c2.value == 0
# now allocate to c1
s.allocate(500, 'c1')
assert c1.position == 5
assert c1.value == 500
assert s.capital == 1000 - 500
assert s.value == 1000
assert c1.weight == 500.0 / 1000
assert c2.weight == 0
def test_strategybase_tree_allocate_level2():
c1 = SecurityBase('c1')
c12 = copy.deepcopy(c1)
c2 = SecurityBase('c2')
c22 = copy.deepcopy(c2)
s1 = StrategyBase('s1', [c1, c2])
s2 = StrategyBase('s2', [c12, c22])
m = StrategyBase('m', [s1, s2])
s1 = m['s1']
s2 = m['s2']
c1 = s1['c1']
c2 = s1['c2']
c12 = s2['c1']
c22 = s2['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
m.setup(data)
i = 0
m.update(dts[i], data.ix[dts[i]])
m.adjust(1000)
# since children have w == 0 this should stay in s
m.allocate(1000)
assert m.value == 1000
assert m.capital == 1000
assert s1.value == 0
assert s2.value == 0
assert c1.value == 0
assert c2.value == 0
# now allocate directly to child
s1.allocate(500)
assert s1.value == 500
assert m.capital == 1000 - 500
assert m.value == 1000
assert s1.weight == 500.0 / 1000
assert s2.weight == 0
# now allocate directly to child of child
c1.allocate(200)
assert s1.value == 500
assert s1.capital == 500 - 200
assert c1.value == 200
assert c1.weight == 200.0 / 500
assert c1.position == 2
assert m.capital == 1000 - 500
assert m.value == 1000
assert s1.weight == 500.0 / 1000
assert s2.weight == 0
assert c12.value == 0
def test_strategybase_tree_allocate_long_short():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
s.adjust(1000)
c1.allocate(500)
assert c1.position == 5
assert c1.value == 500
assert c1.weight == 500.0 / 1000
assert s.capital == 1000 - 500
assert s.value == 1000
c1.allocate(-200)
assert c1.position == 3
assert c1.value == 300
assert c1.weight == 300.0 / 1000
assert s.capital == 1000 - 500 + 200
assert s.value == 1000
c1.allocate(-400)
assert c1.position == -1
assert c1.value == -100
assert c1.weight == -100.0 / 1000
assert s.capital == 1000 - 500 + 200 + 400
assert s.value == 1000
# close up
c1.allocate(-c1.value)
assert c1.position == 0
assert c1.value == 0
assert c1.weight == 0
assert s.capital == 1000 - 500 + 200 + 400 - 100
assert s.value == 1000
def test_strategybase_tree_allocate_update():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
assert s.price == 100
s.adjust(1000)
assert s.price == 100
assert s.value == 1000
assert s._value == 1000
c1.allocate(500)
assert c1.position == 5
assert c1.value == 500
assert c1.weight == 500.0 / 1000
assert s.capital == 1000 - 500
assert s.value == 1000
assert s.price == 100
i = 1
s.update(dts[i], data.ix[dts[i]])
assert c1.position == 5
assert c1.value == 525
assert c1.weight == 525.0 / 1025
assert s.capital == 1000 - 500
assert s.value == 1025
assert np.allclose(s.price, 102.5)
def test_strategybase_universe():
s = StrategyBase('s')
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[0]] = 105
data['c2'][dts[0]] = 95
s.setup(data)
i = 0
s.update(dts[i])
assert len(s.universe) == 1
assert 'c1' in s.universe
assert 'c2' in s.universe
assert s.universe['c1'][dts[i]] == 105
assert s.universe['c2'][dts[i]] == 95
# should not have children unless allocated
assert len(s.children) == 0
def test_strategybase_allocate():
s = StrategyBase('s')
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[0]] = 100
data['c2'][dts[0]] = 95
s.setup(data)
i = 0
s.update(dts[i])
s.adjust(1000)
s.allocate(100, 'c1')
c1 = s['c1']
assert c1.position == 1
assert c1.value == 100
assert s.value == 1000
def test_strategybase_close():
s = StrategyBase('s')
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
s.setup(data)
i = 0
s.update(dts[i])
s.adjust(1000)
s.allocate(100, 'c1')
c1 = s['c1']
assert c1.position == 1
assert c1.value == 100
assert s.value == 1000
s.close('c1')
assert c1.position == 0
assert c1.value == 0
assert s.value == 1000
def test_strategybase_flatten():
s = StrategyBase('s')
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
s.setup(data)
i = 0
s.update(dts[i])
s.adjust(1000)
s.allocate(100, 'c1')
c1 = s['c1']
s.allocate(100, 'c2')
c2 = s['c2']
assert c1.position == 1
assert c1.value == 100
assert c2.position == 1
assert c2.value == 100
assert s.value == 1000
s.flatten()
assert c1.position == 0
assert c1.value == 0
assert s.value == 1000
def test_strategybase_multiple_calls():
s = StrategyBase('s')
dts = pd.date_range('2010-01-01', periods=5)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data.c2[dts[0]] = 95
data.c1[dts[1]] = 95
data.c2[dts[2]] = 95
data.c2[dts[3]] = 95
data.c2[dts[4]] = 95
data.c1[dts[4]] = 105
s.setup(data)
# define strategy logic
def algo(target):
# close out any open positions
target.flatten()
# get stock w/ lowest price
c = target.universe.ix[target.now].idxmin()
# allocate all capital to that stock
target.allocate(target.value, c)
# replace run logic
s.run = algo
# start w/ 1000
s.adjust(1000)
# loop through dates manually
i = 0
# update t0
s.update(dts[i])
assert len(s.children) == 0
assert s.value == 1000
# run t0
s.run(s)
assert len(s.children) == 1
assert s.value == 1000
assert s.capital == 50
c2 = s['c2']
assert c2.value == 950
assert c2.weight == 950.0 / 1000
assert c2.price == 95
# update out t0
s.update(dts[i])
c2 == s['c2']
assert len(s.children) == 1
assert s.value == 1000
assert s.capital == 50
assert c2.value == 950
assert c2.weight == 950.0 / 1000
assert c2.price == 95
# update t1
i = 1
s.update(dts[i])
assert s.value == 1050
assert s.capital == 50
assert len(s.children) == 1
assert 'c2' in s.children
c2 == s['c2']
assert c2.value == 1000
assert c2.weight == 1000.0 / 1050.0
assert c2.price == 100
# run t1 - close out c2, open c1
s.run(s)
assert len(s.children) == 2
assert s.value == 1050
assert s.capital == 5
c1 = s['c1']
assert c1.value == 1045
assert c1.weight == 1045.0 / 1050
assert c1.price == 95
assert c2.value == 0
assert c2.weight == 0
assert c2.price == 100
# update out t1
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1050
assert s.capital == 5
assert c1 == s['c1']
assert c1.value == 1045
assert c1.weight == 1045.0 / 1050
assert c1.price == 95
assert c2.value == 0
assert c2.weight == 0
assert c2.price == 100
# update t2
i = 2
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 5
assert c1.value == 1100
assert c1.weight == 1100.0 / 1105
assert c1.price == 100
assert c2.value == 0
assert c2.weight == 0
assert c2.price == 95
# run t2
s.run(s)
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update out t2
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update t3
i = 3
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# run t3
s.run(s)
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update out t3
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update t4
i = 4
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
# accessing price should refresh - this child has been idle for a while -
# must make sure we can still have a fresh prices
assert c1.price == 105
assert len(c1.prices) == 5
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# run t4
s.run(s)
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 105
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update out t4
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 105
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
def test_strategybase_multiple_calls_preset_secs():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('s', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=5)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data.c2[dts[0]] = 95
data.c1[dts[1]] = 95
data.c2[dts[2]] = 95
data.c2[dts[3]] = 95
data.c2[dts[4]] = 95
data.c1[dts[4]] = 105
s.setup(data)
# define strategy logic
def algo(target):
# close out any open positions
target.flatten()
# get stock w/ lowest price
c = target.universe.ix[target.now].idxmin()
# allocate all capital to that stock
target.allocate(target.value, c)
# replace run logic
s.run = algo
# start w/ 1000
s.adjust(1000)
# loop through dates manually
i = 0
# update t0
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1000
# run t0
s.run(s)
assert len(s.children) == 2
assert s.value == 1000
assert s.capital == 50
assert c2.value == 950
assert c2.weight == 950.0 / 1000
assert c2.price == 95
# update out t0
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1000
assert s.capital == 50
assert c2.value == 950
assert c2.weight == 950.0 / 1000
assert c2.price == 95
# update t1
i = 1
s.update(dts[i])
assert s.value == 1050
assert s.capital == 50
assert len(s.children) == 2
assert c2.value == 1000
assert c2.weight == 1000.0 / 1050.
assert c2.price == 100
# run t1 - close out c2, open c1
s.run(s)
assert c1.value == 1045
assert c1.weight == 1045.0 / 1050
assert c1.price == 95
assert c2.value == 0
assert c2.weight == 0
assert c2.price == 100
assert len(s.children) == 2
assert s.value == 1050
assert s.capital == 5
# update out t1
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1050
assert s.capital == 5
assert c1.value == 1045
assert c1.weight == 1045.0 / 1050
assert c1.price == 95
assert c2.value == 0
assert c2.weight == 0
assert c2.price == 100
# update t2
i = 2
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 5
assert c1.value == 1100
assert c1.weight == 1100.0 / 1105
assert c1.price == 100
assert c2.value == 0
assert c2.weight == 0
assert c2.price == 95
# run t2
s.run(s)
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update out t2
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update t3
i = 3
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# run t3
s.run(s)
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update out t3
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update t4
i = 4
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
# accessing price should refresh - this child has been idle for a while -
# must make sure we can still have a fresh prices
assert c1.price == 105
assert len(c1.prices) == 5
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# run t4
s.run(s)
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 105
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
# update out t4
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1105
assert s.capital == 60
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 105
assert c2.value == 1045
assert c2.weight == 1045.0 / 1105
assert c2.price == 95
def test_strategybase_multiple_calls_no_post_update():
s = StrategyBase('s')
s.set_commissions(lambda q, p: 1)
dts = pd.date_range('2010-01-01', periods=5)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data.c2[dts[0]] = 95
data.c1[dts[1]] = 95
data.c2[dts[2]] = 95
data.c2[dts[3]] = 95
data.c2[dts[4]] = 95
data.c1[dts[4]] = 105
s.setup(data)
# define strategy logic
def algo(target):
# close out any open positions
target.flatten()
# get stock w/ lowest price
c = target.universe.ix[target.now].idxmin()
# allocate all capital to that stock
target.allocate(target.value, c)
# replace run logic
s.run = algo
# start w/ 1000
s.adjust(1000)
# loop through dates manually
i = 0
# update t0
s.update(dts[i])
assert len(s.children) == 0
assert s.value == 1000
# run t0
s.run(s)
assert len(s.children) == 1
assert s.value == 999
assert s.capital == 49
c2 = s['c2']
assert c2.value == 950
assert c2.weight == 950.0 / 999
assert c2.price == 95
# update t1
i = 1
s.update(dts[i])
assert s.value == 1049
assert s.capital == 49
assert len(s.children) == 1
assert 'c2' in s.children
c2 == s['c2']
assert c2.value == 1000
assert c2.weight == 1000.0 / 1049.0
assert c2.price == 100
# run t1 - close out c2, open c1
s.run(s)
assert len(s.children) == 2
assert s.value == 1047
assert s.capital == 2
c1 = s['c1']
assert c1.value == 1045
assert c1.weight == 1045.0 / 1047
assert c1.price == 95
assert c2.value == 0
assert c2.weight == 0
assert c2.price == 100
# update t2
i = 2
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1102
assert s.capital == 2
assert c1.value == 1100
assert c1.weight == 1100.0 / 1102
assert c1.price == 100
assert c2.value == 0
assert c2.weight == 0
assert c2.price == 95
# run t2
s.run(s)
assert len(s.children) == 2
assert s.value == 1100
assert s.capital == 55
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1100
assert c2.price == 95
# update t3
i = 3
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1100
assert s.capital == 55
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1100
assert c2.price == 95
# run t3
s.run(s)
assert len(s.children) == 2
assert s.value == 1098
assert s.capital == 53
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 100
assert c2.value == 1045
assert c2.weight == 1045.0 / 1098
assert c2.price == 95
# update t4
i = 4
s.update(dts[i])
assert len(s.children) == 2
assert s.value == 1098
assert s.capital == 53
assert c1.value == 0
assert c1.weight == 0
# accessing price should refresh - this child has been idle for a while -
# must make sure we can still have a fresh prices
assert c1.price == 105
assert len(c1.prices) == 5
assert c2.value == 1045
assert c2.weight == 1045.0 / 1098
assert c2.price == 95
# run t4
s.run(s)
assert len(s.children) == 2
assert s.value == 1096
assert s.capital == 51
assert c1.value == 0
assert c1.weight == 0
assert c1.price == 105
assert c2.value == 1045
assert c2.weight == 1045.0 / 1096
assert c2.price == 95
def test_strategybase_prices():
dts = pd.date_range('2010-01-01', periods=21)
rawd = [13.555, 13.75, 14.16, 13.915, 13.655,
13.765, 14.02, 13.465, 13.32, 14.65,
14.59, 14.175, 13.865, 13.865, 13.89,
13.85, 13.565, 13.47, 13.225, 13.385,
12.89]
data = pd.DataFrame(index=dts, data=rawd, columns=['a'])
s = StrategyBase('s')
s.set_commissions(lambda q, p: 1)
s.setup(data)
# buy 100 shares on day 1 - hold until end
# just enough to buy 100 shares + 1$ commission
s.adjust(1356.50)
s.update(dts[0])
# allocate all capital to child a
# a should be dynamically created and should have
# 100 shares allocated. s.capital should be 0
s.allocate(s.value, 'a')
assert s.capital == 0
assert s.value == 1355.50
assert len(s.children) == 1
aae(s.price, 99.92628, 5)
a = s['a']
assert a.position == 100
assert a.value == 1355.50
assert a.weight == 1
assert a.price == 13.555
assert len(a.prices) == 1
# update through all dates and make sure price is ok
s.update(dts[1])
aae(s.price, 101.3638, 4)
s.update(dts[2])
aae(s.price, 104.3863, 4)
s.update(dts[3])
aae(s.price, 102.5802, 4)
# finish updates and make sure ok at end
for i in range(4, 21):
s.update(dts[i])
assert len(s.prices) == 21
aae(s.prices[-1], 95.02396, 5)
aae(s.prices[-2], 98.67306, 5)
def test_fail_if_root_value_negative():
s = StrategyBase('s')
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[0]] = 100
data['c2'][dts[0]] = 95
s.setup(data)
s.adjust(-100)
# trigger update
s.update(dts[0])
assert s.bankrupt
# make sure only triggered if root negative
c1 = StrategyBase('c1')
s = StrategyBase('s', children=[c1])
c1 = s['c1']
s.setup(data)
s.adjust(1000)
c1.adjust(-100)
s.update(dts[0])
# now make it trigger
c1.adjust(-1000)
# trigger update
s.update(dts[0])
assert s.bankrupt
def test_fail_if_0_base_in_return_calc():
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[0]] = 100
data['c2'][dts[0]] = 95
# must setup tree because if not negative root error pops up first
c1 = StrategyBase('c1')
s = StrategyBase('s', children=[c1])
c1 = s['c1']
s.setup(data)
s.adjust(1000)
c1.adjust(100)
s.update(dts[0])
c1.adjust(-100)
s.update(dts[1])
try:
c1.adjust(-100)
s.update(dts[1])
assert False
except ZeroDivisionError as e:
if 'Could not update' not in str(e):
assert False
def test_strategybase_tree_rebalance():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
s.set_commissions(lambda q, p: 1)
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
s.adjust(1000)
assert s.value == 1000
assert s.capital == 1000
assert c1.value == 0
assert c2.value == 0
# now rebalance c1
s.rebalance(0.5, 'c1')
assert c1.position == 4
assert c1.value == 400
assert s.capital == 1000 - 401
assert s.value == 999
assert c1.weight == 400.0 / 999
assert c2.weight == 0
def test_strategybase_tree_decimal_position_rebalance():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
s.use_integer_positions(False)
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
s.adjust(1000.2)
s.rebalance(0.42, 'c1')
s.rebalance(0.58, 'c2')
aae(c1.value, 420.084)
aae(c2.value, 580.116)
aae(c1.value + c2.value, 1000.2)
def test_rebalance_child_not_in_tree():
s = StrategyBase('p')
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
i = 0
s.update(dts[i])
s.adjust(1000)
# rebalance to 0 w/ child that is not present - should ignore
s.rebalance(0, 'c2')
assert s.value == 1000
assert s.capital == 1000
assert len(s.children) == 0
def test_strategybase_tree_rebalance_to_0():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
s.adjust(1000)
assert s.value == 1000
assert s.capital == 1000
assert c1.value == 0
assert c2.value == 0
# now rebalance c1
s.rebalance(0.5, 'c1')
assert c1.position == 5
assert c1.value == 500
assert s.capital == 1000 - 500
assert s.value == 1000
assert c1.weight == 500.0 / 1000
assert c2.weight == 0
# now rebalance c1
s.rebalance(0, 'c1')
assert c1.position == 0
assert c1.value == 0
assert s.capital == 1000
assert s.value == 1000
assert c1.weight == 0
assert c2.weight == 0
def test_strategybase_tree_rebalance_level2():
c1 = SecurityBase('c1')
c12 = copy.deepcopy(c1)
c2 = SecurityBase('c2')
c22 = copy.deepcopy(c2)
s1 = StrategyBase('s1', [c1, c2])
s2 = StrategyBase('s2', [c12, c22])
m = StrategyBase('m', [s1, s2])
s1 = m['s1']
s2 = m['s2']
c1 = s1['c1']
c2 = s1['c2']
c12 = s2['c1']
c22 = s2['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
m.setup(data)
i = 0
m.update(dts[i], data.ix[dts[i]])
m.adjust(1000)
assert m.value == 1000
assert m.capital == 1000
assert s1.value == 0
assert s2.value == 0
assert c1.value == 0
assert c2.value == 0
# now rebalance child s1 - since its children are 0, no waterfall alloc
m.rebalance(0.5, 's1')
assert s1.value == 500
assert m.capital == 1000 - 500
assert m.value == 1000
assert s1.weight == 500.0 / 1000
assert s2.weight == 0
# now allocate directly to child of child
s1.rebalance(0.4, 'c1')
assert s1.value == 500
assert s1.capital == 500 - 200
assert c1.value == 200
assert c1.weight == 200.0 / 500
assert c1.position == 2
assert m.capital == 1000 - 500
assert m.value == 1000
assert s1.weight == 500.0 / 1000
assert s2.weight == 0
assert c12.value == 0
# now rebalance child s1 again and make sure c1 also gets proportional
# increase
m.rebalance(0.8, 's1')
assert s1.value == 800
aae(m.capital, 200, 1)
assert m.value == 1000
assert s1.weight == 800 / 1000
assert s2.weight == 0
assert c1.value == 300.0
assert c1.weight == 300.0 / 800
assert c1.position == 3
# now rebalance child s1 to 0 - should close out s1 and c1 as well
m.rebalance(0, 's1')
assert s1.value == 0
assert m.capital == 1000
assert m.value == 1000
assert s1.weight == 0
assert s2.weight == 0
assert c1.weight == 0
def test_strategybase_tree_rebalance_base():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
s.set_commissions(lambda q, p: 1)
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[1]] = 105
data['c2'][dts[1]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
s.adjust(1000)
assert s.value == 1000
assert s.capital == 1000
assert c1.value == 0
assert c2.value == 0
# check that 2 rebalances of equal weight lead to two different allocs
# since value changes after first call
s.rebalance(0.5, 'c1')
assert c1.position == 4
assert c1.value == 400
assert s.capital == 1000 - 401
assert s.value == 999
assert c1.weight == 400.0 / 999
assert c2.weight == 0
s.rebalance(0.5, 'c2')
assert c2.position == 4
assert c2.value == 400
assert s.capital == 1000 - 401 - 401
assert s.value == 998
assert c2.weight == 400.0 / 998
assert c1.weight == 400.0 / 998
# close out everything
s.flatten()
# adjust to get back to 1000
s.adjust(4)
assert s.value == 1000
assert s.capital == 1000
assert c1.value == 0
assert c2.value == 0
# now rebalance but set fixed base
base = s.value
s.rebalance(0.5, 'c1', base=base)
assert c1.position == 4
assert c1.value == 400
assert s.capital == 1000 - 401
assert s.value == 999
assert c1.weight == 400.0 / 999
assert c2.weight == 0
s.rebalance(0.5, 'c2', base=base)
assert c2.position == 4
assert c2.value == 400
assert s.capital == 1000 - 401 - 401
assert s.value == 998
assert c2.weight == 400.0 / 998
assert c1.weight == 400.0 / 998
def test_algo_stack():
a1 = mock.MagicMock(return_value=True)
a2 = mock.MagicMock(return_value=False)
a3 = mock.MagicMock(return_value=True)
# no run_always for now
del a1.run_always
del a2.run_always
del a3.run_always
stack = AlgoStack(a1, a2, a3)
target = mock.MagicMock()
assert not stack(target)
assert a1.called
assert a2.called
assert not a3.called
# now test that run_always marked are run
a1 = mock.MagicMock(return_value=True)
a2 = mock.MagicMock(return_value=False)
a3 = mock.MagicMock(return_value=True)
# a3 will have run_always
del a1.run_always
del a2.run_always
stack = AlgoStack(a1, a2, a3)
target = mock.MagicMock()
assert not stack(target)
assert a1.called
assert a2.called
assert a3.called
def test_set_commissions():
s = StrategyBase('s')
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
s.set_commissions(lambda x, y: 1.0)
s.setup(data)
s.update(dts[0])
s.adjust(1000)
s.allocate(500, 'c1')
assert s.capital == 599
s.set_commissions(lambda x, y: 0.0)
s.allocate(-400, 'c1')
assert s.capital == 999
def test_strategy_tree_proper_return_calcs():
s1 = StrategyBase('s1')
s2 = StrategyBase('s2')
m = StrategyBase('m', [s1, s2])
s1 = m['s1']
s2 = m['s2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data.loc['c1', dts[1]] = 105
data.loc['c2', dts[1]] = 95
m.setup(data)
i = 0
m.update(dts[i], data.ix[dts[i]])
m.adjust(1000)
# since children have w == 0 this should stay in s
m.allocate(1000)
assert m.value == 1000
assert m.capital == 1000
assert m.price == 100
assert s1.value == 0
assert s2.value == 0
# now allocate directly to child
s1.allocate(500)
assert m.capital == 500
assert m.value == 1000
assert m.price == 100
assert s1.value == 500
assert s1.weight == 500.0 / 1000
assert s1.price == 100
assert s2.weight == 0
# allocate to child2 via master method
m.allocate(500, 's2')
assert m.capital == 0
assert m.value == 1000
assert m.price == 100
assert s1.value == 500
assert s1.weight == 500.0 / 1000
assert s1.price == 100
assert s2.value == 500
assert s2.weight == 500.0 / 1000
assert s2.price == 100
# now allocate and incur commission fee
s1.allocate(500, 'c1')
assert m.capital == 0
assert m.value == 1000
assert m.price == 100
assert s1.value == 500
assert s1.weight == 500.0 / 1000
assert s1.price == 100
assert s2.value == 500
assert s2.weight == 500.0 / 1000.0
assert s2.price == 100
def test_strategy_tree_proper_universes():
def do_nothing(x):
return True
child1 = Strategy('c1', [do_nothing], ['b', 'c'])
master = Strategy('m', [do_nothing], [child1, 'a'])
child1 = master['c1']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(
{'a': pd.Series(data=1, index=dts, name='a'),
'b': pd.Series(data=2, index=dts, name='b'),
'c': pd.Series(data=3, index=dts, name='c')})
master.setup(data)
assert len(master.children) == 2
assert 'c1' in master.children
assert 'a' in master.children
assert len(master._universe.columns) == 2
assert 'c1' in master._universe.columns
assert 'a' in master._universe.columns
assert len(child1._universe.columns) == 2
assert 'b' in child1._universe.columns
assert 'c' in child1._universe.columns
def test_strategy_tree_paper():
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['a'], data=100.)
data['a'].ix[dts[1]] = 101
data['a'].ix[dts[2]] = 102
s = Strategy('s',
[bt.algos.SelectWhere(data > 100),
bt.algos.WeighEqually(),
bt.algos.Rebalance()])
m = Strategy('m', [], [s])
s = m['s']
m.setup(data)
m.update(dts[0])
m.run()
assert m.price == 100
assert s.price == 100
assert s._paper_trade
assert s._paper.price == 100
s.update(dts[1])
m.run()
assert m.price == 100
assert m.value == 0
assert s.value == 0
assert s.price == 100
s.update(dts[2])
m.run()
assert m.price == 100
assert m.value == 0
assert s.value == 0
assert np.allclose(s.price, 100. * (102 / 101.))
def test_outlays():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
data['c1'][dts[0]] = 105
data['c2'][dts[0]] = 95
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
# allocate 1000 to strategy
s.adjust(1000)
# now let's see what happens when we allocate 500 to each child
c1.allocate(500)
c2.allocate(500)
# out update
s.update(dts[i])
assert c1.data['outlay'][dts[0]] == (4 * 105)
assert c2.data['outlay'][dts[0]] == (5 * 95)
i = 1
s.update(dts[i], data.ix[dts[i]])
c1.allocate(-400)
c2.allocate(100)
# out update
s.update(dts[i])
#print(c1.data['outlay'])
assert c1.data['outlay'][dts[1]] == (-4 * 100)
assert c2.data['outlay'][dts[1]] == 100
def test_child_weight_above_1():
# check for child weights not exceeding 1
s = StrategyBase('s')
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(np.random.randn(3, 2) + 100,
index=dts, columns=['c1', 'c2'])
s.setup(data)
i = 0
s.update(dts[i])
s.adjust(1e6)
s.allocate(1e6, 'c1')
c1 = s['c1']
assert c1.weight <= 1
def test_fixed_commissions():
c1 = SecurityBase('c1')
c2 = SecurityBase('c2')
s = StrategyBase('p', [c1, c2])
# fixed $1 commission per transaction
s.set_commissions(lambda q, p: 1)
c1 = s['c1']
c2 = s['c2']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1', 'c2'], data=100)
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
# allocate 1000 to strategy
s.adjust(1000)
# now let's see what happens when we allocate 500 to each child
c1.allocate(500)
c2.allocate(500)
# out update
s.update(dts[i])
assert c1.value == 400
assert c2.value == 400
assert s.capital == 198
# de-alloc 100 from c1. This should force c1 to sell 2 units to raise at
# least 100 (because of commissions)
c1.allocate(-100)
s.update(dts[i])
assert c1.value == 200
assert s.capital == 198 + 199
# allocate 100 to c2. This should leave things unchaged, since c2 cannot
# buy one unit since the commission will cause total outlay to exceed
# allocation
c2.allocate(100)
s.update(dts[i])
assert c2.value == 400
assert s.capital == 198 + 199
# ok try again w/ 101 allocation. This time, it should work
c2.allocate(101)
s.update(dts[i])
assert c2.value == 500
assert s.capital == 198 + 199 - 101
# ok now let's close the whole position. Since we are closing, we expect
# the allocation to go through, even though the outlay > amount
c2.allocate(-500)
s.update(dts[i])
assert c2.value == 0
assert s.capital == 198 + 199 - 101 + 499
# now we are going to go short c2
# we want to 'raise' 100 dollars. Since we need at a minimum 100, but we
# also have commissions, we will actually short 2 units in order to raise
# at least 100
c2.allocate(-100)
s.update(dts[i])
assert c2.value == -200
assert s.capital == 198 + 199 - 101 + 499 + 199
def test_degenerate_shorting():
# can have situation where you short infinitely if commission/share > share
# price
c1 = SecurityBase('c1')
s = StrategyBase('p', [c1])
# $1/share commission
s.set_commissions(lambda q, p: abs(q) * 1)
c1 = s['c1']
dts = pd.date_range('2010-01-01', periods=3)
# c1 trades at 0.01
data = pd.DataFrame(index=dts, columns=['c1'], data=0.01)
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
s.adjust(1000)
try:
c1.allocate(-10)
assert False
except Exception as e:
assert 'full_outlay should always be approaching amount' in str(e)
def test_securitybase_allocate():
c1 = SecurityBase('c1')
s = StrategyBase('p', [c1])
c1 = s['c1']
dts = pd.date_range('2010-01-01', periods=3)
data = pd.DataFrame(index=dts, columns=['c1'], data=100.)
# set the price
data['c1'][dts[0]] = 91.40246706608193
s.setup(data)
i = 0
s.update(dts[i], data.ix[dts[i]])
# allocate 100000 to strategy
original_capital = 100000.
s.adjust(original_capital)
# not integer positions
c1.integer_positions = False
# set the full_outlay and amount
full_outlay = 1999.693706988672
amount = 1999.6937069886717
c1.allocate(amount)
# the results that we want to be true
assert np.isclose(full_outlay ,amount,rtol=0.)
# check that the quantity wasn't decreased and the full_outlay == amount
# we can get the full_outlay that was calculated by
# original capital - current capital
assert np.isclose(full_outlay, original_capital - s._capital, rtol=0.)
def test_securitybase_allocate_commisions():
date_span = pd.DatetimeIndex(start='10/1/2017', end='10/11/2017', freq='B')
numper = len(date_span.values)
comms = 0.01
data = [[10, 15, 20, 25, 30, 35, 40, 45],
[10, 10, 10, 10, 20, 20, 20, 20],
[20, 20, 20, 30, 30, 30, 40, 40],
[20, 10, 20, 10, 20, 10, 20, 10]]
data = [[row[i] for row in data] for i in range(len(data[0]))] # Transpose
price = pd.DataFrame(data=data, index=date_span)
price.columns = ['a', 'b', 'c', 'd']
# price = price[['a', 'b']]
sig1 = pd.DataFrame(price['a'] >= price['b'] + 10, columns=['a'])
sig2 = pd.DataFrame(price['a'] < price['b'] + 10, columns=['b'])
signal = sig1.join(sig2)
signal1 = price.diff(1) > 0
signal2 = price.diff(1) < 0
tw = price.copy()
tw.loc[:,:] = 0 # Initialize Set everything to 0
tw[signal1] = -1.0
tw[signal2] = 1.0
s1 = bt.Strategy('long_short', [bt.algos.WeighTarget(tw),
bt.algos.RunDaily(),
bt.algos.Rebalance()])
####now we create the Backtest , commissions=(lambda q, p: abs(p * q) * comms)
t = bt.Backtest(s1, price, initial_capital=1000000, commissions=(lambda q, p: abs(p * q) * comms), progress_bar=False)
####and let's run it!
res = bt.run(t)
########################
| 22.379944 | 122 | 0.57972 | 7,330 | 47,535 | 3.724557 | 0.06412 | 0.05128 | 0.032087 | 0.023772 | 0.769056 | 0.74598 | 0.727483 | 0.701806 | 0.677814 | 0.664554 | 0 | 0.128364 | 0.272178 | 47,535 | 2,123 | 123 | 22.390485 | 0.660751 | 0.100536 | 0 | 0.793909 | 0 | 0 | 0.026311 | 0 | 0 | 0 | 0 | 0 | 0.476629 | 1 | 0.03187 | false | 0 | 0.007082 | 0.000708 | 0.03966 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c54fa9f258bf88ed1b606a69b5cf4707269ca6ab | 140 | py | Python | p_teste/main.py | gabriel-correia0408/Sala_Green_GabrielCorreia | 1d22f466d372786c5f8c8eaba7202844b5f03445 | [
"Apache-2.0"
] | null | null | null | p_teste/main.py | gabriel-correia0408/Sala_Green_GabrielCorreia | 1d22f466d372786c5f8c8eaba7202844b5f03445 | [
"Apache-2.0"
] | null | null | null | p_teste/main.py | gabriel-correia0408/Sala_Green_GabrielCorreia | 1d22f466d372786c5f8c8eaba7202844b5f03445 | [
"Apache-2.0"
] | null | null | null | #para teste desde do inicio
# da pasta p_teste fazendo a importação do método soma
from p_teste.calculadora import soma
print(soma(10,10)) | 23.333333 | 54 | 0.792857 | 25 | 140 | 4.36 | 0.72 | 0.110092 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033613 | 0.15 | 140 | 6 | 55 | 23.333333 | 0.882353 | 0.564286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
3d858fdc137e866ec85706d57b36612eeed22e17 | 2,526 | py | Python | covid19-dashboard/config.py | pradh/tools | b7c1e4feefcb4c6eb532af5be6a65370487841ab | [
"Apache-2.0"
] | null | null | null | covid19-dashboard/config.py | pradh/tools | b7c1e4feefcb4c6eb532af5be6a65370487841ab | [
"Apache-2.0"
] | null | null | null | covid19-dashboard/config.py | pradh/tools | b7c1e4feefcb4c6eb532af5be6a65370487841ab | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# DataCommons Host Server.
DC_SERVER = "https://api.datacommons.org/"
# A dictionary of stat_vars where place_type->key->stat_var.
# There must be at least one stat_var per place_type.
# The key can be made up, this key will be used to output the data.
# Instead of using the long stat_var name.
# The key will be used to identify the stat_var.
STAT_VARS = {
"Country": {
"Cases":
"CumulativeCount_MedicalConditionIncident_COVID_19_ConfirmedCase",
"Deaths":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientDeceased",
"Hospitalized":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientHospitalized",
"Recovered":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientRecovered",
"ICU":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientInICU"
},
"State": {
"Cases":
"CumulativeCount_MedicalConditionIncident_COVID_19_ConfirmedOrProbableCase",
"Deaths":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientDeceased",
"Hospitalized":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientHospitalized",
"Recovered":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientRecovered",
"ICU":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientInICU"
},
"County": {
"Cases":
"CumulativeCount_MedicalConditionIncident_COVID_19_ConfirmedOrProbableCase",
"Deaths":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientDeceased",
"Hospitalized":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientHospitalized",
"Recovered":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientRecovered",
"ICU":
"CumulativeCount_MedicalConditionIncident_COVID_19_PatientInICU"
},
}
| 40.095238 | 88 | 0.723278 | 249 | 2,526 | 7.060241 | 0.449799 | 0.332765 | 0.375427 | 0.392491 | 0.601251 | 0.555176 | 0.555176 | 0.555176 | 0.555176 | 0.555176 | 0 | 0.019019 | 0.209026 | 2,526 | 62 | 89 | 40.741935 | 0.860861 | 0.33175 | 0 | 0.74359 | 0 | 0 | 0.68705 | 0.596523 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ad001e96bafb2929b3139604416b5e2a6173ed0e | 25 | py | Python | main.py | lordjack/oficina_introducao_programacao_python | d432e3d39cfb4790da5f3e75c2bf3ee86fa29f8e | [
"MIT"
] | null | null | null | main.py | lordjack/oficina_introducao_programacao_python | d432e3d39cfb4790da5f3e75c2bf3ee86fa29f8e | [
"MIT"
] | null | null | null | main.py | lordjack/oficina_introducao_programacao_python | d432e3d39cfb4790da5f3e75c2bf3ee86fa29f8e | [
"MIT"
] | null | null | null | import exercicios.ex01
| 6.25 | 22 | 0.8 | 3 | 25 | 6.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 0.16 | 25 | 3 | 23 | 8.333333 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9a94366e201ee1444440131935bc9475911c5d84 | 202 | py | Python | fibonacci.py | andremartins746/curso_de_PYTHON | 3b4d79e3310b2442cf57a98f213a153492f2a89a | [
"MIT"
] | null | null | null | fibonacci.py | andremartins746/curso_de_PYTHON | 3b4d79e3310b2442cf57a98f213a153492f2a89a | [
"MIT"
] | null | null | null | fibonacci.py | andremartins746/curso_de_PYTHON | 3b4d79e3310b2442cf57a98f213a153492f2a89a | [
"MIT"
] | null | null | null | def fibinacci(quantidade, sequencia=(0,1)):
return sequencia if len(sequencia) == quantidade else fibinacci(quantidade, sequencia + (sum(sequencia[-2:]),))
for fib in fibinacci(20):
print(fib) | 33.666667 | 115 | 0.712871 | 26 | 202 | 5.538462 | 0.653846 | 0.263889 | 0.388889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028736 | 0.138614 | 202 | 6 | 116 | 33.666667 | 0.798851 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.25 | 0.5 | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 6 |
9aa0e8a506fc5a782081aed3fca2aa416206428e | 533 | py | Python | chainerex/dataset/__init__.py | corochann/chainerex | 15efb34a8fa6afab1ce5ad52c3802960ab6d49c2 | [
"MIT"
] | 1 | 2018-08-30T08:59:50.000Z | 2018-08-30T08:59:50.000Z | chainerex/dataset/__init__.py | corochann/chainerex | 15efb34a8fa6afab1ce5ad52c3802960ab6d49c2 | [
"MIT"
] | null | null | null | chainerex/dataset/__init__.py | corochann/chainerex | 15efb34a8fa6afab1ce5ad52c3802960ab6d49c2 | [
"MIT"
] | null | null | null | from chainerex.dataset import dataset_mixin_ex # NOQA
from chainerex.dataset import indexers # NOQA
from chainerex.dataset.dataset_mixin_ex import DatasetMixinEX # NOQA
from chainerex.dataset.dataset_mixin_ex import DatasetMixinEXFeatureIndexer # NOQA
from chainerex.dataset.indexers import indexer
from chainerex.dataset.indexers.feature_indexer import BaseFeatureIndexer
from chainerex.dataset.indexers.indexer import BaseIndexer # NOQA
from chainerex.dataset.indexers.indexer import ExtractBySliceNotSupportedError # NOQA
| 48.454545 | 86 | 0.859287 | 62 | 533 | 7.274194 | 0.241935 | 0.230599 | 0.354767 | 0.266075 | 0.456763 | 0.37694 | 0.195122 | 0.195122 | 0 | 0 | 0 | 0 | 0.097561 | 533 | 10 | 87 | 53.3 | 0.93763 | 0.054409 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9aa1113f2ff2ced5508f51349eb9c9d460620653 | 238 | py | Python | app/admin.py | haibincoder/DjangoTensorflow | 7fc606fa5121f0c48d7c8e649775094d86e6387a | [
"MIT"
] | 17 | 2018-07-21T04:14:09.000Z | 2022-03-09T08:32:49.000Z | app/admin.py | haibincoder/DjangoTensorflow | 7fc606fa5121f0c48d7c8e649775094d86e6387a | [
"MIT"
] | 24 | 2020-01-28T22:11:42.000Z | 2022-03-11T23:47:43.000Z | app/admin.py | haibincoder/DjangoTensorflow | 7fc606fa5121f0c48d7c8e649775094d86e6387a | [
"MIT"
] | 7 | 2018-12-13T08:55:07.000Z | 2021-06-26T08:08:01.000Z | from django.contrib import admin
#from .models import Article, Category, BlogComment, Tag
#from .models import Image
# Register your models here.
#admin.site.register([Article, Category, BlogComment, Tag])
#admin.site.register([Image])
| 26.444444 | 59 | 0.773109 | 31 | 238 | 5.935484 | 0.483871 | 0.108696 | 0.173913 | 0.315217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113445 | 238 | 8 | 60 | 29.75 | 0.872038 | 0.810924 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.