hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fcf33f922f6d81e54f6c04c833d8d1b8882750df | 2,180 | py | Python | tests/integration_tests/test_kuendigung_maus.py | Hochfrequenz/mig_ahb_utility_stack | 85b97bc616508404c0178df3870cf4c0f06ccca1 | [
"MIT"
] | 5 | 2021-12-14T11:49:05.000Z | 2021-12-21T21:37:14.000Z | tests/integration_tests/test_kuendigung_maus.py | Hochfrequenz/mig_ahb_utility_stack | 85b97bc616508404c0178df3870cf4c0f06ccca1 | [
"MIT"
] | 11 | 2021-12-16T08:40:59.000Z | 2022-02-25T17:30:05.000Z | tests/integration_tests/test_kuendigung_maus.py | Hochfrequenz/mig_ahb_utility_stack | 85b97bc616508404c0178df3870cf4c0f06ccca1 | [
"MIT"
] | null | null | null | from pathlib import Path
import pytest # type:ignore[import]
from helpers import create_maus_and_assert, write_to_file_or_assert_equality # type:ignore[import]
class TestKuendigungMaus:
"""
A unit test that ensures that the 11016/17/18 MAUS.json are created.
"""
@pytest.mark.datafiles("./edifact-templates/edi/UTILMD/UTILMD5.2c.template")
@pytest.mark.datafiles("./edifact-templates/ahbs/FV2110/UTILMD/11016.csv")
@pytest.mark.datafiles("../unit_tests/migs/FV2204/segment_group_hierarchies/sgh_utilmd.json")
def test_maus_creation_11016(self, datafiles):
create_maus_and_assert(
csv_path=Path(datafiles) / "11016.csv",
sgh_path=Path(datafiles) / "sgh_utilmd.json",
template_path=Path(datafiles) / Path("UTILMD5.2c.template"),
maus_path=Path("edifact-templates/maus/FV2110/UTILMD/11016_maus.json"),
)
@pytest.mark.datafiles("./edifact-templates/edi/UTILMD/UTILMD5.2c.template")
@pytest.mark.datafiles("./edifact-templates/ahbs/FV2110/UTILMD/11017.csv")
@pytest.mark.datafiles("../unit_tests/migs/FV2204/segment_group_hierarchies/sgh_utilmd.json")
def test_maus_creation_11017(self, datafiles):
create_maus_and_assert(
csv_path=Path(datafiles) / "11017.csv",
sgh_path=Path(datafiles) / "sgh_utilmd.json",
template_path=Path(datafiles) / Path("UTILMD5.2c.template"),
maus_path=Path("edifact-templates/maus/FV2110/UTILMD/11017_maus.json"),
)
@pytest.mark.datafiles("./edifact-templates/edi/UTILMD/UTILMD5.2c.template")
@pytest.mark.datafiles("./edifact-templates/ahbs/FV2110/UTILMD/11018.csv")
@pytest.mark.datafiles("../unit_tests/migs/FV2204/segment_group_hierarchies/sgh_utilmd.json")
def test_maus_creation_11018(self, datafiles):
create_maus_and_assert(
csv_path=Path(datafiles) / "11018.csv",
sgh_path=Path(datafiles) / "sgh_utilmd.json",
template_path=Path(datafiles) / Path("UTILMD5.2c.template"),
maus_path=Path("edifact-templates/maus/FV2110/UTILMD/11018_maus.json"),
)
| 49.545455 | 100 | 0.68945 | 269 | 2,180 | 5.379182 | 0.204461 | 0.066344 | 0.118176 | 0.107809 | 0.811334 | 0.811334 | 0.811334 | 0.811334 | 0.811334 | 0.811334 | 0 | 0.065254 | 0.177523 | 2,180 | 43 | 101 | 50.697674 | 0.741774 | 0.05 | 0 | 0.441176 | 0 | 0 | 0.387867 | 0.32372 | 0 | 0 | 0 | 0 | 0.117647 | 1 | 0.088235 | false | 0 | 0.088235 | 0 | 0.205882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1e0311795a2499325159d2045bf353165b96f6cb | 67 | py | Python | SerinasDecisionTree/__init__.py | serinamarie/SerinasDecisionTree | 7083309b149cbea3a13364d7e5139760f2f1122a | [
"MIT"
] | null | null | null | SerinasDecisionTree/__init__.py | serinamarie/SerinasDecisionTree | 7083309b149cbea3a13364d7e5139760f2f1122a | [
"MIT"
] | null | null | null | SerinasDecisionTree/__init__.py | serinamarie/SerinasDecisionTree | 7083309b149cbea3a13364d7e5139760f2f1122a | [
"MIT"
] | null | null | null | from SerinasDecisionTree.decisiontree import DecisionTreeClassifier | 67 | 67 | 0.940299 | 5 | 67 | 12.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044776 | 67 | 1 | 67 | 67 | 0.984375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1e5f59aa6822ab2cd944027932bdb4b70519d1df | 82 | py | Python | test/login.py | hyehyeonmoon/WelfareForEveryOne-1 | 7929b52e0235acb5208673b50214f0fe8130296a | [
"MIT"
] | null | null | null | test/login.py | hyehyeonmoon/WelfareForEveryOne-1 | 7929b52e0235acb5208673b50214f0fe8130296a | [
"MIT"
] | null | null | null | test/login.py | hyehyeonmoon/WelfareForEveryOne-1 | 7929b52e0235acb5208673b50214f0fe8130296a | [
"MIT"
] | null | null | null | from locust import HttpUser, between, task
from config import config
import json
| 16.4 | 42 | 0.817073 | 12 | 82 | 5.583333 | 0.666667 | 0.358209 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158537 | 82 | 4 | 43 | 20.5 | 0.971014 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1e75f6831b440fa30480edfaa13864fca0c509fc | 20,438 | py | Python | smartrecruiters_python_client/apis/postings_api.py | roksela/smartrecruiters-python-client | 6d0849d173a3d6718b5f0769098f4c76857f637d | [
"MIT"
] | 5 | 2018-03-27T08:20:13.000Z | 2022-03-30T06:23:38.000Z | smartrecruiters_python_client/apis/postings_api.py | roksela/smartrecruiters-python-client | 6d0849d173a3d6718b5f0769098f4c76857f637d | [
"MIT"
] | null | null | null | smartrecruiters_python_client/apis/postings_api.py | roksela/smartrecruiters-python-client | 6d0849d173a3d6718b5f0769098f4c76857f637d | [
"MIT"
] | 2 | 2018-12-05T04:48:37.000Z | 2020-12-17T12:12:12.000Z | # coding: utf-8
"""
Unofficial python library for the SmartRecruiters API
The SmartRecruiters API provides a platform to integrate services or applications, build apps and create fully customizable career sites. It exposes SmartRecruiters functionality and allows to connect and build software enhancing it.
OpenAPI spec version: 1
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class PostingsApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def v1_get_posting(self, company_identifier, posting_id, **kwargs):
"""
Get posting by posting id for given company
Note: In order to update content of a job posting available via the Posting API, you need to re-post the job in SmartRecruiters application.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.v1_get_posting(company_identifier, posting_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str company_identifier: Identifier of a company (required)
:param str posting_id: Posting identifier (required)
:param str source_type_id: sourceTypeId can be retrieved using [get /configuration/sources](https://dev.smartrecruiters.com/customer-api/live-docs/#!/configuration/configuration_getSourceTypes) endpoint. Used together with **sourceId** to add source tracking parameter to **applyUrl**.
:param str source_id: sourceId can be retrieved using [get /configuration/sources/{sourceType}/values](https://dev.smartrecruiters.com/customer-api/live-docs/#!/configuration/configuration_getSources) endpoint. Used together with **sourceTypeId** to add source tracking parameter to **applyUrl**.
:return: Posting
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.v1_get_posting_with_http_info(company_identifier, posting_id, **kwargs)
else:
(data) = self.v1_get_posting_with_http_info(company_identifier, posting_id, **kwargs)
return data
def v1_get_posting_with_http_info(self, company_identifier, posting_id, **kwargs):
"""
Get posting by posting id for given company
Note: In order to update content of a job posting available via the Posting API, you need to re-post the job in SmartRecruiters application.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.v1_get_posting_with_http_info(company_identifier, posting_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str company_identifier: Identifier of a company (required)
:param str posting_id: Posting identifier (required)
:param str source_type_id: sourceTypeId can be retrieved using [get /configuration/sources](https://dev.smartrecruiters.com/customer-api/live-docs/#!/configuration/configuration_getSourceTypes) endpoint. Used together with **sourceId** to add source tracking parameter to **applyUrl**.
:param str source_id: sourceId can be retrieved using [get /configuration/sources/{sourceType}/values](https://dev.smartrecruiters.com/customer-api/live-docs/#!/configuration/configuration_getSources) endpoint. Used together with **sourceTypeId** to add source tracking parameter to **applyUrl**.
:return: Posting
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['company_identifier', 'posting_id', 'source_type_id', 'source_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_get_posting" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'company_identifier' is set
if ('company_identifier' not in params) or (params['company_identifier'] is None):
raise ValueError("Missing the required parameter `company_identifier` when calling `v1_get_posting`")
# verify the required parameter 'posting_id' is set
if ('posting_id' not in params) or (params['posting_id'] is None):
raise ValueError("Missing the required parameter `posting_id` when calling `v1_get_posting`")
collection_formats = {}
resource_path = '/v1/companies/{companyIdentifier}/postings/{postingId}'.replace('{format}', 'json')
path_params = {}
if 'company_identifier' in params:
path_params['companyIdentifier'] = params['company_identifier']
if 'posting_id' in params:
path_params['postingId'] = params['posting_id']
query_params = {}
if 'source_type_id' in params:
query_params['sourceTypeId'] = params['source_type_id']
if 'source_id' in params:
query_params['sourceId'] = params['source_id']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Posting',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def v1_list_departments(self, company_identifier, **kwargs):
"""
List departments for given company
List departments for given company.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.v1_list_departments(company_identifier, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str company_identifier: Identifier of a company (required)
:return: Departments
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.v1_list_departments_with_http_info(company_identifier, **kwargs)
else:
(data) = self.v1_list_departments_with_http_info(company_identifier, **kwargs)
return data
def v1_list_departments_with_http_info(self, company_identifier, **kwargs):
"""
List departments for given company
List departments for given company.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.v1_list_departments_with_http_info(company_identifier, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str company_identifier: Identifier of a company (required)
:return: Departments
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['company_identifier']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_list_departments" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'company_identifier' is set
if ('company_identifier' not in params) or (params['company_identifier'] is None):
raise ValueError("Missing the required parameter `company_identifier` when calling `v1_list_departments`")
collection_formats = {}
resource_path = '/v1/companies/{companyIdentifier}/departments'.replace('{format}', 'json')
path_params = {}
if 'company_identifier' in params:
path_params['companyIdentifier'] = params['company_identifier']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Departments',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def v1_list_postings(self, company_identifier, **kwargs):
"""
Lists active postings published by given company
Lists active postings published by given company. Return PostingList
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.v1_list_postings(company_identifier, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str company_identifier: Identifier of a company (required)
:param str q: full-text search query based on a job title, location
:param int limit: number of elements to return. max value is 100
:param int offset: number of elements to skip while processing result
:param str destination: Filter indicating which postings to return: * **PUBLIC**: response will include ONLY public postings * **INTERNAL**: response will include ONLY internal postings * **INTERNAL_OR_PUBLIC**: response will include internal postings or public postings, but not both for a single job. If a job has both types of postings, only internal postings will be returned
:param str country: country code filter (part of the location object)
:param str region: region filter (part of the location object)
:param str city: city filter (part of the location object)
:param str department: department filter (department id)
:param list[str] language: Exceptions to the language code ISO format: * \"en-GB\" - \"English - English (UK)\" * \"fr-CA\" - \"French - français (Canada)\" * \"pt-BR\" - \"Portugal - português (Brasil)\" * \"pt-PT\" - \"Portugal - português (Portugal)\" * \"zh-HK\" - \"Chinese (Traditional) - 中文 (香港)\" * \"zh-CN\" - \"Chinese (Simplified) - 中文 (简体)\"
:return: PostingList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.v1_list_postings_with_http_info(company_identifier, **kwargs)
else:
(data) = self.v1_list_postings_with_http_info(company_identifier, **kwargs)
return data
def v1_list_postings_with_http_info(self, company_identifier, **kwargs):
"""
Lists active postings published by given company
Lists active postings published by given company. Return PostingList
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.v1_list_postings_with_http_info(company_identifier, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str company_identifier: Identifier of a company (required)
:param str q: full-text search query based on a job title, location
:param int limit: number of elements to return. max value is 100
:param int offset: number of elements to skip while processing result
:param str destination: Filter indicating which postings to return: * **PUBLIC**: response will include ONLY public postings * **INTERNAL**: response will include ONLY internal postings * **INTERNAL_OR_PUBLIC**: response will include internal postings or public postings, but not both for a single job. If a job has both types of postings, only internal postings will be returned
:param str country: country code filter (part of the location object)
:param str region: region filter (part of the location object)
:param str city: city filter (part of the location object)
:param str department: department filter (department id)
:param list[str] language: Exceptions to the language code ISO format: * \"en-GB\" - \"English - English (UK)\" * \"fr-CA\" - \"French - français (Canada)\" * \"pt-BR\" - \"Portugal - português (Brasil)\" * \"pt-PT\" - \"Portugal - português (Portugal)\" * \"zh-HK\" - \"Chinese (Traditional) - 中文 (香港)\" * \"zh-CN\" - \"Chinese (Simplified) - 中文 (简体)\"
:return: PostingList
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['company_identifier', 'q', 'limit', 'offset', 'destination', 'country', 'region', 'city', 'department', 'language']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method v1_list_postings" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'company_identifier' is set
if ('company_identifier' not in params) or (params['company_identifier'] is None):
raise ValueError("Missing the required parameter `company_identifier` when calling `v1_list_postings`")
collection_formats = {}
resource_path = '/v1/companies/{companyIdentifier}/postings'.replace('{format}', 'json')
path_params = {}
if 'company_identifier' in params:
path_params['companyIdentifier'] = params['company_identifier']
query_params = {}
if 'q' in params:
query_params['q'] = params['q']
if 'limit' in params:
query_params['limit'] = params['limit']
if 'offset' in params:
query_params['offset'] = params['offset']
if 'destination' in params:
query_params['destination'] = params['destination']
if 'country' in params:
query_params['country'] = params['country']
if 'region' in params:
query_params['region'] = params['region']
if 'city' in params:
query_params['city'] = params['city']
if 'department' in params:
query_params['department'] = params['department']
if 'language' in params:
query_params['language'] = params['language']
collection_formats['language'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = []
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PostingList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 50.71464 | 388 | 0.617184 | 2,209 | 20,438 | 5.52603 | 0.125396 | 0.062669 | 0.019497 | 0.017695 | 0.876219 | 0.859753 | 0.85623 | 0.843369 | 0.828787 | 0.828787 | 0 | 0.002855 | 0.297436 | 20,438 | 402 | 389 | 50.840796 | 0.847273 | 0.434142 | 0 | 0.632653 | 0 | 0 | 0.196881 | 0.03444 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0 | 0.035714 | 0 | 0.122449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1e8ae610061a54d6f6bd433d333173f1d1966b15 | 45 | py | Python | server/app/budget/models/__init__.py | catvitalio/personal-budget | b4470115ebbfd185a8a781a2024787cbfe822639 | [
"MIT"
] | null | null | null | server/app/budget/models/__init__.py | catvitalio/personal-budget | b4470115ebbfd185a8a781a2024787cbfe822639 | [
"MIT"
] | null | null | null | server/app/budget/models/__init__.py | catvitalio/personal-budget | b4470115ebbfd185a8a781a2024787cbfe822639 | [
"MIT"
] | null | null | null | from .budget import *
from .signals import *
| 15 | 22 | 0.733333 | 6 | 45 | 5.5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177778 | 45 | 2 | 23 | 22.5 | 0.891892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1ed77850e1e8a039da407c13103a9a493c679b8f | 96 | py | Python | venv/lib/python3.8/site-packages/pip/_vendor/chardet/enums.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/pip/_vendor/chardet/enums.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/pip/_vendor/chardet/enums.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/02/29/b0/75bf5ab357492996853541f63a158854155de9990927f58ae6c358f1c5 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.510417 | 0 | 96 | 1 | 96 | 96 | 0.385417 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
949eafcedbc1f89900d25641fe35e55cf5b89341 | 27 | py | Python | src/models/cancer_ml.py | Tpool1/Aoede | e14317d1ff41a64586428240ec5f3156b381b503 | [
"MIT"
] | null | null | null | src/models/cancer_ml.py | Tpool1/Aoede | e14317d1ff41a64586428240ec5f3156b381b503 | [
"MIT"
] | null | null | null | src/models/cancer_ml.py | Tpool1/Aoede | e14317d1ff41a64586428240ec5f3156b381b503 | [
"MIT"
] | null | null | null |
def cancer_ml():
pass
| 6.75 | 16 | 0.592593 | 4 | 27 | 3.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.296296 | 27 | 3 | 17 | 9 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
94d99b79d2f16d592cb05e69e6cf2763fbf2267b | 370 | py | Python | abbreviation_client/Colors.py | riarheos/abbreviation_client | 73302daa2ae33c1bc81cbcfe1a1bd5c27655b92d | [
"0BSD"
] | 3 | 2021-01-27T09:50:48.000Z | 2021-07-11T23:59:26.000Z | abbreviation_client/Colors.py | riarheos/abbreviation_client | 73302daa2ae33c1bc81cbcfe1a1bd5c27655b92d | [
"0BSD"
] | null | null | null | abbreviation_client/Colors.py | riarheos/abbreviation_client | 73302daa2ae33c1bc81cbcfe1a1bd5c27655b92d | [
"0BSD"
] | null | null | null | from termcolor import colored
def green(text):
return colored(text, 'green', attrs=['bold'])
def yellow(text):
return colored(text, 'yellow', attrs=['bold'])
def blue(text):
return colored(text, 'blue', attrs=['bold'])
def red(text):
return colored(text, 'red', attrs=['bold'])
def gray(text):
return colored(text, 'white', attrs=['dark'])
| 16.818182 | 50 | 0.637838 | 49 | 370 | 4.816327 | 0.326531 | 0.211864 | 0.360169 | 0.444915 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172973 | 370 | 21 | 51 | 17.619048 | 0.771242 | 0 | 0 | 0 | 0 | 0 | 0.116216 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.454545 | false | 0 | 0.090909 | 0.454545 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
a214b7c9247f5106a834cf7c560b04a021ba646e | 46 | py | Python | src/family_court/tracker/__init__.py | penseesface/client | f7522435b4de05175afd79c44cb5c384088a5e11 | [
"BSD-3-Clause"
] | null | null | null | src/family_court/tracker/__init__.py | penseesface/client | f7522435b4de05175afd79c44cb5c384088a5e11 | [
"BSD-3-Clause"
] | null | null | null | src/family_court/tracker/__init__.py | penseesface/client | f7522435b4de05175afd79c44cb5c384088a5e11 | [
"BSD-3-Clause"
] | null | null | null | from .iou_tracker import IoUTracker, iou_func
| 23 | 45 | 0.847826 | 7 | 46 | 5.285714 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108696 | 46 | 1 | 46 | 46 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a2201f144e0bde133019e94e1610d956647ddac3 | 16 | py | Python | Round #543 (Div 2 based on Technocup 2019 Final Round)/C.py | julianferres/Codeforces | ac80292a4d53b8078fc1a85e91db353c489555d9 | [
"MIT"
] | 4 | 2020-01-31T15:49:25.000Z | 2020-07-07T11:44:03.000Z | Round #543 (Div 2 based on Technocup 2019 Final Round)/C.py | julianferres/CodeForces | 14e8369e82a2403094183d6f7824201f681c9f65 | [
"MIT"
] | null | null | null | Round #543 (Div 2 based on Technocup 2019 Final Round)/C.py | julianferres/CodeForces | 14e8369e82a2403094183d6f7824201f681c9f65 | [
"MIT"
] | null | null | null | def C():
C()
| 2.666667 | 8 | 0.3125 | 3 | 16 | 1.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.375 | 16 | 5 | 9 | 3.2 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bf6ca7313cde8ca59bb015f63f49aa33e3c53218 | 151 | py | Python | clusterval/__init__.py | Nuno09/clusterval | 4844fd75a658b7ced6c78e4f79f0b308870f9adf | [
"BSD-2-Clause"
] | 3 | 2020-11-27T10:49:40.000Z | 2021-12-13T02:52:29.000Z | clusterval/__init__.py | Nuno09/clusterval | 4844fd75a658b7ced6c78e4f79f0b308870f9adf | [
"BSD-2-Clause"
] | null | null | null | clusterval/__init__.py | Nuno09/clusterval | 4844fd75a658b7ced6c78e4f79f0b308870f9adf | [
"BSD-2-Clause"
] | null | null | null | from clusterval.external import calculate_external
from clusterval.internal import calculate_internal
from clusterval.validation import Clusterval
| 18.875 | 50 | 0.874172 | 17 | 151 | 7.647059 | 0.411765 | 0.323077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10596 | 151 | 7 | 51 | 21.571429 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
44edfc7d42ff67573a7f6ddc8481e1c2f475fa46 | 13,963 | py | Python | Arbitrage_Future/Arbitrage_Future/test_future.py | ronaldzgithub/CryptoArbitrage | b4b7a12b7b11f3dcf950f9d2039dad4f1388530b | [
"MIT"
] | 1 | 2021-11-03T06:16:16.000Z | 2021-11-03T06:16:16.000Z | Arbitrage_Future/Arbitrage_Future/test_future.py | benno0810/CryptoArbitrage | b4b7a12b7b11f3dcf950f9d2039dad4f1388530b | [
"MIT"
] | null | null | null | Arbitrage_Future/Arbitrage_Future/test_future.py | benno0810/CryptoArbitrage | b4b7a12b7b11f3dcf950f9d2039dad4f1388530b | [
"MIT"
] | 2 | 2021-05-07T09:11:54.000Z | 2021-11-27T16:29:10.000Z | import OKCoin
import OKCoinFuture
import YunBi
import json
import threading
import Queue
import numpy
import time
def statis(a):
print "number: %d mean: %f max: %f min: %f std : %f" % (a.shape[0], a.mean(), a.max(), a.min(), a.std())
numpy.set_printoptions(suppress=True)
history = open("log/historyPrice_%s.txt" % time.strftime('%Y_%m_%d_%H_%M_%S', time.localtime(time.time())), "a")
balance = open("log/balance%s.txt" % time.strftime('%Y_%m_%d %H_%M_%S', time.localtime(time.time())), 'a')
statuses = open("log/statuses.txt", 'a')
config = json.load(open("config_future.json", 'r'))
platform_list = {
"current": {
"OKCoin": OKCoin.OKCoinPlatform(config, "OKCoin")
#"YunBi": YunBi.YunBiPlatform(config, "YunBi")
},
"future": {
"OKCoinFuture": OKCoinFuture.OKCoinFuturePlatform(config, "OKCoinFuture")
}
}
maxTransLimitation = int(config["MaxCoinTradeLimitation"])
tick = 0
btc_cny = ("BTC", "CNY")
btc_usd = ("BTC", "USD")
exchange_rate = platform_list["future"]["OKCoinFuture"].trade_accounts["OKCoinFuture"][btc_usd]["obj"].exchange_rate()
print exchange_rate
# status: 0:no long no short
# status: 1: long
# status: -1: short
status_json = json.load(open("status.json", "r"))
status = int(status_json["status"])
if status == 0:
status_json["current_future"] = []
maxPosition = 20
gap = 10
current_sell_future_buy_thres = 1100
current_buy_future_sell_thres = 1100
close_current_sell_future_buy_thres = 600
close_current_buy_future_sell_thres = 600
current_sell_future_buy_diff = numpy.asarray([])
current_buy_future_sell_diff = numpy.asarray([])
money = 0
while True:
time.sleep(2)
tick += 1
print "tick ", tick, " exchange rate ", exchange_rate
for pltform_name, pltform in platform_list["current"].items():
pltform.updateBalancePut()
pltform.updateDepth({"BTC": 100 * maxTransLimitation / 4000.0})
pltform.updateBalanceGet()
for pltform_name, pltform in platform_list["future"].items():
pltform.updateBalancePut()
pltform.updateDepth({"BTC": 100 * maxTransLimitation / 4000.0})
pltform.updateBalanceGet()
if status <= 0 and status > -maxPosition:
for key in platform_list["current"]["OKCoin"].trade_accounts.keys():
print platform_list["current"]["OKCoin"].trade_accounts
if platform_list["current"]["OKCoin"].accounts[key]["balances"]["CNY"]>maxTransLimitation*100.0*exchange_rate*2:
current_obj = platform_list["current"]["OKCoin"].trade_accounts[key][btc_cny]
break
future_obj = platform_list["future"]["OKCoinFuture"].trade_accounts["OKCoinFuture"][btc_usd]
print "diff %f %f" % (future_obj["sell_price"] * exchange_rate - current_obj["buy_price"],
current_obj["sell_price"] - future_obj["buy_price"] * exchange_rate)
if (future_obj["sell_price"] * exchange_rate - current_obj["buy_price"]) > current_buy_future_sell_thres + abs(
status) * gap:
current_obj["trade_account_input_queue"].put(("buy", maxTransLimitation*100.0*exchange_rate/current_obj["buy_price"],current_obj["buy_price"], -1))
future_obj["trade_account_input_queue"].put(("short", maxTransLimitation,future_obj["sell_price"], -1))
coin_remain_c,current_money = current_obj["trade_account_output_queue"].get()
coin_remain_f, future_money = future_obj["trade_account_output_queue"].get()
#coin_remain_f = 0
#coin_remain_c = 0
#current_money = -maxTransLimitation * 100.0 * exchange_rate
#future_money = maxTransLimitation * 100
print ("current coin remain : %f, current_money : %f", coin_remain_c, current_money)
print ("future coin remain : %f, future_money : %f", coin_remain_f, future_money)
status += -1
status_tmp = {
"current": {
"platform_name": "OKCoin",
"account_name": "OKCoin",
"type": "long",
"money": abs(current_money),
"amount": round(maxTransLimitation * 100.0 * exchange_rate / current_obj["buy_price"], 3),
"coin_remain_c": coin_remain_c,
"current_money": current_money
},
"future": {
"platform_name": "OKCoinFuture",
"account_name": "OKCoinFuture",
"type": "short",
"amount": maxTransLimitation,
"coin_remain_f": coin_remain_f,
"future_money": future_money
}
}
money += future_money * exchange_rate + current_money
status_json["current_future"].append(status_tmp)
status_json["status"] = status
status_json["time"] = time.strftime('%Y_%m_%d_%H_%M_%S', time.localtime(time.time()))
print status_json
fp = open("status.json", "w")
json.dump(status_json,fp, indent=4)
fp.flush()
json.dump(status_json, statuses, indent=4)
if status >= 0 and status < maxPosition:
future_obj = platform_list["future"]["OKCoinFuture"].trade_accounts["OKCoinFuture"][btc_usd]
for key in platform_list["current"]["OKCoin"].trade_accounts.keys():
if platform_list["current"]["OKCoin"].accounts[key]["balances"]["BTC"]>round(maxTransLimitation*100.0/future_obj["buy_price"],3)*2:
current_obj = platform_list["current"]["OKCoin"].trade_accounts[key][btc_cny]
break
print "diff %f %f" % (future_obj["sell_price"] * exchange_rate - current_obj["buy_price"],
current_obj["sell_price"] - future_obj["buy_price"] * exchange_rate)
if (current_obj["sell_price"] - future_obj[
"buy_price"] * exchange_rate) > current_sell_future_buy_thres + status * gap:
status += 1
current_obj["trade_account_input_queue"].put(("sell", round(maxTransLimitation*100.0/future_obj["buy_price"],3),current_obj["sell_price"], -1))
future_obj["trade_account_input_queue"].put(("long", maxTransLimitation,future_obj["buy_price"], -1))
coin_remain_c,current_money = current_obj["trade_account_output_queue"].get()
coin_remain_f, future_money = future_obj["trade_account_output_queue"].get()
#coin_remain_f = 0
#coin_remain_c = 0
#current_money = round(maxTransLimitation * 100.0 / future_obj["buy_price"], 3) * current_obj["sell_price"]
#future_money = -maxTransLimitation * 100
print ("current coin remain : %f, current_money : %f",coin_remain_c,current_money)
print ("future coin remain : %f, future_money : %f",coin_remain_f,future_money)
status_tmp = {
"current": {
"platform_name": "OKCoin",
"account_name": "OKCoin",
"type": "short",
"money": abs(current_money),
"amount": round(maxTransLimitation * 100.0 / future_obj["buy_price"], 3),
"coin_remain_c": coin_remain_c,
"current_money": current_money
},
"future": {
"platform_name": "OKCoinFuture",
"account_name": "OKCoinFuture",
"type": "long",
"amount": maxTransLimitation,
"coin_remain_f": coin_remain_f,
"future_money": future_money
}
}
money += current_money + future_money * exchange_rate
status_json["current_future"].append(status_tmp)
status_json["status"] = status
status_json["time"] = time.strftime('%Y_%m_%d_%H_%M_%S', time.localtime(time.time()))
fp = open("status.json", "w")
json.dump(status_json,fp, indent=4)
fp.flush()
json.dump(status_json, statuses, indent=4)
if status < 0:
current_platform_name = status_json["current_future"][0]["current"]["platform_name"]
future_platform_name = status_json["current_future"][0]["future"]["platform_name"]
current_account_name = status_json["current_future"][0]["current"]["account_name"]
future_account_name = status_json["current_future"][0]["future"]["account_name"]
for key in platform_list["current"]["OKCoin"].trade_accounts.keys():
if platform_list["current"][current_platform_name].accounts[key]["balances"]["BTC"]>round(status_json["current_future"][0]["current"]["money"] / platform_list["current"][current_platform_name].trade_accounts[current_account_name][btc_cny]["sell_price"], 3)*2:
current_account_name = key
break
current_obj = platform_list["current"][current_platform_name].trade_accounts[current_account_name][btc_cny]
future_obj = platform_list["future"][future_platform_name].trade_accounts[future_account_name][btc_usd]
if (future_obj["buy_price"] * exchange_rate - current_obj["sell_price"]) < close_current_buy_future_sell_thres:
status += 1
sell_amount = round(status_json["current_future"][0]["current"]["money"] / current_obj["sell_price"], 3)
if sell_amount < 0.01:
sell_amount = 0.01
current_obj["trade_account_input_queue"].put(("sell", sell_amount,current_obj["sell_price"], -1))
future_obj["trade_account_input_queue"].put(("close_short", maxTransLimitation,future_obj["buy_price"], -1))
coin_remain_c,current_money = current_obj["trade_account_output_queue"].get()
coin_remain_f, future_money = future_obj["trade_account_output_queue"].get()
#coin_remain_f = 0
#coin_remain_c = 0
#current_money = current_obj["sell_price"] * sell_amount
#future_money = -maxTransLimitation * 100
print ("current coin remain : %f, current_money : %f", coin_remain_c, coin_remain_f)
print ("future coin remain : %f, future_money : %f", coin_remain_c, coin_remain_f)
status_json["current_future"] = status_json["current_future"][1:]
status_json["status"] = status
money += future_money * exchange_rate + current_money
status_json["time"] = time.strftime('%Y_%m_%d_%H_%M_%S', time.localtime(time.time()))
fp = open("status.json", "w")
json.dump(status_json,fp, indent=4)
fp.flush()
json.dump(status_json, statuses, indent=4)
if status > 0:
current_platform_name = status_json["current_future"][0]["current"]["platform_name"]
future_platform_name = status_json["current_future"][0]["future"]["platform_name"]
current_account_name = status_json["current_future"][0]["current"]["account_name"]
future_account_name = status_json["current_future"][0]["future"]["account_name"]
future_obj = platform_list["future"][future_platform_name].trade_accounts[future_account_name][btc_usd]
for key in platform_list["current"]["OKCoin"].trade_accounts.keys():
if platform_list["current"][current_platform_name].accounts[key]["balances"]["CNY"]>round(maxTransLimitation*100.0/future_obj["sell_price"]*platform_list["current"][current_platform_name].trade_accounts[current_account_name][btc_cny]["buy_price"],3)*2:
current_account_name = key
break
current_obj = platform_list["current"][current_platform_name].trade_accounts[current_account_name][btc_cny]
if (current_obj["buy_price"] - future_obj["sell_price"] * exchange_rate) < close_current_sell_future_buy_thres:
status -= 1
current_obj["trade_account_input_queue"].put(("buy", round(maxTransLimitation*100.0/future_obj["sell_price"],3),current_obj["buy_price"], -1))
future_obj["trade_account_input_queue"].put(("close_long", maxTransLimitation,future_obj["sell_price"], -1))
coin_remain_c,current_money = current_obj["trade_account_output_queue"].get()
coin_remain_f, future_money = future_obj["trade_account_output_queue"].get()
#coin_remain_f = 0
#coin_remain_c = 0
#current_money = -current_obj["buy_price"] * maxCoinLimitation
#future_money = future_obj["sell_price"] * maxCoinLimitation
status_json["current_future"] = status_json["current_future"][1:]
status_json["status"] = status
money -= current_money + future_money * exchange_rate
# money-=current_obj["buy_price"] - future_obj["sell_price"]*exchange_rate
status_json["time"] = time.strftime('%Y_%m_%d_%H_%M_%S', time.localtime(time.time()))
fp = open("status.json", "w")
json.dump(status_json,fp, indent=4)
fp.flush()
json.dump(status_json, statuses, indent=4)
current_buy_future_sell_diff = numpy.append(current_buy_future_sell_diff,
(future_obj["sell_price"] * exchange_rate - current_obj["buy_price"]))
current_sell_future_buy_diff = numpy.append(current_sell_future_buy_diff,
(current_obj["sell_price"] - future_obj["buy_price"] * exchange_rate))
if current_sell_future_buy_diff.shape[0] > 10000:
current_sell_future_buy_diff = current_sell_future_buy_diff[-10000:]
if current_buy_future_sell_diff.shape[0] > 10000:
current_buy_future_sell_diff = current_buy_future_sell_diff[-10000:]
statis(current_buy_future_sell_diff)
statis(current_sell_future_buy_diff)
print status_json["status"]
#print json.dumps(status_json, indent=4)
time.sleep(1)
print "money", money
print "\n"
| 55.40873 | 271 | 0.633818 | 1,672 | 13,963 | 4.934809 | 0.074761 | 0.052115 | 0.02933 | 0.047388 | 0.85626 | 0.812508 | 0.764635 | 0.739062 | 0.696643 | 0.642952 | 0 | 0.016699 | 0.232328 | 13,963 | 251 | 272 | 55.629482 | 0.753055 | 0.058225 | 0 | 0.525822 | 0 | 0 | 0.201478 | 0.034506 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.037559 | null | null | 0.079812 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
44f46578353ea4a0b5a415caf94d89890e374328 | 243 | py | Python | drnalpha/project_styleguide/templatetags/wagtailcore_tags.py | UKGovernmentBEIS/BRE_DigitalRegulationNavigator_Alpha | bfa6d08212bc18034b20b9c922a554a6e1ddd0f1 | [
"MIT"
] | 39 | 2020-02-24T08:58:22.000Z | 2022-03-25T08:48:29.000Z | drnalpha/project_styleguide/templatetags/wagtailcore_tags.py | UKGovernmentBEIS/BRE_DigitalRegulationNavigator_Alpha | bfa6d08212bc18034b20b9c922a554a6e1ddd0f1 | [
"MIT"
] | 6 | 2020-02-24T09:00:44.000Z | 2022-02-24T01:43:51.000Z | demo/core/templatetags/wagtailcore_tags.py | torchbox/storybook-django | f2bdde9d2edbcc975d64a247133a35344da76746 | [
"MIT"
] | 4 | 2020-02-26T10:07:33.000Z | 2022-01-13T09:56:43.000Z | from wagtail.core.templatetags.wagtailcore_tags import register
from pattern_library.monkey_utils import override_tag
override_tag(register, name="include_block")
override_tag(register, name="pageurl")
override_tag(register, name="slugurl")
| 30.375 | 63 | 0.843621 | 32 | 243 | 6.15625 | 0.59375 | 0.22335 | 0.28934 | 0.350254 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065844 | 243 | 7 | 64 | 34.714286 | 0.867841 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
44f9ced79ba63de4faff64c8f2c393cf94dc1622 | 68 | py | Python | tf_kit/lattice/core/__init__.py | lyssym/NER-toolkits | c6368c6fa33761cf6b82e4616cc6705aad052130 | [
"MIT"
] | 1 | 2019-06-18T07:22:57.000Z | 2019-06-18T07:22:57.000Z | tf_kit/lattice/core/__init__.py | lyssym/NER-toolkits | c6368c6fa33761cf6b82e4616cc6705aad052130 | [
"MIT"
] | null | null | null | tf_kit/lattice/core/__init__.py | lyssym/NER-toolkits | c6368c6fa33761cf6b82e4616cc6705aad052130 | [
"MIT"
] | 2 | 2019-06-18T07:22:58.000Z | 2019-11-28T07:49:34.000Z | # _*_ coding: utf-8 _*_
from . import cell
from . import embedding
| 13.6 | 23 | 0.691176 | 9 | 68 | 4.777778 | 0.777778 | 0.465116 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018519 | 0.205882 | 68 | 4 | 24 | 17 | 0.777778 | 0.308824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
780133d7fc5d7670d0542916a43583e96b58573b | 43 | py | Python | git_lfs/__init__.py | TouchSurgery/py-git-lfs | bac119874f6f10018a55360119bedc11254449b9 | [
"MIT"
] | null | null | null | git_lfs/__init__.py | TouchSurgery/py-git-lfs | bac119874f6f10018a55360119bedc11254449b9 | [
"MIT"
] | null | null | null | git_lfs/__init__.py | TouchSurgery/py-git-lfs | bac119874f6f10018a55360119bedc11254449b9 | [
"MIT"
] | null | null | null | from lfs import GitLFSServer, GitLFSObject
| 21.5 | 42 | 0.860465 | 5 | 43 | 7.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 43 | 1 | 43 | 43 | 0.973684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
783e4499a4a7aea4f70231077b89473825107e76 | 70 | py | Python | coffer/version.py | xeddmc/Coffer | dd50c6da7e4222aa860ac298aea74b04697f5fc5 | [
"MIT"
] | 90 | 2016-07-25T00:20:29.000Z | 2018-02-07T22:05:27.000Z | coffer/version.py | xeddmc/Coffer | dd50c6da7e4222aa860ac298aea74b04697f5fc5 | [
"MIT"
] | 3 | 2018-04-25T08:38:14.000Z | 2021-02-09T18:10:32.000Z | coffer/version.py | xeddmc/Coffer | dd50c6da7e4222aa860ac298aea74b04697f5fc5 | [
"MIT"
] | 6 | 2016-07-26T14:57:15.000Z | 2017-05-18T21:45:23.000Z | from coffer.utils import text
def version():
print (text.version)
| 17.5 | 29 | 0.728571 | 10 | 70 | 5.1 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 70 | 3 | 30 | 23.333333 | 0.87931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
783ff60f9be975cbb91c53a6990f1fb9c9bacd38 | 28,428 | py | Python | cogs/train_commands.py | finneh4249/PTBot | adfc8693ced758e7918b3a7611367a57247baf47 | [
"MIT"
] | null | null | null | cogs/train_commands.py | finneh4249/PTBot | adfc8693ced758e7918b3a7611367a57247baf47 | [
"MIT"
] | 3 | 2021-01-04T15:02:02.000Z | 2022-01-20T09:28:13.000Z | cogs/train_commands.py | finneh4249/PTBot | adfc8693ced758e7918b3a7611367a57247baf47 | [
"MIT"
] | 1 | 2022-01-20T08:57:48.000Z | 2022-01-20T08:57:48.000Z | import discord
import os
import datetime
import json
import asyncio
import psycopg2
import pytz
from discord.ext import commands
from ptv.client import PTVClient
from classes.PTVFormatter import PTVFormatter
from enum import Enum
from dateutil.relativedelta import relativedelta
import sys
import traceback
from urllib.error import HTTPError
PTV = PTVFormatter(os.environ['PTV_DEV_ID'],os.environ['PTV_DEV_KEY'])
# PTVA = PTVApi(os.environ['PTV_DEV_ID'],os.environ['PTV_DEV_KEY'])
with open('./InfoFiles/stations.json', 'r') as f:
stations = json.load(f)
with open('./InfoFiles/directions.json', 'r') as f:
directionss = json.load(f)
with open('./InfoFiles/station-codes.json', 'r') as f:
Station_Codes = json.load(f)
class RouteType(Enum):
TRAIN = 0
TRAM = 1
BUS = 2
VLINE = 3
NIGHT_BUS = 4
class TrainCommands(commands.Cog):
def __init__(self,bot):
self.bot = bot
def _splicegen(self, maxchars, stringlist):
"""
Return a list of slices to print based on maxchars string-length boundary.
"""
runningcount = 0 # start at 0
tmpslice = [] # tmp list where we append slice numbers.
for item in stringlist:
runningcount += len(item)
if runningcount <= int(maxchars):
tmpslice.append(item)
else:
yield tmpslice
tmpslice = [item]
runningcount = len(item)
yield(tmpslice)
async def get_departures(self, ctx, route_type, station):
noinemoji = ["0️⃣", "1️⃣", "2️⃣", "3️⃣", "4️⃣", "5️⃣"]
await ctx.trigger_typing()
station = ' '.join([(word.upper() not in Station_Codes)*word or Station_Codes[word.upper()] for word in station.split(' ')])
nexttrains = PTV.search(station, route_types=route_type, include_addresses=False, include_outlets=False, match_stop_by_suburb=False, match_route_by_suburb=False)
i = 0
trainoptionsmessage = []
if len(nexttrains['stops']) > 1:
for i in range(0, min(len(nexttrains['stops']), 5)):
type_route = ['<:Metro:771920140207259659> (Metro)', '<:Tram:771921271998382140> (Tram)', '<:Bus:771921102335246346> (Bus)', '<:VLine:771920567959683102> (Vline)', '<:Bus:771921102335246346> (Night Bus)']
trainoptionsmessage.append(f"{noinemoji[i + 1]} {(type_route[nexttrains['stops'][i]['route_type']])} {nexttrains['stops'][i]['stop_name']} in {nexttrains['stops'][i]['stop_suburb']}")
trainoptionsmessage = await ctx.send("Which Station are you talking about?\n" + '\n'.join(trainoptionsmessage))
for i in range(0, min(len(nexttrains['stops']), 5)):
await trainoptionsmessage.add_reaction(noinemoji[i + 1])
def check(reaction, user):
return user == ctx.message.author and reaction.emoji in noinemoji and reaction.message.id == trainoptionsmessage.id
try:
reaction, user = await self.bot.wait_for('reaction_add', timeout=30.0, check=check)
except asyncio.TimeoutError:
await trainoptionsmessage.edit(content="Sorry, message timeout, please ask again")
if ctx.message.guild:
try:
await trainoptionsmessage.clear_reactions()
except discord.errors.Forbidden:
await ctx.send("This bot requires the `manage_messages` permission to work properly. This command will still work without it, but not as well.")
return
else:
await trainoptionsmessage.edit(content="Getting Information, Please wait")
if ctx.message.guild:
try:
await trainoptionsmessage.clear_reactions()
except discord.errors.Forbidden:
await ctx.send("This bot requires the `manage_messages` permission to work properly. This command will still work without it, but not as well.")
station = nexttrains['stops'][noinemoji.index(str(reaction)) - 1]
elif len(nexttrains['stops']) == 0:
await ctx.send('Sorry the station you specified was not found. Please make sure it is a station from Victoria, Australia as this bot only supports Victoria at this time.\nIf you believe this is an error, please dm me or lucky962')
return
else:
trainoptionsmessage = await ctx.send("Getting Information, Please wait")
station = nexttrains['stops'][0]
nexttrains = PTV.get_departures_from_stop(station['route_type'], station['stop_id'], max_results=3, expand=["all"])
embed = discord.Embed(title='Next Trains')
embed.set_author(name='VPT Bot', icon_url=self.bot.user.avatar_url)
nexttrains['directions'] = dict(sorted(nexttrains['directions'].items(), key=lambda p: int(p[0])))
for directions in nexttrains['directions'].values():
nexttrain = []
traincounter = 0
for train in nexttrains['departures']:
if train['direction_id'] == directions['direction_id'] and traincounter < 3:
traincounter += 1
train['scheduled_departure_utc'] = pytz.timezone("Etc/UTC").localize(datetime.datetime.strptime(train['scheduled_departure_utc'],'%Y-%m-%dT%H:%M:%SZ')).astimezone(pytz.timezone('Australia/Melbourne'))
try:
train['estimated_departure_utc'] = pytz.timezone("Etc/UTC").localize(datetime.datetime.strptime(train['estimated_departure_utc'],'%Y-%m-%dT%H:%M:%SZ')).astimezone(pytz.timezone('Australia/Melbourne'))
except TypeError:
pass
flags = ""
if "S_WCA" in train['flags']:
flags += ("<:WCA:780582086398181386> ")
if "S_VTR" in train['flags']:
flags += ("VLine Train")
if "S_VCH" in train['flags']:
flags += ("VLine Coach")
if train['scheduled_departure_utc'].date() == datetime.datetime.now().astimezone(pytz.timezone('Australia/Melbourne')).date():
nexttrain.append(f"{'Plat: ' + train['platform_number'] + '.' if train['platform_number'] else ''} {nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor']['description'] if str(train['run_id']) in nexttrains['runs'] and nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor'] and nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor']['description'] else ''} \nTo: {nexttrains['runs'][str(train['run_id'])]['destination_name']}\nScheduled to leave at: {train['scheduled_departure_utc'].strftime('%I:%M%p')}. ETA: {(str(int(((train.get('estimated_departure_utc') - datetime.datetime.now().astimezone(pytz.timezone('Australia/Melbourne'))).total_seconds()/60))) + ' minutes' if type(train.get('estimated_departure_utc')) == datetime.datetime else train.get('estimated_departure_utc'))}\n" + ("Flags: " + flags + "\n" if flags else ""))
else:
nexttrain.append(f"{'Plat: ' + train['platform_number'] + '.' if train['platform_number'] else ''} {nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor']['description'] if str(train['run_id']) in nexttrains['runs'] and nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor'] and nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor']['description'] else ''} \nTo: {nexttrains['runs'][str(train['run_id'])]['destination_name']}\nScheduled to leave at: {train['scheduled_departure_utc'].strftime('%I:%M%p %d %b')}. ETA: {(str(int(((train.get('estimated_departure_utc') - datetime.datetime.now().astimezone(pytz.timezone('Australia/Melbourne'))).total_seconds()/60))) + ' minutes' if type(train.get('estimated_departure_utc')) == datetime.datetime else train.get('estimated_departure_utc'))}\n" + ("Flags: " + flags + "\n" if flags else ""))
elif traincounter >= 3:
break
if len(nexttrain) > 0:
embed.add_field(name=directions['direction_name'] + (" (Route " + nexttrains['routes'][str(directions['route_id'])]['route_number'] + ")" if nexttrains['routes'][str(directions['route_id'])]['route_number'] else ""), value='\n'.join(nexttrain))
disruptionsmsng = discord.Embed(title='Potential Disruptions', description=f"Potential disruptions that might affect {station['stop_name']}", color=0x0072ce)
disruptionsmsng.set_author(name='VPT Bot', icon_url=self.bot.user.avatar_url)
if station['route_type'] == 0:
embed.title = "Next Trains"
embed.description = f"Trains departing from {station['stop_name']}.\nNote: Train Types are sourced from ptv however seem to be quite inaccurate in some cases. They may or may not be accurate."
embed.colour = 0x0072ce
disruptionsmsng.colour = 0x0072ce
TransportIcon = discord.File("./src/Icons/Metro.png", filename="Metro.png")
embed.set_thumbnail(url="attachment://Metro.png")
elif station['route_type'] == 1:
embed.title = "Next Trams"
embed.description = f"Trams departing from {station['stop_name']}"
embed.colour = 0x78be20
disruptionsmsng.colour = 0x78be20
TransportIcon = discord.File("./src/Icons/Tram.png", filename="Tram.png")
embed.set_thumbnail(url="attachment://Tram.png")
elif station['route_type'] == 2 or station['route_type'] == 4:
embed.title = "Next Buses"
embed.description = f"Buses departing from {station['stop_name']}"
embed.colour = 0xff8200
disruptionsmsng.colour = 0xff8200
TransportIcon = discord.File("./src/Icons/Bus.png", filename="Bus.png")
embed.set_thumbnail(url="attachment://Bus.png")
elif station['route_type'] == 3:
embed.title = "Next VLine Trains"
embed.description = f"VLine Trains departing from {station['stop_name']}"
embed.colour = 0x8f1a95
disruptionsmsng.colour = 0x8f1a95
TransportIcon = discord.File("./src/Icons/VLine.png", filename="VLine.png")
embed.set_thumbnail(url="attachment://VLine.png")
disruptions = nexttrains['disruptions']
listofdisruptions = [[]]
field = 0
length = 0
for disruption in disruptions.values():
if disruption['display_status'] == True:
if length + len(f"[{disruption['title']}]({disruption['url']})\n") > 1024:
listofdisruptions.append([])
field += 1
length = 0
length += len(f"[{disruption['title']}]({disruption['url']})\n")
listofdisruptions[field].append(f"[{disruption['title']}]({disruption['url']})")
for i in listofdisruptions:
disruptionsmsng.add_field(name="Potential Disruptions", value='\n'.join(i))
embed.set_footer(icon_url=self.bot.user.avatar_url, text='Source: Licensed from Public Transport Victoria under a Creative Commons Attribution 4.0 International Licence.')
disruptionsmsng.set_footer(icon_url=self.bot.user.avatar_url, text='Source: Licensed from Public Transport Victoria under a Creative Commons Attribution 4.0 International Licence.')
await ctx.send(content="", file=TransportIcon, embed=embed)
await trainoptionsmessage.delete()
if listofdisruptions[0]:
await ctx.send(embed=disruptionsmsng)
@commands.command(name='nexttrain', aliases=['nt', 'tn', 'tnext', 'nextt'])
async def do_nexttrain(self, ctx, *, station):
await self.get_departures(ctx, [RouteType.TRAIN.value, RouteType.VLINE.value], station)
@commands.command(name='nextbus', aliases=['nb', 'bn', 'bnext', 'nextb'])
async def do_nextbus(self, ctx, *, station):
await self.get_departures(ctx, [RouteType.BUS.value, RouteType.NIGHT_BUS.value], station)
@commands.command(name='nexttram', aliases=['ntr', 'trn', 'trnext', 'nexttr', 'nam', 'amn', 'amnext', 'nextam'])
async def do_nexttram(self, ctx, *, station):
await self.get_departures(ctx, [RouteType.TRAM.value], station)
@commands.command(name='nextvline', aliases=['vt', 'tv', 'vnext', 'nextv'])
async def do_nextvline(self, ctx, *, station):
await self.get_departures(ctx, [RouteType.VLINE.value], station)
@commands.command(name='next', aliases=['n'])
async def do_next(self, ctx, *, station):
await self.get_departures(ctx, [RouteType.TRAIN.value, RouteType.TRAM.value, RouteType.BUS.value, RouteType.VLINE.value, RouteType.NIGHT_BUS.value], station)
@do_nexttrain.error
@do_nextbus.error
@do_nexttram.error
@do_nextvline.error
@do_next.error
async def nexterror(self,ctx,error):
if isinstance(error, commands.MissingRequiredArgument):
await ctx.send('No station specified.')
elif isinstance(error, discord.errors.Forbidden):
await ctx.send("Sorry, I do not have enough permissions to complete your command.")
elif isinstance(error, HTTPError) and str(error.code).startswith("5"):
await ctx.send(f"Error: {error.code} (This is an error on PTV's side, please try again later.)")
else:
print('Ignoring exception in command {}:'.format(ctx.command), file=sys.stderr)
await ctx.send('An error has occured. This error has been reported and will be fixed as soon as possible.')
await self.bot.get_user(244596682531143680).send(f'ERROR\nIgnoring exception in command {ctx.command}\n Command Sent: {ctx.message.content}\n{type(error)}\n{error}\n{error.__traceback__}')
traceback.print_exception(type(error), error, error.__traceback__, file=sys.stderr)
@commands.command(name='dep')
async def do_dep(self, ctx, minutes, *, station):
noinemoji = ["0️⃣", "1️⃣", "2️⃣", "3️⃣", "4️⃣", "5️⃣"]
await ctx.trigger_typing()
nexttrains = PTV.search(station, route_types=[0, 1, 2, 3], include_addresses=False, include_outlets=False, match_stop_by_suburb=False, match_route_by_suburb=False)
i = 0
trainoptionsmessage = []
if len(nexttrains['stops']) > 1:
for i in range(0, min(len(nexttrains['stops']), 5)):
type_route = ['<:Metro:771920140207259659> (Metro)', '<:Tram:771921271998382140> (Tram)', '<:Bus:771921102335246346> (Bus)', '<:VLine:771920567959683102> (Vline)']
trainoptionsmessage.append(f"{noinemoji[i + 1]} {(type_route[nexttrains['stops'][i]['route_type']])} {nexttrains['stops'][i]['stop_name']} in {nexttrains['stops'][i]['stop_suburb']}")
trainoptionsmessage = await ctx.send("Which Station are you talking about?\n" + '\n'.join(trainoptionsmessage))
for i in range(0, min(len(nexttrains['stops']), 5)):
await trainoptionsmessage.add_reaction(noinemoji[i + 1])
def check(reaction, user):
return user == ctx.message.author and reaction.emoji in noinemoji and reaction.message.id == trainoptionsmessage.id
try:
reaction, user = await self.bot.wait_for('reaction_add', timeout=30.0, check=check)
except asyncio.TimeoutError:
await trainoptionsmessage.edit(content="Sorry, message timeout, please ask again")
if ctx.message.guild:
try:
await trainoptionsmessage.clear_reactions()
except discord.errors.Forbidden:
await ctx.send("This bot requires the `manage_messages` permission to work properly. This command will still work without it, but not as well.")
return
else:
await trainoptionsmessage.edit(content="Getting Information, Please wait")
if ctx.message.guild:
try:
await trainoptionsmessage.clear_reactions()
except discord.errors.Forbidden:
await ctx.send("This bot requires the `manage_messages` permission to work properly. This command will still work without it, but not as well.")
station = nexttrains['stops'][noinemoji.index(str(reaction)) - 1]
elif len(nexttrains['stops']) == 0:
await ctx.send('Sorry the station you specified was not found. Please make sure it is a station from Victoria, Australia as this bot only supports Victoria at this time.\nIf you believe this is an error, please dm me or lucky962')
return
else:
trainoptionsmessage = await ctx.send("Getting Information, Please wait")
station = nexttrains['stops'][0]
nexttrains = PTV.get_departures_from_stop(station['route_type'], station['stop_id'], date_utc=(datetime.datetime.now() + datetime.timedelta(minutes=int(minutes))), max_results=3, expand=["all"])
# DELETE IF WORKS WITHOUT THIS
# if nexttrains == "Error, station not found":
# await ctx.send("Sorry the station you specified was not found. Please make sure it is a station from Victoria, Australia as this bot only supports Victoria at this time.\nIf you believe this is an error, please dm me or lucky962")
# return
embed = discord.Embed(title='Next Trains')
embed.set_author(name='VPT Bot', icon_url=self.bot.user.avatar_url)
nexttrains['directions'] = dict(sorted(nexttrains['directions'].items(), key=lambda p: int(p[0])))
for directions in nexttrains['directions'].values():
nexttrain = []
traincounter = 0
for train in nexttrains['departures']:
if train['direction_id'] == directions['direction_id'] and traincounter < 3:
traincounter += 1
train['scheduled_departure_utc'] = pytz.timezone("Etc/UTC").localize(datetime.datetime.strptime(train['scheduled_departure_utc'],'%Y-%m-%dT%H:%M:%SZ')).astimezone(pytz.timezone('Australia/Melbourne'))
try:
train['estimated_departure_utc'] = pytz.timezone("Etc/UTC").localize(datetime.datetime.strptime(train['estimated_departure_utc'],'%Y-%m-%dT%H:%M:%SZ')).astimezone(pytz.timezone('Australia/Melbourne'))
except TypeError:
pass
if train['scheduled_departure_utc'].date() == datetime.datetime.now().astimezone(pytz.timezone('Australia/Melbourne')).date():
nexttrain.append(f"{'Plat: ' + train['platform_number'] + '.' if train['platform_number'] else ''} {nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor']['description'] if str(train['run_id']) in nexttrains['runs'] and nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor'] and nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor']['description'] else ''} \nTo: {nexttrains['runs'][str(train['run_id'])]['destination_name']}\nScheduled to leave at: {train['scheduled_departure_utc'].strftime('%I:%M%p')}. ETA: {(str(int(((train.get('estimated_departure_utc') - datetime.datetime.now().astimezone(pytz.timezone('Australia/Melbourne'))).total_seconds()/60))) + ' minutes' if type(train.get('estimated_departure_utc')) == datetime.datetime else train.get('estimated_departure_utc'))}\n" + ("Flags: " + flags + "\n" if flags else ""))
else:
nexttrain.append(f"{'Plat: ' + train['platform_number'] + '.' if train['platform_number'] else ''} {nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor']['description'] if str(train['run_id']) in nexttrains['runs'] and nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor'] and nexttrains['runs'][str(train['run_id'])]['vehicle_descriptor']['description'] else ''} \nTo: {nexttrains['runs'][str(train['run_id'])]['destination_name']}\nScheduled to leave at: {train['scheduled_departure_utc'].strftime('%I:%M%p %d %b')}. ETA: {(str(int(((train.get('estimated_departure_utc') - datetime.datetime.now().astimezone(pytz.timezone('Australia/Melbourne'))).total_seconds()/60))) + ' minutes' if type(train.get('estimated_departure_utc')) == datetime.datetime else train.get('estimated_departure_utc'))}\n" + ("Flags: " + flags + "\n" if flags else ""))
elif traincounter >= 3:
break
if len(nexttrain) > 0:
embed.add_field(name=directions['direction_name'] + (" (Route " + nexttrains['routes'][str(directions['route_id'])]['route_number'] + ")" if nexttrains['routes'][str(directions['route_id'])]['route_number'] else ""), value='\n'.join(nexttrain))
disruptionsmsng = discord.Embed(title='Potential Disruptions', description=f"Potential disruptions that might affect {station['stop_name']}", color=0x0072ce)
disruptionsmsng.set_author(name='VPT Bot', icon_url=self.bot.user.avatar_url)
if station['route_type'] == 0:
embed.title = "Next Trains"
embed.description = f"Trains departing from {station['stop_name']}.\nNote: Train Types are sourced from ptv however seem to be quite inaccurate in some cases. They may or may not be accurate."
embed.colour = 0x0072ce
disruptionsmsng.colour = 0x0072ce
TransportIcon = discord.File("./src/Icons/Metro.png", filename="Metro.png")
embed.set_thumbnail(url="attachment://Metro.png")
elif station['route_type'] == 1:
embed.title = "Next Trams"
embed.description = f"Trams departing from {station['stop_name']}"
embed.colour = 0x78be20
disruptionsmsng.colour = 0x78be20
TransportIcon = discord.File("./src/Icons/Tram.png", filename="Tram.png")
embed.set_thumbnail(url="attachment://Tram.png")
elif station['route_type'] == 2 or station['route_type'] == 4:
embed.title = "Next Buses"
embed.description = f"Buses departing from {station['stop_name']}"
embed.colour = 0xff8200
disruptionsmsng.colour = 0xff8200
TransportIcon = discord.File("./src/Icons/Bus.png", filename="Bus.png")
embed.set_thumbnail(url="attachment://Bus.png")
elif station['route_type'] == 3:
embed.title = "Next VLine Trains"
embed.description = f"VLine Trains departing from {station['stop_name']}"
embed.colour = 0x8f1a95
disruptionsmsng.colour = 0x8f1a95
TransportIcon = discord.File("./src/Icons/VLine.png", filename="VLine.png")
embed.set_thumbnail(url="attachment://VLine.png")
disruptions = nexttrains['disruptions']
listofdisruptions = [[]]
field = 0
length = 0
for disruption in disruptions.values():
if disruption['display_status'] == True:
if length + len(f"[{disruption['title']}]({disruption['url']})\n") > 1024:
listofdisruptions.append([])
# print(length)
field += 1
length = 0
length += len(f"[{disruption['title']}]({disruption['url']})\n")
listofdisruptions[field].append(f"[{disruption['title']}]({disruption['url']})")
for i in listofdisruptions:
# print(len('\n'.join(i)))
# print('\n'.join(i))
disruptionsmsng.add_field(name="Potential Disruptions", value='\n'.join(i))
# print(f"DISRUPTIONSDF SD FSDOF SD FIOSD F{listofdisruptions}")
embed.set_footer(icon_url=self.bot.user.avatar_url, text='Source: Licensed from Public Transport Victoria under a Creative Commons Attribution 4.0 International Licence.')
disruptionsmsng.set_footer(icon_url=self.bot.user.avatar_url, text='Source: Licensed from Public Transport Victoria under a Creative Commons Attribution 4.0 International Licence.')
# print(embed.fields)
await ctx.send(content="", file=TransportIcon, embed=embed)
await trainoptionsmessage.delete()
if listofdisruptions[0]:
await ctx.send(embed=disruptionsmsng)
@do_dep.error
async def dep_error(self,ctx,error):
if isinstance(error, commands.MissingRequiredArgument):
await ctx.send('No station specified.')
elif isinstance(error, discord.errors.Forbidden):
await ctx.send("Sorry, I do not have enough permissions to complete your command.")
elif isinstance(error, HTTPError) and str(error.code).startswith("5"):
await ctx.send(f"Error: {error.code} (This is an error on PTV's side, please try again later.)")
else:
print('Ignoring exception in command {}:'.format(ctx.command), file=sys.stderr)
await ctx.send('An error has occured. This error has been reported and will be fixed as soon as possible.')
await self.bot.get_user(244596682531143680).send(f'ERROR\nIgnoring exception in command {ctx.command}\n Command Sent: {ctx.message.content}\n{type(error)}\n{error}\n{error.__traceback__}')
traceback.print_exception(type(error), error, error.__traceback__, file=sys.stderr)
@commands.command(name='setdisruptionschannel')
@commands.has_permissions(manage_channels=True)
async def do_disruptions(self, ctx, channel):
channel = channel.lstrip('<#').rstrip('>')
try:
int(channel)
except ValueError:
await ctx.send("Please mention a channel to set it as a disruptions channel.")
return
disruptionchannelslist = []
for j in range(1,18):
if j != 10:
messagerouteid = j
disruptionsmsg = discord.Embed(name="Disruptions", description=self.bot.routes[str(messagerouteid)]["route_name"], timestamp=datetime.datetime.now(), color=3447003)
PTV.disruptions_to_embed(disruptionsmsg, self.bot.disruptions[str(messagerouteid)], messagerouteid, self.bot)
disruptionsmsg.set_footer(icon_url=self.bot.user.avatar_url, text=f'Last Disruption Update ')
disruptionchannelslist.append((await self.bot.get_channel(int(channel)).send(embed=disruptionsmsg)).id)
disruptionchannelslist.append((await self.bot.get_channel(int(channel)).send(f"Last Checked for Disruptions at {(datetime.datetime.now().astimezone(pytz.timezone('Australia/Melbourne'))).strftime('%I:%M%p %d %b %Y')}\nThe side bar will be yellow if a Planned Work is currently active.\nSource: Licensed from Public Transport Victoria under a Creative Commons Attribution 4.0 International Licence.\nBe sure to join my discord server for official VPTBot support/feedback! https://discord.gg/KEhCS8U")).id)
DATABASE_URL = os.environ['DATABASE_URL']
conn = psycopg2.connect(DATABASE_URL, sslmode='require')
cur = conn.cursor()
cur.execute(f'DELETE FROM disruption_channels WHERE channel_id = {channel};')
cur.execute(f'INSERT INTO disruption_channels ("channel_id", "1", "2", "3", "4", "5", "6", "7", "8", "9", "11", "12", "13", "14", "15", "16", "17", "18") VALUES({channel}, {", ".join(str(v) for v in disruptionchannelslist)});')
conn.commit()
cur.close()
conn.close()
@do_disruptions.error
async def disruptionserror(self,ctx,error):
if isinstance(error, commands.MissingPermissions):
await ctx.send("Sorry, you are required to have the manage channels permission to run this command")
elif isinstance(error, commands.MissingRequiredArgument):
await ctx.send("Please specify a Channel")
else:
print('Ignoring exception in command {}:'.format(ctx.command), file=sys.stderr)
await ctx.send('An error has occured. This error has been reported and will be fixed as soon as possible.')
await self.bot.get_user(244596682531143680).send(f'ERROR\nIgnoring exception in command {ctx.command}\n Command Sent: {ctx.message.content}\n{type(error)}\n{error}\n{error.__traceback__}')
traceback.print_exception(type(error), error, error.__traceback__, file=sys.stderr)
def setup(bot):
bot.add_cog(TrainCommands(bot)) | 72.705882 | 887 | 0.639546 | 3,375 | 28,428 | 5.298667 | 0.133926 | 0.012973 | 0.018118 | 0.014539 | 0.834703 | 0.82721 | 0.815747 | 0.806408 | 0.803109 | 0.795057 | 0 | 0.021902 | 0.219432 | 28,428 | 391 | 888 | 72.705882 | 0.78292 | 0.023005 | 0 | 0.683333 | 0 | 0.058333 | 0.378443 | 0.157102 | 0 | 0 | 0.005191 | 0 | 0 | 1 | 0.013889 | false | 0.005556 | 0.041667 | 0.005556 | 0.094444 | 0.016667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7849c0fc9da4a6530cacde074b35249dfaf6af53 | 143 | py | Python | lang/py/cookbook/v2/source/cb2_19_11_exm_1.py | ch1huizong/learning | 632267634a9fd84a5f5116de09ff1e2681a6cc85 | [
"MIT"
] | null | null | null | lang/py/cookbook/v2/source/cb2_19_11_exm_1.py | ch1huizong/learning | 632267634a9fd84a5f5116de09ff1e2681a6cc85 | [
"MIT"
] | null | null | null | lang/py/cookbook/v2/source/cb2_19_11_exm_1.py | ch1huizong/learning | 632267634a9fd84a5f5116de09ff1e2681a6cc85 | [
"MIT"
] | null | null | null | def logical_lines(physical_lines, joiner=''.join, separator=''):
return joiner(physical_lines).replace('\\\n', separator).splitlines(True)
| 47.666667 | 77 | 0.741259 | 17 | 143 | 6.058824 | 0.705882 | 0.252427 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 143 | 2 | 78 | 71.5 | 0.780303 | 0 | 0 | 0 | 0 | 0 | 0.027972 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
78ac1e37214bf74eb7e791370838e7132d637809 | 26,912 | py | Python | HW_data_processing.py | JuraK/Hotwire | 00d919e8eda379feb32c382e36d35b66c49f5968 | [
"Apache-2.0"
] | null | null | null | HW_data_processing.py | JuraK/Hotwire | 00d919e8eda379feb32c382e36d35b66c49f5968 | [
"Apache-2.0"
] | null | null | null | HW_data_processing.py | JuraK/Hotwire | 00d919e8eda379feb32c382e36d35b66c49f5968 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Wed Apr 29 15:47:01 2015
@author: jura
"""
from __future__ import division
import matplotlib.pyplot as plt
import HotWire_calc as hw
import os
import numpy as np
from multiprocessing import Pool
#######################x
U0=3.68 #[m/s] Uniform velocity
wire_temper=260 #[°C] Wire temperature
processor_no=4 #Number of processor cores on a PC
#######################x
prumery=[]
pozice_all=[]
cta=[]
rozmery=[]
#-------Gets all the files in all the subdirectories---------
directory='.\\Viric_data'
file_paths=[]
ax=[]
vir=[]
natoceni=[]
for root, directories, files in os.walk(directory):
for filename in files:
# Join the two strings in order to form the full filepath.
if '.txt' in filename and 'viric_' in root:
filepath = os.path.join(root, filename)
file_paths.append(filepath) # Add it to the list.
ax.append(root.split('\\')[-1]) #Get all the axial distances
vir.append(root.split('\\')[-2]) #Get all the swirler types
if '_' in filename:
natoceni.append(filename.split('_')[-1][:-4]) # Get all the rotation angles
#----------------------------------------------------------
#------Here you can restrict range of files to be processed--------------
file_paths=file_paths[:31]
ax=ax[:31]
vir=vir[:31]
natoceni=natoceni[:31]
#------------------------------------------------------------------------
#print file_paths
##---Gets all the data from all files------
#def read_cta(filenm):
# return hw.HotWireMeasurement(filenm,wire_temper)
#
#p=Pool(processes=processor_no)
#print len(file_paths)
#cta=p.map(read_cta,file_paths)
for f in file_paths:
#Načte data pro daný soubor f, uloží je do třídy HotWireMeasurement a třídu uloží do pole cta
cta.append(hw.HotWireMeasurement(f,wire_temper=wire_temper))
for c in cta:
print c
peaks=c.fft_analysis()
#print 'Significant frequencies: ',peaks,'Hz'
##-----------------------------------------
#Eliminates repeates
ax=np.unique(ax)
vir=np.unique(vir)
natoceni=np.unique(natoceni)
#print natoceni
#swirl number
nazev_ind=[]
for v in vir:
for ax_pozice in ax:
if ax_pozice == '0,1D':
ax_pozice='01D'
elif ax_pozice == '1D':
ax_pozice='10D'
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind.append(n)
ccs=[cta[c] for c in nazev_ind ]
phi=[]
for nzv in nazev_ind:
phi.append(int(file_paths[nzv].split('_')[-1][:-4]))
S=hw.swirl_number(ccs, 150)
#phi=[int(c) for c in natoceni]
print file_paths[nazev_ind[-1]][:5]+': Swirl nuber:',S
if len(phi) == len(ccs):
V=hw.flow_rate(ccs,150,phi)
print file_paths[nazev_ind[-1]][:5]+': Flow rate:',V,"m3/s"
else:
print "Cannot calculate flow rate: phi list has different length from ccs list!"
nazev_ind=[]
nazev_ind=[]
for v in vir:
for ax_pozice in ax:
if ax_pozice == '0,1D':
ax_pozice='01D'
elif ax_pozice == '1D':
ax_pozice='10D'
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind.append(n)
# ccs=[cta[c] for c in nazev_ind ]
#U=[cta[c].U_mag for c in nazev_ind ]
U=cta[nazev_ind[0]].U_mag
if len(nazev_ind)>1:
for c in nazev_ind[1:]:
U=np.vstack((U,cta[c].U_mag))
fig, axis = plt.subplots(figsize=(5,5),subplot_kw=dict(projection='polar'))
phi=[]
for nzv in nazev_ind:
phi.append(int(file_paths[nzv].split('_')[-1][:-4]))
#phi=[int(c) for c in natoceni]
phi=np.radians(phi)
rad=150-cta[nazev_ind[-1]].x
phis=np.array(phi[:-2])
# rads=np.array(rad)
Us=np.copy(U[:-2])
for i in range(7):
phis=np.append(phis,phi[:-2]+np.pi*(i+1)/4)
Us=np.vstack((Us,U[:-2]))
phis=np.append(phis,phi[-1]+np.pi*7/4)
Us=np.vstack((Us,U[0]))
rad, phi=np.meshgrid(rad,phis)
CS=axis.contourf(phi,rad,Us)
plt.colorbar(CS)
#plt.rcParams.update({'font.size': 8})
plt.savefig(file_paths[nazev_ind[-1]][:-16]+'U_mag_2D.png',dpi=300,bbox_inches='tight')
nazev_ind=[]
plt.close()
nazev_ind=[]
for v in vir:
for ax_pozice in ax:
if ax_pozice == '0,1D':
ax_pozice='01D'
elif ax_pozice == '1D':
ax_pozice='10D'
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind.append(n)
# ccs=[cta[c] for c in nazev_ind ]
#U=[cta[c].U_aver for c in nazev_ind ]
U=cta[nazev_ind[0]].U_aver
if len(nazev_ind)>1:
for c in nazev_ind[1:]:
U=np.vstack((U,cta[c].U_aver))
fig, axis = plt.subplots(figsize=(5,5),subplot_kw=dict(projection='polar'))
phi=[]
for nzv in nazev_ind:
phi.append(int(file_paths[nzv].split('_')[-1][:-4]))
#phi=[int(c) for c in natoceni]
phi=np.radians(phi)
rad=150-cta[nazev_ind[-1]].x
phis=np.array(phi[:-2])
# rads=np.array(rad)
Us=np.copy(U[:-2])
for i in range(7):
phis=np.append(phis,phi[:-2]+np.pi*(i+1)/4)
Us=np.vstack((Us,U[:-2]))
phis=np.append(phis,phi[-1]+np.pi*7/4)
Us=np.vstack((Us,U[0]))
rad, phi=np.meshgrid(rad,phis)
CS=axis.contourf(phi,rad,Us)
plt.colorbar(CS)
#plt.rcParams.update({'font.size': 8})
plt.savefig(file_paths[nazev_ind[-1]][:-16]+'U_aver_2D.png',dpi=300,bbox_inches='tight')
nazev_ind=[]
plt.close()
nazev_ind=[]
for v in vir:
for ax_pozice in ax:
if ax_pozice == '0,1D':
ax_pozice='01D'
elif ax_pozice == '1D':
ax_pozice='10D'
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind.append(n)
# ccs=[cta[c] for c in nazev_ind ]
#U=[cta[c].U_mag for c in nazev_ind ]
U=cta[nazev_ind[0]].V_aver
if len(nazev_ind)>1:
for c in nazev_ind[1:]:
U=np.vstack((U,cta[c].V_aver))
fig, axis = plt.subplots(figsize=(5,5),subplot_kw=dict(projection='polar'))
phi=[]
for nzv in nazev_ind:
phi.append(int(file_paths[nzv].split('_')[-1][:-4]))
#phi=[int(c) for c in natoceni]
phi=np.radians(phi)
rad=150-cta[nazev_ind[-1]].x
phis=np.array(phi[:-2])
# rads=np.array(rad)
Us=np.copy(U[:-2])
for i in range(7):
phis=np.append(phis,phi[:-2]+np.pi*(i+1)/4)
Us=np.vstack((Us,U[:-2]))
phis=np.append(phis,phi[-1]+np.pi*7/4)
Us=np.vstack((Us,U[0]))
rad, phi=np.meshgrid(rad,phis)
CS=axis.contourf(phi,rad,Us)
plt.colorbar(CS)
#plt.rcParams.update({'font.size': 8})
plt.savefig(file_paths[nazev_ind[-1]][:-16]+'V_aver_2D.png',dpi=300,bbox_inches='tight')
nazev_ind=[]
plt.close()
nazev_ind=[]
for v in vir:
for ax_pozice in ax:
if ax_pozice == '0,1D':
ax_pozice='01D'
elif ax_pozice == '1D':
ax_pozice='10D'
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind.append(n)
# ccs=[cta[c] for c in nazev_ind ]
#U=[cta[c].U_aver for c in nazev_ind ]
if len(nazev_ind)>1:
U=cta[nazev_ind[0]].U_aver
V=cta[nazev_ind[0]].U_aver
kt=0.5*((U**2+V**2)**0.5)
for c in nazev_ind[1:]:
U=cta[c].U_aver
V=cta[c].V_aver
kt=np.vstack((kt,0.5*((U**2+V**2)**0.5)))
fig, axis = plt.subplots(figsize=(5,5),subplot_kw=dict(projection='polar'))
phi=[]
for nzv in nazev_ind:
phi.append(int(file_paths[nzv].split('_')[-1][:-4]))
#phi=[int(c) for c in natoceni]
phi=np.radians(phi)
rad=150-cta[nazev_ind[-1]].x
phis=np.array(phi[:-2])
# rads=np.array(rad)
kts=np.copy(kt[:-2])
for i in range(7):
phis=np.append(phis,phi[:-2]+np.pi*(i+1)/4)
kts=np.vstack((kts,kt[:-2]))
phis=np.append(phis,phi[-1]+np.pi*7/4)
kts=np.vstack((kts,kt[0]))
rad, phi=np.meshgrid(rad,phis)
CS=axis.contourf(phi,rad,kts)
plt.colorbar(CS)
#plt.rcParams.update({'font.size': 8})
plt.savefig(file_paths[nazev_ind[-1]][:-16]+'kt_2D.png',dpi=300,bbox_inches='tight')
nazev_ind=[]
plt.close()
for v in vir:
for ax_pozice in ax:
if ax_pozice == '0,1D':
ax_pozice='01D'
elif ax_pozice == '1D':
ax_pozice='10D'
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind.append(n)
ccs=[cta[c] for c in nazev_ind ]
S=hw.swirl_number(ccs, 150)
print file_paths[nazev_ind[-1]][:5]+': Swirl nuber:',S
nazev_ind=[]
#-------Zde definuje která natočení se budou tisknout do grafů------
natoceni=['00','30', '40']
#---------------------------------------------------------
#print ax, vir, natoceni
nazev_ind=0
#složený graf U_mag přes natočení i vzdálenosti
for v in vir:
fig, axes = plt.subplots(5, 1, figsize=(5,15),sharex=True, sharey=True)
for xi,ax_pozice in enumerate(ax):
xi=4-xi
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind=n
r_D=(150-cta[n].x)/300.
U_U0=cta[n].U_mag/U0
axes[xi].plot(r_D,U_U0,label=nat, marker='+')
if xi==4:
axes[xi].set_xlabel('r/D [-]',fontsize=12)
axes[xi].set_ylim([0,5.5])
axes[xi].legend(loc=2)
axes[xi].grid()
axes[xi].set_ylabel('U_mag/U0 [-]',fontsize=8)
text='x/D='+ax_pozice
yticks = [0,1,2,3,4,5]
axes[xi].text(0.15, 5, text,fontsize=8,bbox={'facecolor':'white', 'alpha':0.5, 'pad':10})
fig.subplots_adjust(hspace=0)
plt.setp([a.set_yticklabels(yticks) for a in fig.axes[:-2]], visible=True)
plt.rcParams.update({'font.size': 8})
print 'Writing file: '+file_paths[nazev_ind][:-21]+'U_mag.png' #('/').join([v,ax_pozice,'U_mag.png'])
plt.savefig(file_paths[nazev_ind][:-21]+'U_mag.png',dpi=500,bbox_inches='tight')
plt.close()
#složený graf U_aver přes natočení i vzdálenosti
for v in vir:
fig, axes = plt.subplots(5, 1, figsize=(5, 15),sharex=True, sharey=True)
for xi,ax_pozice in enumerate(ax):
xi=4-xi
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind=n
r_D=(150-cta[n].x)/300.
U_U0=cta[n].U_aver/U0
axes[xi].plot(r_D,U_U0,label=nat, marker='+')
if xi==4:
axes[xi].set_xlabel('r/D [-]',fontsize=12)
axes[xi].set_ylim([0,5])
axes[xi].legend(loc=2)
axes[xi].grid()
axes[xi].set_ylabel('U/U0 [-]',fontsize=8)
text='x/D='+ax_pozice
yticks = [0,1,2,3,4]
axes[xi].text(0.15, 4.45, text,fontsize=8,bbox={'facecolor':'white', 'alpha':0.5, 'pad':10})
fig.subplots_adjust(hspace=0)
plt.setp([a.set_yticklabels(yticks) for a in fig.axes[:-2]], visible=True)
plt.rcParams.update({'font.size': 8})
print 'Writing file: '+file_paths[nazev_ind][:-21]+'U_aver.png' #('/').join([v,ax_pozice,'U_mag.png'])
plt.savefig(file_paths[nazev_ind][:-21]+'U_aver.png',dpi=500,bbox_inches='tight')
plt.close()
#složený graf U_aver přes natočení i vzdálenosti
for v in vir:
fig, axes = plt.subplots(5, 1, figsize=(5, 15),sharex=True, sharey=True)
for xi,ax_pozice in enumerate(ax):
xi=4-xi
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind=n
r_D=(150-cta[n].x)/300.
V_U0=cta[n].V_aver/U0
axes[xi].plot(r_D,V_U0,label=nat, marker='+')
if xi==4:
axes[xi].set_xlabel('r/D [-]',fontsize=12)
axes[xi].set_ylim([-1,3])
axes[xi].legend(loc=2)
axes[xi].grid()
axes[xi].set_ylabel('V/U0 [-]',fontsize=8)
text='x/D='+ax_pozice
yticks = [-1,-0.5,0,0.5,1,1.5,2,2.5]
axes[xi].text(0.13, 2.5, text,fontsize=8,bbox={'facecolor':'white', 'alpha':0.5, 'pad':10})
fig.subplots_adjust(hspace=0)
plt.setp([a.set_yticklabels(yticks) for a in fig.axes[:-2]], visible=True)
plt.rcParams.update({'font.size': 8})
print 'Writing file: '+file_paths[nazev_ind][:-21]+'V_aver.png' #('/').join([v,ax_pozice,'U_mag.png'])
plt.savefig(file_paths[nazev_ind][:-21]+'V_aver.png',dpi=500,bbox_inches='tight')
plt.close()
#složený graf Angle přes natočení i vzdálenosti
for v in vir:
fig, axes = plt.subplots(5, 1, figsize=(5, 15),sharex=True, sharey=True)
for xi,ax_pozice in enumerate(ax):
xi=4-xi
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind=n
r_D=(150-cta[n].x)/300.
Angle=cta[n].phi
axes[xi].plot(r_D,Angle,label=nat, marker='+')
if xi==4:
axes[xi].set_xlabel('r/D [-]',fontsize=12)
axes[xi].set_ylim([-45,45])
axes[xi].legend(loc=3)
axes[xi].grid()
axes[xi].set_ylabel('Angle [deg]',fontsize=8)
text='x/D='+ax_pozice
yticks = [-45,-40,-20,0,20, 40]
axes[xi].text(0.02, 35, text,fontsize=8,bbox={'facecolor':'white', 'alpha':0.5, 'pad':10})
fig.subplots_adjust(hspace=0)
plt.setp([a.set_yticklabels(yticks) for a in fig.axes[:-2]], visible=True)
plt.rcParams.update({'font.size': 8})
print 'Writing file: '+file_paths[nazev_ind][:-21]+'Angle.png' #('/').join([v,ax_pozice,'U_mag.png'])
plt.savefig(file_paths[nazev_ind][:-21]+'Angle.png',dpi=500,bbox_inches='tight')
plt.close()
#složený graf Ek přes natočení i vzdálenosti
for v in vir:
fig, axes = plt.subplots(5, 1, figsize=(5, 15),sharex=True, sharey=True)
for xi,ax_pozice in enumerate(ax):
xi=4-xi
for nat in natoceni:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind=n
r_D=(150-cta[n].x)/300.
k_T=(0.5*((cta[n].U_RMS**2+cta[n].U_RMS**2)**0.5))
axes[xi].plot(r_D,k_T,label=nat, marker='+')
if xi==4:
axes[xi].set_xlabel('r/D [-]',fontsize=12)
axes[xi].set_ylim([0,3])
axes[xi].legend(loc=2)
axes[xi].grid()
axes[xi].set_ylabel('k_T [m2/s2]',fontsize=8)
text='x/D='+ax_pozice
yticks = [0,0.5,1,1.5,2,2.5]
axes[xi].text(0.13,2.5, text,fontsize=8,bbox={'facecolor':'white', 'alpha':0.5, 'pad':10})
fig.subplots_adjust(hspace=0)
plt.setp([a.set_yticklabels(yticks) for a in fig.axes[:-2]], visible=True)
plt.rcParams.update({'font.size': 8})
print 'Writing file: '+file_paths[nazev_ind][:-21]+'k_T.png' #('/').join([v,ax_pozice,'U_mag.png'])
plt.savefig(file_paths[nazev_ind][:-21]+'k_T.png',dpi=500,bbox_inches='tight')
plt.close()
#grafy jen pro natočení
#for v in vir:
#for ax_pozice in ax:
# if ax_pozice == '0,1D':
# ax_pozice='01D'
# elif ax_pozice == '1D':
# ax_pozice='10D'
#for nat in natoceni:
#txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
#n=[n for n,s in enumerate(file_paths) if txt in s]
#if len(n) > 0:
#n=n[0]
# nazev_ind=n
# r_D=(150-cta[n].x)/300.
# U_U0=cta[n].U_aver/U0
# plt.plot(r_D,U_U0,label=nat, marker='+')
#plt.xlabel('r/D [-]',fontsize=16)
#plt.ylabel('U/U0 [-]',fontsize=16)
#plt.legend(loc=2)
#plt.grid()
#plt.rcParams.update({'font.size': 12})
#plt.savefig(file_paths[nazev_ind][:-16]+'U_aver.png',dpi=500,bbox_inches='tight')
#print 'Writing file: '+file_paths[nazev_ind][:-16]+'U_aver.png' #('/').join([v,ax_pozice,'U_aver.png'])
#plt.close()
#for nat in natoceni:
# txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
# n=[n for n,s in enumerate(file_paths) if txt in s]
# if len(n) > 0:
# n=n[0]
# nazev_ind=n
# r_D=(150-cta[n].x)/300.
# V_U0=cta[n].V_aver/U0
# plt.plot(r_D,V_U0,label=nat, marker='+')
#plt.xlabel('r/D [-]',fontsize=16)
#plt.ylabel('V/U0 [-]',fontsize=16)
#plt.legend(loc=2)
#plt.grid()
#plt.rcParams.update({'font.size': 12})
#plt.savefig(file_paths[nazev_ind][:-16]+'V_aver.png',dpi=500,bbox_inches='tight')
#print 'Writing file: '+file_paths[nazev_ind][:-16]+'V_aver.png' #('/').join([v,ax_pozice,'V_aver.png'])
#plt.close()
#for nat in natoceni:
# txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
# n=[n for n,s in enumerate(file_paths) if txt in s]
# if len(n) > 0:
# n=n[0]
# nazev_ind=n
# r=(150-cta[n].x)
# Ek=(0.5*((cta[n].U_RMS**2+cta[n].U_RMS**2)**0.5))
# plt.plot(r,Ek,label=nat, marker='+')
#plt.xlabel('r [mm]',fontsize=16)
#plt.ylabel('Ek [m2/s2]',fontsize=16)
#plt.xlim([0,150.2])
#plt.legend(loc=2)
#plt.grid()
#plt.rcParams.update({'font.size': 12})
#plt.savefig(file_paths[nazev_ind][:-16]+'Ek.png',dpi=500,)
#print 'Writing file: '+file_paths[nazev_ind][:-16]+'Ek.png' #('/').join([v,ax_pozice,'V_aver.png'])
#plt.close()
#for nat in natoceni:
# txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
# n=[n for n,s in enumerate(file_paths) if txt in s]
# if len(n) > 0:
# n=n[0]
# nazev_ind=n
# r=(150-cta[n].x)
# phi=cta[n].phi
# plt.plot(r,phi,label=nat,marker='+')
#plt.xlabel('r [mm]',fontsize=16)
#plt.ylabel('angle [deg]',fontsize=16)
#plt.xlim([0,150.2])
#plt.legend(loc=2)
#plt.grid()
#plt.rcParams.update({'font.size': 12})
#plt.savefig(file_paths[nazev_ind][:-16]+'angle.png',dpi=500,)
#print 'Writing file: '+file_paths[nazev_ind][:-16]+'angle.png' #('/').join([v,ax_pozice,'V_aver.png'])
#plt.close()
##------------------------Tisk grafů U_mag vs x/D--------------------
natoceni=['00','30','40']
nn=[]
x=40 #Pro jaký bod (poloměr) vykreslit axiální závislost
ap=np.array([])
un=np.array([])
phi=np.array([])
Ek1=np.array([])
Ek2=np.array([])
for v in vir:
for nat in natoceni:
for ax_pozice in ax:
if ax_pozice == '0,1D':
txt=('D\\'+v[6:]+'_'+'01'+'_'+nat+'.txt').replace(',','')
elif ax_pozice == '1D':
txt=('D\\'+v[6:]+'_'+'10'+'_'+nat+'.txt').replace(',','')
else:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind=n
ui=np.where(cta[n].x==x)
if len(ui)>0:
un=np.append(un,cta[n].U_mag[ui])
ap=np.append(ap,float(ax_pozice[:-1].replace(',','.')))
un_i=np.argsort(ap)
un=un[un_i]
ap=ap[un_i]
U_U0=un/U0
plt.plot(ap,U_U0,label=nat,marker='+')
ap=np.array([]) #TODO nesmaže to předchozí graf?
un=np.array([])
plt.xlabel('x/D [-]',fontsize=16)
plt.ylabel('U_mag/U0 [-]',fontsize=16)
plt.ylim([0,4])
plt.legend(loc=0)
text='r ='+str(150-x)
plt.text(0.2,3.5, text, fontsize=8,bbox={'facecolor':'white', 'alpha':0.5, 'pad':10})
plt.grid()
plt.rcParams.update({'font.size': 12})
plt.savefig(file_paths[nazev_ind][:26]+'U_mag_ax.png',dpi=500,bbox_inches='tight')
print 'Writing file: '+file_paths[nazev_ind][:26]+'U_mag_ax.png' #('/').join([v,ax_pozice,'U_mag_ax.png'])
plt.close()
for v in vir:
for nat in natoceni:
for ax_pozice in ax:
if ax_pozice == '0,1D':
txt=('D\\'+v[6:]+'_'+'01'+'_'+nat+'.txt').replace(',','')
elif ax_pozice == '1D':
txt=('D\\'+v[6:]+'_'+'10'+'_'+nat+'.txt').replace(',','')
else:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind=n
ui=np.where(cta[n].x==x)
if len(ui)>0:
phi=np.append(phi,cta[n].phi[ui])
ap=np.append(ap,float(ax_pozice[:-1].replace(',','.')))
phi_i=np.argsort(ap)
phi=phi[phi_i]
ap=ap[phi_i]
plt.plot(ap,phi,label=nat,marker='+')
ap=np.array([]) #TODO nesmaže to předchozí graf?
phi=np.array([])
plt.xlabel('x/D [-]',fontsize=16)
plt.ylabel('angle [deg]',fontsize=16)
plt.ylim([-45,45])
plt.legend(loc=0)
text='r ='+str(150-x)
plt.text(0.2,35, text, fontsize=8,bbox={'facecolor':'white', 'alpha':0.5, 'pad':10})
plt.grid()
plt.rcParams.update({'font.size': 12})
plt.savefig(file_paths[nazev_ind][:26]+'angle_x.png',dpi=500,bbox_inches='tight')
print 'Writing file: '+file_paths[nazev_ind][:26]+'angle_x.png' #('/').join([v,ax_pozice,'U_mag_ax.png'])
plt.close()
for v in vir:
for nat in natoceni:
for ax_pozice in ax:
if ax_pozice == '0,1D':
txt=('D\\'+v[6:]+'_'+'01'+'_'+nat+'.txt').replace(',','')
elif ax_pozice == '1D':
txt=('D\\'+v[6:]+'_'+'10'+'_'+nat+'.txt').replace(',','')
else:
txt=('D\\'+v[6:]+'_'+ax_pozice[:-1]+'_'+nat+'.txt').replace(',','')
n=[n for n,s in enumerate(file_paths) if txt in s]
if len(n) > 0:
n=n[0]
nazev_ind=n
ui=np.where(cta[n].x==x)
if len(ui)>0:
Ek1=np.append(Ek1,cta[n].U_RMS[ui])
Ek2=np.append(Ek2,cta[n].V_RMS[ui])
ap=np.append(ap,float(ax_pozice[:-1].replace(',','.')))
ap_i=np.argsort(ap)
Ek1=Ek1[ap_i]
Ek2=Ek2[ap_i]
ap=ap[ap_i]
Ek=(0.5*((Ek1**2+Ek2**2)**0.5))
plt.plot(ap,Ek,label=nat,marker='+')
ap=np.array([]) #TODO nesmaže to předchozí graf?
Ek1=np.array([])
Ek2=np.array([])
plt.xlabel('x/D [-]',fontsize=16)
plt.ylabel('k_T [m2/s2]',fontsize=16)
plt.ylim([0,3])
plt.legend(loc=0)
text='r ='+str(150-x)
plt.text(0.2,2.7, text, fontsize=8,bbox={'facecolor':'white', 'alpha':0.5, 'pad':10})
plt.grid()
plt.rcParams.update({'font.size': 12})
plt.savefig(file_paths[nazev_ind][:26]+'k_T-x.png',dpi=500,bbox_inches='tight')
print 'Writing file: '+file_paths[nazev_ind][:26]+'k_T-x.png' #('/').join([v,ax_pozice,'U_mag_ax.png'])
plt.close()
## ----------------------------------------------------------------
def get_sizes(filename):
'''
From the file name extracts dimensions
'''
if '.txt' in filename:
fnm=os.path.basename(filename)
rozmery=fnm[:-4].split('_')
return rozmery
elif 'viric_' in filename:
dirnm=os.path.dirname(filename).split('\\')
m=[n for n,s in enumerate(dirnm) if 'viric_' in s][0]
vir=dirnm[m][6:-4]
lopatky=dirnm[m][-2:-1]
ax=dirnm[m+1][:-2]
if ax == '0,1':
ax='01'
elif ax == '1':
ax ='10'
return [vir,lopatky,ax,'']
# print lst
#print 'D/'+v[6:]+'_'+a[-2]+'_'
#n=[n for n,s in enumerate(file_paths) if '01_45.txt' in s]
#print n
#lst=[file_paths[i] for i in n]
#print lst
#for v in vir:
# virice=[s for s in file_paths if v in s]
# for a in ax:
# ax_pozice=[s for s in virice if '/'+a+'/' in s]
# for n in natoceni:
# nn=[s for s in ax_pozice if n+'.txt' in s]
# if len(nn) > 0:
# nazvy.append(nn[0])
# st=[v,'/'+a+'/',n+'.txt']
# print nazvy
# nazvy=[]
# print
#print nazvy
# print st
# if any(x in str for x in file_paths):
# soubor.append(file_paths)
#for fp in file_paths:
# if all(x in fp for x in st):
# print fp
#if any("45.txt" in s for s in file_paths):
# print "Je tam"
#
#matching = [s for s in file_paths if vir[0] in s]
#print matching
#matching = [fp for fp in file_paths if all(x in fp for x in st)]
#print st
#print matching
#
#matching = [fp for fp in file_paths if "280_55/" in fp]
##print matching
#print os.path.dirname(matching[0])
#file_paths=file_paths[0:1]
#print file_paths
#
#for file_in in file_paths:
# print file_in
# rozmery.append(os.path.basename(file_in)[:-4].split('_'))
#
# cta.append(hw.HotWireMeasurement(file_in)) #Načte data pro daný soubor file_in, uloží je do třídy HotWireMeasurement a třídu uloží do pole cta
# print cta[-1] #Vytiskne informace o daném měření
#
#for c,r in zip(cta,rozmery):
#
# D_v,uhel_lopatek,dist,natoceni=r
#
# plt.grid()
## plt.axis([-0.05,0.5,1,4.5]) #0.35,1.8
# plt.plot( c.x ,c.U_aver,label=natoceni) #prumer/U0
# r=1-c.x
# plt.xlabel('r [mm]')
# plt.ylabel('U [m/s]')
# plt.legend()
#
## plt.rcParams.update({'font.size': 22})
# plt.savefig(file_in[:-4]+'_U.png',bbox_inches='tight')
# plt.close()
#
#for c,r in zip(cta,rozmery):
#
# D_v,uhel_lopatek,dist,natoceni=r
#
# plt.ylabel('V [m/s]')
# plt.plot( c.x ,c.V_aver) #prumer/U0
## plt.rcParams.update({'font.size': 22})
# plt.savefig(file_in[:-4]+'_V.png',bbox_inches='tight')
# print "Ukládám obrázek:",file_in[:-4]+".png"
# plt.close()
| 31.476023 | 145 | 0.53805 | 4,360 | 26,912 | 3.202752 | 0.076376 | 0.05328 | 0.03108 | 0.03774 | 0.790318 | 0.773489 | 0.760384 | 0.74821 | 0.738685 | 0.729734 | 0 | 0.04334 | 0.241231 | 26,912 | 854 | 146 | 31.512881 | 0.640451 | 0.259661 | 0 | 0.698603 | 0 | 0 | 0.067325 | 0 | 0 | 0 | 0 | 0.001171 | 0 | 0 | null | null | 0 | 0.011976 | null | null | 0.025948 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
158ec66035194a9d5a3ff0125c8d3d98a6a7dc34 | 228 | py | Python | src/default/__init__.py | Nkzono99/emses_inp_generator | ca9ea33619f1425840a4ba14500705e7199f2356 | [
"MIT"
] | null | null | null | src/default/__init__.py | Nkzono99/emses_inp_generator | ca9ea33619f1425840a4ba14500705e7199f2356 | [
"MIT"
] | null | null | null | src/default/__init__.py | Nkzono99/emses_inp_generator | ca9ea33619f1425840a4ba14500705e7199f2356 | [
"MIT"
] | null | null | null | from default.gui import WindowCreator
from default.loader import create_default_loader
from default.saver import create_default_saver
from default.unit_conversion import create_conversion_window, to_emses_unit, to_physical_unit
| 45.6 | 93 | 0.894737 | 33 | 228 | 5.848485 | 0.424242 | 0.227979 | 0.196891 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 228 | 4 | 94 | 57 | 0.919048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
15912fa5b1b277b103cedef84576beaf92190b3d | 3,906 | py | Python | p8_test/test_local/test_eta5_execution/test_before_name_etaw.py | crazynayan/tpf1 | c81a15d88d4d1f3ed2cf043c90782a4b8509ef14 | [
"MIT"
] | 1 | 2020-01-27T10:10:40.000Z | 2020-01-27T10:10:40.000Z | p8_test/test_local/test_eta5_execution/test_before_name_etaw.py | crazynayan/tpf1 | c81a15d88d4d1f3ed2cf043c90782a4b8509ef14 | [
"MIT"
] | 4 | 2019-08-23T05:24:23.000Z | 2021-09-16T10:05:55.000Z | p8_test/test_local/test_eta5_execution/test_before_name_etaw.py | crazynayan/tpf1 | c81a15d88d4d1f3ed2cf043c90782a4b8509ef14 | [
"MIT"
] | null | null | null | import json
from config import config
from p8_test.test_local.test_eta5_execution import NameGeneral, fqtv_gld
class BeforeNameETAW(NameGeneral):
def setUp(self) -> None:
super().setUp()
self.test_data.add_pnr_element(["1ZAVERI"], "name")
def test_HFX_BA_TKV_SRT5(self):
self.test_data.set_field("WA0ET6", bytes([self.wa0hfx]))
self.test_data.partition = "VA"
self.test_data.set_field("WA0US3", bytes([self.wa0tkv]))
test_data = self.tpf_server.run("ETA5", self.test_data)
self.output = test_data.output
self.assertEqual(self.IGR1_END, self.output.last_line)
self.assertNotEqual("E2D9E3F5", test_data.get_field("EBX004")) # SRT5
def test_HFX_AA_TKV_SRT5(self):
self.test_data.set_field("WA0ET6", bytes([self.wa0hfx]))
self.test_data.partition = "AA"
self.test_data.set_field("WA0US3", bytes([self.wa0tkv]))
test_data = self.tpf_server.run("ETA5", self.test_data)
self.output = test_data.output
self.assertEqual(self.IGR1_END, self.output.last_line)
self.assertEqual("E2D9E3F5", test_data.get_field("EBX004")) # SRT5
def test_ASC_ITN_fqtv_ETAS(self):
self.test_data.set_field("WA0ASC", bytes([0x01]))
self.test_data.set_field("WA0ET2", bytes([self.wa0itn]))
self.test_data.add_pnr_field_data(fqtv_gld, "fqtv", config.AAAPNR)
test_data = self.tpf_server.run("ETA5", self.test_data)
self.output = test_data.output
self.assertEqual(self.FMSG_END, self.output.last_line)
# self.assertIn("INVLD ITIN", self.output.messages)
self.assertEqual("C5E3C1E2", test_data.get_field("EBX008")) # ETAS
def test_ASC_FTN_ETAS(self):
self.test_data.set_field("WA0ASC", bytes([0x01]))
self.test_data.set_field("WA0XX3", bytes([self.wa0ftn]))
test_data = self.tpf_server.run("ETA5", self.test_data)
self.output = test_data.output
self.assertEqual(self.FMSG_END, self.output.last_line)
# self.assertIn("INVLD ITIN", self.output.messages)
self.assertNotEqual("C5E3C1E2", test_data.get_field("EBX008")) # ETAS
def test_ASC_fqtv_ETAS(self):
self.test_data.set_field("WA0ASC", bytes([0x01]))
self.test_data.add_pnr_field_data(fqtv_gld, "fqtv", config.AAAPNR)
test_data = self.tpf_server.run("ETA5", self.test_data)
self.output = test_data.output
self.assertEqual(self.FMSG_END, self.output.last_line, f"{self.output.last_node}---{self.output.dumps}")
# self.assertIn("INVLD ITIN", self.output.messages)
self.assertNotEqual("C5E3C1E2", test_data.get_field("EBX008")) # ETAS
def test_FTD_ETK2(self):
self.test_data.set_field("WA0XX3", bytes([self.wa0ftd]))
test_data = self.tpf_server.run("ETA5", self.test_data)
self.output = test_data.output
self.assertEqual(self.FMSG_END, self.output.last_line, json.dumps(self.output.debug))
self.assertEqual(13, self.output.regs["R6"])
self.assertIn("VERIFY FREQUENT TRAVELER INFORMATION FOR CHANGED NAMES", self.output.messages)
def test_AFU_subs_ETGN(self):
self.test_data.set_field("WA0USE", bytes([self.wa0afu]))
self.test_data.add_pnr_element(["TEST"], "subs_card_seg")
test_data = self.tpf_server.run("ETA5", self.test_data)
self.output = test_data.output
self.assertEqual(self.IGR1_END, self.output.last_line)
self.assertEqual("C5E3C7D5", test_data.get_field("EBX012")) # ETGN
def test_AFU_ETGN(self):
self.test_data.set_field("WA0USE", bytes([self.wa0afu]))
test_data = self.tpf_server.run("ETA5", self.test_data)
self.output = test_data.output
self.assertEqual(self.IGR1_END, self.output.last_line, self.output.last_node)
self.assertNotEqual("C5E3C7D5", test_data.get_field("EBX012")) # ETGN
| 48.222222 | 112 | 0.685356 | 545 | 3,906 | 4.658716 | 0.176147 | 0.154391 | 0.122883 | 0.070894 | 0.821189 | 0.805435 | 0.784167 | 0.757385 | 0.737692 | 0.705396 | 0 | 0.033935 | 0.177675 | 3,906 | 80 | 113 | 48.825 | 0.756538 | 0.047107 | 0 | 0.523077 | 0 | 0 | 0.092428 | 0.012126 | 0 | 0 | 0.003234 | 0 | 0.261538 | 1 | 0.138462 | false | 0 | 0.046154 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
15afad7f4fccac1f9b9e8cb268da2dc7e9b0d53a | 42 | py | Python | Python/myprojects/views/__init__.py | iatanasov77/my-web-projects-guis | 833899dc99ed952ee813c49f0d5852da498d93eb | [
"MIT"
] | null | null | null | Python/myprojects/views/__init__.py | iatanasov77/my-web-projects-guis | 833899dc99ed952ee813c49f0d5852da498d93eb | [
"MIT"
] | null | null | null | Python/myprojects/views/__init__.py | iatanasov77/my-web-projects-guis | 833899dc99ed952ee813c49f0d5852da498d93eb | [
"MIT"
] | null | null | null | from .posts import *
from .polls import *
| 14 | 20 | 0.714286 | 6 | 42 | 5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 42 | 2 | 21 | 21 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
15b2f134da4e5e4600055497408da758a27881c7 | 120 | py | Python | sparse/bcoo/blinalg.py | zhcui/sparse | 00aa0c2561d3850b9afdf03d833d20662df8cd5b | [
"BSD-3-Clause"
] | 2 | 2018-06-12T18:48:48.000Z | 2018-07-01T18:09:33.000Z | sparse/bcoo/blinalg.py | zhcui/sparse | 00aa0c2561d3850b9afdf03d833d20662df8cd5b | [
"BSD-3-Clause"
] | null | null | null | sparse/bcoo/blinalg.py | zhcui/sparse | 00aa0c2561d3850b9afdf03d833d20662df8cd5b | [
"BSD-3-Clause"
] | 2 | 2018-06-11T20:52:16.000Z | 2018-06-12T18:48:59.000Z | from .bcore import BCOO
import numpy as np
import numpy.linalg.norm
def norm(x):
return np.linalg.norm(bcoo.data)
| 15 | 36 | 0.741667 | 21 | 120 | 4.238095 | 0.619048 | 0.247191 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 120 | 7 | 37 | 17.142857 | 0.89 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.6 | 0.2 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
15c0b932e92a431154f70ec7b02371bbd08f0ff3 | 100 | py | Python | aim/pytorch_lightning.py | admariner/aim | 4c143ea40acf3531abfa69f66503428d73d9fedc | [
"Apache-2.0"
] | 1 | 2021-07-19T19:21:30.000Z | 2021-07-19T19:21:30.000Z | aim/pytorch_lightning.py | admariner/aim | 4c143ea40acf3531abfa69f66503428d73d9fedc | [
"Apache-2.0"
] | 2 | 2021-08-25T16:17:16.000Z | 2022-02-10T05:49:55.000Z | aim/pytorch_lightning.py | admariner/aim | 4c143ea40acf3531abfa69f66503428d73d9fedc | [
"Apache-2.0"
] | 1 | 2021-01-29T02:10:14.000Z | 2021-01-29T02:10:14.000Z | # Alias to SDK PyTorch Lightning interface
from aim.sdk.adapters.pytorch_lightning import AimLogger
| 33.333333 | 56 | 0.85 | 14 | 100 | 6 | 0.785714 | 0.380952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11 | 100 | 2 | 57 | 50 | 0.94382 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ec9303b4de8aebbc39453c489748aa7abf422565 | 84 | py | Python | plugins/fmri_qa/api.py | bsavelev/medipy | f0da3750a6979750d5f4c96aedc89ad5ae74545f | [
"CECILL-B"
] | null | null | null | plugins/fmri_qa/api.py | bsavelev/medipy | f0da3750a6979750d5f4c96aedc89ad5ae74545f | [
"CECILL-B"
] | null | null | null | plugins/fmri_qa/api.py | bsavelev/medipy | f0da3750a6979750d5f4c96aedc89ad5ae74545f | [
"CECILL-B"
] | 1 | 2022-03-04T05:47:08.000Z | 2022-03-04T05:47:08.000Z | from gui import CrossSectionalPanel as cross_sectional
from gui import longitudinal
| 28 | 54 | 0.880952 | 11 | 84 | 6.636364 | 0.727273 | 0.191781 | 0.356164 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119048 | 84 | 2 | 55 | 42 | 0.986486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
eccbf23bdf59709dd1367a92543b3c77c21bc9e3 | 2,078 | py | Python | python-training/crawler.py | peace860226/python-practice | a85412a0ca24e82f95c6d94f5fe04c0c77d3e3f7 | [
"MIT"
] | null | null | null | python-training/crawler.py | peace860226/python-practice | a85412a0ca24e82f95c6d94f5fe04c0c77d3e3f7 | [
"MIT"
] | null | null | null | python-training/crawler.py | peace860226/python-practice | a85412a0ca24e82f95c6d94f5fe04c0c77d3e3f7 | [
"MIT"
] | null | null | null | # 抓原始碼HTML
# import urllib.request as req
# url="https://www.ptt.cc/bbs/movie/index.html"
# request=req.Request(url, headers={
# "User-Agent":"Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.190 Mobile Safari/537.36"
# })
# with req.urlopen(request) as response:
# data=response.read().decode("utf-8")
# print(data)
#解析資料
# import urllib.request as req
# url="https://www.ptt.cc/bbs/movie/index.html"
# request=req.Request(url, headers={
# "User-Agent":"Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.190 Mobile Safari/537.36"
# })
# with req.urlopen(request) as response:
# data=response.read().decode("utf-8")
# import bs4
# root=bs4.BeautifulSoup(data, "html.parser")
# #print(root.title)
# print(root.title.string)
# import urllib.request as req
# url="https://www.ptt.cc/bbs/movie/index.html"
# request=req.Request(url, headers={
# "User-Agent":"Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.190 Mobile Safari/537.36"
# })
# with req.urlopen(request) as response:
# data=response.read().decode("utf-8")
# import bs4
# root=bs4.BeautifulSoup(data, "html.parser")
# titles=root.find("div", class_="title") #尋照class="title" 的div標籤
# #print(titles)
# print(titles.a.string)
import urllib.request as req
url="https://www.ptt.cc/bbs/movie/index.html"
#建立一個Request物件, 附加 Request Headers 的資訊
request=req.Request(url, headers={
"User-Agent":"Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.190 Mobile Safari/537.36"
})
with req.urlopen(request) as response:
data=response.read().decode("utf-8")
# 解析原始碼.
import bs4
root=bs4.BeautifulSoup(data, "html.parser") #用 BeautifulSoup 協助解析 HTML 格式文件
titles=root.find_all("div", class_="title") #尋照class="title" 的div標籤
for title in titles:
if title.a != None:
print(title.a.string)
| 38.481481 | 157 | 0.68768 | 319 | 2,078 | 4.470219 | 0.231975 | 0.050491 | 0.053296 | 0.058906 | 0.852034 | 0.852034 | 0.808555 | 0.808555 | 0.778401 | 0.778401 | 0 | 0.066106 | 0.141001 | 2,078 | 53 | 158 | 39.207547 | 0.732773 | 0.682387 | 0 | 0 | 0 | 0.076923 | 0.367491 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.153846 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ece7a5d3ec5462edf6f6f52761a629575000d9cf | 407 | py | Python | terrascript/signalfx/d.py | jackluo923/python-terrascript | ed4b626e6d28621ea1b02fc16f7277a094d89830 | [
"BSD-2-Clause"
] | 4 | 2022-02-07T21:08:14.000Z | 2022-03-03T04:41:28.000Z | terrascript/signalfx/d.py | jackluo923/python-terrascript | ed4b626e6d28621ea1b02fc16f7277a094d89830 | [
"BSD-2-Clause"
] | null | null | null | terrascript/signalfx/d.py | jackluo923/python-terrascript | ed4b626e6d28621ea1b02fc16f7277a094d89830 | [
"BSD-2-Clause"
] | 2 | 2022-02-06T01:49:42.000Z | 2022-02-08T14:15:00.000Z | # terrascript/signalfx/d.py
import terrascript
class signalfx_aws_services(terrascript.Data):
pass
class signalfx_azure_services(terrascript.Data):
pass
class signalfx_gcp_services(terrascript.Data):
pass
class signalfx_dimension_values(terrascript.Data):
pass
class signalfx_pagerduty_integration(terrascript.Data):
pass
class signalfx_data_link(terrascript.Data):
pass
| 15.074074 | 55 | 0.788698 | 48 | 407 | 6.4375 | 0.354167 | 0.252427 | 0.368932 | 0.38835 | 0.595469 | 0.38835 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142506 | 407 | 26 | 56 | 15.653846 | 0.885387 | 0.061425 | 0 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.461538 | 0.076923 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
01c6273423d9a1e68e7bf6ed19c67228727d38ee | 29 | py | Python | pystatsbomb/__init__.py | jeremyelster/PyStatsBomb | 584e4c22197b62ee064d429cf21d9abd773e2993 | [
"BSD-2-Clause"
] | null | null | null | pystatsbomb/__init__.py | jeremyelster/PyStatsBomb | 584e4c22197b62ee064d429cf21d9abd773e2993 | [
"BSD-2-Clause"
] | null | null | null | pystatsbomb/__init__.py | jeremyelster/PyStatsBomb | 584e4c22197b62ee064d429cf21d9abd773e2993 | [
"BSD-2-Clause"
] | 1 | 2019-03-22T03:58:16.000Z | 2019-03-22T03:58:16.000Z | from .sbclient import Client
| 14.5 | 28 | 0.827586 | 4 | 29 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bf27e177cd9b9ae3f0013a99a26f8e486ed6b4e0 | 96 | py | Python | venv/lib/python3.8/site-packages/poetry/core/packages/constraints/empty_constraint.py | GiulianaPola/select_repeats | 17a0d053d4f874e42cf654dd142168c2ec8fbd11 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/poetry/core/packages/constraints/empty_constraint.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/poetry/core/packages/constraints/empty_constraint.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/98/0a/c3/4a24f31722e9cecf93d6cd5039f15206ddfb4b02f09c6f873f026958a3 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.427083 | 0 | 96 | 1 | 96 | 96 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bf28d7b3502c03ba7600da1f31d704e77845a46d | 33 | py | Python | 3.GeoLRU/lru/__init__.py | Kelechukwu/kelechukwu_nwosu_test | 8c850129bc03c2bc28af150b46da74b05a592061 | [
"MIT"
] | null | null | null | 3.GeoLRU/lru/__init__.py | Kelechukwu/kelechukwu_nwosu_test | 8c850129bc03c2bc28af150b46da74b05a592061 | [
"MIT"
] | null | null | null | 3.GeoLRU/lru/__init__.py | Kelechukwu/kelechukwu_nwosu_test | 8c850129bc03c2bc28af150b46da74b05a592061 | [
"MIT"
] | null | null | null | from lru.LRUCache import LRUCache | 33 | 33 | 0.878788 | 5 | 33 | 5.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 33 | 1 | 33 | 33 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
174f8e08207dbc35237955c2adaaaaab1df89f15 | 105 | py | Python | public/cantusdata/signals/__init__.py | jacobsanz97/cantus | 37d139ae20972c36d4abb96a2a5ac5106b0c1b47 | [
"MIT"
] | 12 | 2015-01-08T14:34:55.000Z | 2021-06-03T06:53:04.000Z | public/cantusdata/signals/__init__.py | jacobsanz97/cantus | 37d139ae20972c36d4abb96a2a5ac5106b0c1b47 | [
"MIT"
] | 303 | 2015-01-14T17:10:32.000Z | 2022-02-14T20:27:21.000Z | public/cantusdata/signals/__init__.py | jacobsanz97/cantus | 37d139ae20972c36d4abb96a2a5ac5106b0c1b47 | [
"MIT"
] | 2 | 2019-10-07T21:21:27.000Z | 2019-10-20T16:58:22.000Z | from cantusdata.signals import solr_sync as _solr_sync
solr_synchronizer = _solr_sync.solr_synchronizer
| 26.25 | 54 | 0.87619 | 15 | 105 | 5.666667 | 0.533333 | 0.282353 | 0.282353 | 0.564706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 105 | 3 | 55 | 35 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
1799bb33c17bf0fc670eaf396a79b2dac3c978fd | 77 | py | Python | taotao-cloud-python/taotao-cloud-oldboy/day100-tornado/day100/uimethods.py | shuigedeng/taotao-cloud-paren | 3d281b919490f7cbee4520211e2eee5da7387564 | [
"Apache-2.0"
] | 47 | 2021-04-13T10:32:13.000Z | 2022-03-31T10:30:30.000Z | taotao-cloud-python/taotao-cloud-oldboy/day100-tornado/day100/uimethods.py | shuigedeng/taotao-cloud-paren | 3d281b919490f7cbee4520211e2eee5da7387564 | [
"Apache-2.0"
] | 1 | 2021-11-01T07:41:04.000Z | 2021-11-01T07:41:10.000Z | taotao-cloud-python/taotao-cloud-oldboy/day100-tornado/day100/uimethods.py | shuigedeng/taotao-cloud-paren | 3d281b919490f7cbee4520211e2eee5da7387564 | [
"Apache-2.0"
] | 21 | 2021-04-13T10:32:17.000Z | 2022-03-26T07:43:22.000Z | from tornado import escape
def tab(request,val):
return '<h1>老村长</h1>'
| 12.833333 | 26 | 0.675325 | 12 | 77 | 4.333333 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031746 | 0.181818 | 77 | 5 | 27 | 15.4 | 0.793651 | 0 | 0 | 0 | 0 | 0 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
bd6caa064ef51b3de0aba74c89137746a417655c | 32 | py | Python | print_text/update.py | ErraticO/test_github_release_pypi | 1d0a31a19bb150178ad316dbe2be63bd49abe6d5 | [
"MIT"
] | 2 | 2022-02-21T01:13:44.000Z | 2022-02-21T06:31:53.000Z | print_text/update.py | ErraticO/test_github_release_pypi | 1d0a31a19bb150178ad316dbe2be63bd49abe6d5 | [
"MIT"
] | null | null | null | print_text/update.py | ErraticO/test_github_release_pypi | 1d0a31a19bb150178ad316dbe2be63bd49abe6d5 | [
"MIT"
] | null | null | null | def make():
print("update")
| 10.666667 | 19 | 0.5625 | 4 | 32 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21875 | 32 | 2 | 20 | 16 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
bd723688db58c6804be4d7513aa87be3a16cc709 | 139 | py | Python | music/admin.py | garg10may/djangoTest | 199ca58a96904e8524e6e403eb8d830590dd6070 | [
"MIT"
] | null | null | null | music/admin.py | garg10may/djangoTest | 199ca58a96904e8524e6e403eb8d830590dd6070 | [
"MIT"
] | null | null | null | music/admin.py | garg10may/djangoTest | 199ca58a96904e8524e6e403eb8d830590dd6070 | [
"MIT"
] | null | null | null | from django.contrib import admin
# Register your models here.
from models import *
admin.site.register(Album)
admin.site.register(Song) | 15.444444 | 32 | 0.784173 | 20 | 139 | 5.45 | 0.6 | 0.201835 | 0.311927 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129496 | 139 | 9 | 33 | 15.444444 | 0.900826 | 0.18705 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
bd912ed1ed5703e55b4ed26f9ff56bae92eea58f | 136 | py | Python | models/cifar/__init__.py | GuanhuaWang/sensai_experiments | 7348f18d572b08a5b1dfe8e32f0e25a6022f6a5f | [
"Apache-2.0"
] | 61 | 2021-01-21T03:21:48.000Z | 2022-03-23T17:21:06.000Z | models/cifar/__init__.py | GuanhuaWang/sensai_experiments | 7348f18d572b08a5b1dfe8e32f0e25a6022f6a5f | [
"Apache-2.0"
] | null | null | null | models/cifar/__init__.py | GuanhuaWang/sensai_experiments | 7348f18d572b08a5b1dfe8e32f0e25a6022f6a5f | [
"Apache-2.0"
] | 6 | 2021-01-21T04:39:55.000Z | 2021-09-28T21:16:25.000Z | from __future__ import absolute_import
from .vgg import *
from .resnet import *
from .mobilenetv2 import *
from .shufflenetv2 import *
| 19.428571 | 38 | 0.786765 | 17 | 136 | 6 | 0.470588 | 0.392157 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017391 | 0.154412 | 136 | 6 | 39 | 22.666667 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
da0cb95dfcb978279fb120ae62b5f0e3cd5c95a3 | 37 | py | Python | python/testData/testRunner/env/createConfigurationTest/configurationByContext/bar/test_foo.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/testRunner/env/createConfigurationTest/configurationByContext/bar/test_foo.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/testRunner/env/createConfigurationTest/configurationByContext/bar/test_foo.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | def test_test():
print("bar") | 18.5 | 20 | 0.540541 | 5 | 37 | 3.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.27027 | 37 | 2 | 20 | 18.5 | 0.703704 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
da329e8cdad4c706f58d9d525cd330232e560e3b | 275 | py | Python | deprc_scaled_class_data.py | habichta/ETHZDeepReinforcementLearning | e1ae22159753724290f20068214bb3d94fcb7be4 | [
"BSD-3-Clause"
] | 7 | 2018-01-23T05:17:50.000Z | 2020-10-30T02:29:59.000Z | deprc_scaled_class_data.py | habichta/ETHZDeepReinforcementLearning | e1ae22159753724290f20068214bb3d94fcb7be4 | [
"BSD-3-Clause"
] | null | null | null | deprc_scaled_class_data.py | habichta/ETHZDeepReinforcementLearning | e1ae22159753724290f20068214bb3d94fcb7be4 | [
"BSD-3-Clause"
] | 2 | 2018-01-23T05:17:58.000Z | 2018-07-02T00:13:34.000Z |
import abb_deeplearning.abb_data_pipeline.abb_clouddrl_transformation_pipeline as abb_t
import abb_deeplearning.abb_data_pipeline.abb_clouddrl_constants as ac
"""
Not finished
"""
abb_t.create_scaled_classification_data(solar_station=ac.ABB_Solarstation.C,img_d_tup_l=None) | 34.375 | 93 | 0.883636 | 43 | 275 | 5.162791 | 0.581395 | 0.081081 | 0.189189 | 0.216216 | 0.423423 | 0.423423 | 0.423423 | 0.423423 | 0 | 0 | 0 | 0 | 0.050909 | 275 | 8 | 93 | 34.375 | 0.850575 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e573697de1aac73d160fac03d510ecb91fa20655 | 38 | py | Python | hyperbox/networks/ofa/__init__.py | marsggbo/hyperbox | 91dcd04ad30164bcb12209d818df18961fa3f347 | [
"MIT"
] | 1 | 2022-01-17T00:34:14.000Z | 2022-01-17T00:34:14.000Z | hyperbox/networks/ofa/__init__.py | marsggbo/hyperbox | 91dcd04ad30164bcb12209d818df18961fa3f347 | [
"MIT"
] | null | null | null | hyperbox/networks/ofa/__init__.py | marsggbo/hyperbox | 91dcd04ad30164bcb12209d818df18961fa3f347 | [
"MIT"
] | null | null | null | from .ofa_mbv3 import OFAMobileNetV3
| 19 | 37 | 0.842105 | 5 | 38 | 6.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060606 | 0.131579 | 38 | 1 | 38 | 38 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e5cbcc5516d30fc74e2be178e12e93e0d95c7dfd | 157 | py | Python | utils/df_funcs.py | cltl/a-proof-zonmw | f6d1a83fc77223bf8b58c9d465aae301269bb679 | [
"Apache-2.0"
] | 2 | 2021-02-08T08:24:06.000Z | 2021-11-12T10:23:23.000Z | utils/df_funcs.py | cltl/a-proof-zonmw | f6d1a83fc77223bf8b58c9d465aae301269bb679 | [
"Apache-2.0"
] | null | null | null | utils/df_funcs.py | cltl/a-proof-zonmw | f6d1a83fc77223bf8b58c9d465aae301269bb679 | [
"Apache-2.0"
] | 2 | 2021-12-07T22:14:56.000Z | 2021-12-14T09:06:16.000Z | import pandas as pd
def remove_on_multikeys(df1, df2, keys):
isin = df1.set_index(keys).index.isin(df2.set_index(keys).index)
return df1.loc[~isin] | 26.166667 | 68 | 0.726115 | 27 | 157 | 4.074074 | 0.592593 | 0.145455 | 0.218182 | 0.309091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.140127 | 157 | 6 | 69 | 26.166667 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
e5d2c3dc72e2fa5e6904bcb1e7d93d1e154c5ad3 | 73,267 | py | Python | jasmin/routing/test/test_router.py | pyghassen/jasmin | d6bf0b40bb72e406bcb0dd3a56064a28efd7c6b3 | [
"Apache-2.0"
] | null | null | null | jasmin/routing/test/test_router.py | pyghassen/jasmin | d6bf0b40bb72e406bcb0dd3a56064a28efd7c6b3 | [
"Apache-2.0"
] | null | null | null | jasmin/routing/test/test_router.py | pyghassen/jasmin | d6bf0b40bb72e406bcb0dd3a56064a28efd7c6b3 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import copy
import glob
import os
import mock
import pickle
import time
import urllib
import string
from twisted.internet import reactor, defer
from twisted.trial import unittest
from twisted.spread import pb
from twisted.web import server
from twisted.web.client import getPage
from jasmin.protocols.smpp.test.smsc_simulator import *
from jasmin.redis.configs import RedisForJasminConfig
from jasmin.redis.client import ConnectionWithConfiguration
from jasmin.routing.router import RouterPB
from jasmin.routing.proxies import RouterPBProxy
from jasmin.routing.configs import RouterPBConfig
from jasmin.routing.test.http_server import AckServer
from jasmin.routing.configs import deliverSmHttpThrowerConfig, DLRThrowerConfig
from jasmin.routing.throwers import deliverSmHttpThrower, DLRThrower
from jasmin.protocols.http.server import HTTPApi
from jasmin.protocols.http.configs import HTTPApiConfig
from jasmin.protocols.smpp.configs import SMPPClientConfig
from jasmin.managers.proxies import SMPPClientManagerPBProxy
from jasmin.managers.clients import SMPPClientManagerPB
from jasmin.managers.configs import SMPPClientPBConfig
from jasmin.routing.Routes import DefaultRoute, StaticMTRoute
from jasmin.routing.Filters import GroupFilter
from jasmin.routing.jasminApi import *
from jasmin.queues.factory import AmqpFactory
from jasmin.queues.configs import AmqpConfig
from jasmin.vendor.smpp.pdu.pdu_types import EsmClass, EsmClassMode, MoreMessagesToSend
from twisted.cred import portal
from jasmin.tools.cred.portal import JasminPBRealm
from jasmin.tools.spread.pb import JasminPBPortalRoot
from twisted.cred.checkers import AllowAnonymousAccess, InMemoryUsernamePasswordDatabaseDontUse
from jasmin.routing.proxies import ConnectError
def composeMessage(characters, length):
if length <= len(characters):
return ''.join(random.sample(characters, length))
else:
s = ''
while len(s) < length:
s += ''.join(random.sample(characters, len(characters)))
return s[:length]
def id_generator(size=12, chars=string.ascii_uppercase + string.digits):
return ''.join(random.choice(chars) for x in range(size))
class RouterPBTestCase(unittest.TestCase):
def setUp(self, authentication = False):
# Initiating config objects without any filename
# will lead to setting defaults and that's what we
# need to run the tests
self.RouterPBConfigInstance = RouterPBConfig()
# Launch the router server
self.pbRoot_f = RouterPB()
# Mock callbacks
# will be used for assertions
self.pbRoot_f.bill_request_submit_sm_resp_callback = mock.Mock(wraps = self.pbRoot_f.bill_request_submit_sm_resp_callback)
self.pbRoot_f.deliver_sm_callback = mock.Mock(wraps = self.pbRoot_f.deliver_sm_callback)
self.pbRoot_f.setConfig(self.RouterPBConfigInstance)
p = portal.Portal(JasminPBRealm(self.pbRoot_f))
if not authentication:
p.registerChecker(AllowAnonymousAccess())
else:
c = InMemoryUsernamePasswordDatabaseDontUse()
c.addUser('test_user', md5('test_password').digest())
p.registerChecker(c)
jPBPortalRoot = JasminPBPortalRoot(p)
self.PBServer = reactor.listenTCP(0, pb.PBServerFactory(jPBPortalRoot))
self.pbPort = self.PBServer.getHost().port
@defer.inlineCallbacks
def tearDown(self):
yield self.disconnect()
yield self.PBServer.stopListening()
self.pbRoot_f.cancelPersistenceTimer()
class HttpServerTestCase(RouterPBTestCase):
def setUp(self):
RouterPBTestCase.setUp(self)
# Initiating config objects without any filename
# will lead to setting defaults and that's what we
# need to run the tests
httpApiConfigInstance = HTTPApiConfig()
SMPPClientPBConfigInstance = SMPPClientPBConfig()
SMPPClientPBConfigInstance.authentication = False
# Smpp client manager is required for HTTPApi instanciation
self.clientManager_f = SMPPClientManagerPB()
self.clientManager_f.setConfig(SMPPClientPBConfigInstance)
# Launch the http server
httpApi = HTTPApi(self.pbRoot_f, self.clientManager_f, httpApiConfigInstance)
self.httpServer = reactor.listenTCP(httpApiConfigInstance.port, server.Site(httpApi))
self.httpPort = httpApiConfigInstance.port
@defer.inlineCallbacks
def tearDown(self):
yield RouterPBTestCase.tearDown(self)
yield self.httpServer.stopListening()
class SMPPClientManagerPBTestCase(HttpServerTestCase):
@defer.inlineCallbacks
def setUp(self):
HttpServerTestCase.setUp(self)
# Initiating config objects without any filename
# will lead to setting defaults and that's what we
# need to run the tests
AMQPServiceConfigInstance = AmqpConfig()
AMQPServiceConfigInstance.reconnectOnConnectionLoss = False
# Launch AMQP Broker
self.amqpBroker = AmqpFactory(AMQPServiceConfigInstance)
self.amqpBroker.preConnect()
self.amqpClient = reactor.connectTCP(AMQPServiceConfigInstance.host, AMQPServiceConfigInstance.port, self.amqpBroker)
# Wait for AMQP Broker connection to get ready
yield self.amqpBroker.getChannelReadyDeferred()
# Add the broker to the RouterPB
yield self.pbRoot_f.addAmqpBroker(self.amqpBroker)
# Setup smpp client manager pb
yield self.clientManager_f.addAmqpBroker(self.amqpBroker)
p = portal.Portal(JasminPBRealm(self.clientManager_f))
p.registerChecker(AllowAnonymousAccess())
jPBPortalRoot = JasminPBPortalRoot(p)
self.CManagerServer = reactor.listenTCP(0, pb.PBServerFactory(jPBPortalRoot))
self.CManagerPort = self.CManagerServer.getHost().port
# Start DLRThrower
DLRThrowerConfigInstance = DLRThrowerConfig()
self.DLRThrower = DLRThrower()
self.DLRThrower.setConfig(DLRThrowerConfigInstance)
yield self.DLRThrower.addAmqpBroker(self.amqpBroker)
# Connect to redis server
RedisForJasminConfigInstance = RedisForJasminConfig()
self.redisClient = yield ConnectionWithConfiguration(RedisForJasminConfigInstance)
# Authenticate and select db
if RedisForJasminConfigInstance.password is not None:
yield self.redisClient.auth(RedisForJasminConfigInstance.password)
yield self.redisClient.select(RedisForJasminConfigInstance.dbid)
# Connect CM with RC:
self.clientManager_f.addRedisClient(self.redisClient)
# Set a smpp client manager proxy instance
self.SMPPClientManagerPBProxy = SMPPClientManagerPBProxy()
@defer.inlineCallbacks
def tearDown(self):
yield HttpServerTestCase.tearDown(self)
yield self.SMPPClientManagerPBProxy.disconnect()
yield self.CManagerServer.stopListening()
yield self.amqpClient.disconnect()
yield self.redisClient.disconnect()
class AuthenticatedTestCases(RouterPBProxy, RouterPBTestCase):
@defer.inlineCallbacks
def setUp(self, authentication=False):
yield RouterPBTestCase.setUp(self, authentication=True)
@defer.inlineCallbacks
def test_connect_success(self):
yield self.connect('127.0.0.1', self.pbPort, 'test_user', 'test_password')
@defer.inlineCallbacks
def test_connect_failure(self):
try:
yield self.connect('127.0.0.1', self.pbPort, 'test_anyuser', 'test_wrongpassword')
except ConnectError, e:
self.assertEqual(str(e), 'Authentication error test_anyuser')
except Exception, e:
self.assertTrue(False, "ConnectError not raised, got instead a %s" % type(e))
else:
self.assertTrue(False, "ConnectError not raised")
self.assertFalse(self.isConnected)
@defer.inlineCallbacks
def test_connect_non_anonymous(self):
try:
yield self.connect('127.0.0.1', self.pbPort)
except ConnectError, e:
self.assertEqual(str(e), 'Anonymous connection is not authorized !')
except Exception, e:
self.assertTrue(False, "ConnectError not raised, got instead a %s" % type(e))
else:
self.assertTrue(False, "ConnectError not raised")
self.assertFalse(self.isConnected)
class RoutingTestCases(RouterPBProxy, RouterPBTestCase):
@defer.inlineCallbacks
def test_add_list_and_flush_mt_route(self):
yield self.connect('127.0.0.1', self.pbPort)
yield self.mtroute_add(StaticMTRoute([GroupFilter(Group(1))], SmppClientConnector(id_generator()), 0.0), 2)
yield self.mtroute_add(DefaultRoute(SmppClientConnector(id_generator())), 0)
listRet1 = yield self.mtroute_get_all()
listRet1 = pickle.loads(listRet1)
yield self.mtroute_flush()
listRet2 = yield self.mtroute_get_all()
listRet2 = pickle.loads(listRet2)
self.assertEqual(2, len(listRet1))
self.assertEqual(0, len(listRet2))
@defer.inlineCallbacks
def test_add_list_and_remove_mt_route(self):
yield self.connect('127.0.0.1', self.pbPort)
yield self.mtroute_add(StaticMTRoute([GroupFilter(Group(1))], SmppClientConnector(id_generator()), 0.0), 2)
yield self.mtroute_add(DefaultRoute(SmppClientConnector(id_generator())), 0)
listRet1 = yield self.mtroute_get_all()
listRet1 = pickle.loads(listRet1)
yield self.mtroute_remove(2)
listRet2 = yield self.mtroute_get_all()
listRet2 = pickle.loads(listRet2)
self.assertEqual(2, len(listRet1))
self.assertEqual(1, len(listRet2))
@defer.inlineCallbacks
def test_add_list_and_flush_mo_route(self):
yield self.connect('127.0.0.1', self.pbPort)
yield self.moroute_add(DefaultRoute(HttpConnector(id_generator(), 'http://127.0.0.1')), 0)
listRet1 = yield self.moroute_get_all()
listRet1 = pickle.loads(listRet1)
yield self.moroute_flush()
listRet2 = yield self.moroute_get_all()
listRet2 = pickle.loads(listRet2)
self.assertEqual(1, len(listRet1))
self.assertEqual(0, len(listRet2))
@defer.inlineCallbacks
def test_add_list_and_remove_mo_route(self):
yield self.connect('127.0.0.1', self.pbPort)
yield self.moroute_add(DefaultRoute(HttpConnector(id_generator(), 'http://127.0.0.1')), 0)
listRet1 = yield self.moroute_get_all()
listRet1 = pickle.loads(listRet1)
yield self.mtroute_remove(0)
listRet2 = yield self.mtroute_get_all()
listRet2 = pickle.loads(listRet2)
self.assertEqual(1, len(listRet1))
self.assertEqual(0, len(listRet2))
class RoutingConnectorTypingCases(RouterPBProxy, RouterPBTestCase):
@defer.inlineCallbacks
def test_add_mt_route(self):
yield self.connect('127.0.0.1', self.pbPort)
r = yield self.mtroute_add(DefaultRoute(HttpConnector(id_generator(), 'http://127.0.0.1')), 0)
self.assertFalse(r)
r = yield self.mtroute_add(DefaultRoute(Connector(id_generator())), 0)
self.assertFalse(r)
r = yield self.mtroute_add(DefaultRoute(SmppClientConnector(id_generator())), 0)
self.assertTrue(r)
@defer.inlineCallbacks
def test_add_mo_route(self):
yield self.connect('127.0.0.1', self.pbPort)
r = yield self.moroute_add(DefaultRoute(HttpConnector(id_generator(), 'http://127.0.0.1')), 0)
self.assertTrue(r)
r = yield self.moroute_add(DefaultRoute(Connector(id_generator())), 0)
self.assertFalse(r)
r = yield self.moroute_add(DefaultRoute(SmppClientConnector(id_generator())), 0)
self.assertFalse(r)
class UserAndGroupTestCases(RouterPBProxy, RouterPBTestCase):
@defer.inlineCallbacks
def test_add_user_without_group(self):
yield self.connect('127.0.0.1', self.pbPort)
# This group will not be added to router
g1 = Group(1)
u1 = User(1, g1, 'username', 'password')
r = yield self.user_add(u1)
self.assertEqual(r, False)
@defer.inlineCallbacks
def test_authenticate(self):
yield self.connect('127.0.0.1', self.pbPort)
g1 = Group(1)
yield self.group_add(g1)
u1 = User(1, g1, 'username', 'password')
yield self.user_add(u1)
r = yield self.user_authenticate('username', 'password')
self.assertNotEqual(r, None)
r = pickle.loads(r)
self.assertEqual(u1.uid, r.uid)
self.assertEqual(u1.username, r.username)
self.assertEqual(u1.password, r.password)
self.assertEqual(u1.group, g1)
r = yield self.user_authenticate('username', 'incorrect')
self.assertEqual(r, None)
r = yield self.user_authenticate('incorrect', 'password')
self.assertEqual(r, None)
r = yield self.user_authenticate('incorrect', 'incorrect')
self.assertEqual(r, None)
@defer.inlineCallbacks
def test_add_list_and_remove_group(self):
yield self.connect('127.0.0.1', self.pbPort)
g1 = Group(1)
yield self.group_add(g1)
g2 = Group(2)
yield self.group_add(g2)
g3 = Group(3)
yield self.group_add(g3)
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(3, len(c))
yield self.group_remove(1)
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
yield self.group_remove_all()
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
@defer.inlineCallbacks
def test_remove_not_empty_group(self):
yield self.connect('127.0.0.1', self.pbPort)
g1 = Group(1)
yield self.group_add(g1)
u1 = User(1, g1, 'username1', 'password')
yield self.user_add(u1)
u2 = User(2, g1, 'username2', 'password')
yield self.user_add(u2)
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
yield self.group_remove_all()
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
@defer.inlineCallbacks
def test_add_list_and_remove_user(self):
yield self.connect('127.0.0.1', self.pbPort)
g1 = Group(1)
yield self.group_add(g1)
u1 = User(1, g1, 'username', 'password')
u2 = User(2, g1, 'username2', 'password')
yield self.user_add(u1)
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
yield self.user_add(u2)
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
yield self.user_remove(u1.uid)
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
yield self.user_add(u2)
yield self.user_remove_all()
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
@defer.inlineCallbacks
def test_add_list_user_with_groups(self):
yield self.connect('127.0.0.1', self.pbPort)
g1 = Group(1)
yield self.group_add(g1)
g2 = Group(2)
yield self.group_add(g2)
u1 = User(1, g1, 'username', 'password')
yield self.user_add(u1)
u2 = User(2, g2, 'username2', 'password')
yield self.user_add(u2)
# Get all users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
# Get users from gid=1
c = yield self.user_get_all(1)
c = pickle.loads(c)
self.assertEqual(1, len(c))
@defer.inlineCallbacks
def test_user_unicity(self):
yield self.connect('127.0.0.1', self.pbPort)
g1 = Group(1)
yield self.group_add(g1)
# Users are unique by uid or username
# The below 3 samples must be saved as two users
# the Router will replace a User if it finds the same
# uid or username
u1 = User(1, g1, 'username', 'password')
u2 = User(2, g1, 'username', 'password')
u3 = User(2, g1, 'other', 'password')
yield self.user_add(u1)
yield self.user_add(u2)
yield self.user_add(u3)
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
u = yield self.user_authenticate('other', 'password')
u = pickle.loads(u)
self.assertEqual(u3.username, u.username)
class PersistenceTestCase(RouterPBProxy, RouterPBTestCase):
@defer.inlineCallbacks
def tearDown(self):
# Remove persisted configurations
filelist = glob.glob("%s/*" % self.RouterPBConfigInstance.store_path)
for f in filelist:
os.remove(f)
yield RouterPBTestCase.tearDown(self)
class ConfigurationPersistenceTestCases(PersistenceTestCase):
@defer.inlineCallbacks
def test_persist_default(self):
yield self.connect('127.0.0.1', self.pbPort)
persistRet = yield self.persist()
self.assertTrue(persistRet)
@defer.inlineCallbacks
def test_load_undefined_profile(self):
yield self.connect('127.0.0.1', self.pbPort)
loadRet = yield self.load()
self.assertFalse(loadRet)
@defer.inlineCallbacks
def test_add_users_and_groups_persist_and_load_default(self):
yield self.connect('127.0.0.1', self.pbPort)
# Add users and groups
g1 = Group(1)
yield self.group_add(g1)
u1 = User(1, g1, 'username', 'password')
yield self.user_add(u1)
u2 = User(2, g1, 'username2', 'password')
yield self.user_add(u2)
# List users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
# Persist
yield self.persist()
# Remove all users
yield self.user_remove_all()
# List and assert
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
# Load
yield self.load()
# List users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
@defer.inlineCallbacks
def test_add_all_persist_and_load_default(self):
yield self.connect('127.0.0.1', self.pbPort)
# Add users and groups
g1 = Group(1)
yield self.group_add(g1)
u1 = User(1, g1, 'username', 'password')
yield self.user_add(u1)
u2 = User(2, g1, 'username2', 'password')
yield self.user_add(u2)
# Add mo route
yield self.moroute_add(DefaultRoute(HttpConnector(id_generator(), 'http://127.0.0.1/any')), 0)
# Add mt route
yield self.mtroute_add(DefaultRoute(SmppClientConnector(id_generator())), 0)
# List users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
# List mo routes
c = yield self.moroute_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
# List mt routes
c = yield self.mtroute_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
# Persist
yield self.persist()
# Remove all users
yield self.user_remove_all()
# Remove all group
yield self.group_remove_all()
# Remove all mo routes
yield self.moroute_flush()
# Remove all mt routes
yield self.mtroute_flush()
# List and assert
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
# List mo routes
c = yield self.moroute_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
# List mt routes
c = yield self.mtroute_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
# Load
yield self.load()
# List users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
# List mo routes
c = yield self.moroute_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
# List mt routes
c = yield self.mtroute_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
@defer.inlineCallbacks
def test_add_persist_and_load_profile(self):
yield self.connect('127.0.0.1', self.pbPort)
# Add users and groups
g1 = Group(1)
yield self.group_add(g1)
u1 = User(1, g1, 'username', 'password')
yield self.user_add(u1)
u2 = User(2, g1, 'username2', 'password')
yield self.user_add(u2)
# List users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
# Persist
yield self.persist('profile')
# Remove all users
yield self.user_remove_all()
# List and assert
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
# Load
yield self.load('profile')
# List users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
@defer.inlineCallbacks
def test_persist_scope_groups(self):
yield self.connect('127.0.0.1', self.pbPort)
# Add users and groups
g1 = Group(1)
yield self.group_add(g1)
u1 = User(1, g1, 'username', 'password')
yield self.user_add(u1)
u2 = User(2, g1, 'username2', 'password')
yield self.user_add(u2)
# List users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(2, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
# Persist groups only
yield self.persist(scope='groups')
# Remove all users
yield self.user_remove_all()
# Remove all groups
yield self.group_remove_all()
# List users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
# Load
yield self.load(scope='groups') # Load with scope=all may also work
# List users
c = yield self.user_get_all()
c = pickle.loads(c)
self.assertEqual(0, len(c))
# List groups
c = yield self.group_get_all()
c = pickle.loads(c)
self.assertEqual(1, len(c))
@defer.inlineCallbacks
def test_persitance_flag(self):
yield self.connect('127.0.0.1', self.pbPort)
# Initially, all config is already persisted
isPersisted = yield self.is_persisted()
self.assertTrue(isPersisted)
# Make config modifications and assert is_persisted()
g1 = Group(1)
yield self.group_add(g1)
# Config is not persisted, waiting for persistance
isPersisted = yield self.is_persisted()
self.assertFalse(isPersisted)
u1 = User(1, g1, 'username', 'password')
yield self.user_add(u1)
# Config is not persisted, waiting for persistance
isPersisted = yield self.is_persisted()
self.assertFalse(isPersisted)
u2 = User(2, g1, 'username2', 'password')
yield self.user_add(u2)
# Persist
yield self.persist()
# Config is now persisted
isPersisted = yield self.is_persisted()
self.assertTrue(isPersisted)
# Add mo route
yield self.moroute_add(DefaultRoute(HttpConnector(id_generator(), 'http://127.0.0.1/any')), 0)
# Config is not persisted, waiting for persistance
isPersisted = yield self.is_persisted()
self.assertFalse(isPersisted)
# Add mt route
yield self.mtroute_add(DefaultRoute(SmppClientConnector(id_generator())), 0)
# Persist
yield self.persist()
# Config is now persisted
isPersisted = yield self.is_persisted()
self.assertTrue(isPersisted)
# Remove all users
yield self.user_remove_all()
# Remove all group
yield self.group_remove_all()
# Remove all mo routes
yield self.moroute_flush()
# Remove all mt routes
yield self.mtroute_flush()
# Config is not persisted, waiting for persistance
isPersisted = yield self.is_persisted()
self.assertFalse(isPersisted)
# Load
yield self.load()
# Config is now persisted
isPersisted = yield self.is_persisted()
self.assertTrue(isPersisted)
class QuotasUpdatedPersistenceTestCases(PersistenceTestCase):
@defer.inlineCallbacks
def test_manual_persist_sets_quotas_updated_to_false(self):
yield self.connect('127.0.0.1', self.pbPort)
# Add a group
g1 = Group(1)
yield self.group_add(g1)
# Add a user
mt_c = MtMessagingCredential()
mt_c.setQuota('balance', 2.0)
u1 = User(1, g1, 'username', 'password', mt_c)
yield self.user_add(u1)
# Config is not persisted, waiting for persistance
isPersisted = yield self.is_persisted()
self.assertFalse(isPersisted)
# Check quotas_updated flag
self.assertFalse(self.pbRoot_f.users[0].mt_credential.quotas_updated)
# Update quota and check for quotas_updated
self.pbRoot_f.users[0].mt_credential.updateQuota('balance', -1.0)
self.assertTrue(self.pbRoot_f.users[0].mt_credential.quotas_updated)
self.assertEqual(self.pbRoot_f.users[0].mt_credential.getQuota('balance'), 1)
# Manual persistence and check for quotas_updated
persistRet = yield self.persist()
self.assertTrue(persistRet)
self.assertFalse(self.pbRoot_f.users[0].mt_credential.quotas_updated)
# Balance would not change after persistence
self.assertEqual(self.pbRoot_f.users[0].mt_credential.getQuota('balance'), 1)
@defer.inlineCallbacks
def test_manual_load_sets_quotas_updated_to_false(self):
yield self.connect('127.0.0.1', self.pbPort)
# Add a group
g1 = Group(1)
yield self.group_add(g1)
# Add a user
mt_c = MtMessagingCredential()
mt_c.setQuota('balance', 2.0)
u1 = User(1, g1, 'username', 'password', mt_c)
yield self.user_add(u1)
# Manual persistence and check for quotas_updated
persistRet = yield self.persist()
self.assertTrue(persistRet)
# Config is persisted
isPersisted = yield self.is_persisted()
self.assertTrue(isPersisted)
# Check quotas_updated flag
self.assertFalse(self.pbRoot_f.users[0].mt_credential.quotas_updated)
# Update quota and check for quotas_updated
self.pbRoot_f.users[0].mt_credential.updateQuota('balance', -1.0)
self.assertTrue(self.pbRoot_f.users[0].mt_credential.quotas_updated)
self.assertEqual(self.pbRoot_f.users[0].mt_credential.getQuota('balance'), 1)
# Manual load and check for quotas_updated
loadRet = yield self.load()
self.assertTrue(loadRet)
self.assertFalse(self.pbRoot_f.users[0].mt_credential.quotas_updated)
# Balance will be reset after persistence
self.assertEqual(self.pbRoot_f.users[0].mt_credential.getQuota('balance'), 2.0)
@defer.inlineCallbacks
def test_automatic_persist_on_quotas_updated(self):
yield self.connect('127.0.0.1', self.pbPort)
# Mock perspective_persist for later assertions
self.pbRoot_f.perspective_persist = mock.Mock(self.pbRoot_f.perspective_persist)
# Reset persistence_timer_secs to shorten the test time
self.pbRoot_f.config.persistence_timer_secs = 0.1
self.pbRoot_f.activatePersistenceTimer()
# Add a group
g1 = Group(1)
yield self.group_add(g1)
# Add a user
mt_c = MtMessagingCredential()
mt_c.setQuota('balance', 2.0)
u1 = User(1, g1, 'username', 'password', mt_c)
yield self.user_add(u1)
# Update quota and check for quotas_updated
self.pbRoot_f.users[0].mt_credential.updateQuota('balance', -1.0)
self.assertTrue(self.pbRoot_f.users[0].mt_credential.quotas_updated)
self.assertEqual(self.pbRoot_f.users[0].mt_credential.getQuota('balance'), 1)
# Wait 3 seconds for automatic persistence to be done
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
# assert for 2 calls to persist: 1.users and 2.groups
self.assertEqual(self.pbRoot_f.perspective_persist.call_count, 2)
self.assertEqual(self.pbRoot_f.perspective_persist.call_args_list, [mock.call(scope='groups'), mock.call(scope='users')])
class SimpleNonConnectedSubmitSmDeliveryTestCases(RouterPBProxy, SMPPClientManagerPBTestCase):
@defer.inlineCallbacks
def test_delivery(self):
yield self.connect('127.0.0.1', self.pbPort)
g1 = Group(1)
yield self.group_add(g1)
c1 = SmppClientConnector(id_generator())
u1_password = 'password'
u1 = User(1, g1, 'username', u1_password)
u2_password = 'password'
u2 = User(1, g1, 'username2', u2_password)
yield self.user_add(u1)
yield self.mtroute_add(DefaultRoute(c1), 0)
# Send a SMS MT through http interface
url_ko = 'http://127.0.0.1:1401/send?to=98700177&content=test&username=%s&password=%s' % (u2.username, u1_password)
url_ok = 'http://127.0.0.1:1401/send?to=98700177&content=test&username=%s&password=%s' % (u1.username, u2_password)
# Incorrect username/password will lead to '403 Forbidden' error
lastErrorStatus = 200
try:
yield getPage(url_ko)
except Exception, e:
lastErrorStatus = e.status
self.assertEqual(lastErrorStatus, '403')
# Since Connector doesnt really exist, the message will not be routed
# to a queue, a 500 error will be returned, and more details will be written
# in smpp client manager log:
# 'Trying to enqueue a SUBMIT_SM to a connector with an unknown cid: '
try:
yield getPage(url_ok)
except Exception, e:
lastErrorStatus = e.status
self.assertEqual(lastErrorStatus, '500')
# Now we'll create the connecter and send an MT to it
yield self.SMPPClientManagerPBProxy.connect('127.0.0.1', self.CManagerPort)
c1Config = SMPPClientConfig(id=c1.cid)
yield self.SMPPClientManagerPBProxy.add(c1Config)
# We should receive a msg id
c = yield getPage(url_ok)
self.assertEqual(c[:7], 'Success')
# @todo: Should be a real uuid pattern testing
self.assertApproximates(len(c), 40, 10)
class LastClientFactory(Factory):
lastClient = None
def buildProtocol(self, addr):
self.lastClient = Factory.buildProtocol(self, addr)
return self.lastClient
class HappySMSCTestCase(SMPPClientManagerPBTestCase):
protocol = ManualDeliveryReceiptHappySMSC
@defer.inlineCallbacks
def setUp(self):
yield SMPPClientManagerPBTestCase.setUp(self)
self.smsc_f = LastClientFactory()
self.smsc_f.protocol = self.protocol
self.SMSCPort = reactor.listenTCP(0, self.smsc_f)
@defer.inlineCallbacks
def tearDown(self):
yield SMPPClientManagerPBTestCase.tearDown(self)
yield self.SMSCPort.stopListening()
class SubmitSmTestCaseTools():
"""
Factorized methods for child classes
"""
@defer.inlineCallbacks
def prepareRoutingsAndStartConnector(self, bindOperation = 'transceiver', route_rate = 0.0,
user = None):
# Routing stuff
g1 = Group(1)
yield self.group_add(g1)
self.c1 = SmppClientConnector(id_generator())
user_password = 'password'
if user is None:
self.u1 = User(1, g1, 'username', user_password)
else:
self.u1 = user
yield self.user_add(self.u1)
yield self.mtroute_add(DefaultRoute(self.c1, route_rate), 0)
# Now we'll create the connecter
yield self.SMPPClientManagerPBProxy.connect('127.0.0.1', self.CManagerPort)
c1Config = SMPPClientConfig(id=self.c1.cid, port = self.SMSCPort.getHost().port,
bindOperation = bindOperation)
yield self.SMPPClientManagerPBProxy.add(c1Config)
# Start the connector
yield self.SMPPClientManagerPBProxy.start(self.c1.cid)
# Wait for 'BOUND_TRX' state
while True:
ssRet = yield self.SMPPClientManagerPBProxy.session_state(self.c1.cid)
if ssRet[:6] == 'BOUND_':
break;
else:
time.sleep(0.2)
# Configuration
self.method = 'GET'
self.postdata = None
self.params = {'to': '98700177',
'username': self.u1.username,
'password': user_password,
'content': 'test'}
if hasattr(self, 'AckServer'):
# Send a SMS MT through http interface and set delivery receipt callback in url
self.dlr_url = 'http://127.0.0.1:%d/receipt' % (self.AckServer.getHost().port)
self.AckServerResource.render_POST = mock.Mock(wraps=self.AckServerResource.render_POST)
self.AckServerResource.render_GET = mock.Mock(wraps=self.AckServerResource.render_GET)
@defer.inlineCallbacks
def stopSmppClientConnectors(self):
# Disconnect the connector
yield self.SMPPClientManagerPBProxy.stop(self.c1.cid)
# Wait for 'BOUND_TRX' state
while True:
ssRet = yield self.SMPPClientManagerPBProxy.session_state(self.c1.cid)
if ssRet == 'NONE':
break;
else:
time.sleep(0.2)
class DlrCallbackingTestCases(RouterPBProxy, HappySMSCTestCase, SubmitSmTestCaseTools):
@defer.inlineCallbacks
def setUp(self):
yield HappySMSCTestCase.setUp(self)
# Start http servers
self.AckServerResource = AckServer()
self.AckServer = reactor.listenTCP(0, server.Site(self.AckServerResource))
@defer.inlineCallbacks
def tearDown(self):
yield HappySMSCTestCase.tearDown(self)
yield self.AckServer.stopListening()
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level1(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 1
3. Wait for the level1 DLR (submit_sm_resp) and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 1
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_POST.call_count, 1)
# Message ID must be transmitted in the DLR
callArgs = self.AckServerResource.render_POST.call_args_list[0][0][0].args
self.assertEqual(callArgs['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level2(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 2
3. Wait for the level2 DLR (deliver_sm) and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 1
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
# Wait 3 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
# Trigger a deliver_sm containing a DLR
yield self.SMSCPort.factory.lastClient.trigger_DLR()
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_POST.call_count, 1)
# Message ID must be transmitted in the DLR
callArgs = self.AckServerResource.render_POST.call_args_list[0][0][0].args
self.assertEqual(callArgs['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level3(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 3
3. Wait for the level1 & level2 DLRs and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 3
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
# Wait 2 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
# Trigger a deliver_sm
yield self.SMSCPort.factory.lastClient.trigger_DLR()
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_POST.call_count, 2)
# Message ID must be transmitted in the DLR
callArgs_level1 = self.AckServerResource.render_POST.call_args_list[0][0][0].args
callArgs_level2 = self.AckServerResource.render_POST.call_args_list[1][0][0].args
self.assertEqual(callArgs_level1['id'][0], msgId)
self.assertEqual(callArgs_level2['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level1_GET(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 1 using GET method
3. Wait for the level1 DLR (submit_sm_resp) and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 1
self.params['dlr-method'] = 'GET'
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_GET.call_count, 1)
# Message ID must be transmitted in the DLR
callArgs = self.AckServerResource.render_GET.call_args_list[0][0][0].args
self.assertEqual(callArgs['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level2_GET(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 2 using GET method
3. Wait for the level2 DLR (deliver_sm) and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 2
self.params['dlr-method'] = 'GET'
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
# Wait 3 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
# Trigger a deliver_sm containing a DLR
yield self.SMSCPort.factory.lastClient.trigger_DLR()
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_GET.call_count, 1)
# Message ID must be transmitted in the DLR
callArgs = self.AckServerResource.render_GET.call_args_list[0][0][0].args
self.assertEqual(callArgs['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level3_GET(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 3 using GET method
3. Wait for the level1 & level2 DLRs and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 3
self.params['dlr-method'] = 'GET'
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
# Wait 2 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
# Trigger a deliver_sm
yield self.SMSCPort.factory.lastClient.trigger_DLR()
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_GET.call_count, 2)
# Message ID must be transmitted in the DLR
callArgs_level1 = self.AckServerResource.render_GET.call_args_list[0][0][0].args
callArgs_level2 = self.AckServerResource.render_GET.call_args_list[1][0][0].args
self.assertEqual(callArgs_level1['id'][0], msgId)
self.assertEqual(callArgs_level2['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_empty_content(self):
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['content'] = ''
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
class LongSmDlrCallbackingTestCases(RouterPBProxy, HappySMSCTestCase, SubmitSmTestCaseTools):
@defer.inlineCallbacks
def setUp(self):
yield HappySMSCTestCase.setUp(self)
# Start http servers
self.AckServerResource = AckServer()
self.AckServer = reactor.listenTCP(0, server.Site(self.AckServerResource))
@defer.inlineCallbacks
def tearDown(self):
yield HappySMSCTestCase.tearDown(self)
yield self.AckServer.stopListening()
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level1(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 1
3. Wait for the level1 DLR (submit_sm_resp) and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 1
self.params['content'] = composeMessage({'_'}, 200)
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
# Wait 3 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_POST.call_count, 1)
# Message ID must be transmitted in the DLR
callArgs = self.AckServerResource.render_POST.call_args_list[0][0][0].args
self.assertEqual(callArgs['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level2(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 2
3. Wait for the level2 DLR (deliver_sm) and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 1
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
# Wait 3 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
# Trigger a deliver_sm containing a DLR
yield self.SMSCPort.factory.lastClient.trigger_DLR()
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_POST.call_count, 1)
# Message ID must be transmitted in the DLR
callArgs = self.AckServerResource.render_POST.call_args_list[0][0][0].args
self.assertEqual(callArgs['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level3(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 3
3. Wait for the level1 & level2 DLRs and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 3
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
# Wait 2 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
# Trigger a deliver_sm
yield self.SMSCPort.factory.lastClient.trigger_DLR()
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_POST.call_count, 2)
# Message ID must be transmitted in the DLR
callArgs_level1 = self.AckServerResource.render_POST.call_args_list[0][0][0].args
callArgs_level2 = self.AckServerResource.render_POST.call_args_list[1][0][0].args
self.assertEqual(callArgs_level1['id'][0], msgId)
self.assertEqual(callArgs_level2['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level1_GET(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 1 using GET method
3. Wait for the level1 DLR (submit_sm_resp) and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 1
self.params['dlr-method'] = 'GET'
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_GET.call_count, 1)
# Message ID must be transmitted in the DLR
callArgs = self.AckServerResource.render_GET.call_args_list[0][0][0].args
self.assertEqual(callArgs['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level2_GET(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 2 using GET method
3. Wait for the level2 DLR (deliver_sm) and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 2
self.params['dlr-method'] = 'GET'
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
# Wait 3 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
# Trigger a deliver_sm containing a DLR
yield self.SMSCPort.factory.lastClient.trigger_DLR()
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_GET.call_count, 1)
# Message ID must be transmitted in the DLR
callArgs = self.AckServerResource.render_GET.call_args_list[0][0][0].args
self.assertEqual(callArgs['id'][0], msgId)
@defer.inlineCallbacks
def test_delivery_with_inurl_dlr_level3_GET(self):
"""Will:
1. Set a SMS-MT route to connector A
2. Send a SMS-MT to that route and set a DLR callback for level 3 using GET method
3. Wait for the level1 & level2 DLRs and run tests
"""
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 3
self.params['dlr-method'] = 'GET'
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
# We should receive a msg id
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
# Wait 2 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
# Trigger a deliver_sm
yield self.SMSCPort.factory.lastClient.trigger_DLR()
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_GET.call_count, 2)
# Message ID must be transmitted in the DLR
callArgs_level1 = self.AckServerResource.render_GET.call_args_list[0][0][0].args
callArgs_level2 = self.AckServerResource.render_GET.call_args_list[1][0][0].args
self.assertEqual(callArgs_level1['id'][0], msgId)
self.assertEqual(callArgs_level2['id'][0], msgId)
class NoSubmitSmWhenReceiverIsBoundSMSC(SMPPClientManagerPBTestCase):
protocol = NoSubmitSmWhenReceiverIsBoundSMSC
@defer.inlineCallbacks
def setUp(self):
yield SMPPClientManagerPBTestCase.setUp(self)
self.smsc_f = LastClientFactory()
self.smsc_f.protocol = self.protocol
self.SMSCPort = reactor.listenTCP(0, self.smsc_f)
@defer.inlineCallbacks
def tearDown(self):
yield SMPPClientManagerPBTestCase.tearDown(self)
yield self.SMSCPort.stopListening()
class BOUND_RX_SubmitSmTestCases(RouterPBProxy, NoSubmitSmWhenReceiverIsBoundSMSC, SubmitSmTestCaseTools):
@defer.inlineCallbacks
def setUp(self):
yield NoSubmitSmWhenReceiverIsBoundSMSC.setUp(self)
# Start http servers
self.AckServerResource = AckServer()
self.AckServer = reactor.listenTCP(0, server.Site(self.AckServerResource))
@defer.inlineCallbacks
def tearDown(self):
yield NoSubmitSmWhenReceiverIsBoundSMSC.tearDown(self)
yield self.AckServer.stopListening()
@defer.inlineCallbacks
def test_delivery_using_incorrectly_bound_connector(self):
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector(bindOperation = 'receiver')
self.params['dlr-url'] = self.dlr_url
self.params['dlr-level'] = 1
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
c = yield getPage(baseurl, method = self.method, postdata = self.postdata)
msgStatus = c[:7]
msgId = c[9:45]
yield self.stopSmppClientConnectors()
# Run tests
self.assertEqual(msgStatus, 'Success')
# A DLR must be sent to dlr_url
self.assertEqual(self.AckServerResource.render_POST.call_count, 1)
# Message ID must be transmitted in the DLR
callArgs = self.AckServerResource.render_POST.call_args_list[0][0][0].args
self.assertEqual(callArgs['id'][0], msgId)
self.assertEqual(callArgs['message_status'][0], 'ESME_RINVBNDSTS')
class DeliverSmSMSCTestCase(SMPPClientManagerPBTestCase):
protocol = DeliverSmSMSC
@defer.inlineCallbacks
def setUp(self):
yield SMPPClientManagerPBTestCase.setUp(self)
self.smsc_f = LastClientFactory()
self.smsc_f.protocol = self.protocol
self.SMSCPort = reactor.listenTCP(0, self.smsc_f)
@defer.inlineCallbacks
def tearDown(self):
yield self.SMSCPort.stopListening()
yield SMPPClientManagerPBTestCase.tearDown(self)
class DeliverSmThrowingTestCases(RouterPBProxy, DeliverSmSMSCTestCase):
@defer.inlineCallbacks
def setUp(self):
yield DeliverSmSMSCTestCase.setUp(self)
# Start http servers
self.AckServerResource = AckServer()
self.AckServer = reactor.listenTCP(0, server.Site(self.AckServerResource))
# Initiating config objects without any filename
# will lead to setting defaults and that's what we
# need to run the tests
deliverSmHttpThrowerConfigInstance = deliverSmHttpThrowerConfig()
# Lower the timeout config to pass the timeout tests quickly
deliverSmHttpThrowerConfigInstance.timeout = 2
deliverSmHttpThrowerConfigInstance.retryDelay = 1
deliverSmHttpThrowerConfigInstance.maxRetries = 2
# Launch the deliverSmHttpThrower
self.deliverSmHttpThrower = deliverSmHttpThrower()
self.deliverSmHttpThrower.setConfig(deliverSmHttpThrowerConfigInstance)
# Add the broker to the deliverSmHttpThrower
yield self.deliverSmHttpThrower.addAmqpBroker(self.amqpBroker)
@defer.inlineCallbacks
def tearDown(self):
yield self.AckServer.stopListening()
yield self.deliverSmHttpThrower.stopService()
yield DeliverSmSMSCTestCase.tearDown(self)
@defer.inlineCallbacks
def prepareRoutingsAndStartConnector(self, connector):
self.AckServerResource.render_GET = mock.Mock(wraps=self.AckServerResource.render_GET)
# Prepare for routing
connector.port = self.SMSCPort.getHost().port
c2_destination = HttpConnector(id_generator(), 'http://127.0.0.1:%s/send' % self.AckServer.getHost().port)
# Set the route
yield self.moroute_add(DefaultRoute(c2_destination), 0)
# Now we'll create the connector 1
yield self.SMPPClientManagerPBProxy.connect('127.0.0.1', self.CManagerPort)
c1Config = SMPPClientConfig(id=connector.cid, port=connector.port)
yield self.SMPPClientManagerPBProxy.add(c1Config)
# Start the connector
yield self.SMPPClientManagerPBProxy.start(connector.cid)
# Wait for 'BOUND_TRX' state
while True:
ssRet = yield self.SMPPClientManagerPBProxy.session_state(connector.cid)
if ssRet == 'BOUND_TRX':
break;
else:
time.sleep(0.2)
@defer.inlineCallbacks
def stopConnector(self, connector):
# Disconnect the connector
yield self.SMPPClientManagerPBProxy.stop(connector.cid)
# Wait for 'BOUND_TRX' state
while True:
ssRet = yield self.SMPPClientManagerPBProxy.session_state(connector.cid)
if ssRet == 'NONE':
break;
else:
time.sleep(0.2)
@defer.inlineCallbacks
def triggerDeliverSmFromSMSC(self, pdus):
for pdu in pdus:
yield self.SMSCPort.factory.lastClient.trigger_deliver_sm(pdu)
# Wait 3 seconds
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
@defer.inlineCallbacks
def test_delivery_HttpConnector(self):
yield self.connect('127.0.0.1', self.pbPort)
# Connect to SMSC
source_connector = Connector(id_generator())
yield self.prepareRoutingsAndStartConnector(source_connector)
# Send a deliver_sm from the SMSC
pdu = DeliverSM(
source_addr='1234',
destination_addr='4567',
short_message='any content',
)
yield self.triggerDeliverSmFromSMSC([pdu])
# Run tests
# Test callback in router
self.assertEquals(self.pbRoot_f.deliver_sm_callback.call_count, 1)
# Destination connector must receive the message one time (no retries)
self.assertEqual(self.AckServerResource.render_GET.call_count, 1)
# Assert received args
receivedHttpReq = self.AckServerResource.last_request.args
self.assertEqual(len(receivedHttpReq), 7)
self.assertEqual(receivedHttpReq['from'], [pdu.params['source_addr']])
self.assertEqual(receivedHttpReq['to'], [pdu.params['destination_addr']])
self.assertEqual(receivedHttpReq['content'], [pdu.params['short_message']])
self.assertEqual(receivedHttpReq['origin-connector'], [source_connector.cid])
# Disconnector from SMSC
yield self.stopConnector(source_connector)
@defer.inlineCallbacks
def test_long_content_delivery_SAR_HttpConnector(self):
yield self.connect('127.0.0.1', self.pbPort)
# Connect to SMSC
#source_connector = Connector(id_generator())
source_connector = Connector(id_generator())
yield self.prepareRoutingsAndStartConnector(source_connector)
# Send a deliver_sm from the SMSC
basePdu = DeliverSM(
source_addr = '1234',
destination_addr = '4567',
short_message = '',
sar_total_segments = 3,
sar_msg_ref_num = int(id_generator(size = 2, chars=string.digits)),
)
pdu_part1 = copy.deepcopy(basePdu)
pdu_part2 = copy.deepcopy(basePdu)
pdu_part3 = copy.deepcopy(basePdu)
pdu_part1.params['short_message'] = '__1st_part_with_153_char________________________________________________________________________________________________________________________________.'
pdu_part1.params['sar_segment_seqnum'] = 1
pdu_part2.params['short_message'] = '__2nd_part_with_153_char________________________________________________________________________________________________________________________________.'
pdu_part2.params['sar_segment_seqnum'] = 2
pdu_part3.params['short_message'] = '__3rd_part_end.'
pdu_part3.params['sar_segment_seqnum'] = 3
yield self.triggerDeliverSmFromSMSC([pdu_part1, pdu_part2, pdu_part3])
# Run tests
# Destination connector must receive the message one time (no retries)
self.assertEqual(self.AckServerResource.render_GET.call_count, 1)
# Assert received args
receivedHttpReq = self.AckServerResource.last_request.args
self.assertEqual(len(receivedHttpReq), 7)
self.assertEqual(receivedHttpReq['from'], [basePdu.params['source_addr']])
self.assertEqual(receivedHttpReq['to'], [basePdu.params['destination_addr']])
self.assertEqual(receivedHttpReq['content'], [pdu_part1.params['short_message'] + pdu_part2.params['short_message'] + pdu_part3.params['short_message']])
self.assertEqual(receivedHttpReq['origin-connector'], [source_connector.cid])
# Disconnector from SMSC
yield self.stopConnector(source_connector)
@defer.inlineCallbacks
def test_long_content_delivery_UDH_HttpConnector(self):
yield self.connect('127.0.0.1', self.pbPort)
# Connect to SMSC
#source_connector = Connector(id_generator())
source_connector = Connector(id_generator())
yield self.prepareRoutingsAndStartConnector(source_connector)
# Build a UDH
baseUdh = []
baseUdh.append(struct.pack('!B', 5)) # Length of User Data Header
baseUdh.append(struct.pack('!B', 0)) # Information Element Identifier, equal to 00 (Concatenated short messages, 8-bit reference number)
baseUdh.append(struct.pack('!B', 3)) # Length of the header, excluding the first two fields; equal to 03
baseUdh.append(struct.pack('!B', int(id_generator(size = 2, chars=string.digits)))) # msg_ref_num
baseUdh.append(struct.pack('!B', 3)) # total_segments
# Send a deliver_sm from the SMSC
basePdu = DeliverSM(
source_addr = '1234',
destination_addr = '4567',
short_message = '',
esm_class = EsmClass(EsmClassMode.DEFAULT, EsmClassType.DEFAULT, [EsmClassGsmFeatures.UDHI_INDICATOR_SET]),
)
pdu_part1 = copy.deepcopy(basePdu)
udh_part1 = copy.deepcopy(baseUdh)
pdu_part2 = copy.deepcopy(basePdu)
udh_part2 = copy.deepcopy(baseUdh)
pdu_part3 = copy.deepcopy(basePdu)
udh_part3 = copy.deepcopy(baseUdh)
udh_part1.append(struct.pack('!B', 1)) # segment_seqnum
pdu_part1.params['more_messages_to_send'] = MoreMessagesToSend.MORE_MESSAGES
pdu_part1.params['short_message'] = ''.join(udh_part1)+'__1st_part_with_153_char________________________________________________________________________________________________________________________________.'
udh_part2.append(struct.pack('!B', 2)) # segment_seqnum
pdu_part2.params['more_messages_to_send'] = MoreMessagesToSend.MORE_MESSAGES
pdu_part2.params['short_message'] = ''.join(udh_part2)+'__2nd_part_with_153_char________________________________________________________________________________________________________________________________.'
udh_part3.append(struct.pack('!B', 3)) # segment_seqnum
pdu_part3.params['more_messages_to_send'] = MoreMessagesToSend.NO_MORE_MESSAGES
pdu_part3.params['short_message'] = ''.join(udh_part3)+'__3rd_part_end.'
yield self.triggerDeliverSmFromSMSC([pdu_part1, pdu_part2, pdu_part3])
# Run tests
# Destination connector must receive the message one time (no retries)
self.assertEqual(self.AckServerResource.render_GET.call_count, 1)
# Assert received args
receivedHttpReq = self.AckServerResource.last_request.args
self.assertEqual(len(receivedHttpReq), 7)
self.assertEqual(receivedHttpReq['from'], [basePdu.params['source_addr']])
self.assertEqual(receivedHttpReq['to'], [basePdu.params['destination_addr']])
self.assertEqual(receivedHttpReq['content'], [pdu_part1.params['short_message'][6:] + pdu_part2.params['short_message'][6:] + pdu_part3.params['short_message'][6:]])
self.assertEqual(receivedHttpReq['origin-connector'], [source_connector.cid])
# Disconnector from SMSC
yield self.stopConnector(source_connector)
@defer.inlineCallbacks
def test_unordered_long_content_delivery_HttpConnector(self):
yield self.connect('127.0.0.1', self.pbPort)
# Connect to SMSC
#source_connector = Connector(id_generator())
source_connector = Connector(id_generator())
yield self.prepareRoutingsAndStartConnector(source_connector)
# Send a deliver_sm from the SMSC
basePdu = DeliverSM(
source_addr = '1234',
destination_addr = '4567',
short_message = '',
sar_total_segments = 3,
sar_msg_ref_num = int(id_generator(size = 2, chars=string.digits)),
)
pdu_part1 = copy.deepcopy(basePdu)
pdu_part2 = copy.deepcopy(basePdu)
pdu_part3 = copy.deepcopy(basePdu)
pdu_part1.params['short_message'] = '__1st_part_with_153_char________________________________________________________________________________________________________________________________.'
pdu_part1.params['sar_segment_seqnum'] = 1
pdu_part2.params['short_message'] = '__2nd_part_with_153_char________________________________________________________________________________________________________________________________.'
pdu_part2.params['sar_segment_seqnum'] = 2
pdu_part3.params['short_message'] = '__3rd_part_end.'
pdu_part3.params['sar_segment_seqnum'] = 3
yield self.triggerDeliverSmFromSMSC([pdu_part1, pdu_part3, pdu_part2])
# Run tests
# Destination connector must receive the message one time (no retries)
self.assertEqual(self.AckServerResource.render_GET.call_count, 1)
# Assert received args
receivedHttpReq = self.AckServerResource.last_request.args
self.assertEqual(len(receivedHttpReq), 7)
self.assertEqual(receivedHttpReq['from'], [basePdu.params['source_addr']])
self.assertEqual(receivedHttpReq['to'], [basePdu.params['destination_addr']])
self.assertEqual(receivedHttpReq['content'], [pdu_part1.params['short_message'] + pdu_part2.params['short_message'] + pdu_part3.params['short_message']])
self.assertEqual(receivedHttpReq['origin-connector'], [source_connector.cid])
# Disconnector from SMSC
yield self.stopConnector(source_connector)
def test_delivery_SmppClientConnector(self):
pass
test_delivery_SmppClientConnector.skip = 'TODO: When SMPP Server will be implemented ?'
class BillRequestSubmitSmRespCallbackingTestCases(RouterPBProxy, HappySMSCTestCase, SubmitSmTestCaseTools):
@defer.inlineCallbacks
def test_unrated_route(self):
yield self.connect('127.0.0.1', self.pbPort)
yield self.prepareRoutingsAndStartConnector()
# Mock callback
self.pbRoot_f.bill_request_submit_sm_resp_callback = mock.Mock(self.pbRoot_f.bill_request_submit_sm_resp_callback)
self.params['content'] = composeMessage({'_'}, 200)
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
yield getPage(baseurl, method = self.method, postdata = self.postdata)
# Wait 3 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
yield self.stopSmppClientConnectors()
# Run tests
# Unrated route will not callback, nothing to bill
self.assertEquals(self.pbRoot_f.bill_request_submit_sm_resp_callback.call_count, 0)
@defer.inlineCallbacks
def test_rated_route(self):
yield self.connect('127.0.0.1', self.pbPort)
mt_c = MtMessagingCredential()
mt_c.setQuota('balance', 2.0)
mt_c.setQuota('early_decrement_balance_percent', 10)
user = User(1, Group(1), 'username', 'password', mt_c)
yield self.prepareRoutingsAndStartConnector(route_rate = 1.0, user = user)
self.params['content'] = composeMessage({'_'}, 10)
baseurl = 'http://127.0.0.1:1401/send?%s' % urllib.urlencode(self.params)
# Send a MT
yield getPage(baseurl, method = self.method, postdata = self.postdata)
# Wait 3 seconds for submit_sm_resp
exitDeferred = defer.Deferred()
reactor.callLater(3, exitDeferred.callback, None)
yield exitDeferred
yield self.stopSmppClientConnectors()
# Run tests
# Rated route will callback with a bill
self.assertEquals(self.pbRoot_f.bill_request_submit_sm_resp_callback.call_count, 1) | 38.704173 | 218 | 0.643318 | 8,563 | 73,267 | 5.292421 | 0.067967 | 0.059975 | 0.008385 | 0.010062 | 0.778923 | 0.740793 | 0.722787 | 0.701825 | 0.691123 | 0.67528 | 0 | 0.025705 | 0.259285 | 73,267 | 1,893 | 219 | 38.704173 | 0.809361 | 0.103553 | 0 | 0.726439 | 0 | 0.001668 | 0.071248 | 0.016109 | 0 | 0 | 0 | 0.000528 | 0.148457 | 0 | null | null | 0.038365 | 0.032527 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e5f545d10a6ae5f111dafc0ef17cc1c1804d867c | 83 | py | Python | optimization/common/base/__init__.py | AICryptoGroup/TorchSlim | 5e1a5eb994b7b22e226cce9ee3849a623ddaacb7 | [
"Apache-2.0"
] | 5 | 2022-03-11T09:35:33.000Z | 2022-03-26T14:47:03.000Z | optimization/common/base/__init__.py | AICryptoGroup/TorchSlim | 5e1a5eb994b7b22e226cce9ee3849a623ddaacb7 | [
"Apache-2.0"
] | null | null | null | optimization/common/base/__init__.py | AICryptoGroup/TorchSlim | 5e1a5eb994b7b22e226cce9ee3849a623ddaacb7 | [
"Apache-2.0"
] | 1 | 2022-03-11T09:47:28.000Z | 2022-03-11T09:47:28.000Z | from .compressor import Compressor, PrunerSchema, CompressorSchema, QuantizerSchema | 83 | 83 | 0.879518 | 7 | 83 | 10.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072289 | 83 | 1 | 83 | 83 | 0.948052 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
daa48f1949ff2ae283e978edc1131ac68cb8f40f | 35 | py | Python | tests/test_setup.py | iklasky/timemachines | 1820fa9453d31d4daaeff75274a935c7455febe3 | [
"MIT"
] | 253 | 2021-01-08T17:33:30.000Z | 2022-03-21T17:32:36.000Z | tests/test_setup.py | iklasky/timemachines | 1820fa9453d31d4daaeff75274a935c7455febe3 | [
"MIT"
] | 65 | 2021-01-20T16:43:35.000Z | 2022-03-30T19:07:22.000Z | tests/test_setup.py | iklasky/timemachines | 1820fa9453d31d4daaeff75274a935c7455febe3 | [
"MIT"
] | 28 | 2021-02-04T14:58:30.000Z | 2022-01-17T04:35:17.000Z |
def test_nothin():
assert True | 11.666667 | 18 | 0.685714 | 5 | 35 | 4.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228571 | 35 | 3 | 19 | 11.666667 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dab04babcae8febc5b1a0dfc6ef3b1bba6029cb6 | 203 | py | Python | polyaxon/activitylogs/events/repo.py | elyase/polyaxon | 1c19f059a010a6889e2b7ea340715b2bcfa382a0 | [
"MIT"
] | null | null | null | polyaxon/activitylogs/events/repo.py | elyase/polyaxon | 1c19f059a010a6889e2b7ea340715b2bcfa382a0 | [
"MIT"
] | null | null | null | polyaxon/activitylogs/events/repo.py | elyase/polyaxon | 1c19f059a010a6889e2b7ea340715b2bcfa382a0 | [
"MIT"
] | null | null | null | import activitylogs
from event_manager.events import repo
activitylogs.subscribe(repo.RepoCreatedEvent)
activitylogs.subscribe(repo.RepoDownloadedEvent)
activitylogs.subscribe(repo.RepoNewCommitEvent)
| 25.375 | 48 | 0.881773 | 20 | 203 | 8.9 | 0.55 | 0.353933 | 0.421348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054187 | 203 | 7 | 49 | 29 | 0.927083 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
972807d8b86be0ce50bfabea1120b9b0f8accbe8 | 17,380 | py | Python | contrib/0.挖宝行动/youzidata-机坪跑道航空器识别/src/utils/read_image_to_list.py | huaweicloud/ModelArts-Lab | 75d06fb70d81469cc23cd422200877ce443866be | [
"Apache-2.0"
] | 1,045 | 2019-05-09T02:50:43.000Z | 2022-03-31T06:22:11.000Z | contrib/0.挖宝行动/youzidata-机坪跑道航空器识别/src/utils/read_image_to_list.py | huaweicloud/ModelArts-Lab | 75d06fb70d81469cc23cd422200877ce443866be | [
"Apache-2.0"
] | 1,468 | 2019-05-16T00:48:18.000Z | 2022-03-08T04:12:44.000Z | contrib/0.挖宝行动/youzidata-机坪跑道航空器识别/src/utils/read_image_to_list.py | huaweicloud/ModelArts-Lab | 75d06fb70d81469cc23cd422200877ce443866be | [
"Apache-2.0"
] | 1,077 | 2019-05-09T02:50:53.000Z | 2022-03-27T11:05:32.000Z | # Copyright 2018 Deep Learning Service of Huawei Cloud. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
from moxing.framework import file
import random
def get_image_list(data_path, split_spec):
"""get image list
[[image_path, label_path]]
:param data_path: data store url
:param split_spec: split train percent if data doesn't have evaluation data
Returns:
train_data_list,
eval_data_list,
"""
image_list_train = []
image_list_eval = []
class_name = None
file_list = file.list_directory(data_path)
donot_have_directory = True
if 'cache' in file_list:
file_list.remove('cache')
for i in file_list:
if file.is_directory(os.path.join(data_path, i)):
donot_have_directory = False
break
if 'Images' and 'Annotations' in file_list:
image_list_train, image_list_eval, class_name = \
get_image_images_annotation(data_path, split_spec)
elif 'train' and 'eval' in file_list:
file_list = file.list_directory(os.path.join(data_path, 'train'))
is_raw = True
if 'cache' in file_list:
file_list.remove('cache')
for i in file_list:
if file.is_directory(os.path.join(data_path, 'train', i)):
is_raw = False
break
if 'Images' and 'Annotations' in file_list:
image_list_train, image_list_eval = get_image_train_eval(data_path)
elif 'image_to_annotation.csv' in file_list:
image_list_train, image_list_eval = get_image_csv(data_path)
elif is_raw:
image_list_train, image_list_eval = \
get_image_train_eval_raw(data_path)
else:
image_list_train, image_list_eval, class_name = \
get_image_classese_train_eval(data_path)
elif donot_have_directory:
image_list_train, image_list_eval, class_name = get_image_raw_txt(data_path, split_spec)
else:
image_list_train, image_list_eval, class_name = get_image_classese_raw(data_path, split_spec)
return image_list_train, image_list_eval, class_name
def get_image_csv(data_path):
file_list = file.list_directory(os.path.join(data_path, 'train'))
image_list_train = []
image_list_eval = []
for i in file_list:
if '.txt' in file.list_directory(os.path.join(data_path, 'train', i)):
image_list_train = os.path.join(data_path, 'train', 'image_to_annotation.csv')
image_list_eval = os.path.join(data_path, 'eval', 'image_to_annotation.csv')
break
elif '.xml' in file.list_directory(os.path.join(data_path, 'train', i)):
with file.File(os.path.join(data_path, 'train', 'image_to_annotation.csv'), 'r') as f:
for line in f.readlines()[1:]:
image_path, image_label = line.strip().split(',')
image_list_train.append([os.path.join(data_path, image_path),
os.path.join(data_path, image_label)])
with file.File(os.path.join(data_path, 'eval', 'image_to_annotation.csv'), 'r') as f:
for line in f.readlines()[1:]:
image_path, image_label = line.strip().split(',')
image_list_eval.append([os.path.join(data_path, image_path),
os.path.join(data_path, image_label)])
break
return image_list_train, image_list_eval
def get_image_images_annotation(data_path, split_spec):
"""get image list when data struct is
{
|-- data_url
|-- Images
|-- a.jpg
|-- b.jpg
...
|-- Annotations
|-- a.txt (or a.xml)
|-- b.txt (or b.xml)
...
|-- label_map_dict (optional)
}
:param data_path: data store url
:param split_spec: split train percent if data doesn't have evaluation data
Returns:
train_data_list,
eval_data_list,
"""
image_set = []
label_dict = {}
label_num = 0
class_name = []
# get all labeled data
image_list_set = file.list_directory(os.path.join(data_path, 'Images'))
assert not image_list_set == [], 'there is no file in data url'
for i in image_list_set:
if file.exists(os.path.join(data_path, 'Annotations', os.path.splitext(i)[0] + '.xml')):
image_set.append([os.path.join(data_path, 'Images', i),
os.path.join(data_path, 'Annotations', os.path.splitext(i)[0] + '.xml')])
elif file.exists(os.path.join(data_path, 'Annotations', os.path.splitext(i)[0] + '.txt')):
label_name = file.read(os.path.join(data_path, 'Annotations',
os.path.splitext(i)[0] + '.txt'))
if label_name not in label_dict.keys():
label_dict[label_name] = label_num
class_name.append(label_name)
label_num = label_num + 1
image_set.append([os.path.join(data_path, 'Images', i),
label_dict[label_name]])
# split data to train and eval
num_examples = len(image_set)
train_num = int(num_examples * split_spec)
shuffle_list = list(range(num_examples))
random.shuffle(shuffle_list)
image_list_train = []
image_list_eval = []
for idx, item in enumerate(shuffle_list):
if idx < train_num:
image_list_train.append(image_set[item])
else:
image_list_eval.append(image_set[item])
return image_list_train, image_list_eval, class_name
def get_image_raw_txt(data_path, split_spec):
"""get image list when data struct is
{
|-- data_url
|-- a.jpg
|-- a.txt (or a.xml)
|-- b.jpg
|-- b.txt (or b.xml)
...
|-- label_map_dict (optional)
}
:param data_path: data store url
:param split_spec: split train percent if data doesn't have evaluation data
Returns:
train_data_list,
eval_data_list,
"""
image_list_set = []
image_set = []
# get all labeled data
image_list_set = file.list_directory(data_path)
label_dict = {}
label_num = 0
class_name = []
assert not image_list_set == [], 'there is no file in data url'
for i in image_list_set:
if not '.xml' in i and not '.txt' in i:
if file.exists(os.path.join(data_path, os.path.splitext(i)[0] + '.xml')):
image_set.append([os.path.join(data_path, i),
os.path.join(data_path, os.path.splitext(i)[0] + '.xml')])
elif file.exists(os.path.join(data_path, os.path.splitext(i)[0] + '.txt')):
label_name = file.read(os.path.join(data_path,
os.path.splitext(i)[0] + '.txt'))
if label_name not in label_dict.keys():
label_dict[label_name] = label_num
class_name.append(label_name)
label_num = label_num + 1
image_set.append([os.path.join(data_path, i), label_dict[label_name]])
# split data to train and eval
num_examples = len(image_set)
train_num = int(num_examples * split_spec)
shuffle_list = list(range(num_examples))
random.shuffle(shuffle_list)
image_list_train = []
image_list_eval = []
for idx, item in enumerate(shuffle_list):
if idx < train_num:
image_list_train.append(image_set[item])
else:
image_list_eval.append(image_set[item])
return image_list_train, image_list_eval, class_name
def get_image_train_eval(data_path):
"""get image list when data struct is
{
|-- data_url
|-- train
|-- Images
|-- a.jpg
...
|-- Annotations
|-- a.txt (or a.xml)
|-- label_map_dict (optional)
|-- eval
|-- Images
|-- b.jpg
...
|-- Annotations
|-- b.txt (or b.xml)
...
|-- label_map_dict (optional)
|-- label_map_dict (optional)
}
:param data_path: data store url
Returns:
train_data_list,
eval_data_list,
"""
image_list_train = []
# get all labeled train data
image_list_set = file.list_directory(os.path.join(data_path, 'train', 'Images'))
assert not image_list_set == [], 'there is no file in data url'
for i in image_list_set:
if file.exists(os.path.join(data_path, 'train', 'Annotations', os.path.splitext(i)[0] + '.xml')):
image_list_train.append([os.path.join(data_path, 'train', 'Images', i),
os.path.join(data_path, 'train', 'Annotations', os.path.splitext(i)[0] + '.xml')])
elif file.exists(os.path.join(data_path, 'train', 'Annotations', os.path.splitext(i)[0] + '.txt')):
image_list_train.append([os.path.join(data_path, 'train', 'Images', i),
file.read(os.path.join(data_path, 'train',
'Annotations',
os.path.splitext(i)[0] + '.txt'))])
# get all labeled eval data
image_list_eval = []
image_list_set = []
image_list_set = file.list_directory(os.path.join(data_path, 'eval', 'Images'))
assert not image_list_set == [], 'there is no file in data url'
for i in image_list_set:
if file.exists(os.path.join(data_path, 'eval', 'Annotations', os.path.splitext(i)[0] + '.xml')):
image_list_eval.append([os.path.join(data_path, 'eval', 'Images', i),
os.path.join(data_path, 'eval', 'Annotations', os.path.splitext(i)[0] + '.xml')])
elif file.exists(os.path.join(data_path, 'eval', 'Annotations', os.path.splitext(i)[0] + '.txt')):
image_list_eval.append([os.path.join(data_path, 'eval', 'Images', i),
file.read(os.path.join(data_path, 'eval',
'Annotations',
os.path.splitext(i)[0] + '.txt'))])
return image_list_train, image_list_eval
def get_image_train_eval_raw(data_path):
"""get image list when data struct is
{
|-- data_url
|-- train
|-- a.jpg
|-- a.txt (or a.xml)
...
|-- label_map_dict (optional)
|-- eval
|-- b.jpg
|-- b.txt (or b.xml)
...
|-- label_map_dict (optional)
}
:param data_path: data store url
Returns:
train_data_list,
eval_data_list,
"""
image_list_train = []
# get all labeled train data
image_list_set = file.list_directory(os.path.join(data_path, 'train'))
assert not image_list_set == [], 'there is no file in data url'
for i in image_list_set:
if not '.xml' in i and not '.txt' in i:
if file.exists(os.path.join(data_path, 'train', os.path.splitext(i)[0] + '.xml')):
image_list_train.append([os.path.join(data_path, 'train', i),
os.path.join(data_path, 'train', os.path.splitext(i)[0] + '.xml')])
elif file.exists(os.path.join(data_path, 'train', os.path.splitext(i)[0] + '.txt')):
image_list_train.append([os.path.join(data_path, 'train', i),
file.read(os.path.join(data_path,
'train',
os.path.splitext(i)[0] + '.txt'))])
# get all labeled eval data
image_list_eval = []
image_list_set = []
image_list_set = file.list_directory(os.path.join(data_path, 'eval'))
assert not image_list_set == [], 'there is no file in data url'
for i in image_list_set:
if not '.xml' in i and not '.txt' in i:
if file.exists(os.path.join(data_path, 'eval', os.path.splitext(i)[0] + '.xml')):
image_list_eval.append([os.path.join(data_path, 'eval', i),
os.path.join(data_path, 'eval', os.path.splitext(i)[0] + '.xml')])
elif file.exists(os.path.join(data_path, 'eval', os.path.splitext(i)[0] + '.txt')):
image_list_eval.append([os.path.join(data_path, 'eval', i),
file.read(os.path.join(data_path, 'eval',
os.path.splitext(i)[0] + '.txt'))])
return image_list_train, image_list_eval
def get_image_classese_raw(data_path, split_spec):
"""get image list when data struct is
{
|-- data_url
|-- class_1
|-- a.jpg
|-- b.jpg
|-- class_2
|-- c.jpg
|-- d.jpg
...
|-- label_map_dict (optional)
}
:param data_path: data store url
Returns:
train_data_list,
eval_data_list,
"""
image_set = []
class_name = []
# get all labeled train data
image_list_set = file.list_directory(data_path)
for i in image_list_set:
if not file.is_directory(os.path.join(data_path, i)):
image_list_set.remove(i)
assert not image_list_set == [], 'there is no file in data url'
label_index = 0
for i in image_list_set:
if file.is_directory(os.path.join(data_path, i)):
img_list = file.list_directory(os.path.join(data_path, i))
for j in img_list:
label = label_index
class_name.append(i)
if not '.xml' in j and not '.txt' in j:
image_set.append([os.path.join(data_path, i, j), label])
label_index += 1
# split to train and eval
image_list_train = []
image_list_eval = []
start_examples = 0
for i in image_list_set:
image_list_set = file.list_directory(os.path.join(data_path, i))
num_examples = len(image_list_set)
train_num = int(num_examples * split_spec)
shuffle_list = list(range(start_examples, start_examples + num_examples))
random.shuffle(shuffle_list)
for idx, item in enumerate(shuffle_list):
if idx < train_num:
image_list_train.append(image_set[item])
else:
image_list_eval.append(image_set[item])
start_examples += num_examples
return image_list_train, image_list_eval, class_name
def get_image_classese_train_eval(data_path):
"""get image list when data struct is
{
|-- data_url
|-- train
|-- class_1
|-- a.jpg
...
|-- class_2
|-- b.jpg
...
...
|-- eval
|-- class_1
|-- c.jpg
...
|-- class_2
|-- d.jpg
}
:param data_path: data store url
Returns:
train_data_list,
eval_data_list,
"""
image_label_name = {}
image_list_train = []
label_index = 0
class_name = []
# get all labeled train data
image_list_set = file.list_directory(os.path.join(data_path, 'train'))
assert not image_list_set == [], 'there is no file in data url'
for i in image_list_set:
if file.is_directory(os.path.join(data_path, 'train', i)):
img_list = file.list_directory(os.path.join(data_path, 'train', i))
for j in img_list:
label = label_index
class_name.append(i)
if not '.xml' in j and not '.txt' in j:
image_list_train.append([os.path.join(data_path, 'train', i, j), label])
image_label_name[i] = label_index
label_index += 1
# get all labeled eval data
image_list_eval = []
image_list_set = file.list_directory(os.path.join(data_path, 'eval'))
assert not image_list_set == [], 'there is no file in data url'
for i in image_list_set:
if file.is_directory(os.path.join(data_path, 'eval', i)):
img_list = file.list_directory(os.path.join(data_path, 'eval', i))
for j in img_list:
label = image_label_name[i]
if not '.xml' in j and not '.txt' in j:
image_list_eval.append([os.path.join(data_path, 'eval', i, j), label])
return image_list_train, image_list_eval, class_name
| 41.08747 | 119 | 0.568297 | 2,306 | 17,380 | 4.023417 | 0.070252 | 0.105734 | 0.073292 | 0.102608 | 0.877452 | 0.865919 | 0.856866 | 0.830675 | 0.813322 | 0.789071 | 0 | 0.004092 | 0.310932 | 17,380 | 422 | 120 | 41.184834 | 0.770625 | 0.198101 | 0 | 0.654618 | 0 | 0 | 0.072872 | 0.008578 | 0 | 0 | 0 | 0 | 0.036145 | 1 | 0.032129 | false | 0 | 0.024096 | 0 | 0.088353 | 0.004016 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
976fec565c11ac99fdb1eb3dcd15c4a01a1b19d2 | 4,468 | py | Python | tests/unit/output/test_output_service.py | tttgm/basketball_reference_web_scraper | 2dbd9d7bacbcfee17f08bcf8629bd7d50893761d | [
"MIT"
] | 325 | 2015-10-27T03:15:49.000Z | 2022-03-16T06:49:12.000Z | tests/unit/output/test_output_service.py | tttgm/basketball_reference_web_scraper | 2dbd9d7bacbcfee17f08bcf8629bd7d50893761d | [
"MIT"
] | 173 | 2018-10-16T04:11:05.000Z | 2022-03-29T17:52:08.000Z | tests/unit/output/test_output_service.py | tttgm/basketball_reference_web_scraper | 2dbd9d7bacbcfee17f08bcf8629bd7d50893761d | [
"MIT"
] | 97 | 2016-04-09T19:11:28.000Z | 2022-03-21T09:57:50.000Z | from unittest import TestCase, mock
from basketball_reference_web_scraper.data import OutputType, OutputWriteOption
from basketball_reference_web_scraper.output.service import OutputService
from basketball_reference_web_scraper.output.writers import OutputOptions, FileOptions
class TestOutput(TestCase):
def setUp(self):
self.values = ["some values"]
self.output_file_path = "some output file path"
self.csv_writer = mock.Mock(write=mock.Mock())
self.json_writer = mock.Mock(write=mock.Mock())
self.output_service = OutputService(json_writer=self.json_writer, csv_writer=self.csv_writer)
def test_return_values_when_output_type_is_none(self):
self.assertEqual(
self.values,
self.output_service.output(
data=self.values,
options=OutputOptions(
file_options=FileOptions.of(),
formatting_options={},
output_type=None,
),
),
)
def test_output_json_when_output_write_option_is_none_and_no_custom_options(self):
options = OutputOptions(
output_type=OutputType.JSON,
file_options=FileOptions(path=self.output_file_path, mode=None),
formatting_options={},
)
self.output_service.output(data=self.values, options=options)
self.json_writer.write.assert_called_once_with(
data=self.values,
options=options
)
def test_output_json_when_output_write_option_is_append_and_no_custom_options(self):
options = OutputOptions(
output_type=OutputType.JSON,
file_options=FileOptions(path=self.output_file_path, mode=OutputWriteOption.APPEND),
formatting_options={}
)
self.output_service.output(data=self.values, options=options)
self.json_writer.write.assert_called_once_with(data=self.values, options=options)
def test_output_json_when_output_write_option_is_none_and_custom_options(self):
options = OutputOptions(
output_type=OutputType.JSON,
file_options=FileOptions(path=self.output_file_path, mode=None),
formatting_options={
"jae": "baebae",
"bae": "jadley",
}
)
self.output_service.output(data=self.values, options=options)
self.json_writer.write.assert_called_once_with(data=self.values, options=options)
def test_output_json_when_output_write_option_is_append_and_custom_options(self):
options = OutputOptions(
output_type=OutputType.JSON,
file_options=FileOptions(path=self.output_file_path, mode=OutputWriteOption.APPEND),
formatting_options={
"jae": "baebae",
"bae": "jadley",
}
)
self.output_service.output(data=self.values, options=options)
self.json_writer.write.assert_called_once_with(data=self.values, options=options)
def test_output_csv_when_output_write_option_is_none_and_no_custom_options(self):
options = OutputOptions(
output_type=OutputType.CSV,
file_options=FileOptions(path=self.output_file_path, mode=OutputWriteOption.WRITE),
formatting_options={}
)
self.output_service.output(data=self.values, options=options)
self.csv_writer.write.assert_called_once_with(data=self.values, options=options)
def test_output_csv_when_output_write_option_is_append_and_no_custom_options(self):
options = OutputOptions(
output_type=OutputType.CSV,
file_options=FileOptions(path=self.output_file_path, mode=OutputWriteOption.APPEND),
formatting_options={}
)
self.output_service.output(data=self.values, options=options)
self.csv_writer.write.assert_called_once_with(data=self.values, options=options)
def test_raise_error_when_outputting_csv_but_unable_to_write_to_file(self):
options = OutputOptions(
output_type="jaebaebae",
file_options=FileOptions(path=self.output_file_path, mode=OutputWriteOption.APPEND),
formatting_options={}
)
self.assertRaisesRegex(
ValueError,
"Unknown output type: jaebaebae",
self.output_service.output,
data=self.values,
options=options,
)
| 40.618182 | 101 | 0.675246 | 501 | 4,468 | 5.670659 | 0.129741 | 0.059838 | 0.06899 | 0.103485 | 0.804294 | 0.780711 | 0.753256 | 0.731433 | 0.715945 | 0.697994 | 0 | 0 | 0.241719 | 4,468 | 109 | 102 | 40.990826 | 0.838548 | 0 | 0 | 0.516129 | 0 | 0 | 0.023948 | 0 | 0 | 0 | 0 | 0 | 0.086022 | 1 | 0.096774 | false | 0 | 0.043011 | 0 | 0.150538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
979f7c583ab516192c8e28645c003490bc85df0b | 45 | py | Python | tube_dl/__init__.py | creepysta/tube_dl | 6c7acb96ac3f119ca58ceed8abcfe65254317a41 | [
"MIT"
] | null | null | null | tube_dl/__init__.py | creepysta/tube_dl | 6c7acb96ac3f119ca58ceed8abcfe65254317a41 | [
"MIT"
] | null | null | null | tube_dl/__init__.py | creepysta/tube_dl | 6c7acb96ac3f119ca58ceed8abcfe65254317a41 | [
"MIT"
] | null | null | null | from tube_dl.__main__ import Youtube,Playlist | 45 | 45 | 0.888889 | 7 | 45 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 45 | 1 | 45 | 45 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
97a407a83c6f4685110f4f619bf7041fcd958967 | 49 | py | Python | gpcrmining/gpcrdb/__init__.py | drorlab/GPCR-mining | e1f3b2980629d83ac04e1904fdcd770c54e0d69c | [
"MIT"
] | 11 | 2021-04-22T21:41:58.000Z | 2022-01-13T14:57:34.000Z | gpcrmining/gpcrdb/__init__.py | drorlab/GPCR-mining | e1f3b2980629d83ac04e1904fdcd770c54e0d69c | [
"MIT"
] | null | null | null | gpcrmining/gpcrdb/__init__.py | drorlab/GPCR-mining | e1f3b2980629d83ac04e1904fdcd770c54e0d69c | [
"MIT"
] | 2 | 2022-02-04T01:03:43.000Z | 2022-02-04T01:59:13.000Z | from .sequence import *
from .structure import *
| 16.333333 | 24 | 0.755102 | 6 | 49 | 6.166667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 49 | 2 | 25 | 24.5 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
97a76984528ad8ccf561827f612cc558cb3a17f9 | 29,654 | py | Python | skvectors/tests/test_cartesian_3d_vector_methods_3d.py | t-o-k/scikit-vectors | 67ac86807160253b09cb5461f718b3f772ada263 | [
"BSD-3-Clause"
] | 4 | 2019-03-27T20:41:30.000Z | 2020-07-28T19:02:03.000Z | skvectors/tests/test_cartesian_3d_vector_methods_3d.py | t-o-k/scikit-vectors | 67ac86807160253b09cb5461f718b3f772ada263 | [
"BSD-3-Clause"
] | null | null | null | skvectors/tests/test_cartesian_3d_vector_methods_3d.py | t-o-k/scikit-vectors | 67ac86807160253b09cb5461f718b3f772ada263 | [
"BSD-3-Clause"
] | 1 | 2019-09-08T23:09:15.000Z | 2019-09-08T23:09:15.000Z | """
Copyright (c) 2017 Tor Olav Kristensen, http://subcube.com
https://github.com/t-o-k/scikit-vectors
Use of this source code is governed by a BSD-license that can be found in the LICENSE file.
"""
import math
import unittest
import skvectors
class Test_Case_cartesian_3d_vector(unittest.TestCase):
create_vector_class = staticmethod(skvectors.create_class_Cartesian_3D_Vector)
pi_1_4 = math.pi/4.0
pi_1_2 = math.pi/2.0
pi_3_4 = math.pi*3.0/4.0
pi_1_1 = math.pi
sqrt_2 = math.sqrt(2.0)
sqrt_3 = math.sqrt(3.0)
sqrt_6 = math.sqrt(6.0)
sqrt_8 = math.sqrt(8.0)
polar_to_cartesian = \
[
((0.0, pi_1_4, 0.0), ( 0.0, 0.0, 0.0)),
((0.0, 0.0, -pi_1_4), ( 0.0, 0.0, 0.0)),
((2.0, 0.0, 0.0), ( 2.0, 0.0, 0.0)),
((2.0, -pi_1_1, 0.0), ( -2.0, 0.0, 0.0)),
((2.0, pi_1_1, 0.0), ( -2.0, 0.0, 0.0)),
((2.0, 0.0, -pi_1_1), ( -2.0, 0.0, 0.0)),
((2.0, 0.0, pi_1_1), ( -2.0, 0.0, 0.0)),
((2.0, -pi_1_1, -pi_1_1), ( 2.0, 0.0, 0.0)),
((2.0, -pi_1_1, pi_1_1), ( 2.0, 0.0, 0.0)),
((2.0, pi_1_1, -pi_1_1), ( 2.0, 0.0, 0.0)),
((2.0, pi_1_1, pi_1_1), ( 2.0, 0.0, 0.0)),
((2.0, -pi_1_2, 0.0), ( 0.0, -2.0, 0.0)),
((2.0, pi_1_2, 0.0), ( 0.0, 2.0, 0.0)),
((2.0, 0.0, -pi_1_2), ( 0.0, 0.0, -2.0)),
((2.0, 0.0, pi_1_2), ( 0.0, 0.0, 2.0)),
((2.0, -pi_1_2, -pi_1_2), ( 0.0, 0.0, -2.0)),
((2.0, -pi_1_2, pi_1_2), ( 0.0, 0.0, 2.0)),
((2.0, pi_1_2, -pi_1_2), ( 0.0, 0.0, -2.0)),
((2.0, pi_1_2, pi_1_2), ( 0.0, 0.0, 2.0)),
((2.0, -pi_1_4, 0.0), ( sqrt_2, -sqrt_2, 0.0)),
((2.0, pi_1_4, 0.0), ( sqrt_2, sqrt_2, 0.0)),
((2.0, 0.0, -pi_1_4), ( sqrt_2, 0.0, -sqrt_2)),
((2.0, 0.0, pi_1_4), ( sqrt_2, 0.0, sqrt_2)),
((2.0, -pi_1_4, -pi_1_4), ( 1.0, -1.0, -sqrt_2)),
((2.0, -pi_1_4, pi_1_4), ( 1.0, -1.0, sqrt_2)),
((2.0, pi_1_4, -pi_1_4), ( 1.0, 1.0, -sqrt_2)),
((2.0, pi_1_4, pi_1_4), ( 1.0, 1.0, sqrt_2)),
((2.0, -pi_1_2, -pi_1_4), ( 0.0, -sqrt_2, -sqrt_2)),
((2.0, -pi_1_2, pi_1_4), ( 0.0, -sqrt_2, sqrt_2)),
((2.0, pi_1_2, -pi_1_4), ( 0.0, sqrt_2, -sqrt_2)),
((2.0, pi_1_2, pi_1_4), ( 0.0, sqrt_2, sqrt_2)),
((2.0, -pi_1_4, -pi_1_2), ( 0.0, 0.0, -2.0)),
((2.0, -pi_1_4, pi_1_2), ( 0.0, 0.0, 2.0)),
((2.0, pi_1_4, -pi_1_2), ( 0.0, 0.0, -2.0)),
((2.0, pi_1_4, pi_1_2), ( 0.0, 0.0, 2.0)),
((5.0, math.atan2(-4.0, -3.0), 0.0), (-3.0, -4.0, 0.0)),
((5.0, math.atan2(-4.0, 3.0), 0.0), ( 3.0, -4.0, 0.0)),
((5.0, math.atan2( 4.0, -3.0), 0.0), (-3.0, 4.0, 0.0)),
((5.0, math.atan2( 4.0, 3.0), 0.0), ( 3.0, 4.0, 0.0)),
((13.0, 0.0, math.atan2(-5.0, -12.0)), (-12.0, 0.0, -5.0)),
((13.0, 0.0, math.atan2(-5.0, 12.0)), ( 12.0, 0.0, -5.0)),
((13.0, 0.0, math.atan2( 5.0, -12.0)), (-12.0, 0.0, 5.0)),
((13.0, 0.0, math.atan2( 5.0, 12.0)), ( 12.0, 0.0, 5.0))
]
cartesian_to_polar = \
[
(( 0.0, 0.0, 0.0), ( 0.0, 0.0, 0.0)),
((-2.0, 0.0, 0.0), ( 2.0, pi_1_1, 0.0)),
(( 2.0, 0.0, 0.0), ( 2.0, 0.0, 0.0)),
(( 0.0, -2.0, 0.0), ( 2.0, -pi_1_2, 0.0)),
(( 0.0, 2.0, 0.0), ( 2.0, pi_1_2, 0.0)),
(( 0.0, 0.0, -2.0), ( 2.0, 0.0, -pi_1_2)),
(( 0.0, 0.0, 2.0), ( 2.0, 0.0, pi_1_2)),
((-2.0, -2.0, 0.0), (sqrt_8, -pi_3_4, 0.0)),
((-2.0, 2.0, 0.0), (sqrt_8, pi_3_4, 0.0)),
(( 2.0, -2.0, 0.0), (sqrt_8, -pi_1_4, 0.0)),
(( 2.0, 2.0, 0.0), (sqrt_8, pi_1_4, 0.0)),
((-2.0, 0.0, -2.0), (sqrt_8, pi_1_1, -pi_1_4)),
((-2.0, 0.0, 2.0), (sqrt_8, pi_1_1, pi_1_4)),
(( 2.0, 0.0, -2.0), (sqrt_8, 0.0, -pi_1_4)),
(( 2.0, 0.0, 2.0), (sqrt_8, 0.0, pi_1_4)),
(( 0.0, -2.0, -2.0), (sqrt_8, -pi_1_2, -pi_1_4)),
(( 0.0, -2.0, 2.0), (sqrt_8, -pi_1_2, pi_1_4)),
(( 0.0, 2.0, -2.0), (sqrt_8, pi_1_2, -pi_1_4)),
(( 0.0, 2.0, 2.0), (sqrt_8, pi_1_2, pi_1_4))
]
@classmethod
def setUpClass(cls):
cls.V3D = \
cls.create_vector_class(
name = 'V3D',
component_names = 'xyz',
brackets = '<>',
sep = ', ',
cnull = 0,
cunit = 1,
functions = None
)
@classmethod
def tearDownClass(cls):
del cls.V3D
def test_from_polar(self):
fail_msg = "Problem with class method 'from_polar'"
v = \
self.V3D.from_polar(
radius = 5.0,
azimuth = math.atan2(-4.0, 3.0),
inclination = -math.pi/4.0
)
x, y, z = v.component_values()
self.assertAlmostEqual(x, math.sqrt(4.5), msg=fail_msg)
self.assertAlmostEqual(y, -math.sqrt(8.0), msg=fail_msg)
self.assertAlmostEqual(z, -math.sqrt(12.5), msg=fail_msg)
def verify_from_polar(test_data):
for polar_coord, cartesian_coord in test_data:
expected_x, expected_y, expected_z = cartesian_coord
v = self.V3D.from_polar(*polar_coord)
x, y, z = v.component_values()
with self.subTest(polar_coord=polar_coord, cartesian_coord=cartesian_coord):
self.assertAlmostEqual(x, expected_x, msg=fail_msg)
self.assertAlmostEqual(y, expected_y, msg=fail_msg)
self.assertAlmostEqual(z, expected_z, msg=fail_msg)
verify_from_polar(self.polar_to_cartesian)
### TODO: Add test with negative radius. Should fail.
def test_cross(self):
fail_msg = "Problem with method 'cross'"
u = self.V3D(0.0, -1.0, 2.0)
w = self.V3D(-3.0, 4.0, -5.0)
v = self.V3D.cross(u, w)
self.assertListEqual(v.component_values(), [ -3.0, -6.0, -3.0 ], msg=fail_msg)
u = self.V3D(0, 0, 0)
w = self.V3D(0, 0, 0)
v = self.V3D.cross(u, w)
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(0, 0, 0)
w = self.V3D(0, -1, 2)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(-3, 4, -5)
w = self.V3D(0, 0, 0)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(-1, 0, 0)
w = self.V3D(3, 0, 0)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(0, 2, 0)
w = self.V3D(0, 1, 0)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(0, 0, 2)
w = self.V3D(0, 0, -3)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(2, 0, 0)
w = self.V3D(0, 3, 0)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 0, 0, 6 ], msg=fail_msg)
u = self.V3D(0, -3, 0)
w = self.V3D(-2, 0, 0)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 0, 0, -6 ], msg=fail_msg)
u = self.V3D(0, -2, 0)
w = self.V3D(0, 0, 3)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ -6, 0, 0 ], msg=fail_msg)
u = self.V3D(0, 0, 3)
w = self.V3D(0, 2, 0)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ -6, 0, 0 ], msg=fail_msg)
u = self.V3D(0, 0, -3)
w = self.V3D(2, 0, 0)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 0, -6, 0 ], msg=fail_msg)
u = self.V3D(-3, 0, 0)
w = self.V3D(0, 0, -2)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 0, -6, 0 ], msg=fail_msg)
u = self.V3D(-2.0, -3.0, -5.0)
w = self.V3D(7.0, 13.0, 11.0)
v = u.cross(w)
self.assertListEqual(v.component_values(), [ 32.0, -13.0, -5.0 ], msg=fail_msg)
u = self.V3D(-2.0, -3.0, -5.0)
v = u.cross(-1.5)
self.assertListEqual(v.component_values(), [ -3.0, 4.5, -1.5 ], msg=fail_msg)
u = self.V3D(0, 0, 0)
id_u_before = id(u)
w = self.V3D(0, 0, 0)
v = u.cross(w)
id_v_after = id(v)
self.assertNotEqual(id_u_before, id_v_after, msg=fail_msg)
def test_stp(self):
fail_msg = "Problem with method 'stp'"
u = self.V3D(0, 0, 0)
v = self.V3D(0, 0, 0)
w = self.V3D(0, 0, 0)
r = self.V3D.stp(u, v, w)
self.assertEqual(r, 0, msg=fail_msg)
u = self.V3D(0, -1, 2)
v = self.V3D(-3, 4, -5)
w = self.V3D(3, 1, 2)
r = self.V3D.stp(u, v, w)
self.assertEqual(r, -21, msg=fail_msg)
u = self.V3D(3, 1, 2)
v = self.V3D(0, 0, 0)
w = self.V3D(-3, 4, -5)
r = u.stp(v, w)
self.assertEqual(r, 0, msg=fail_msg)
u = self.V3D(3.0, 1.0, 2.0)
v = self.V3D(0.0, -1.0, 2.0)
w = self.V3D(0.0, 0.0, 0.0)
r = u.stp(v, w)
self.assertEqual(r, 0.0, msg=fail_msg)
u = self.V3D(2.0, 1.0, 3.0)
v = self.V3D(0.0, -1.0, 2.0)
w = self.V3D(0.0, 0.0, 0.0)
r = u.stp(v, w)
self.assertEqual(r, 0.0, msg=fail_msg)
u = self.V3D(3.0, 1.0, 2.0)
v = self.V3D(-3.5, 4.5, -5.0)
w = self.V3D(0.0, 1.0, -2.5)
r = u.stp(v, w)
self.assertEqual(r, -34.5, msg=fail_msg)
u = self.V3D(3.0, 1.0, 2.0)
v = self.V3D(-3.5, 4.5, -5.0)
r = u.stp(-2.0, v)
self.assertEqual(r, 22.0, msg=fail_msg)
u = self.V3D(3.0, 1.0, 2.0)
v = self.V3D(-3.5, 4.5, -5.0)
r = u.stp(v, 3.0)
self.assertEqual(r, 33.0, msg=fail_msg)
def test_vtp(self):
fail_msg = "Problem with method 'vtp'"
u = self.V3D(0, 0, 0)
w = self.V3D(0, 0, 0)
v = self.V3D.vtp(u, w, self.V3D(0, 0, 0))
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(-3, 4, -5)
w = self.V3D(3, 1, 2)
v = self.V3D.vtp(u, w, self.V3D(0, -1, 2))
self.assertListEqual(v.component_values(), [-42, -29, 2], msg=fail_msg)
u = self.V3D(0, -1, 2)
w = self.V3D(-3, 4, 5)
v = self.V3D.vtp(u, w, self.V3D(3, 1, 2))
self.assertListEqual(v.component_values(), [ -27, 6, 3 ], msg=fail_msg)
u = self.V3D(0, 0, 0)
w = self.V3D(-3, 4, -5)
v = self.V3D(3, 1, 2).vtp(u, w)
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(0.0, -1.0, 2.0)
w = self.V3D(0.0, 0.0, 0.0)
v = self.V3D(3.0, 1.0, 2.0).vtp(u, w)
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(0.0, -1.0, 2.0)
w = self.V3D(0.0, 0.0, 0.0)
v = self.V3D(2.0, 1.0, 3.0).vtp(u, w)
self.assertListEqual(v.component_values(), [ 0, 0, 0 ], msg=fail_msg)
u = self.V3D(-3.5, 4.5, -5.0)
w = self.V3D(0.0, 1.0, -2.5)
v = self.V3D(3.0, 1.0, 2.0).vtp(u, w)
self.assertListEqual(v.component_values(), [14.0, -2.0, -20.0], msg=fail_msg)
u = self.V3D(-3.5, 4.5, -5.0)
w = self.V3D(3.0, 1.0, 2.0)
v = u.vtp(2.5, w)
self.assertListEqual(v.component_values(), [-10.0, -30.0, -20.0], msg=fail_msg)
u = self.V3D(3.0, 1.0, 2.0)
w = self.V3D(-3.5, 4.5, 5.0)
v = u.vtp(w, -3.0)
self.assertListEqual(v.component_values(), [75.0, -69.0, -78.0], msg=fail_msg)
u = self.V3D(0, 0, 0)
id_u_before = id(u)
w = self.V3D(0, 0, 0)
v = u.vtp(w, self.V3D(0, 0, 0))
id_v_after = id(v)
self.assertNotEqual(id_u_before, id_v_after, msg=fail_msg)
def test_sin(self):
fail_msg = "Problem with method 'sin'"
u = self.V3D(2, 0, 0)
w = self.V3D(3, 0, 0)
r = self.V3D.sin(u, w)
self.assertAlmostEqual(r, 0.0, msg=fail_msg)
u = self.V3D(0, -3, 0)
w = self.V3D(0, 2, 0)
r = self.V3D.sin(u, w)
self.assertAlmostEqual(r, 0.0, msg=fail_msg)
u = self.V3D(-3, -4, 0)
w = self.V3D(4, -3, 0)
r = u.sin(w)
self.assertAlmostEqual(r, 1.0, msg=fail_msg)
u = self.V3D(4.5, -3.0, -1.5)
r = u.sin(-3.0)
self.assertAlmostEqual(r, 1.0, msg=fail_msg)
u = self.V3D(2.0, 0.0, 0.0)
w = self.V3D(3.0, 0.0, -3.0)
r = u.sin(w)
self.assertAlmostEqual(r, math.sqrt(2.0)/2.0, msg=fail_msg)
u = self.V3D(0.0, 2.0, 0.0)
w = self.V3D(0.0, -1.0, -math.sqrt(3.0))
r = u.sin(w)
self.assertAlmostEqual(r, math.sqrt(3.0)/2, msg=fail_msg)
u = self.V3D(0, 0, 0)
w = self.V3D(0, 0, 0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
r = u.sin(w)
u = self.V3D(0, 0, 0)
w = self.V3D(-3, 4, -5)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
r = u.sin(w)
u = self.V3D(0, -1, 2)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
r = u.sin(0)
def test_rotate_x(self):
fail_msg = "Problem with method 'rotate_x'"
u = self.V3D(0.0, 0.0, 0.0)
v = self.V3D.rotate_x(u, 2*math.pi)
self.assertListEqual(v.component_values(), [ 0.0, 0.0, 0.0 ], msg=fail_msg)
u = self.V3D(1.0, -1.5, 2.0)
v = u.rotate_x(0.0)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 1.0, msg=fail_msg)
self.assertAlmostEqual(y, -1.5, msg=fail_msg)
self.assertAlmostEqual(z, 2.0, msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
v = u.rotate_x(math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, -3.0, msg=fail_msg)
self.assertAlmostEqual(y, -4.0, msg=fail_msg)
self.assertAlmostEqual(z, 5.0, msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
v = u.rotate_x(3*math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, -3.0, msg=fail_msg)
self.assertAlmostEqual(y, -4.0, msg=fail_msg)
self.assertAlmostEqual(z, 5.0, msg=fail_msg)
u = self.V3D(-3.5, 4.5, -5.5)
v = u.rotate_x(-5*math.pi/2)
x, y, z = v.component_values()
self.assertAlmostEqual(x, -3.5, msg=fail_msg)
self.assertAlmostEqual(y, -5.5, msg=fail_msg)
self.assertAlmostEqual(z, -4.5, msg=fail_msg)
### TODO: Add more tests
def test_rotate_y(self):
fail_msg = "Problem with method 'rotate_y'"
u = self.V3D(0.0, 0.0, 0.0)
v = self.V3D.rotate_y(u, 2*math.pi)
self.assertListEqual(v.component_values(), [ 0.0, 0.0, 0.0 ], msg=fail_msg)
u = self.V3D(1.0, -1.5, 2.0)
v = u.rotate_y(0.0)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 1.0, msg=fail_msg)
self.assertAlmostEqual(y, -1.5, msg=fail_msg)
self.assertAlmostEqual(z, 2.0, msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
v = u.rotate_y(math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 3.0, msg=fail_msg)
self.assertAlmostEqual(y, 4.0, msg=fail_msg)
self.assertAlmostEqual(z, 5.0, msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
v = u.rotate_y(3*math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 3.0, msg=fail_msg)
self.assertAlmostEqual(y, 4.0, msg=fail_msg)
self.assertAlmostEqual(z, 5.0, msg=fail_msg)
u = self.V3D(-3.5, 4.5, -5.5)
v = u.rotate_y(-5*math.pi/2)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 5.5, msg=fail_msg)
self.assertAlmostEqual(y, 4.5, msg=fail_msg)
self.assertAlmostEqual(z, -3.5, msg=fail_msg)
### TODO: Add more tests
def test_rotate_z(self):
fail_msg = "Problem with method 'rotate_z'"
u = self.V3D(0.0, 0.0, 0.0)
v = self.V3D.rotate_z(u, 2*math.pi)
self.assertListEqual(v.component_values(), [ 0.0, 0.0, 0.0 ], msg=fail_msg)
u = self.V3D(1.0, -1.5, 2.0)
v = u.rotate_z(0.0)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 1.0, msg=fail_msg)
self.assertAlmostEqual(y, -1.5, msg=fail_msg)
self.assertAlmostEqual(z, 2.0, msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
v = u.rotate_z(math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 3.0, msg=fail_msg)
self.assertAlmostEqual(y, -4.0, msg=fail_msg)
self.assertAlmostEqual(z, -5.0, msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
v = u.rotate_z(3*math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 3.0, msg=fail_msg)
self.assertAlmostEqual(y, -4.0, msg=fail_msg)
self.assertAlmostEqual(z, -5.0, msg=fail_msg)
u = self.V3D(-3.5, 4.5, -5.5)
v = u.rotate_z(-5*math.pi/2)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 4.5, msg=fail_msg)
self.assertAlmostEqual(y, 3.5, msg=fail_msg)
self.assertAlmostEqual(z, -5.5, msg=fail_msg)
### TODO: Add more tests
def test_axis_rotate(self):
fail_msg = "Problem with method 'axis_rotate'"
u = self.V3D(0.0, 0.0, 0.0)
w = self.V3D(0.0, 1.0, 0.0)
v = self.V3D.axis_rotate(u, w, 0.0)
self.assertListEqual(v.component_values(), [ 0.0, 0.0, 0.0 ], msg=fail_msg)
u = self.V3D(2.0, 0.0, 0.0)
w = self.V3D(0.0, -3.0, 0.0)
v = self.V3D.axis_rotate(u, w, math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, -2.0, msg=fail_msg)
self.assertAlmostEqual(y, 0.0, msg=fail_msg)
self.assertAlmostEqual(z, 0.0, msg=fail_msg)
u = self.V3D(0.0, 0.0, 3.0)
w = self.V3D(-2.0, 0.0, 0.0)
v = u.axis_rotate(w, -math.pi/2)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 0.0, msg=fail_msg)
self.assertAlmostEqual(y, -3.0, msg=fail_msg)
self.assertAlmostEqual(z, 0.0, msg=fail_msg)
u = self.V3D(-2.0, 3.0, 0.0)
w = self.V3D(0.0, 0.0, 5.0)
v = u.axis_rotate(w, math.pi/2)
x, y, z = v.component_values()
self.assertAlmostEqual(x, -3.0, msg=fail_msg)
self.assertAlmostEqual(y, -2.0, msg=fail_msg)
self.assertAlmostEqual(z, 0.0, msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
w = self.V3D(0.0, -1.0, 2.0)
v = u.axis_rotate(w, math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 3.0, msg=fail_msg)
self.assertAlmostEqual(y, 1.6, msg=fail_msg)
self.assertAlmostEqual(z, -6.2, msg=fail_msg)
u = self.V3D(3.0, -4.0, 5.0)
w = self.V3D(0.0, 2.5, 2.0)
v = u.axis_rotate(w, 4*math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 3.0, msg=fail_msg)
self.assertAlmostEqual(y, -4.0, msg=fail_msg)
self.assertAlmostEqual(z, 5.0, msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
w = self.V3D(0.0, -2.5, -2.0)
v = u.axis_rotate(w, -3*math.pi)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 3.0, msg=fail_msg)
self.assertAlmostEqual(y, -4.0, msg=fail_msg)
self.assertAlmostEqual(z, 5.0, msg=fail_msg)
u = self.V3D(0.0, 0.0, 0.0)
id_u_before = id(u)
w = self.V3D(0.0, 1.0, 0.0)
v = u.axis_rotate(w, 0.0)
id_v_after = id(v)
self.assertNotEqual(id_u_before, id_v_after, msg=fail_msg)
u = self.V3D(0.0, 0.0, 0.0)
w = self.V3D(0.0, 0.0, 0.0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
v = u.axis_rotate(w, 0.0)
u = self.V3D(-3.0, 4.0, -5.0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
v = u.axis_rotate(0.0, 0.0)
### TODO: Add more tests
def test_reorient(self):
fail_msg = "Problem with method 'reorient'"
u = self.V3D(0.0, 0.0, 0.0)
w1 = self.V3D(1.0, 0.0, 0.0)
w2 = self.V3D(0.0, 1.0, 0.0)
v = self.V3D.reorient(u, w1, w2)
self.assertListEqual(v.component_values(), [ 0.0, 0.0, 0.0 ], msg=fail_msg)
u = self.V3D(2.0, -3.0, 4.0)
w1 = self.V3D(0.0, -0.5, 0.0)
w2 = self.V3D(0.0, -1.5, 0.0)
v = self.V3D.reorient(u, w1, w2)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 2.0, msg=fail_msg)
self.assertAlmostEqual(y, -3.0, msg=fail_msg)
self.assertAlmostEqual(z, 4.0, msg=fail_msg)
u = self.V3D(2.0, -3.0, 4.0)
w1 = self.V3D(-2.5, 0.0, 0.0)
w2 = self.V3D(0.0, 0.0, 1.0)
v = u.reorient(w1, w2)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 4.0, msg=fail_msg)
self.assertAlmostEqual(y, -3.0, msg=fail_msg)
self.assertAlmostEqual(z, -2.0, msg=fail_msg)
u = self.V3D(-1.0, 1.5, 2.0)
w1 = self.V3D(-2.5, 2.5, 0.0)
w2 = self.V3D(0.0, 0.5, 0.5)
v = u.reorient(w1, w2)
x, y, z = v.component_values()
self.assertAlmostEqual(x, 1.0, msg=fail_msg)
self.assertAlmostEqual(y, 0.0, msg=fail_msg)
self.assertAlmostEqual(z, 2.5, msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
w1 = self.V3D(0.5, -1.5, 2.0)
w2 = self.V3D(0.5, -1.5, 2.0)
v = u.reorient(w1, w2)
x, y, z = v.component_values()
self.assertAlmostEqual(x, -3.0, msg=fail_msg)
self.assertAlmostEqual(y, 4.0, msg=fail_msg)
self.assertAlmostEqual(z, -5.0, msg=fail_msg)
u = self.V3D(2.0, -3.0, 4.0)
w1 = self.V3D(0.0, 0.0, 0.0)
w2 = self.V3D(1.0, 0.0, 0.0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
v = self.V3D.reorient(u, w1, w2)
u = self.V3D(-1.0, -1.5, 2.0)
w1 = self.V3D(0.0, 1.0, 0.0)
w2 = self.V3D(0.0, 0.0, 0.0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
v = self.V3D.reorient(u, w1, w2)
u = self.V3D(-3, 4, -5)
w1 = self.V3D(0, 0, 0)
w2 = self.V3D(0, 0, 0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
v = self.V3D.reorient(u, w1, w2)
u = self.V3D(-3.0, 4.0, -5.0)
w = self.V3D(0.5, -1.5, 2.0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
v = u.reorient(w, -w)
u = self.V3D(-3.0, 4.0, -5.0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
v = u.reorient(0, 0)
u = self.V3D(-3.0, 4.0, -5.0)
w = self.V3D(0.5, -1.5, 2.0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
v = u.reorient(0, w)
u = self.V3D(-3.0, 4.0, -5.0)
w = self.V3D(0.5, -1.5, 2.0)
with self.assertRaises(ZeroDivisionError, msg=fail_msg):
v = u.reorient(w, 0)
### TODO: Add more tests
def test_are_parallel(self):
fail_msg = "Problem with method 'are_parallel'"
u = self.V3D(0, 0, 0)
w = self.V3D(0, 0, 0)
self.assertTrue(u.are_parallel(w), msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
w = self.V3D(0.0, 0.0, 0.0)
self.assertTrue(u.are_parallel(w), msg=fail_msg)
u = self.V3D(0.0, 0.0, 0.0)
w = self.V3D(-1.0, -1.5, 2.0)
self.assertTrue(u.are_parallel(w), msg=fail_msg)
u = self.V3D(-3, 4, -5)
w = self.V3D(-3.0, 4.0, -5.0)
self.assertTrue(u.are_parallel(w), msg=fail_msg)
u = self.V3D(1.5, -2.0, 2.5)
w = self.V3D(-3.0, 4.0, -5.0)
self.assertTrue(u.are_parallel(w), msg=fail_msg)
u = self.V3D(1.0, 0.0, 0.0)
w = self.V3D(0.0, 1.0, 0.0)
self.assertFalse(u.are_parallel(w), msg=fail_msg)
u = self.V3D(1.0, 0.0, 0.0)
w = self.V3D(0.0, 0.0, 1.0)
self.assertFalse(u.are_parallel(w), msg=fail_msg)
u = self.V3D(0.0, 1.0, 0.0)
w = self.V3D(0.0, 0.0, 1.0)
self.assertFalse(u.are_parallel(w), msg=fail_msg)
u = self.V3D(-3.0, 4.0, -5.0)
w = self.V3D(-3.0, -4.0, -5.0)
self.assertFalse(u.are_parallel(w), msg=fail_msg)
u = self.V3D(1.5, -2.0, 2.5)
w = self.V3D(1.5, 2.5, -2.0)
self.assertFalse(u.are_parallel(w), msg=fail_msg)
u = self.V3D(1.5, 2.5, -2.0)
w = self.V3D(-3.0, -4.0, -5.0)
self.assertFalse(u.are_parallel(w), msg=fail_msg)
### TODO: Add more tests
def test_polar_as_dict(self):
fail_msg = "Problem with method 'polar_as_dict'"
v = self.V3D(math.sqrt(4.5), -math.sqrt(8.0), -math.sqrt(12.5))
polar = v.polar_as_dict()
radius = polar['radius']
azimuth = polar['azimuth']
inclination = polar['inclination']
self.assertAlmostEqual(radius, 5.0, msg=fail_msg)
self.assertAlmostEqual(azimuth, math.atan2(-4.0, 3.0), msg=fail_msg)
self.assertAlmostEqual(inclination, -math.pi/4, msg=fail_msg)
def verify_polar_as_dict(test_data):
for cartesian_coord, polar_coord in test_data:
expected_radius, expected_azimuth, expected_inclination = polar_coord
v = self.V3D(*cartesian_coord)
polar = v.polar_as_dict()
radius = polar['radius']
azimuth = polar['azimuth']
inclination = polar['inclination']
with self.subTest(cartesian_coord=cartesian_coord, polar_coord=polar_coord):
self.assertAlmostEqual(radius, expected_radius, msg=fail_msg)
self.assertAlmostEqual(azimuth, expected_azimuth, msg=fail_msg)
self.assertAlmostEqual(inclination, expected_inclination, msg=fail_msg)
verify_polar_as_dict(self.cartesian_to_polar)
def test_radius(self):
fail_msg = "Problem with property method 'radius'"
v = self.V3D(-3.0, 0.0, 4.0)
r = v.radius
self.assertAlmostEqual(r, 5.0, msg=fail_msg)
v = self.V3D(1.0, -1.0, 2.0)
r = v.radius
self.assertAlmostEqual(r, math.sqrt(6.0), msg=fail_msg)
v = self.V3D(math.sqrt(4.5), -math.sqrt(8.0), -math.sqrt(12.5))
r = v.radius
self.assertAlmostEqual(r, 5.0, msg=fail_msg)
def verify_radius(test_data):
for cartesian_coord, polar_coord in test_data:
expected_radius = polar_coord[0]
v = self.V3D(*cartesian_coord)
radius = v.radius
with self.subTest(cartesian_coord=cartesian_coord, radius=expected_radius):
self.assertAlmostEqual(radius, expected_radius, msg=fail_msg)
verify_radius(self.cartesian_to_polar)
def test_azimuth(self):
fail_msg = "Problem with property method 'azimuth'"
v = self.V3D(1.0, -1.0, 2.0)
r = v.azimuth
self.assertAlmostEqual(r, -math.pi/4, msg=fail_msg)
v = self.V3D(math.sqrt(4.5), -math.sqrt(8.0), -math.sqrt(12.5))
r = v.azimuth
self.assertAlmostEqual(r, math.atan2(-4.0, 3.0), msg=fail_msg)
def verify_azimuth(test_data):
for cartesian_coord, polar_coord in test_data:
expected_azimuth = polar_coord[1]
v = self.V3D(*cartesian_coord)
azimuth = v.azimuth
with self.subTest(cartesian_coord=cartesian_coord, azimuth=expected_azimuth):
self.assertAlmostEqual(azimuth, expected_azimuth, msg=fail_msg)
verify_azimuth(self.cartesian_to_polar)
def test_inclination(self):
fail_msg = "Problem with property method 'inclination'"
v = self.V3D(1.0, -1.0, 2.0)
r = v.inclination
self.assertAlmostEqual(r, math.atan2(2, math.sqrt(2)), msg=fail_msg)
v = self.V3D(math.sqrt(4.5), -math.sqrt(8.0), -math.sqrt(12.5))
r = v.inclination
self.assertAlmostEqual(r, -math.pi/4, msg=fail_msg)
def verify_inclination(test_data):
for cartesian_coord, polar_coord in test_data:
expected_inclination = polar_coord[2]
v = self.V3D(*cartesian_coord)
inclination = v.inclination
with self.subTest(cartesian_coord=cartesian_coord, inclination=expected_inclination):
self.assertAlmostEqual(inclination, expected_inclination, msg=fail_msg)
verify_inclination(self.cartesian_to_polar)
class Test_Case_tolerant_cartesian_3d_vector(Test_Case_cartesian_3d_vector):
create_vector_class = staticmethod(skvectors.create_class_Tolerant_Cartesian_3D_Vector)
# def test_are_parallel(self):
if __name__ == "__main__":
unittest.main()
| 40.018893 | 101 | 0.514433 | 5,118 | 29,654 | 2.852872 | 0.03009 | 0.072598 | 0.061434 | 0.041093 | 0.886378 | 0.863845 | 0.821998 | 0.756729 | 0.716321 | 0.681186 | 0 | 0.115833 | 0.305659 | 29,654 | 740 | 102 | 40.072973 | 0.593298 | 0.013354 | 0 | 0.52126 | 0 | 0 | 0.018647 | 0 | 0 | 0 | 0 | 0.001351 | 0.247244 | 1 | 0.034646 | false | 0 | 0.004724 | 0 | 0.061417 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
97cfff6b2e6eb7114e4d9140f561e840bfe1b6a6 | 82 | py | Python | sdt/__init__.py | helvecioneto/sdt | e10ee2696dc767a7fe3ad0cac00af29f38ef1609 | [
"Apache-2.0"
] | 1 | 2020-07-13T17:26:30.000Z | 2020-07-13T17:26:30.000Z | sdt/__init__.py | helvecioneto/sdt | e10ee2696dc767a7fe3ad0cac00af29f38ef1609 | [
"Apache-2.0"
] | null | null | null | sdt/__init__.py | helvecioneto/sdt | e10ee2696dc767a7fe3ad0cac00af29f38ef1609 | [
"Apache-2.0"
] | null | null | null | from modules.main_menu import mainMenu
from modules.quali.dqc import main_qualify
| 27.333333 | 42 | 0.865854 | 13 | 82 | 5.307692 | 0.692308 | 0.318841 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 82 | 2 | 43 | 41 | 0.932432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c1924cf853318e438219a37bab192c346b63d0bf | 202 | py | Python | application.py | Lwando92/flaskappteam | aedd2aba15a8a6cf5bab8dd6942064301cfc0ba1 | [
"MIT"
] | null | null | null | application.py | Lwando92/flaskappteam | aedd2aba15a8a6cf5bab8dd6942064301cfc0ba1 | [
"MIT"
] | null | null | null | application.py | Lwando92/flaskappteam | aedd2aba15a8a6cf5bab8dd6942064301cfc0ba1 | [
"MIT"
] | 1 | 2019-04-11T13:54:33.000Z | 2019-04-11T13:54:33.000Z |
from src.flaskbasic import application ,db
from src.flaskbasic.form import *
from src.flaskbasic.models import *
from src.flaskbasic.db import *
if __name__ == '__main__':
application.run(debug=True)
| 25.25 | 42 | 0.782178 | 28 | 202 | 5.357143 | 0.5 | 0.186667 | 0.453333 | 0.306667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118812 | 202 | 7 | 43 | 28.857143 | 0.842697 | 0 | 0 | 0 | 0 | 0 | 0.039801 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
c1a8da0dd4b734a70e119ea6f3eec14aff34a7e2 | 144 | py | Python | holite/GAMES/__init__.py | light-technology/holite | 29706b87b0bebf58f1ad9a3b069106c5132d00c0 | [
"MIT"
] | null | null | null | holite/GAMES/__init__.py | light-technology/holite | 29706b87b0bebf58f1ad9a3b069106c5132d00c0 | [
"MIT"
] | null | null | null | holite/GAMES/__init__.py | light-technology/holite | 29706b87b0bebf58f1ad9a3b069106c5132d00c0 | [
"MIT"
] | null | null | null | from .main import *
from .cows_and_bulls import cows_and_bulls
from .tic_tac_toe import tic_tac_toe
__all__ = ['cows_and_bulls', 'tic_tac_toe'] | 28.8 | 43 | 0.805556 | 26 | 144 | 3.846154 | 0.384615 | 0.21 | 0.36 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 144 | 5 | 43 | 28.8 | 0.78125 | 0 | 0 | 0 | 0 | 0 | 0.172414 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c1cac2e69cb764360f5ecd86e5456a4005b8bec5 | 18,806 | py | Python | ami2py/ami_symbol_facade.py | raaghulr/ami2py | a8ba8e83f91760e8ec6c3337199eef9800a80547 | [
"MIT"
] | 10 | 2020-04-30T03:22:57.000Z | 2021-09-07T09:33:20.000Z | ami2py/ami_symbol_facade.py | raaghulr/ami2py | a8ba8e83f91760e8ec6c3337199eef9800a80547 | [
"MIT"
] | 4 | 2020-09-17T01:21:53.000Z | 2022-03-27T01:49:11.000Z | ami2py/ami_symbol_facade.py | raaghulr/ami2py | a8ba8e83f91760e8ec6c3337199eef9800a80547 | [
"MIT"
] | 4 | 2020-04-07T11:47:05.000Z | 2021-12-28T16:17:28.000Z | from construct import (
Struct,
Bytes,
GreedyRange,
PaddedString,
swapbitsinbytes,
BitsSwapped,
bytes2bits,
bits2bytes,
)
from .consts import (
DATEPACKED,
DAY,
MONTH,
YEAR,
VOLUME,
CLOSE,
OPEN,
HIGH,
LOW,
FUT,
RESERVED,
MICRO_SEC,
MILLI_SEC,
SECOND,
MINUTE,
HOUR,
AUX_1,
AUX_2,
TERMINATOR,
)
from .ami_bitstructs import EntryChunk
from .ami_construct import SymbolHeader
import struct
entry_map = [
DAY,
MONTH,
YEAR,
MICRO_SEC,
MILLI_SEC,
SECOND,
MINUTE,
HOUR,
VOLUME,
AUX_1,
AUX_2,
TERMINATOR,
CLOSE,
OPEN,
HIGH,
LOW,
FUT,
]
NUM_HEADER_BYTES = 0x4A0
OVERALL_ENTRY_BYTES = 40
TERMINATOR_DOUBLE_WORD_LENGTH = 4
Master = Struct(
"Header" / Bytes(0x4A0),
"Symbols"
/ GreedyRange(
Struct("Symbol" / PaddedString(5, "ASCII"), "Rest" / Bytes(1172 - 5))
),
)
SymbolConstruct = Struct(
"Header" / Bytes(0x4A0), "Entries" / GreedyRange(BitsSwapped(EntryChunk))
)
class AmiSymbolFacade:
def __init__(self, binary):
self.data = binary
pass
def __setitem__(self, key, item):
self.__dict__[key] = item
def __getitem__(self, key):
return self.__dict__[key]
def __repr__(self):
return repr(self.__dict__)
def __len__(self):
return len(self.__dict__)
def __delitem__(self, key):
del self.__dict__[key]
def clear(self):
return self.__dict__.clear()
def copy(self):
return self.__dict__.copy()
def has_key(self, k):
return k in self.__dict__
def update(self, *args, **kwargs):
return self.__dict__.update(*args, **kwargs)
def keys(self):
return self.__dict__.keys()
def values(self):
return self.__dict__.values()
def items(self):
return self.__dict__.items()
def pop(self, *args):
return self.__dict__.pop(*args)
def __cmp__(self, dict_):
return self.__cmp__(self.__dict__, dict_)
def __contains__(self, item):
return item in self.__dict__
def __iter__(self):
return iter(self.__dict__)
# def __unicode__(self):
# return unicode(repr(self.__dict__))
class AmiHeaderFacade:
def __init__(self):
pass
def reverse_bits(byte_data):
return int("{:08b}".format(byte_data)[::-1], 2)
def read_date(date_tuple):
values = int.from_bytes(bytes(date_tuple), "little")
return {
YEAR: values >> 52,
MONTH: (values >> 48) & 0x0F,
DAY: (values >> 43) & 0x1F,
HOUR: (values >> 38) & 0x1F,
MINUTE: (values >> 32) & 0x3F,
SECOND: (values >> 26) & 0x3F,
MILLI_SEC: (values >> 16) & 0x3FF,
MICRO_SEC: (values >> 6) & 0x3FF,
RESERVED: values & 0xE,
FUT: values & 0x1,
}
def read_date_data(entrybin):
stride = 40
start = 0
datapackbytes = zip(
entrybin[start::stride],
entrybin[start + 1 :: stride],
entrybin[start + 2 :: stride],
entrybin[start + 3 :: stride],
entrybin[start + 4 :: stride],
entrybin[start + 5 :: stride],
entrybin[start + 6 :: stride],
entrybin[start + 7 :: stride],
)
result = [el for el in map(read_date, datapackbytes)]
return result
def create_float(float_tuple):
return struct.unpack("<f", bytes(float_tuple))[0]
def float_to_bin(data):
return bytearray(struct.pack("<f", data))
def date_to_bin(day, month, year, hour=0, minute=0, second=0, mic_sec=0, milli_sec=0):
result = bytearray(8)
result[7] = year >> 4
result[6] = (result[6] & 0x0F) + (year << 4) & 0xF0
result[6] = (result[6] & 0xF0) + month
result[5] = (day << 3) + result[5] & 0xF8
return result
# Currently reading intraday data is very difficult
# YEAR: (date_tuple[7] << 4) + ((date_tuple[6] & 0xF0) >> 4),
# MONTH: date_tuple[6] & 0x0F,
# DAY: (date_tuple[5] & 0xF8) >> 3,
# # Unreversed !!!
# HOUR: (reverse_bits(date_tuple[5]) & 0x7)
# + ((reverse_bits(date_tuple[4]) & 0xC0) >> 6),
# MINUTE: (reverse_bits(date_tuple[4]) & 0x3F),
# SECOND: (reverse_bits(date_tuple[3]) & 0xFC) >> 2,
# MILLI_SEC: (reverse_bits(date_tuple[3]) & 0x3)
# + (reverse_bits(date_tuple[2]) & 0xFF),
# MICRO_SEC: (reverse_bits(date_tuple[1]) & 0xFF)
# + ((reverse_bits(date_tuple[0]) & 0xC0) >> 6),
# RESERVED: ((reverse_bits(date_tuple[0]) & 0x1E) >> 1),
# FUT: (reverse_bits(date_tuple[0]) & 0x1),
pass
class AmiSymbolDataFacade:
def __init__(self, binary=None):
self.binary = binary
self._empty = False
self.stride = OVERALL_ENTRY_BYTES
self.default_header = b"BROKDAt5SPCE\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00X\x02\x00\x00"
self.header = SymbolHeader.parse(self.default_header)
self.default_header = bytearray(self.default_header)
if not binary:
self._empty = True
self.binary = self.default_header + bytearray(TERMINATOR_DOUBLE_WORD_LENGTH)
self.binentries = self.binary[NUM_HEADER_BYTES:]
self.length = 0
self.set_length_in_header()
return
enough_bytes = len(binary) >= (NUM_HEADER_BYTES + TERMINATOR_DOUBLE_WORD_LENGTH)
if not enough_bytes:
self._empty = True
self.length = 0
self.set_length_in_header()
self.binary = self.default_header + bytearray(TERMINATOR_DOUBLE_WORD_LENGTH)
self.binentries = self.binary[NUM_HEADER_BYTES:]
return
self.binary = bytearray(self.binary)
self.header = SymbolHeader.parse(self.binary)
self.default_header = SymbolHeader.build(self.header)
self.binentries = bytearray(binary[NUM_HEADER_BYTES:])
self.length = (
len(self.binentries) - TERMINATOR_DOUBLE_WORD_LENGTH
) // OVERALL_ENTRY_BYTES
self.set_length_in_header()
def set_length_in_header(self):
self.header["Length"] = self.length
self.default_header = SymbolHeader.build(self.header)
# self.default_header[-4] = self.length & 0x00ff
# self.default_header[-3] = (self.length & 0xff00) >> 8
# self.default_header[-2] = (self.length & 0xff0000) >> 16
# self.default_header[-1] = (self.length & 0xff000000) >> 24
def _create_blank_header(self):
pass
def __len__(self):
return self.length
def __getitem__(self, item):
if self._empty:
return []
if type(item) == int:
return self._get_item_by_index(item)
if type(item) == slice:
result = []
start = self._convert_to_index(item.start)
stop = self._convert_to_index(item.stop)
step = item.step
if item.step == None:
step = 1
for i in range(start, stop, step):
result.append(self._get_item_by_index(i))
return result
def _convert_to_index(self, index):
if index >= 0:
return index
if index < 0:
return self.length + index
def _get_item_by_index(self, item):
index = item
if item < 0:
index = self.length + item
start = index * self.stride
date_tuple = self.binentries[start : (start + 8)]
return {
**read_date(date_tuple),
CLOSE: create_float(self.binentries[(start + 8) : (start + 12)]),
OPEN: create_float(self.binentries[(start + 12) : (start + 16)]),
HIGH: create_float(self.binentries[(start + 16) : (start + 20)]),
LOW: create_float(self.binentries[(start + 20) : (start + 24)]),
VOLUME: create_float(self.binentries[(start + 24) : (start + 28)]),
AUX_1: create_float(self.binentries[(start + 28) : (start + 32)]),
AUX_2: create_float(self.binentries[(start + 32) : (start + 36)]),
TERMINATOR: create_float(self.binentries[(start + 36) : (start + 40)]),
}
def __iter__(self):
pass
def __iadd__(self, other):
# assert all (k in entry_map for k in other)
minute, hour, second, micro_second, milli_second = 0, 0, 0, 0, 0
if MINUTE in other:
minute = other[MINUTE]
if HOUR in other:
hour = other[HOUR]
if SECOND in other:
second = other[SECOND]
if MICRO_SEC in other:
micro_second = other[MICRO_SEC]
if MILLI_SEC in other:
milli_second = other[MILLI_SEC]
append_bin = date_to_bin(
other[DAY],
other[MONTH],
other[YEAR],
hour,
minute,
second,
micro_second,
milli_second,
)
append_bin += float_to_bin(other[CLOSE])
append_bin += float_to_bin(other[OPEN])
append_bin += float_to_bin(other[HIGH])
append_bin += float_to_bin(other[LOW])
if VOLUME in other:
append_bin += float_to_bin(other[VOLUME])
else:
append_bin += float_to_bin(0)
if AUX_1 in other:
append_bin += float_to_bin(other[AUX_1])
else:
append_bin += float_to_bin(0)
if AUX_2 in other:
append_bin += float_to_bin(other[AUX_2])
else:
append_bin += float_to_bin(0)
if TERMINATOR in other:
append_bin += float_to_bin(other[TERMINATOR])
else:
append_bin += float_to_bin(0)
self.binentries[
-TERMINATOR_DOUBLE_WORD_LENGTH:-TERMINATOR_DOUBLE_WORD_LENGTH
] = append_bin
self.length = (
len(self.binentries) - TERMINATOR_DOUBLE_WORD_LENGTH
) // OVERALL_ENTRY_BYTES
self.set_length_in_header()
self.binary = self.default_header + self.binentries
return self
class SymbolConstructFast:
header = "Header" / Bytes(0x4A0)
entry_chunk = BitsSwapped(EntryChunk)
@classmethod
def parse(self, bin):
binentries = bin[0x4A0:]
num_bytes = len(binentries)
numits, offset = divmod(num_bytes, 0x488) # bytes
result = {}
result["Header"] = self.header.parse(bin[0:0x4A0])
result["Entries"] = []
start = 0x4A0 - offset
numits = numits + 1
result["Entries"].append(self.entry_chunk.parse(bin[0x4A0:]))
entrybin = bin[start:]
for i in range(numits):
result["Entries"].append(
[
self.entry_chunk.parse(entrybin[(i * 40) : (i * 40 + 40)]),
self.entry_chunk.parse(entrybin[(2 * i * 40) : (2 * i * 40 + 40)]),
self.entry_chunk.parse(entrybin[(3 * i * 40) : (3 * i * 40 + 40)]),
self.entry_chunk.parse(entrybin[(4 * i * 40) : (4 * i * 40 + 40)]),
self.entry_chunk.parse(entrybin[(5 * i * 40) : (5 * i * 40 + 40)]),
self.entry_chunk.parse(entrybin[(6 * i * 40) : (6 * i * 40 + 40)]),
self.entry_chunk.parse(entrybin[(7 * i * 40) : (7 * i * 40 + 40)]),
self.entry_chunk.parse(entrybin[(8 * i * 40) : (8 * i * 40 + 40)]),
self.entry_chunk.parse(entrybin[(9 * i * 40) : (9 * i * 40 + 40)]),
self.entry_chunk.parse(
entrybin[(10 * i * 40) : (10 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(11 * i * 40) : (11 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(12 * i * 40) : (12 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(13 * i * 40) : (13 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(14 * i * 40) : (14 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(15 * i * 40) : (15 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(16 * i * 40) : (16 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(17 * i * 40) : (17 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(18 * i * 40) : (18 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(19 * i * 40) : (19 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(20 * i * 40) : (20 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(21 * i * 40) : (21 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(22 * i * 40) : (22 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(23 * i * 40) : (23 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(24 * i * 40) : (24 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(25 * i * 40) : (25 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(26 * i * 40) : (26 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(27 * i * 40) : (27 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(28 * i * 40) : (28 * i * 40 + 40)]
),
self.entry_chunk.parse(
entrybin[(29 * i * 40) : (29 * i * 40 + 40)]
),
]
)
return result
| 42.165919 | 4,727 | 0.576997 | 2,778 | 18,806 | 3.762779 | 0.078834 | 0.666986 | 0.997034 | 1.325935 | 0.597819 | 0.525304 | 0.51038 | 0.494691 | 0.387257 | 0.375203 | 0 | 0.207089 | 0.267946 | 18,806 | 445 | 4,728 | 42.260674 | 0.55219 | 0.052217 | 0 | 0.327869 | 0 | 0.002732 | 0.269116 | 0.263723 | 0 | 1 | 0.00545 | 0 | 0 | 1 | 0.092896 | false | 0.013661 | 0.013661 | 0.04918 | 0.20765 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c1f377f5d5742e02a9d8304693b1cd83bfc9f163 | 205 | py | Python | login.py | LauraForde/FitKit-API | 05faf0a738b410c01b02999e47a465c58864c426 | [
"MIT"
] | null | null | null | login.py | LauraForde/FitKit-API | 05faf0a738b410c01b02999e47a465c58864c426 | [
"MIT"
] | null | null | null | login.py | LauraForde/FitKit-API | 05faf0a738b410c01b02999e47a465c58864c426 | [
"MIT"
] | null | null | null | from flask_injector import flask_injector
from providers.CouchProvider import CouchProvider
@inject(data_provider=CouchProvider)
def read_users(data_provider) -> str:
return data_provider.read_login() | 34.166667 | 49 | 0.843902 | 26 | 205 | 6.384615 | 0.576923 | 0.216867 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092683 | 205 | 6 | 50 | 34.166667 | 0.892473 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
a9af4231a4d60dc35a88275b729e2fee904d5829 | 135 | py | Python | AutoMark/InstaPy/instapy/settings.py | tyanakiev/AutoMark | 4e44a9f7c448f02bc4abc05c7a45a67fc71aa3f9 | [
"MIT"
] | 1 | 2018-02-25T06:43:13.000Z | 2018-02-25T06:43:13.000Z | AutoMark/InstaPy/instapy/settings.py | tyanakiev/AutoMark | 4e44a9f7c448f02bc4abc05c7a45a67fc71aa3f9 | [
"MIT"
] | 4 | 2021-04-17T03:55:49.000Z | 2022-02-10T10:29:08.000Z | AutoMark/InstaPy/instapy/settings.py | tyanakiev/AutoMark | 4e44a9f7c448f02bc4abc05c7a45a67fc71aa3f9 | [
"MIT"
] | null | null | null | class Settings:
database_location = "AutoMark/InstaPy/db/instapy.db"
browser_location = "AutoMark/InstaPy/assets/chromedriver"
| 33.75 | 61 | 0.777778 | 15 | 135 | 6.866667 | 0.666667 | 0.31068 | 0.446602 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118519 | 135 | 3 | 62 | 45 | 0.865546 | 0 | 0 | 0 | 0 | 0 | 0.488889 | 0.488889 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
a9e43b4a5da880d01d115596187f9fe0b5bcabee | 1,884 | py | Python | app/forms.py | ivan4oto/balkanultra | 02d7a1b99e0ca018ae4733b2a4d64fd0832399cc | [
"MIT"
] | null | null | null | app/forms.py | ivan4oto/balkanultra | 02d7a1b99e0ca018ae4733b2a4d64fd0832399cc | [
"MIT"
] | null | null | null | app/forms.py | ivan4oto/balkanultra | 02d7a1b99e0ca018ae4733b2a4d64fd0832399cc | [
"MIT"
] | null | null | null | from django import forms
from .models import UltraAthlete, SkyAthlete
class UltraAthleteForm(forms.ModelForm):
class Meta:
model = UltraAthlete
fields = [
'first_name',
'last_name',
'email',
'phone',
'gender',
'first_link',
'second_link'
]
widgets = {
'first_name': forms.TextInput(attrs={'class': "form-control", 'id': 'post-first-name'}),
'last_name': forms.TextInput(attrs={'class': "form-control", 'id': 'post-last-name'}),
'email': forms.EmailInput(attrs={'class': "form-control", 'id': 'post-mail'}),
'phone': forms.TextInput(attrs={'class': "form-control", 'id': 'post-phone'}),
'gender': forms.Select(attrs={'class': 'form-select', 'id': 'post-gender'}, choices=[('male', 'male'), ('female', 'female')]),
'first_link': forms.TextInput(attrs={'class': "form-control", 'id': 'post-first-link'}),
'second_link': forms.TextInput(attrs={'class': "form-control", 'id': 'post-second-link'}),
}
class SkyAthleteForm(forms.ModelForm):
class Meta:
model = SkyAthlete
fields = [
'first_name',
'last_name',
'email',
'phone',
'gender'
]
widgets = {
'first_name': forms.TextInput(attrs={'class': "form-control", 'id': 'post-first-name'}),
'last_name': forms.TextInput(attrs={'class': "form-control", 'id': 'post-last-name'}),
'email': forms.EmailInput(attrs={'class': "form-control", 'id': 'post-mail'}),
'phone': forms.TextInput(attrs={'class': "form-control", 'id': 'post-phone'}),
'gender': forms.Select(attrs={'class': 'form-select', 'id': 'post-gender'}, choices=[('male', 'male'), ('female', 'female')])
} | 43.813953 | 138 | 0.535032 | 189 | 1,884 | 5.269841 | 0.185185 | 0.120482 | 0.168675 | 0.210843 | 0.834337 | 0.778112 | 0.778112 | 0.778112 | 0.699799 | 0.604418 | 0 | 0 | 0.2638 | 1,884 | 43 | 139 | 43.813953 | 0.718097 | 0 | 0 | 0.55 | 0 | 0 | 0.316711 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e71d48b698f9219d9a3bf699fbbc8cdaf94e83e6 | 21 | py | Python | valheimlog/__init__.py | pstap/valheim-log-parser | ef73364f52304bce3573d65850d1e492d5061664 | [
"MIT"
] | 90 | 2020-06-18T05:32:06.000Z | 2022-03-28T13:05:17.000Z | valheimlog/__init__.py | pstap/valheim-log-parser | ef73364f52304bce3573d65850d1e492d5061664 | [
"MIT"
] | 5 | 2020-07-02T02:25:16.000Z | 2022-03-24T05:50:30.000Z | valheimlog/__init__.py | pstap/valheim-log-parser | ef73364f52304bce3573d65850d1e492d5061664 | [
"MIT"
] | 13 | 2020-06-27T07:01:54.000Z | 2022-01-18T07:31:01.000Z | from .parse import *
| 10.5 | 20 | 0.714286 | 3 | 21 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e73adf6fbb075a3c5a30a2805ec4141518509a72 | 232 | py | Python | backend/rest/MetadataInterface/plasmidClass.py | gaarangoa/ARG-inspect | 245d577f783c7ce395a730741987910097fc1981 | [
"BSD-2-Clause"
] | 6 | 2018-10-11T09:31:05.000Z | 2022-01-27T10:22:41.000Z | backend/rest/MetadataInterface/plasmidClass.py | gaarangoa/ARG-inspect | 245d577f783c7ce395a730741987910097fc1981 | [
"BSD-2-Clause"
] | 3 | 2018-05-24T22:40:09.000Z | 2021-10-12T06:53:45.000Z | backend/rest/MetadataInterface/plasmidClass.py | gaarangoa/ARG-inspect | 245d577f783c7ce395a730741987910097fc1981 | [
"BSD-2-Clause"
] | 1 | 2021-01-25T05:26:25.000Z | 2021-01-25T05:26:25.000Z | class Plasmid():
def __init__(self, database):
self.info = ""
self.database = database
self.table = 'plasmids'
def getById(self, gene_id):
return self.database.get_by_id(self.table, gene_id)
| 25.777778 | 59 | 0.62069 | 29 | 232 | 4.689655 | 0.517241 | 0.264706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.262931 | 232 | 8 | 60 | 29 | 0.795322 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0.142857 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
99b2a3364e5e7fe60b321b895a4a47ebd53d2096 | 26 | py | Python | terrascript/mysql/__init__.py | GarnerCorp/python-terrascript | ec6c2d9114dcd3cb955dd46069f8ba487e320a8c | [
"BSD-2-Clause"
] | null | null | null | terrascript/mysql/__init__.py | GarnerCorp/python-terrascript | ec6c2d9114dcd3cb955dd46069f8ba487e320a8c | [
"BSD-2-Clause"
] | null | null | null | terrascript/mysql/__init__.py | GarnerCorp/python-terrascript | ec6c2d9114dcd3cb955dd46069f8ba487e320a8c | [
"BSD-2-Clause"
] | 1 | 2018-11-15T16:23:05.000Z | 2018-11-15T16:23:05.000Z | """2019-05-28 10:49:59"""
| 13 | 25 | 0.538462 | 6 | 26 | 2.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.583333 | 0.076923 | 26 | 1 | 26 | 26 | 0 | 0.730769 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
99db1343bad2a8698d209a71c58983c507dad561 | 179 | py | Python | service-workers/service-worker/resources/scope2/worker_interception_redirect_webworker.py | wanderview/web-platform-tests | d516af2f3ea9422f10f744225a64c50b9438ff68 | [
"BSD-3-Clause"
] | 1 | 2021-01-07T18:46:45.000Z | 2021-01-07T18:46:45.000Z | service-workers/service-worker/resources/scope2/worker_interception_redirect_webworker.py | wanderview/web-platform-tests | d516af2f3ea9422f10f744225a64c50b9438ff68 | [
"BSD-3-Clause"
] | null | null | null | service-workers/service-worker/resources/scope2/worker_interception_redirect_webworker.py | wanderview/web-platform-tests | d516af2f3ea9422f10f744225a64c50b9438ff68 | [
"BSD-3-Clause"
] | null | null | null | import os
import sys
# Use the file from the parent directory.
sys.path.append(os.path.dirname(os.path.dirname(__file__)))
from worker_interception_redirect_webworker import main
| 29.833333 | 59 | 0.826816 | 28 | 179 | 5.035714 | 0.607143 | 0.113475 | 0.184397 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094972 | 179 | 5 | 60 | 35.8 | 0.87037 | 0.217877 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
99dec279d6c45806759d2c24f00d939238834daa | 259 | py | Python | django_messages_drf/apps.py | kamikaz1k/django-messages-drf | 1e30d5d7c767d83d70c076f8b5f73d7095aabb25 | [
"MIT"
] | 9 | 2020-12-25T14:15:31.000Z | 2022-03-27T16:53:53.000Z | django_messages_drf/apps.py | kamikaz1k/django-messages-drf | 1e30d5d7c767d83d70c076f8b5f73d7095aabb25 | [
"MIT"
] | 7 | 2020-12-09T04:18:39.000Z | 2022-03-27T17:49:22.000Z | django_messages_drf/apps.py | kamikaz1k/django-messages-drf | 1e30d5d7c767d83d70c076f8b5f73d7095aabb25 | [
"MIT"
] | 7 | 2021-06-11T03:22:49.000Z | 2022-03-27T17:49:30.000Z | from django.apps import AppConfig as BaseAppConfig
from django.utils.translation import gettext_lazy as _
class MessagesDrfConfig(BaseAppConfig):
name = "django_messages_drf"
label = "django_messages_drf"
verbose_name = _("Django Messages DRF")
| 28.777778 | 54 | 0.783784 | 31 | 259 | 6.290323 | 0.580645 | 0.215385 | 0.261538 | 0.215385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150579 | 259 | 8 | 55 | 32.375 | 0.886364 | 0 | 0 | 0 | 0 | 0 | 0.220077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8242fe2fe15586d26bdb3323c0a74155fa59c8b1 | 30 | py | Python | tglog/__init__.py | sdurivau25/logfunc | cce962742385c460e49e3c5b7cf43d358149ee25 | [
"MIT"
] | null | null | null | tglog/__init__.py | sdurivau25/logfunc | cce962742385c460e49e3c5b7cf43d358149ee25 | [
"MIT"
] | null | null | null | tglog/__init__.py | sdurivau25/logfunc | cce962742385c460e49e3c5b7cf43d358149ee25 | [
"MIT"
] | null | null | null | from tglog.tglog import logger | 30 | 30 | 0.866667 | 5 | 30 | 5.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 30 | 1 | 30 | 30 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
412d7ae3f47c612954c00be34e4b2287a8f5c4c4 | 47 | py | Python | web/src/main/utils/__init__.py | sidneijp/zedev | 75d6a83d08febb795f862627811925ea18f89fca | [
"BSD-3-Clause"
] | null | null | null | web/src/main/utils/__init__.py | sidneijp/zedev | 75d6a83d08febb795f862627811925ea18f89fca | [
"BSD-3-Clause"
] | null | null | null | web/src/main/utils/__init__.py | sidneijp/zedev | 75d6a83d08febb795f862627811925ea18f89fca | [
"BSD-3-Clause"
] | null | null | null | from .cnpj import *
from .env_parsers import *
| 15.666667 | 26 | 0.744681 | 7 | 47 | 4.857143 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170213 | 47 | 2 | 27 | 23.5 | 0.871795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
68b91c2434b150fc63a891cde5200064d884b436 | 36 | py | Python | Lib/site-packages/QTileLayout/__init__.py | fochoao/cpython | 3dc84b260e5bced65ebc2c45c40c8fa65f9b5aa9 | [
"bzip2-1.0.6",
"0BSD"
] | null | null | null | Lib/site-packages/QTileLayout/__init__.py | fochoao/cpython | 3dc84b260e5bced65ebc2c45c40c8fa65f9b5aa9 | [
"bzip2-1.0.6",
"0BSD"
] | 20 | 2021-05-03T18:02:23.000Z | 2022-03-12T12:01:04.000Z | Lib/site-packages/QTileLayout/__init__.py | fochoao/cpython | 3dc84b260e5bced65ebc2c45c40c8fa65f9b5aa9 | [
"bzip2-1.0.6",
"0BSD"
] | null | null | null | from .tileLayout import QTileLayout
| 18 | 35 | 0.861111 | 4 | 36 | 7.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 36 | 1 | 36 | 36 | 0.96875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
68f4eed2de75d2eceff0dd4416be24f7128c746c | 191 | py | Python | server/mturk/forms.py | BarracudaPff/code-golf-data-pythpn | 42e8858c2ebc6a061012bcadb167d29cebb85c5e | [
"MIT"
] | null | null | null | server/mturk/forms.py | BarracudaPff/code-golf-data-pythpn | 42e8858c2ebc6a061012bcadb167d29cebb85c5e | [
"MIT"
] | null | null | null | server/mturk/forms.py | BarracudaPff/code-golf-data-pythpn | 42e8858c2ebc6a061012bcadb167d29cebb85c5e | [
"MIT"
] | null | null | null | from django import forms
class MTurkAssignmentForm(forms.Form):
data = forms.CharField(required=True)
action_log = forms.CharField(required=True)
feedback = forms.CharField(required=False) | 38.2 | 44 | 0.811518 | 24 | 191 | 6.416667 | 0.625 | 0.272727 | 0.428571 | 0.337662 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089005 | 191 | 5 | 45 | 38.2 | 0.885057 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
68f76296263aeef2158e6f391c0c70c9aee47795 | 5,968 | py | Python | tests/test_cors.py | ausecocloud/pyramid_cors | c79785aff56456bf7543dd17e766492b3953f037 | [
"Apache-2.0"
] | null | null | null | tests/test_cors.py | ausecocloud/pyramid_cors | c79785aff56456bf7543dd17e766492b3953f037 | [
"Apache-2.0"
] | null | null | null | tests/test_cors.py | ausecocloud/pyramid_cors | c79785aff56456bf7543dd17e766492b3953f037 | [
"Apache-2.0"
] | null | null | null | import re
import unittest
from pyramid import testing
from pyramid.request import Request
class TestCORS(unittest.TestCase):
def setUp(self):
self.config = testing.setUp()
self.config.registry.settings.update({
'cors.Access-Control-Allow-Origin': '*'
})
self.config.include('pyramid_cors')
# This should come first to intercept preflight requests
self.config.add_cors_preflight_handler()
# add a test view
def view(request):
return 'Hello'
self.config.add_view(view, name='cors', cors=True, renderer='json', request_method='POST')
self.config.add_view(view, name='nocors', renderer='json', request_method='GET')
def tearDown(self):
del self.config
testing.tearDown()
def test_deriver_registered(self):
from pyramid.interfaces import IViewDerivers
derivers = self.config.registry.getUtility(IViewDerivers)
dlist = {d for (d, _) in derivers.sorted()}
self.assertIn('cors_view', dlist)
def test_default_headers(self):
origin = 'http://example.com'
app = self.config.make_wsgi_app()
request = Request.blank('/cors', base_url=origin)
request.method = 'POST'
request.headers['Origin'] = origin
response = request.get_response(app)
self.assertTrue(response.headers.get('Access-Control-Allow-Methods'))
self.assertTrue(response.headers.get('Access-Control-Allow-Headers'))
self.assertTrue(response.headers.get('Access-Control-Allow-Origin'))
self.assertEqual(response.headers['Access-Control-Allow-Origin'], '*')
def test_preflight_default_headers(self):
origin = 'http://example.com'
app = self.config.make_wsgi_app()
request = Request.blank('/cors', base_url=origin)
request.method = 'OPTIONS'
request.headers['Origin'] = origin
request.headers['Access-Control-Request-Method'] = 'GET'
response = request.get_response(app)
self.assertTrue(response.headers.get('Access-Control-Allow-Methods'))
self.assertTrue(response.headers.get('Access-Control-Allow-Headers'))
self.assertTrue(response.headers.get('Access-Control-Allow-Origin'))
self.assertEqual(response.headers['Access-Control-Allow-Origin'], '*')
def test_preflight_allowed_origins_ok(self):
origin = 'http://example.com'
self.config.registry.settings['cors.allowed_origins'] = re.compile('http://example.com')
app = self.config.make_wsgi_app()
request = Request.blank('/cors', base_url=origin)
request.method = 'OPTIONS'
request.headers['Origin'] = origin
request.headers['Access-Control-Request-Method'] = 'GET'
response = request.get_response(app)
self.assertEqual(response.headers['Access-Control-Allow-Origin'], 'http://example.com')
def test_preflight_allowed_origins_fail(self):
origin = 'http://example.org'
self.config.registry.settings['cors.allowed_origins'] = re.compile('http://example.com')
app = self.config.make_wsgi_app()
request = Request.blank('/cors', base_url=origin)
request.method = 'OPTIONS'
request.headers['Origin'] = origin
request.headers['Access-Control-Request-Method'] = 'GET'
response = request.get_response(app)
self.assertNotIn('Access-Control-Allow-Origin', response.headers)
def test_preflight_allowed_sub_domain_ok(self):
origin = 'http://sub.example.com'
self.config.registry.settings['cors.allowed_origins'] = re.compile('http://.*example.com')
app = self.config.make_wsgi_app()
request = Request.blank('/cors', base_url=origin)
request.method = 'OPTIONS'
request.headers['Origin'] = origin
request.headers['Access-Control-Request-Method'] = 'GET'
response = request.get_response(app)
self.assertEqual(response.headers['Access-Control-Allow-Origin'], 'http://sub.example.com')
def test_preflight_allowed_sub_domain_fail(self):
origin = 'http://sub.example.com'
self.config.registry.settings['cors.allowed_origins'] = re.compile('http://example.com')
app = self.config.make_wsgi_app()
request = Request.blank('/cors', base_url=origin)
request.method = 'OPTIONS'
request.headers['Origin'] = origin
request.headers['Access-Control-Request-Method'] = 'GET'
response = request.get_response(app)
self.assertNotIn('Access-Control-Allow-Origin', response.headers)
def test_nocors_view_preflight(self):
origin = 'http://example.com'
app = self.config.make_wsgi_app()
request = Request.blank('/nocors', base_url=origin)
request.method = 'OPTIONS'
request.headers['Origin'] = origin
request.headers['Access-Control-Request-Method'] = 'GET'
response = request.get_response(app)
self.assertIn('Access-Control-Allow-Methods', response.headers)
self.assertEqual(response.headers.get('Access-Control-Allow-Origin'), '*')
def test_nocors_view(self):
origin = 'http://example.com'
app = self.config.make_wsgi_app()
request = Request.blank('/nocors', base_url=origin)
request.method = 'GET'
request.headers['Origin'] = origin
response = request.get_response(app)
self.assertNotIn('Access-Control-Allow-Methods', response.headers)
self.assertNotIn('Access-Control-Allow-Origin', response.headers)
def test_no_origin(self):
origin = 'http://example.com'
app = self.config.make_wsgi_app()
request = Request.blank('/cors', base_url=origin)
request.method = 'POST'
response = request.get_response(app)
self.assertNotIn('Access-Control-Allow-Methods', response.headers)
self.assertNotIn('Access-Control-Allow-Origin', response.headers)
| 43.562044 | 99 | 0.663706 | 688 | 5,968 | 5.638081 | 0.125 | 0.083784 | 0.088167 | 0.074246 | 0.818252 | 0.801237 | 0.760763 | 0.738592 | 0.738592 | 0.738592 | 0 | 0 | 0.199397 | 5,968 | 136 | 100 | 43.882353 | 0.811846 | 0.011729 | 0 | 0.649123 | 0 | 0 | 0.218151 | 0.118575 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.114035 | false | 0 | 0.04386 | 0.008772 | 0.175439 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
68fc0b0a4a53f74371362673c48196aee1187a2e | 12,641 | py | Python | scripts/noise/unit_impulse.py | gferragu/gmprocess_scratchpaper | b267f2883b0de9260471296116009d359bc02c8a | [
"MIT"
] | null | null | null | scripts/noise/unit_impulse.py | gferragu/gmprocess_scratchpaper | b267f2883b0de9260471296116009d359bc02c8a | [
"MIT"
] | null | null | null | scripts/noise/unit_impulse.py | gferragu/gmprocess_scratchpaper | b267f2883b0de9260471296116009d359bc02c8a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from scipy import signal
import numpy as np
import matplotlib.pyplot as plt
# def make_test_signal(A=1, f=5, t_min=0, t_max=60, samp_interv=0.001):
# # sampling_rate = 1 / samp_interv
# nsamp = int((t_max - t_min) / samp_interv)
# t = np.linspace(t_min, t_max, nsamp)
# signal = A*np.sin((2*np.pi)*t)
# return t, signal
def get_impulse(tmin=0, tmax=100, fs=1, delta=0.01):
signal.unit_impulse(100)
imp = signal.unit_impulse((tmax-tmin), 'mid')
t = np.arange(tmin, tmax)
return t, imp
def get_psd(series, fs=1.0, nfft=None, nperseg=10, noverlap=None, scaling='density'):
if nfft == None and noverlap == None:
nfft = len(series)
nperseg = len(series)/100
noverlap = nperseg/2
freqs, psd = signal.welch(series,fs=fs,nfft=nfft,nperseg=nperseg,noverlap=noverlap,scaling=scaling)
return freqs, psd
def get_ps(series, fs=1.0, nfft=None, nperseg=10, noverlap=None, scaling='spectrum'):
freqs, ps = signal.welch(series,fs=fs,nfft=nfft,nperseg=nperseg,noverlap=noverlap,scaling=scaling)
return freqs, ps
def psd_to_fft_sqrt(spectrum):
noise = np.zeros(len(spectrum))
for idx,power in enumerate(spectrum):
noise[idx] = np.sqrt(power)
return noise
def psd_to_fft_normalize(spectrum, fs=1):
nfft = len(spectrum)
delta = 1/fs
norm = nfft/(2*delta)
fft_norm = np.zeros(len(spectrum))
for idx,power in enumerate(spectrum):
fft_norm[idx] = np.sqrt(norm*power)
return fft_norm
def fft_to_t(spectrum):
t_series = np.fft.ifft(spectrum)
return t_series
# def ifft_of_psd(series, fs=1.0, scaling='density'):
# reqs, psd = signal.welch(series,fs,scaling=scaling)
# np.fft.fft()
# ifft_psd = np.fft.ifft(psd)
# return ifft_psd
def plot_unit_amplitude_impulse(t = None, series=None, save=False, path='./', filename='impulse.png'):
"""
Parameters
----------
t : TYPE, optional
DESCRIPTION. The default is None.
series : TYPE, optional
DESCRIPTION. The default is None.
save : TYPE, optional
DESCRIPTION. The default is False.
path : TYPE, optional
DESCRIPTION. The default is './'.
filename : TYPE, optional
DESCRIPTION. The default is 'impulse.png'.
Returns
-------
None.
"""
if series is None or t is None:
t, imp = get_impulse()
else:
imp = series
t = t
if len(t) != len(series):
print("signal and time arrays have differing lengths, exiting ...")
return
plt.plot(t, imp)
plt.title("Unit Impulse")
plt.margins(0.1, 0.1)
plt.xlabel('Time [samples]')
plt.ylabel('Amplitude')
plt.grid(True)
# plt.show()
if save:
plt.savefig(fname='/Users/gabriel/Documents/Research/USGS_Work/gmprocess/figs/impulse/' + filename, dpi=500)
plt.close()
def plot_unit_amplitude_impulse_fft(t = None, series = None, save=False, path='./', filename='impulse_fft.png'):
"""
Parameters
----------
t : TYPE, optional
DESCRIPTION. The default is None.
series : TYPE, optional
DESCRIPTION. The default is None.
save : TYPE, optional
DESCRIPTION. The default is False.
path : TYPE, optional
DESCRIPTION. The default is './'.
filename : TYPE, optional
DESCRIPTION. The default is 'impulse_fft.png'.
Returns
-------
None.
"""
t, imp = get_impulse()
fft_imp = np.abs(np.fft.fft(imp).real)
freq = np.fft.fftfreq(len(fft_imp))
plt.plot(freq, fft_imp)
plt.title("NumPy FFT of Unit Impulse")
plt.margins(0.1, 0.1)
plt.xlabel('Period (s)')
plt.ylabel('Amplitude')
ax = plt.gca()
ax.set_xscale('log')
plt.grid(True)
# plt.show()
if save:
plt.savefig(fname='/Users/gabriel/Documents/Research/USGS_Work/gmprocess/figs/impulse/' + filename, dpi=500)
plt.close()
def plot_unit_amplitude_impulse_psd(series, fs=1.0, scaling='density',save=False, path='./', filename='impulse_psd.png'):
"""
Uses the SciPy Signal method "Welch" to construct the power spectral
density from a time series using the method of Welch (1967)
Units are in (m/s)^2 / Hz
Parameters
----------
series : TYPE
DESCRIPTION.
fs : TYPE, optional
DESCRIPTION. The default is 1.0.
scaling : TYPE, optional
DESCRIPTION. The default is 'density'.
Returns
-------
None.
"""
freqs, psd = signal.welch(series,fs,scaling=scaling)
plt.semilogx(freqs, psd)
plt.margins(0.1, 0.1)
plt.title('PSD: power spectral density - Scaling: "density"')
plt.xlabel('Frequency')
plt.ylabel('PSD [V**2/Hz]')
# plt.tight_layout()
plt.grid(True)
# plt.show()
if save:
plt.savefig(fname='/Users/gabriel/Documents/Research/USGS_Work/gmprocess/figs/impulse/' + filename, dpi=500)
plt.close()
def plot_unit_amplitude_impulse_ps(series, fs=1.0, scaling='spectrum',save=False, path='./', filename='impulse_ps.png'):
"""
Uses the SciPy Signal method "Welch" to construct the power spectral
density from a time series using the method of Welch (1967).
Units are in (m/s)^2 / Hz
Parameters
----------
series : TYPE
DESCRIPTION.
fs : TYPE, optional
DESCRIPTION. The default is 1.0.
scaling : TYPE, optional
DESCRIPTION. The default is 'density'.
Returns
-------
None.
"""
freqs, psd = signal.welch(series,fs,scaling=scaling)
plt.semilogx(freqs, psd)
plt.margins(0.1, 0.1)
plt.title('Power - Scaling: "spectrum"')
plt.xlabel('Frequency')
plt.ylabel('Linear Spectrum [V RMS]')
# plt.tight_layout()
plt.grid(True)
# plt.show()
if save:
plt.savefig(fname='/Users/gabriel/Documents/Research/USGS_Work/gmprocess/figs/impulse/' + filename, dpi=500)
plt.close()
# def plot_noise(t = None, series = None, save=False, path='./', filename='impulse_fft.png'):
# t, imp = get_impulse()
# fft_imp = np.abs(np.fft.fft(imp).real)
# freq = np.fft.fftfreq(len(fft_imp))
# plt.plot(freq, fft_imp)
# plt.margins(0.1, 0.1)
# plt.xlabel('Period (s)')
# plt.ylabel('Amplitude')
# ax = plt.gca()
# ax.set_xscale('log')
# plt.grid(True)
# plt.show()
# if save:
# plt.savefig(fname='/Users/gabriel/Documents/Research/USGS_Work/gmprocess/figs/impulse/' + filename, dpi=500)
# plt.close()
# def plot_noise_fft(spectrum = None, save=False, path='./', filename='impulse_fft.png'):
# if noise is None:
# noise_fft = np.abs(np.fft.fft(imp).real)
# freq = np.fft.fftfreq(len(fft_imp))
# plt.plot(freq, fft_imp)
# plt.margins(0.1, 0.1)
# plt.xlabel('Period (s)')
# plt.ylabel('Amplitude')
# ax = plt.gca()
# ax.set_xscale('log')
# plt.grid(True)
# plt.show()
# if save:
# plt.savefig(fname='/Users/gabriel/Documents/Research/USGS_Work/gmprocess/figs/impulse/' + filename, dpi=500)
# plt.close()
# %%
# plt.close()
# # Make a test signal
# np.random.seed(0)
# delta = .01
# fs = 1/ delta
# t = np.arange(0, 70, delta)
# # A signal with a small frequency chirp
# sig = np.sin(0.5 * np.pi * t * (1 + .1 * t))
# # An unit impulse
# t, imp = get_impulse(tmin=0,tmax=100)
# plot_unit_amplitude_impulse(save=True)
# plot_unit_amplitude_impulse_fft(save=True)
# plot_unit_amplitude_impulse_psd(imp, save=True)
# plot_unit_amplitude_impulse_ps(imp, save=True)
# # plot_unit_amplitude_impulse_psd(sig,fs)
#%%
# # PSD of the impulse
# freqs_psd, psd = get_psd(imp)
# plt.semilogx(freqs_psd, psd)
# plt.title("Test plot of Impulse PSD")
# plt.show()
# plt.close()
# # PS of the impulse
# freqs_ps, ps = get_ps(imp)
# plt.semilogx(freqs_ps, ps)
# plt.title("Test plot of Impulse PS")
# plt.show()
# plt.close()
# # What does the ifft of the PSD look like?
# convert_psd_to_fft = psd_to_fft(psd)
# plt.semilogx(convert_psd_to_fft)
# plt.title("PSD to FFT by just using sqrt()")
# plt.show()
# plt.close()
# ifft_impulse_psd = fft_to_t(convert_psd_to_fft)
# plt.plot(ifft_impulse_psd)
# plt.title("Inverse FFT of sqrt(PSD)")
# plt.show()
# plt.close()
# # Convert directly from PSD to time domain
# psd_to_time = np.fft.ifft(psd)
# plt.plot(psd_to_time)
# plt.title("Inverse FFT of PSD (no square root)")
# plt.show()
# plt.close()
#%%
# Make a test signal
tmin=0
tmax=100
np.random.seed(0)
delta = .01
fs = 1/ delta
t = np.arange(tmin, tmax, delta)
wndw_factor=500
overlap_factor=2
nfft = len(t)
nperseg=len(t)/wndw_factor
noverlap=nperseg/overlap_factor
# A signal with a small frequency chirp
sig = np.sin(0.5 * np.pi * t * (1 + .1 * t))
# Just plot the input signal
plt.plot(t,sig)
plt.title("Original sine() test signal")
plt.show()
plt.close()
#%%
# Make unit impulse
tmin=0
tmax=100
delta = .01
fs = 1/ delta
t = np.arange(tmin, tmax, delta)
overlap_factor=2
nfft = len(t)
nperseg=len(t)/wndw_factor
noverlap=nperseg/overlap_factor
# A unit impulse test signal
t, sig = get_impulse(tmin=tmin, tmax=tmax, fs=fs, delta= delta)
# Just plot the input signal
plt.plot(t,sig)
plt.title("Unit Impulse")
plt.show()
plt.close()
#%%
# FFT of the test signal
# sine_fft = np.abs(np.fft.fft(np.sin(t)))
sine_fft = np.abs(np.fft.fft(np.sin(t)).real)
sine_freqs = np.fft.fftfreq(t.shape[-1], delta)
plt.semilogx(sine_freqs, sine_fft)
plt.title("Taking only real, positive components of the standard FFT")
plt.show()
# rFFT of the test signal (real)
# rsine_fft = np.abs(np.fft.rfft(np.sin(t)))
rsine_fft = np.abs(np.fft.rfft(np.sin(t)).real)
rsine_freqs = np.fft.rfftfreq(t.shape[-1], delta)
plt.semilogx(rsine_freqs, rsine_fft)
plt.title("Taking only real, positive components of the rFFT (redundant)")
plt.show()
# # hFFT of the test signal (hermitian)
# hsine_fft = np.fft.hfft(np.sin(t), n=nfft) # Need to specify n
# hsine_freqs = np.fft.fftfreq(t.shape[-1], delta)
# plt.semilogx(hsine_freqs, hsine_fft)
# plt.show()
#%% Plot individually
# # PSD of the test signal
# freqs_psd, psd = get_psd(sig, fs=fs)
# plt.semilogx(freqs_psd, psd)
# plt.title("Test plot of a sine() signal's PSD")
# plt.show()
# plt.close()
# # PS of the impulse
# freqs_ps, ps = get_ps(sig)
# plt.semilogx(freqs_ps, ps)
# plt.title("Test plot of a sine() signal's PS")
# plt.show()
# plt.close()
# # What does the ifft of the PSD look like?
# convert_psd_to_fft = psd_to_fft_sqrt(psd)
# plt.semilogx(convert_psd_to_fft)
# plt.title("PSD to FFT by just using sqrt()")
# plt.show()
# plt.close()
# ifft_impulse_psd = fft_to_t(convert_psd_to_fft)
# plt.plot(ifft_impulse_psd)
# plt.title("Inverse FFT of sqrt(PSD)")
# plt.show()
# plt.close()
# # Using the PSD -> FFT normalization
# convert_psd_to_fft_norm = psd_to_fft_normalize(psd, fs=fs)
# plt.semilogx(convert_psd_to_fft_norm)
# plt.title("PSD to FFT by just using normalized sqrt()")
# plt.show()
# plt.close()
# ifft_impulse_psd_norm = fft_to_t(convert_psd_to_fft_norm)
# plt.plot(ifft_impulse_psd_norm)
# plt.title("Inverse FFT of normalized sqrt(PSD)")
# plt.show()
# plt.close()
# # Convert directly from PSD to time domain
# psd_to_time = np.fft.ifft(psd)
# plt.plot(psd_to_time)
# plt.title("Inverse FFT of PSD (no square root)")
# plt.show()
# plt.close()
#%% Plot together
# PSD of the test signal
freqs_psd, psd = get_psd(sig, fs=fs, nfft=nfft, nperseg=nperseg, noverlap=noverlap)
plt.semilogx(freqs_psd, psd,label="PSD")
# plt.title("Test plot of a sine() signal's PSD")
# PS of the impulse
freqs_ps, ps = get_ps(sig, fs=fs, nfft=nfft, nperseg=nperseg, noverlap=noverlap)
plt.semilogx(freqs_ps, ps,label="PS")
plt.title("Test plot of a sine() signal's PS")
plt.legend()
plt.show()
# What does the ifft of the PSD look like?
convert_psd_to_fft = psd_to_fft_sqrt(psd)
plt.semilogx(convert_psd_to_fft)
plt.title("PSD to FFT by just using sqrt()")
plt.show()
plt.close()
# Using the PSD -> FFT normalization
convert_psd_to_fft_norm = psd_to_fft_normalize(psd, fs=fs)
plt.semilogx(convert_psd_to_fft_norm)
plt.title("PSD to FFT by just using normalized sqrt()")
plt.show()
plt.close()
# What does the ifft of the PSD look like?
ifft_impulse_psd = fft_to_t(convert_psd_to_fft)
plt.plot(ifft_impulse_psd)
plt.title("Inverse FFT of sqrt(PSD)")
plt.show()
plt.close()
ifft_impulse_psd_norm = fft_to_t(convert_psd_to_fft_norm)
plt.plot(ifft_impulse_psd_norm)
plt.title("Inverse FFT of normalized sqrt(PSD)")
plt.show()
plt.close()
# Convert directly from PSD to time domain
psd_to_time = np.fft.ifft(psd)
plt.plot(psd_to_time)
plt.title("Inverse FFT of PSD (no square root)")
plt.show()
plt.close() | 26.335417 | 121 | 0.661577 | 1,985 | 12,641 | 4.086146 | 0.101763 | 0.022192 | 0.026631 | 0.035137 | 0.815559 | 0.779189 | 0.759216 | 0.734805 | 0.73012 | 0.724818 | 0 | 0.013925 | 0.187643 | 12,641 | 480 | 122 | 26.335417 | 0.775928 | 0.491575 | 0 | 0.440994 | 0 | 0 | 0.16675 | 0.044914 | 0 | 0 | 0 | 0 | 0 | 1 | 0.062112 | false | 0 | 0.018634 | 0 | 0.124224 | 0.006211 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ec055f84bbcff12cb3833ac7f86be1a25e99ca1c | 122 | py | Python | Emoji.py | Danilo-Xaxa/python_curso_em_video | 9a88e5f579dfc844f936b7759d33e4068b091f09 | [
"MIT"
] | 4 | 2021-08-29T02:19:55.000Z | 2021-08-30T20:21:30.000Z | Emoji.py | Danilo-Xaxa/python_curso_em_video | 9a88e5f579dfc844f936b7759d33e4068b091f09 | [
"MIT"
] | null | null | null | Emoji.py | Danilo-Xaxa/python_curso_em_video | 9a88e5f579dfc844f936b7759d33e4068b091f09 | [
"MIT"
] | null | null | null | import emoji
print(emoji.emojize("Cu gostoso :yum:", use_aliases=True))
#começando a usar o import emoji e o emoji.emojize | 40.666667 | 58 | 0.778689 | 21 | 122 | 4.47619 | 0.714286 | 0.234043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114754 | 122 | 3 | 59 | 40.666667 | 0.87037 | 0.401639 | 0 | 0 | 0 | 0 | 0.219178 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
ec21a2da2c9376875417cd582043a8c55fe23ad9 | 76 | py | Python | pvn3d/lib/utils/warmup_scheduler/__init__.py | JiazeWang/PVN3D | 07241f5e0de488c123cd78f516a707bff207c2e0 | [
"MIT"
] | 369 | 2019-11-12T08:27:08.000Z | 2022-03-28T08:33:14.000Z | pvn3d/lib/utils/warmup_scheduler/__init__.py | hansongfang/PVN3D | 1f0216505faa543ac2367cf485e0703cabcaefa0 | [
"MIT"
] | 99 | 2019-11-27T12:48:49.000Z | 2022-03-23T07:15:24.000Z | pvn3d/lib/utils/warmup_scheduler/__init__.py | hansongfang/PVN3D | 1f0216505faa543ac2367cf485e0703cabcaefa0 | [
"MIT"
] | 102 | 2019-11-20T13:14:16.000Z | 2022-03-31T08:05:29.000Z | from .scheduler import GradualWarmupScheduler, WarmupCosScheduler, CyclicLR
| 38 | 75 | 0.881579 | 6 | 76 | 11.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 76 | 1 | 76 | 76 | 0.957143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ec429e920c08ed688ebad3bf68d18e7b5cfd3d5f | 16,256 | py | Python | idaes/models_extra/power_generation/unit_models/soc_submodels/tests/test_solid_oxide_cell.py | aladshaw3/idaes-pse | 23d627958cfd934e91fc6cfc2c82661c67237bb5 | [
"RSA-MD"
] | null | null | null | idaes/models_extra/power_generation/unit_models/soc_submodels/tests/test_solid_oxide_cell.py | aladshaw3/idaes-pse | 23d627958cfd934e91fc6cfc2c82661c67237bb5 | [
"RSA-MD"
] | null | null | null | idaes/models_extra/power_generation/unit_models/soc_submodels/tests/test_solid_oxide_cell.py | aladshaw3/idaes-pse | 23d627958cfd934e91fc6cfc2c82661c67237bb5 | [
"RSA-MD"
] | null | null | null | #################################################################################
# The Institute for the Design of Advanced Energy Systems Integrated Platform
# Framework (IDAES IP) was produced under the DOE Institute for the
# Design of Advanced Energy Systems (IDAES), and is copyright (c) 2018-2021
# by the software owners: The Regents of the University of California, through
# Lawrence Berkeley National Laboratory, National Technology & Engineering
# Solutions of Sandia, LLC, Carnegie Mellon University, West Virginia University
# Research Corporation, et al. All rights reserved.
#
# Please see the files COPYRIGHT.md and LICENSE.md for full copyright and
# license information.
#################################################################################
__author__ = "Douglas Allan"
import pytest
import numpy as np
import pyomo.environ as pyo
from idaes.core import FlowsheetBlock
from idaes.models.unit_models.heat_exchanger import HeatExchangerFlowPattern
import idaes.models_extra.power_generation.unit_models.soc_submodels as soc
import idaes.models_extra.power_generation.unit_models.soc_submodels.testing as soc_testing
solver = pyo.SolverFactory("ipopt")
def build_tester(cell, nt, nz):
soc_testing._build_test_utility(
cell,
comp_dict={
pyo.Var: {
"current_density": nz * nt,
"potential": nt,
"temperature_z": nz * nt,
"length_z": 1,
"length_y": 1,
},
pyo.Constraint: {
"mean_temperature_eqn": nz * nt,
"potential_eqn": nz * nt,
"no_heat_flux_fuel_interconnect_eqn": nz * nt,
"no_heat_flux_oxygen_interconnect_eqn": nz * nt,
},
pyo.Expression: {
"voltage_drop_contact": nz * nt,
"voltage_drop_ohmic": nz * nt,
"electrical_work": 1,
},
},
)
@pytest.fixture
def model():
time_set = [0]
zfaces = np.linspace(0, 1, 4).tolist()
xfaces_electrode = np.linspace(0, 1, 6).tolist()
xfaces_electrolyte = np.linspace(0, 1, 8).tolist()
fuel_comps = ["H2", "H2O", "N2"]
fuel_triple_phase_boundary_stoich_dict = {
"H2": -0.5,
"H2O": 0.5,
"Vac": 0.5,
"O^2-": -0.5,
"e^-": 1.0,
}
oxygen_comps = ["O2", "N2"]
oxygen_triple_phase_boundary_stoich_dict = {
"O2": -0.25,
"N2": 0,
"Vac": -0.5,
"O^2-": 0.5,
"e^-": -1.0,
}
m = pyo.ConcreteModel()
m.fs = FlowsheetBlock(
default={
"dynamic": False,
"time_set": time_set,
"time_units": pyo.units.s,
}
)
m.fs.cell = soc.SolidOxideCell(
default={
"has_holdup": True,
"control_volume_zfaces": zfaces,
"control_volume_xfaces_fuel_electrode": xfaces_electrode,
"control_volume_xfaces_oxygen_electrode": xfaces_electrode,
"control_volume_xfaces_electrolyte": xfaces_electrolyte,
"fuel_component_list": fuel_comps,
"fuel_triple_phase_boundary_stoich_dict": fuel_triple_phase_boundary_stoich_dict,
"inert_fuel_species_triple_phase_boundary": ["N2"],
"oxygen_component_list": oxygen_comps,
"oxygen_triple_phase_boundary_stoich_dict": oxygen_triple_phase_boundary_stoich_dict,
"inert_oxygen_species_triple_phase_boundary": ["N2"],
"flow_pattern": HeatExchangerFlowPattern.countercurrent,
"include_contact_resistance": True,
}
)
return m
@pytest.fixture
def model_no_contact_resistance():
time_set = [0]
zfaces = np.linspace(0, 1, 6).tolist()
xfaces_electrode = np.linspace(0, 1, 4).tolist()
xfaces_electrolyte = np.linspace(0, 1, 12).tolist()
fuel_comps = ["H2", "H2O"]
fuel_triple_phase_boundary_stoich_dict = {
"H2": -0.5,
"H2O": 0.5,
"Vac": 0.5,
"O^2-": -0.5,
"e^-": 1,
}
oxygen_comps = ["O2", "N2", "H2O"]
oxygen_triple_phase_boundary_stoich_dict = {
"O2": -0.25,
"H2O": 0,
"Vac": -0.5,
"O^2-": 0.5,
"e^-": -1,
}
m = pyo.ConcreteModel()
m.fs = FlowsheetBlock(
default={
"dynamic": False,
"time_set": time_set,
"time_units": pyo.units.s,
}
)
m.fs.cell = soc.SolidOxideCell(
default={
"has_holdup": False,
"control_volume_zfaces": zfaces,
"control_volume_xfaces_fuel_electrode": xfaces_electrode,
"control_volume_xfaces_oxygen_electrode": xfaces_electrode,
"control_volume_xfaces_electrolyte": xfaces_electrolyte,
"fuel_component_list": fuel_comps,
"fuel_triple_phase_boundary_stoich_dict": fuel_triple_phase_boundary_stoich_dict,
# "inert_fuel_species_triple_phase_boundary": [], Test default
"oxygen_component_list": oxygen_comps,
"oxygen_triple_phase_boundary_stoich_dict": oxygen_triple_phase_boundary_stoich_dict,
"inert_oxygen_species_triple_phase_boundary": ["N2", "H2O"],
"flow_pattern": HeatExchangerFlowPattern.cocurrent,
"include_contact_resistance": False,
}
)
return m
@pytest.mark.build
@pytest.mark.unit
def test_build(model):
cell = model.fs.cell
nt = len(model.fs.time)
nz = len(cell.zfaces) - 1
build_tester(cell, nt, nz)
channels = [cell.fuel_channel, cell.oxygen_channel]
for chan in channels:
assert cell.temperature_z is chan.temperature_z.referent
assert cell.length_y is chan.length_y.referent
assert cell.length_z is chan.length_z.referent
contact_resistors = [
cell.contact_interconnect_fuel_flow_mesh,
cell.contact_interconnect_oxygen_flow_mesh,
cell.contact_flow_mesh_fuel_electrode,
cell.contact_flow_mesh_oxygen_electrode,
]
for unit in contact_resistors:
assert cell.temperature_z is unit.temperature_z.referent
assert cell.current_density is unit.current_density.referent
assert cell.length_y is unit.length_y.referent
assert cell.length_z is unit.length_z.referent
assert (
cell.fuel_channel.heat_flux_x0
is cell.contact_interconnect_fuel_flow_mesh.heat_flux_x1.referent
)
assert (
cell.fuel_channel.temperature_deviation_x0
is cell.contact_interconnect_fuel_flow_mesh.temperature_deviation_x.referent
)
assert (
cell.fuel_channel.heat_flux_x1
is cell.contact_flow_mesh_fuel_electrode.heat_flux_x0.referent
)
assert (
cell.fuel_channel.temperature_deviation_x1
is cell.contact_flow_mesh_fuel_electrode.temperature_deviation_x.referent
)
assert (
cell.oxygen_channel.heat_flux_x1
is cell.contact_interconnect_oxygen_flow_mesh.heat_flux_x0.referent
)
assert (
cell.oxygen_channel.temperature_deviation_x1
is cell.contact_interconnect_oxygen_flow_mesh.temperature_deviation_x.referent
)
assert (
cell.oxygen_channel.heat_flux_x0
is cell.contact_flow_mesh_oxygen_electrode.heat_flux_x1.referent
)
assert (
cell.oxygen_channel.temperature_deviation_x0
is cell.contact_flow_mesh_oxygen_electrode.temperature_deviation_x.referent
)
electrodes = [cell.fuel_electrode, cell.oxygen_electrode]
for trode in electrodes:
assert cell.temperature_z is trode.temperature_z.referent
assert cell.current_density is trode.current_density.referent
assert cell.length_y is trode.length_y.referent
assert cell.length_z is trode.length_z.referent
assert (
cell.fuel_channel.temperature_deviation_x1
is cell.fuel_electrode.temperature_deviation_x0.referent
)
assert (
cell.contact_flow_mesh_fuel_electrode.heat_flux_x1
is cell.fuel_electrode.heat_flux_x0.referent
)
assert (
cell.fuel_channel.conc_mol_comp
is cell.fuel_electrode.conc_mol_comp_ref.referent
)
assert (
cell.fuel_channel.conc_mol_comp_deviation_x1
is cell.fuel_electrode.conc_mol_comp_deviation_x0.referent
)
assert (
cell.fuel_channel.dconc_mol_compdt
is cell.fuel_electrode.dconc_mol_comp_refdt.referent
)
assert (
cell.fuel_channel.material_flux_x1
is cell.fuel_electrode.material_flux_x0.referent
)
assert (
cell.oxygen_channel.temperature_deviation_x0
is cell.oxygen_electrode.temperature_deviation_x1.referent
)
assert (
cell.contact_flow_mesh_oxygen_electrode.heat_flux_x0
is cell.oxygen_electrode.heat_flux_x1.referent
)
assert (
cell.oxygen_channel.conc_mol_comp
is cell.oxygen_electrode.conc_mol_comp_ref.referent
)
assert (
cell.oxygen_channel.conc_mol_comp_deviation_x0
is cell.oxygen_electrode.conc_mol_comp_deviation_x1.referent
)
assert (
cell.oxygen_channel.dconc_mol_compdt
is cell.oxygen_electrode.dconc_mol_comp_refdt.referent
)
assert (
cell.oxygen_channel.material_flux_x0
is cell.oxygen_electrode.material_flux_x1.referent
)
tpb_list = [cell.fuel_triple_phase_boundary, cell.oxygen_triple_phase_boundary]
for tpb in tpb_list:
assert cell.temperature_z is tpb.temperature_z.referent
assert cell.current_density is tpb.current_density.referent
assert cell.length_y is tpb.length_y.referent
assert cell.length_z is tpb.length_z.referent
assert (
cell.fuel_triple_phase_boundary.temperature_deviation_x.referent
is cell.fuel_electrode.temperature_deviation_x1
)
assert (
cell.fuel_triple_phase_boundary.heat_flux_x0.referent
is cell.fuel_electrode.heat_flux_x1
)
assert (
cell.fuel_triple_phase_boundary.conc_mol_comp_ref.referent
is cell.fuel_channel.conc_mol_comp
)
assert (
cell.fuel_triple_phase_boundary.conc_mol_comp_deviation_x.referent
is cell.fuel_electrode.conc_mol_comp_deviation_x1
)
assert (
cell.oxygen_triple_phase_boundary.temperature_deviation_x.referent
is cell.oxygen_electrode.temperature_deviation_x0
)
assert (
cell.oxygen_triple_phase_boundary.heat_flux_x1.referent
is cell.oxygen_electrode.heat_flux_x0
)
assert (
cell.oxygen_triple_phase_boundary.conc_mol_comp_ref.referent
is cell.oxygen_channel.conc_mol_comp
)
assert (
cell.oxygen_triple_phase_boundary.conc_mol_comp_deviation_x.referent
is cell.oxygen_electrode.conc_mol_comp_deviation_x0
)
assert cell.temperature_z is cell.electrolyte.temperature_z.referent
assert cell.current_density is cell.electrolyte.current_density.referent
assert cell.length_y is cell.electrolyte.length_y.referent
assert cell.length_z is cell.electrolyte.length_z.referent
assert (
cell.fuel_electrode.temperature_deviation_x1
is cell.electrolyte.temperature_deviation_x0.referent
)
assert (
cell.oxygen_electrode.temperature_deviation_x0
is cell.electrolyte.temperature_deviation_x1.referent
)
assert (
cell.fuel_triple_phase_boundary.heat_flux_x1
is cell.electrolyte.heat_flux_x0.referent
)
assert (
cell.oxygen_triple_phase_boundary.heat_flux_x0
is cell.electrolyte.heat_flux_x1.referent
)
@pytest.mark.build
@pytest.mark.unit
def test_build_no_contact_resistance(model_no_contact_resistance):
cell = model_no_contact_resistance.fs.cell
nt = len(model_no_contact_resistance.fs.time)
nz = len(cell.zfaces) - 1
build_tester(cell, nt, nz)
channels = [cell.fuel_channel, cell.oxygen_channel]
for chan in channels:
assert cell.temperature_z is chan.temperature_z.referent
assert cell.length_y is chan.length_y.referent
assert cell.length_z is chan.length_z.referent
electrodes = [cell.fuel_electrode, cell.oxygen_electrode]
for trode in electrodes:
assert cell.temperature_z is trode.temperature_z.referent
assert cell.current_density is trode.current_density.referent
assert cell.length_y is trode.length_y.referent
assert cell.length_z is trode.length_z.referent
assert (
cell.fuel_channel.temperature_deviation_x1
is cell.fuel_electrode.temperature_deviation_x0.referent
)
assert cell.fuel_channel.heat_flux_x1 is cell.fuel_electrode.heat_flux_x0.referent
assert (
cell.fuel_channel.conc_mol_comp
is cell.fuel_electrode.conc_mol_comp_ref.referent
)
assert (
cell.fuel_channel.conc_mol_comp_deviation_x1
is cell.fuel_electrode.conc_mol_comp_deviation_x0.referent
)
assert (
cell.fuel_channel.dconc_mol_compdt
is cell.fuel_electrode.dconc_mol_comp_refdt.referent
)
assert (
cell.fuel_channel.material_flux_x1
is cell.fuel_electrode.material_flux_x0.referent
)
assert (
cell.oxygen_channel.temperature_deviation_x0
is cell.oxygen_electrode.temperature_deviation_x1.referent
)
assert (
cell.oxygen_channel.heat_flux_x0 is cell.oxygen_electrode.heat_flux_x1.referent
)
assert (
cell.oxygen_channel.conc_mol_comp
is cell.oxygen_electrode.conc_mol_comp_ref.referent
)
assert (
cell.oxygen_channel.conc_mol_comp_deviation_x0
is cell.oxygen_electrode.conc_mol_comp_deviation_x1.referent
)
assert (
cell.oxygen_channel.dconc_mol_compdt
is cell.oxygen_electrode.dconc_mol_comp_refdt.referent
)
assert (
cell.oxygen_channel.material_flux_x0
is cell.oxygen_electrode.material_flux_x1.referent
)
tpb_list = [cell.fuel_triple_phase_boundary, cell.oxygen_triple_phase_boundary]
for tpb in tpb_list:
assert cell.temperature_z is tpb.temperature_z.referent
assert cell.current_density is tpb.current_density.referent
assert cell.length_y is tpb.length_y.referent
assert cell.length_z is tpb.length_z.referent
assert (
cell.fuel_triple_phase_boundary.temperature_deviation_x.referent
is cell.fuel_electrode.temperature_deviation_x1
)
assert (
cell.fuel_triple_phase_boundary.heat_flux_x0.referent
is cell.fuel_electrode.heat_flux_x1
)
assert (
cell.fuel_triple_phase_boundary.conc_mol_comp_ref.referent
is cell.fuel_channel.conc_mol_comp
)
assert (
cell.fuel_triple_phase_boundary.conc_mol_comp_deviation_x.referent
is cell.fuel_electrode.conc_mol_comp_deviation_x1
)
assert (
cell.oxygen_triple_phase_boundary.temperature_deviation_x.referent
is cell.oxygen_electrode.temperature_deviation_x0
)
assert (
cell.oxygen_triple_phase_boundary.heat_flux_x1.referent
is cell.oxygen_electrode.heat_flux_x0
)
assert (
cell.oxygen_triple_phase_boundary.conc_mol_comp_ref.referent
is cell.oxygen_channel.conc_mol_comp
)
assert (
cell.oxygen_triple_phase_boundary.conc_mol_comp_deviation_x.referent
is cell.oxygen_electrode.conc_mol_comp_deviation_x0
)
assert cell.temperature_z is cell.electrolyte.temperature_z.referent
assert cell.current_density is cell.electrolyte.current_density.referent
assert cell.length_y is cell.electrolyte.length_y.referent
assert cell.length_z is cell.electrolyte.length_z.referent
assert (
cell.fuel_electrode.temperature_deviation_x1
is cell.electrolyte.temperature_deviation_x0.referent
)
assert (
cell.oxygen_electrode.temperature_deviation_x0
is cell.electrolyte.temperature_deviation_x1.referent
)
assert (
cell.fuel_triple_phase_boundary.heat_flux_x1
is cell.electrolyte.heat_flux_x0.referent
)
assert (
cell.oxygen_triple_phase_boundary.heat_flux_x0
is cell.electrolyte.heat_flux_x1.referent
)
| 34.223158 | 97 | 0.688115 | 2,041 | 16,256 | 5.1073 | 0.097011 | 0.086339 | 0.115695 | 0.044321 | 0.876439 | 0.859075 | 0.841616 | 0.81888 | 0.762471 | 0.741462 | 0 | 0.013891 | 0.229454 | 16,256 | 474 | 98 | 34.295359 | 0.818298 | 0.039985 | 0 | 0.604819 | 0 | 0 | 0.069984 | 0.04536 | 0 | 0 | 0 | 0 | 0.216867 | 1 | 0.012048 | false | 0 | 0.016867 | 0 | 0.033735 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6b97c0bd9eb8600b512e2d94a9a83d7db9040a0a | 89 | py | Python | base/__init__.py | FelixFu520/classification | 2e8119c229bc978ecaecac0361bf8b573c553d43 | [
"MIT"
] | null | null | null | base/__init__.py | FelixFu520/classification | 2e8119c229bc978ecaecac0361bf8b573c553d43 | [
"MIT"
] | null | null | null | base/__init__.py | FelixFu520/classification | 2e8119c229bc978ecaecac0361bf8b573c553d43 | [
"MIT"
] | null | null | null | from .base_dataloader import *
from .base_dataset import *
from .base_trainer import *
| 14.833333 | 30 | 0.775281 | 12 | 89 | 5.5 | 0.5 | 0.363636 | 0.424242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157303 | 89 | 5 | 31 | 17.8 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
6bb07fb451ba720d4a36102d7242f331c133992f | 17,234 | py | Python | src/ripe_rainbow/domain/logic/ripe_white.py | ripe-tech/ripe-rainbow | 12b430d15102ed6a731d239db00d32dae87384df | [
"Apache-2.0"
] | 2 | 2019-06-11T09:19:48.000Z | 2020-06-30T09:30:29.000Z | src/ripe_rainbow/domain/logic/ripe_white.py | ripe-tech/ripe-rainbow | 12b430d15102ed6a731d239db00d32dae87384df | [
"Apache-2.0"
] | 43 | 2019-06-06T10:06:46.000Z | 2022-02-02T10:47:53.000Z | src/ripe_rainbow/domain/logic/ripe_white.py | ripe-tech/ripe-rainbow | 12b430d15102ed6a731d239db00d32dae87384df | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
# -*- coding: utf-8 -*-
import appier
from .. import parts
class RipeWhitePart(parts.Part):
def authorize(self):
self.id.authorize()
def select_size(self, size, gender = None, scale = None, open = True, wait_closed = True):
"""
Opens the size selection window, selects the proper scale and size and
applies that configuration by clicking 'Apply' and closing the window.
Notice that if the "open" flag is unset the window is not opened.
:type size: String
:param size: The size to be picked.
:type gender: String
:param gender: The gender that is going to be picked.
:type scale: String
:param scale: The scale that is going to be picked.
:type open: Boolean
:param open: If the size modal window should be opened before selection.
:type wait_closed: Boolean
:param wait_closed: Whether it should wait for the size modal to be closed,
not waiting for the closing of the modal should improve performance.
"""
if open: self.interactions.click(".content .size:not(.disabled) > .button-size")
if gender: self.interactions.click(".size .button-gender", text = gender)
if scale: self.interactions.click(".size .button-scale", text = str(scale))
self.interactions.click(".size .sizes .button-size", text = str(size))
self.interactions.click(".content .size .button.button-apply")
if wait_closed: self.waits.not_visible(".content .size .modal")
def select_size_mobile(self, size, gender = None, scale = None, wait_closed = True):
"""
Opens the size selection window, selects the proper scale and size and
applies that configuration by clicking 'Apply' and closing the window.
Notice that if the "open" flag is unset the window is not opened.
This method is aimed at the mobile version of the size selector.
:type size: String
:param size: The size to be picked.
:type gender: String
:param gender: The gender that is going to be picked.
:type scale: String
:param scale: The scale that is going to be picked.
:type open: Boolean
:param open: If the size modal window should be opened before selection.
:type wait_closed: Boolean
:param wait_closed: Whether it should wait for the size modal to be closed,
not waiting for the closing of the modal should improve performance.
"""
if gender: self.interactions.click(".size .button-gender", text = gender)
if scale: self.interactions.click(".size .button-scale", text = str(scale))
self.interactions.click(".size .sizes .button-size", text = str(size))
self.interactions.click(".content-mobile .size .button.button-apply")
if wait_closed: self.waits.not_visible(".content-mobile .size .modal")
def select_part(self, part):
self.interactions.click(".content .pickers .button-part[data-part='%s']" % part)
def select_part_mobile(self, part):
self.interactions.click(".content-mobile .pickers .button-part[data-part='%s']" % part)
def select_material(self, material):
self.interactions.click(".content .pickers .button-material[data-material='%s']" % material)
def select_material_mobile(self, material):
self.interactions.click(".content-mobile .pickers .button-material[data-material='%s']" % material)
def select_color(self, material, color):
self.interactions.click(".content .pickers .button-color-option[data-material='%s'][data-color='%s']" % (material, color))
def select_color_mobile(self, material, color):
self.interactions.click(".content-mobile .pickers .button-color-option[data-material='%s'][data-color='%s']" % (material, color))
def assert_no_part(self, part, timeout = None):
self.waits.not_visible(
".content .pickers .button-part > p",
text = self._capitalize_words(part),
message = "The selector for the part '%s' didn't disappear" % part,
timeout = timeout
)
def assert_no_part_mobile(self, part, timeout = None):
self.waits.not_visible(
".content-mobile .pickers .button-part > p",
text = self._capitalize_words(part),
message = "The selector for the part '%s' didn't disappear" % part,
timeout = timeout
)
def assert_no_material(self, part, material):
self.select_part(part)
self.waits.not_visible(".material li[data-material='%s']" % material)
self.waits.not_visible(".content .pickers .button-color[data-material='%s']" % material)
def assert_no_material_mobile(self, part, material):
self.select_part_mobile(part)
self.waits.not_visible(".material li[data-material='%s']" % material)
self.waits.not_visible(".content-mobile .pickers .button-color[data-material='%s']" % material)
def assert_no_color(self, part, color):
self.select_part(part)
self.waits.not_visible(".content .pickers .button-color[data-color='%s']" % color)
def assert_no_color_mobile(self, part, color):
self.select_part_mobile(part)
self.waits.not_visible(".content-mobile .pickers .button-color[data-color='%s']" % color)
def set_part(
self,
brand,
model,
part,
material,
color,
part_text = None,
material_text = None,
color_text = None,
verify = True,
has_swatch = True
):
"""
Makes a change to the customization of a part and checks that the pages
mutates correctly, picking the right active parts, materials and colors,
as well as properly switching the swatches.
If the text parameters are passed an extra set of assertions are going
to be performed to validate expected behaviour.
:type brand: String
:param brand: The brand of the model being customized.
:type model: String
:param model: The model being customized.
:type part: String
:param part: The technical name of the part being changed.
:type material: String
:param material: The technical name of the material to use for the part.
:type color: String
:param color: The technical name of the color to use for the part.
:type part_text: String
:param part_text: The expected label for the part after clicking.
:type material_text: String
:param material_text: The expected label for the material after clicking.
:type color_text: String
:param color_text: The expected label for the color after clicking.
:type verify: bool
:param verify: If a final assertion should be performed after the selection
has been done (to verify the final status).
:type has_swatch: Boolean
:param has_swatch: Whether there should be a swatch.
"""
self.select_part(part)
self.select_material(material)
self.select_color(material, color)
if verify:
self.assert_part(
brand,
model,
part,
material,
color,
part_text = part_text,
material_text = material_text,
color_text = color_text,
has_swatch = has_swatch,
select_part = False
)
def set_part_mobile(
self,
brand,
model,
part,
material,
color,
part_text = None,
material_text = None,
color_text = None,
verify = True,
has_swatch = True
):
"""
Makes a change to the customization of a part and checks that the pages
mutates correctly, picking the right active parts, materials and colors,
as well as properly switching the swatches.
If the text parameters are passed an extra set of assertions are going
to be performed to validate expected behaviour.
This method is aimed at the mobile version of the parts selector.
:type brand: String
:param brand: The brand of the model being customized.
:type model: String
:param model: The model being customized.
:type part: String
:param part: The technical name of the part being changed.
:type material: String
:param material: The technical name of the material to use for the part.
:type color: String
:param color: The technical name of the color to use for the part.
:type part_text: String
:param part_text: The expected label for the part after clicking.
:type material_text: String
:param material_text: The expected label for the material after clicking.
:type color_text: String
:param color_text: The expected label for the color after clicking.
:type verify: bool
:param verify: If a final assertion should be performed after the selection
has been done (to verify the final status).
:type has_swatch: Boolean
:param has_swatch: Whether there should be a swatch.
"""
self.select_part_mobile(part)
self.select_material_mobile(material)
self.select_color_mobile(material, color)
if verify:
self.assert_part_mobile(
brand,
model,
part,
material,
color,
part_text = part_text,
material_text = material_text,
color_text = color_text,
has_swatch = has_swatch,
select_part = False
)
def assert_part(
self,
brand,
model,
part,
material,
color,
part_text = None,
material_text = None,
color_text = None,
has_swatch = True,
select_part = True
):
"""
Checks that the part pickers have the expected state, meaning that the
complete set of assertions are properly filled.
If the text parameters are passed an extra set of assertions are going
to be performed to validate expected behaviour.
Notice that this assertion requires the changing of the current visual
state, in the sense that the part tab is going to be switched to the
one that is going to be asserted.
:type brand: String
:param brand: The brand of the model being customized.
:type model: String
:param model: The model being customized.
:type part: String
:param part: The technical name of the part being checked.
:type material: String
:param material: The technical name of the material used in the part.
:type color: String
:param color: The technical name of the color used in the part.
:type part_text: String
:param part_text: The expected label for the part.
:type material_text: String
:param material_text: The expected label for the material.
:type color_text: String
:param color_text: The expected label for the color.
:type has_swatch: Boolean
:param has_swatch: Whether there should be a swatch.
:type select_part: Boolean
:param select_part: If it's true then the part that is being asserted
is clicked before the assertions start. This is mandatory when the part
is not selected, but unnecessary otherwise. Using this option may imply
performance degradation as the part selection incurs animation.
"""
if select_part: self.select_part(part)
if part_text: self.waits.visible(".button-part.active", text = part_text)
if color_text: self.waits.visible(".button-color-option.active", text = color_text)
if material_text: self.waits.visible(".button-material.active", text = material_text)
if has_swatch:
self.waits.until(
lambda d: self.core.assert_swatch(
".content .pickers .button-part.active .swatch > img",
brand, model, material, color
),
"Part swatch didn't have the expected image"
)
self.waits.until(
lambda d: self.core.assert_swatch(
".content .pickers .button-color-option.active .swatch > img",
brand, model, material, color
),
"Color swatch didn't have the expected image"
)
def assert_part_mobile(
self,
brand,
model,
part,
material,
color,
part_text = None,
material_text = None,
color_text = None,
has_swatch = True,
select_part = True
):
"""
Checks that the part pickers have the expected state, meaning that the
complete set of assertions are properly filled.
If the text parameters are passed an extra set of assertions are going
to be performed to validate expected behaviour.
Notice that this assertion requires the changing of the current visual
state, in the sense that the part tab is going to be switched to the
one that is going to be asserted.
This method is aimed at the mobile version of the parts selector.
:type brand: String
:param brand: The brand of the model being customized.
:type model: String
:param model: The model being customized.
:type part: String
:param part: The technical name of the part being checked.
:type material: String
:param material: The technical name of the material used in the part.
:type color: String
:param color: The technical name of the color used in the part.
:type part_text: String
:param part_text: The expected label for the part.
:type material_text: String
:param material_text: The expected label for the material.
:type color_text: String
:param color_text: The expected label for the color.
:type has_swatch: Boolean
:param has_swatch: Whether there should be a swatch.
:type select_part: Boolean
:param select_part: If it's true then the part that is being asserted
is clicked before the assertions start. This is mandatory when the part
is not selected, but unnecessary otherwise. Using this option may imply
performance degradation as the part selection incurs animation.
"""
if select_part: self.select_part_mobile(part)
if part_text: self.waits.visible(".content-mobile .button-part.active", text = part_text)
if color_text: self.waits.visible(" .content-mobile .button-color-option.active", text = color_text)
if material_text: self.waits.visible(".content-mobile .button-material.active", text = material_text)
if has_swatch:
self.waits.until(
lambda d: self.core.assert_swatch(
".content-mobile .pickers .button-part.active .swatch > img",
brand, model, material, color
),
"Part swatch didn't have the expected image"
)
self.waits.until(
lambda d: self.core.assert_swatch(
".content-mobile .pickers .button-color-option.active .swatch > img",
brand, model, material, color
),
"Color swatch didn't have the expected image"
)
def url_model(self, model, brand):
return "%s/?model=%s&brand=%s" % (self.white_url, model, brand)
def url_product_id(self, product_id):
return "%s/?product_id=%s" % (self.white_url, product_id)
@property
def base_url(self):
return self.white_url
@property
def home_url(self):
return "%s/" % self.white_url
@property
def next_url(self):
return self.home_url
@property
def white_url(self):
ripe_suffix = appier.conf("RIPE_SUFFIX", None)
if ripe_suffix: white_url = "https://ripe-white-%s.platforme.com" % ripe_suffix
else: white_url = "http://localhost:3000"
white_url = appier.conf("BASE_URL", white_url)
white_url = appier.conf("WHITE_URL", white_url)
white_url = appier.conf("RIPE_WHITE_URL", white_url)
return white_url
def _capitalize_words(self, sentence):
return " ".join(map(lambda s: s.capitalize(), sentence.split(" ")))
| 40.646226 | 138 | 0.610131 | 2,135 | 17,234 | 4.830445 | 0.097892 | 0.040531 | 0.030544 | 0.020944 | 0.910016 | 0.900514 | 0.878406 | 0.840493 | 0.834093 | 0.798992 | 0 | 0.000422 | 0.311825 | 17,234 | 423 | 139 | 40.742317 | 0.86914 | 0.398979 | 0 | 0.578431 | 0 | 0.009804 | 0.212208 | 0.067479 | 0 | 0 | 0 | 0 | 0.068627 | 1 | 0.127451 | false | 0 | 0.009804 | 0.029412 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6be76b9b83a76eeed76ea0172112839724ae1019 | 72 | py | Python | bt_ig/__init__.py | lma26756/My_bt_IG_store | 318450f7dae9f175d96a88b3a91d22ac867ae1f0 | [
"MIT"
] | 22 | 2017-09-30T17:13:13.000Z | 2021-09-05T16:05:22.000Z | bt_ig/__init__.py | lma26756/My_bt_IG_store | 318450f7dae9f175d96a88b3a91d22ac867ae1f0 | [
"MIT"
] | 3 | 2017-11-07T06:25:38.000Z | 2020-12-09T12:25:33.000Z | bt_ig/__init__.py | lma26756/My_bt_IG_store | 318450f7dae9f175d96a88b3a91d22ac867ae1f0 | [
"MIT"
] | 14 | 2017-11-07T09:18:52.000Z | 2022-02-18T19:59:23.000Z | from .igstore import *
from .igdata import *
from .igbroker import *
| 18 | 24 | 0.708333 | 9 | 72 | 5.666667 | 0.555556 | 0.392157 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 72 | 3 | 25 | 24 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
d41fd91bf873b619b69f1ec5ca9670151e9ad1a6 | 2,109 | py | Python | library/final-projects/shorts.py | kausalyamahadevan/knitout-frontend-py | 1a0414a88db698383dda7e9b45ec1293948b369b | [
"MIT"
] | null | null | null | library/final-projects/shorts.py | kausalyamahadevan/knitout-frontend-py | 1a0414a88db698383dda7e9b45ec1293948b369b | [
"MIT"
] | null | null | null | library/final-projects/shorts.py | kausalyamahadevan/knitout-frontend-py | 1a0414a88db698383dda7e9b45ec1293948b369b | [
"MIT"
] | null | null | null | import sys
sys.path.append('../')
import knitout
import gabrielle
k = knitout.Writer('1 2 3 4 5 6')
fileName = 'shorts-test-3-2'
# imagePath = '../graphics/cactus-gripper-2.png'
imagePath = '../graphics/shorts-shape-2.png'
stitchPatImgPathF = '../graphics/shorts-st-2.png'
stitchPatternsF = { 'jersey': ['cyan', (128, 128, 128), 'blue'], 'garter': ['yellow', 'green'], 'rib': ['fuchsia', 'red'] }
colorArgsF = {'cyan': { 'features': {'plaiting': True} }, (128, 128, 128): { 'features': {'plaiting': True} }, 'blue': { 'extensions': {'stitchNumber': 6} }, 'yellow': {'patternRows': 2, 'passes': 1.5}, 'green': {'patternRows': 2, 'passes': 1.5}, 'fuchsia': {'sequence': 'fb'}, 'red': {'sequence': 'fb'} }
stitchPatImgPathB = '../graphics/shorts-st-2.png'
stitchPatternsB = { 'jersey': ['cyan', (128, 128, 128), 'blue'], 'garter': ['yellow', 'green'], 'rib': ['fuchsia', 'red'] }
colorArgsB = {'cyan': { 'features': {'plaiting': True} }, (128, 128, 128): { 'features': {'plaiting': True} }, 'blue': { 'extensions': {'stitchNumber': 6} }, 'yellow': {'patternRows': 2, 'passes': 1.5}, 'green': {'patternRows': 2, 'passes': 1.5}, 'fuchsia': {'sequence': 'fb'}, 'red': {'sequence': 'fb'} }
# gabrielle.shapeImgToKnitout(k, imagePath=imagePath, gauge=2, maxShortrowCount=4, addBindoff=True, excludeCarriers=[], addBorder=True, closedCaston=False, openBindoff=True)
# gabrielle.shapeImgToKnitout(k, imagePath=imagePath, gauge=2, maxShortrowCount=4, addBindoff=False, excludeCarriers=[], addBorder=True, stitchPatternsFront={'imgPath': stitchPatImgPathF, 'patterns': stitchPatternsF, 'colorArgs': colorArgsF}, stitchPatternsBack={'imgPath': stitchPatImgPathB, 'patterns': stitchPatternsB, 'colorArgs': colorArgsB})
gabrielle.shapeImgToKnitout(k, imagePath=imagePath, gauge=2, maxShortrowCount=4, addBindoff=False, excludeCarriers=['4'], addBorder=True, stitchPatternsFront={'imgPath': stitchPatImgPathF, 'patterns': stitchPatternsF, 'colorArgs': colorArgsF}, stitchPatternsBack={'imgPath': stitchPatImgPathB, 'patterns': stitchPatternsB, 'colorArgs': colorArgsB}, closedCaston=False)
k.write(f'{fileName}.k') | 72.724138 | 368 | 0.684685 | 221 | 2,109 | 6.533937 | 0.294118 | 0.033241 | 0.024931 | 0.052632 | 0.765928 | 0.738227 | 0.738227 | 0.738227 | 0.738227 | 0.738227 | 0 | 0.036182 | 0.09578 | 2,109 | 29 | 369 | 72.724138 | 0.721028 | 0.267425 | 0 | 0 | 0 | 0 | 0.347601 | 0.054475 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.133333 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
d4353a5262e6240ca2f2ea3c9c013c04dfe29bb2 | 38 | py | Python | samples/simple/forin.py | ajlopez/JPyScript | 51c12fd6e5a9de1484c2504479e7719aa093e427 | [
"MIT"
] | 2 | 2018-03-01T13:45:52.000Z | 2019-07-03T19:03:05.000Z | samples/simple/forin.py | ajlopez/JPyScript | 51c12fd6e5a9de1484c2504479e7719aa093e427 | [
"MIT"
] | null | null | null | samples/simple/forin.py | ajlopez/JPyScript | 51c12fd6e5a9de1484c2504479e7719aa093e427 | [
"MIT"
] | 1 | 2016-07-31T20:04:42.000Z | 2016-07-31T20:04:42.000Z | for k in [1,2,3,4,5]:
print(k)
| 9.5 | 22 | 0.447368 | 10 | 38 | 1.7 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192308 | 0.315789 | 38 | 3 | 23 | 12.666667 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
2e2e7f0c1a7e5b6fba5f223e4f4696680aca34a3 | 35 | py | Python | app/ch09_sqlalchemy/starter/pypi_org/models/_all_models.py | rwalden1993/FlaskTutorial | 31473d2a12acc6d4baf37a5ebbd03843528b057f | [
"MIT"
] | null | null | null | app/ch09_sqlalchemy/starter/pypi_org/models/_all_models.py | rwalden1993/FlaskTutorial | 31473d2a12acc6d4baf37a5ebbd03843528b057f | [
"MIT"
] | null | null | null | app/ch09_sqlalchemy/starter/pypi_org/models/_all_models.py | rwalden1993/FlaskTutorial | 31473d2a12acc6d4baf37a5ebbd03843528b057f | [
"MIT"
] | null | null | null | from models.package import Package
| 17.5 | 34 | 0.857143 | 5 | 35 | 6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2e397f4a614756ae13c9261ae6cdc2deaf910662 | 188 | py | Python | Democode/evoltier-master/evoltier/selection/__init__.py | Asurada2015/Multi-objective-evolution-strategy | 62f85e9fd23c9f6a3344855614a74e988bf3edd3 | [
"MIT"
] | 2 | 2019-07-16T14:33:58.000Z | 2020-08-27T09:50:29.000Z | Democode/evoltier-master/evoltier/selection/__init__.py | Asurada2015/Multi-objective-evolution-strategy | 62f85e9fd23c9f6a3344855614a74e988bf3edd3 | [
"MIT"
] | 8 | 2020-09-25T23:02:27.000Z | 2022-02-10T00:09:14.000Z | Democode/evoltier-master/evoltier/selection/__init__.py | Asurada2015/Multi-objective-evolution-strategy | 62f85e9fd23c9f6a3344855614a74e988bf3edd3 | [
"MIT"
] | null | null | null | from .pbil_selection import PBILSelection
from .cma_large_popsize_selection import CMALargePopSizeSelection
from .cma_selection import CMASelection
from .nes_selection import NESSelection
| 37.6 | 65 | 0.893617 | 22 | 188 | 7.363636 | 0.545455 | 0.37037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 188 | 4 | 66 | 47 | 0.94186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2e8ac3495f8c7c503acd1f555a045ea34bcc6d54 | 23 | py | Python | plugins/plugin_lz4/__init__.py | OscarMaireles/experiment-notebook | 7c2e5d3d5496421652e4cd9ed27385754c16e738 | [
"MIT"
] | null | null | null | plugins/plugin_lz4/__init__.py | OscarMaireles/experiment-notebook | 7c2e5d3d5496421652e4cd9ed27385754c16e738 | [
"MIT"
] | null | null | null | plugins/plugin_lz4/__init__.py | OscarMaireles/experiment-notebook | 7c2e5d3d5496421652e4cd9ed27385754c16e738 | [
"MIT"
] | null | null | null | from . import lz4_codec | 23 | 23 | 0.826087 | 4 | 23 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.130435 | 23 | 1 | 23 | 23 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
cf19ca652e2ef8efeb3c0ce9da2d95075814756e | 1,682 | py | Python | tests/algorithms/graphs/search_unit_test.py | vertexproject/pyalgs | 11b9ea37afc9e9f9e38ffacc42b53f9cd96f5f83 | [
"BSD-3-Clause"
] | 12 | 2017-05-01T10:31:42.000Z | 2021-06-23T14:03:28.000Z | tests/algorithms/graphs/search_unit_test.py | vertexproject/pyalgs | 11b9ea37afc9e9f9e38ffacc42b53f9cd96f5f83 | [
"BSD-3-Clause"
] | 2 | 2018-08-01T10:09:09.000Z | 2020-07-16T11:41:46.000Z | tests/algorithms/graphs/search_unit_test.py | vertexproject/pyalgs | 11b9ea37afc9e9f9e38ffacc42b53f9cd96f5f83 | [
"BSD-3-Clause"
] | 6 | 2017-06-04T01:41:14.000Z | 2021-01-19T05:05:44.000Z | import unittest
from pyalgs.algorithms.graphs.search import DepthFirstSearch, BreadthFirstSearch
from tests.algorithms.graphs.util import create_graph, create_digraph
class DepthFirstSearchUnitTest(unittest.TestCase):
def test_dfs(self):
g = create_graph() # or create_digraph
s = 0
dfs = DepthFirstSearch(g, s)
for v in range(1, g.vertex_count()):
if dfs.hasPathTo(v):
print(str(s) + ' is connected to ' + str(v))
print('path is ' + ' => '.join([str(i) for i in dfs.pathTo(v)]))
def test_dfs_digraph(self):
g = create_digraph()
s = 0
dfs = DepthFirstSearch(g, s)
for v in range(1, g.vertex_count()):
if dfs.hasPathTo(v):
print(str(s) + ' is connected to ' + str(v))
print('path is ' + ' => '.join([str(i) for i in dfs.pathTo(v)]))
class BreadthFirstSearchUnitTest(unittest.TestCase):
def test_dfs(self):
g = create_graph() # or create_digraph
s = 0
dfs = BreadthFirstSearch(g, s)
for v in range(1, g.vertex_count()):
if dfs.hasPathTo(v):
print(str(s) + ' is connected to ' + str(v))
print('path is ' + ' => '.join([str(i) for i in dfs.pathTo(v)]))
def test_dfs_digraph(self):
g = create_digraph()
s = 0
dfs = BreadthFirstSearch(g, s)
for v in range(1, g.vertex_count()):
if dfs.hasPathTo(v):
print(str(s) + ' is connected to ' + str(v))
print('path is ' + ' => '.join([str(i) for i in dfs.pathTo(v)]))
if __name__ == '__main__':
unittest.main()
| 32.346154 | 80 | 0.554697 | 221 | 1,682 | 4.104072 | 0.208145 | 0.052922 | 0.044101 | 0.066152 | 0.749724 | 0.749724 | 0.749724 | 0.749724 | 0.749724 | 0.749724 | 0 | 0.006879 | 0.308561 | 1,682 | 51 | 81 | 32.980392 | 0.773001 | 0.020809 | 0 | 0.820513 | 0 | 0 | 0.075426 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.102564 | false | 0 | 0.076923 | 0 | 0.230769 | 0.205128 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cf57586aeed66755943d9c9f4c52b79f63d7ab9f | 2,039 | py | Python | poc/tests/test_AuxContext.py | bookofproofs/fpl | 527b43b0f8bb3d459ee906e5ed8524a676ce3a2c | [
"MIT"
] | 4 | 2021-11-08T10:09:46.000Z | 2021-11-13T22:25:46.000Z | poc/tests/test_AuxContext.py | bookofproofs/fpl | 527b43b0f8bb3d459ee906e5ed8524a676ce3a2c | [
"MIT"
] | 1 | 2020-09-04T13:02:09.000Z | 2021-06-16T07:07:44.000Z | poc/tests/test_AuxContext.py | bookofproofs/fpl | 527b43b0f8bb3d459ee906e5ed8524a676ce3a2c | [
"MIT"
] | 1 | 2021-11-08T10:10:12.000Z | 2021-11-08T10:10:12.000Z | import unittest
from poc.classes.AuxContext import AuxContext
"""
Tests if the parsing context works properly.
"""
class AuxContextTests(unittest.TestCase):
# test parsing context
def test_is_parsing_context(self):
"""
Test if parsing context returns correct results
"""
sem = AuxContext()
sem.push_context("t1")
sem.push_context("t2")
sem.push_context("t3")
sem.push_context("t4")
self.assertTrue(sem.is_parsing_context(["t4"]))
self.assertTrue(sem.is_parsing_context(["t3", "t4"]))
self.assertTrue(sem.is_parsing_context(["t2", "t3", "t4"]))
self.assertTrue(sem.is_parsing_context(["t1", "t2", "t3", "t4"]))
self.assertFalse(sem.is_parsing_context(["t1"]))
self.assertFalse(sem.is_parsing_context(["t2"]))
self.assertFalse(sem.is_parsing_context(["t3"]))
self.assertFalse(sem.is_parsing_context(["t2", "t3"]))
self.assertFalse(sem.is_parsing_context(["t0", "t1", "t2", "t3", "t4"]))
self.assertFalse(sem.is_parsing_context(["t0", "t2", "t3", "t4"]))
self.assertFalse(sem.is_parsing_context(["t1", "t2", "t3", "t5"]))
self.assertFalse(sem.is_parsing_context(["t1", "t2", "t", "t4"]))
self.assertFalse(sem.is_parsing_context(["t1", "t", "t3", "t4"]))
def test_pop_context(self):
"""
Test if parsing context returns correct results
"""
sem = AuxContext()
sem.push_context("t1")
sem.push_context("t2")
sem.push_context("t3")
sem.push_context("t4")
with self.assertRaises(AssertionError) as t:
sem.pop_context(["t1"])
with self.assertRaises(AssertionError) as t:
sem.pop_context(["t2"])
with self.assertRaises(AssertionError) as t:
sem.pop_context(["t3"])
sem.pop_context(["t4"])
sem.pop_context(["t1", "t2", "t3"])
self.assertEqual(0, len(sem.get_context()))
if __name__ == '__main__':
unittest.main()
| 35.77193 | 80 | 0.605199 | 249 | 2,039 | 4.742972 | 0.180723 | 0.213378 | 0.18967 | 0.209145 | 0.76884 | 0.76884 | 0.738357 | 0.640135 | 0.453853 | 0.326842 | 0 | 0.031626 | 0.22462 | 2,039 | 56 | 81 | 36.410714 | 0.71537 | 0.057381 | 0 | 0.333333 | 0 | 0 | 0.059243 | 0 | 0 | 0 | 0 | 0 | 0.435897 | 1 | 0.051282 | false | 0 | 0.051282 | 0 | 0.128205 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cf66f1c7d7135517ee8b586361395419ff34c473 | 736 | py | Python | find_next_smaller_integer/tests/test_find_next_smaller_integer.py | Tomasz-Kluczkowski/Coding_exercises | be4ff36ee42735d6cf02640cc1f77e842fdc78e7 | [
"MIT"
] | null | null | null | find_next_smaller_integer/tests/test_find_next_smaller_integer.py | Tomasz-Kluczkowski/Coding_exercises | be4ff36ee42735d6cf02640cc1f77e842fdc78e7 | [
"MIT"
] | null | null | null | find_next_smaller_integer/tests/test_find_next_smaller_integer.py | Tomasz-Kluczkowski/Coding_exercises | be4ff36ee42735d6cf02640cc1f77e842fdc78e7 | [
"MIT"
] | null | null | null | import pytest
from find_next_smaller_integer.find_next_smaller_integer import next_smaller
class TestFindSmallerInt:
def test_too_short_num(self):
assert next_smaller(5) == -1
assert next_smaller(10) == -1
def test_simple_number(self):
assert next_smaller(21) == 12
assert next_smaller(315) == 153
assert next_smaller(2071) == 2017
assert next_smaller(1207) == 1072
def test_mid_complexity_numbers(self):
assert next_smaller(1234567908) == 1234567890
assert next_smaller(12345679008) == 12345678900
assert next_smaller(12345679118) == 12345678911
def test_more_complex_numbers(self):
assert next_smaller(12345679658) == 12345679568
| 28.307692 | 76 | 0.706522 | 89 | 736 | 5.52809 | 0.47191 | 0.29065 | 0.345528 | 0.170732 | 0.113821 | 0 | 0 | 0 | 0 | 0 | 0 | 0.203125 | 0.217391 | 736 | 25 | 77 | 29.44 | 0.651042 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.588235 | 1 | 0.235294 | false | 0 | 0.117647 | 0 | 0.411765 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cf835950ff02ef156b5105803349197d01c1c7a6 | 23,972 | py | Python | insights/tests/client/apps/test_compliance.py | TZ3070/insights-core | 13f4fc6bfcb89d76f0255c6259902360a298d619 | [
"Apache-2.0"
] | null | null | null | insights/tests/client/apps/test_compliance.py | TZ3070/insights-core | 13f4fc6bfcb89d76f0255c6259902360a298d619 | [
"Apache-2.0"
] | null | null | null | insights/tests/client/apps/test_compliance.py | TZ3070/insights-core | 13f4fc6bfcb89d76f0255c6259902360a298d619 | [
"Apache-2.0"
] | null | null | null | # -*- coding: UTF-8 -*-
from insights.client.apps.compliance import ComplianceClient, COMPLIANCE_CONTENT_TYPE
from mock.mock import patch, Mock, mock_open
from pytest import raises
import os
import six
PATH = '/usr/share/xml/scap/ref_id.xml'
@patch("insights.client.apps.compliance.ComplianceClient._assert_oscap_rpms_exist")
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None, compressor='gz', obfuscate=False)
def test_oscap_scan(config, assert_rpms):
compliance_client = ComplianceClient(config)
compliance_client._get_inventory_id = lambda: ''
compliance_client.get_initial_profiles = lambda: [{'attributes': {'ref_id': 'foo', 'tailored': False}}]
compliance_client.get_profiles_matching_os = lambda: []
compliance_client.find_scap_policy = lambda ref_id: '/usr/share/xml/scap/foo.xml'
compliance_client.run_scan = lambda ref_id, policy_xml, output_path, tailoring_file_path: None
compliance_client.archive.archive_tmp_dir = '/tmp'
compliance_client.archive.archive_name = 'insights-compliance-test'
archive, content_type = compliance_client.oscap_scan()
assert archive == '/tmp/insights-compliance-test.tar.gz'
assert content_type == COMPLIANCE_CONTENT_TYPE
@patch("insights.client.apps.compliance.ComplianceClient._assert_oscap_rpms_exist")
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None, compressor='gz', obfuscate=True)
def test_oscap_scan_with_obfuscation(config, assert_rpms, tmpdir):
results_file = tmpdir.mkdir('results').join('result.xml')
results_file.write("""
<xml>
<TestResult xmlns="http://checklists.nist.gov/xccdf/1.2">
<target-address>obfuscate</target-address>
<target-facts>
<fact name="urn:xccdf:fact:asset:identifier:ipv4" type="string">obfuscate</fact>
<fact name="urn:xccdf:fact:asset:identifier:ipv6" type="string">obfuscate</fact>
</target-facts>
</TestResult>
<arf xmlns="http://scap.nist.gov/schema/asset-identification/1.1">
<ip-address>
<ip-v4>obfuscate</ip-v4>
</ip-address>
<ip-address>
<ip-v6>obfuscate</ip-v6>
</ip-address>
<mac-address>obfuscate</mac-address>
</arf>
<oval xmlns="http://oval.mitre.org/XMLSchema/oval-system-characteristics-5">
<system-info>
<interfaces>
<interface>
<ip_address>obfuscate</ip_address>
<mac_address>obfuscate</mac_address>
</interface>
</interfaces>
</system-info>
</oval>
</xml>
""")
compliance_client = ComplianceClient(config)
compliance_client._get_inventory_id = lambda: ''
compliance_client.get_initial_profiles = lambda: [{'attributes': {'ref_id': 'foo', 'tailored': False}}]
compliance_client.get_profiles_matching_os = lambda: []
compliance_client.find_scap_policy = lambda ref_id: '/usr/share/xml/scap/foo.xml'
compliance_client._results_file = lambda archive_dir, profile: str(results_file)
compliance_client.run_scan = lambda ref_id, policy_xml, output_path, tailoring_file_path: None
compliance_client.archive.archive_tmp_dir = '/tmp'
compliance_client.archive.archive_name = 'insights-compliance-test'
archive, content_type = compliance_client.oscap_scan()
assert archive == '/tmp/insights-compliance-test.tar.gz'
assert content_type == COMPLIANCE_CONTENT_TYPE
obfuscated_results = open(str(results_file)).read()
assert '<target-address>obfuscate</target-address>' not in obfuscated_results
assert '<fact name="urn:xccdf:fact:asset:identifier:ipv4" type="string">obfuscate</fact>' not in obfuscated_results
assert '<fact name="urn:xccdf:fact:asset:identifier:ipv6" type="string">obfuscate</fact>' not in obfuscated_results
assert '<ip-v4>obfuscate</ip-v4>' not in obfuscated_results
assert '<ip-v6>obfuscate</ip-v6>' not in obfuscated_results
assert '<mac-address>obfuscate</mac-address>' not in obfuscated_results
assert '<ip_address>obfuscate</ip_address>' not in obfuscated_results
assert '<mac_address>obfuscate</mac_address>' not in obfuscated_results
@patch("insights.client.apps.compliance.ComplianceClient._assert_oscap_rpms_exist")
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None, compressor='gz', obfuscate=True, obfuscate_hostname=True)
def test_oscap_scan_with_hostname_obfuscation(config, assert_rpms, tmpdir):
results_file = tmpdir.mkdir('results').join('result.xml')
results_file.write("""
<xml>
<TestResult xmlns="http://checklists.nist.gov/xccdf/1.2">
<target>obfuscate</target>
<target-address>obfuscate</target-address>
<target-facts>
<fact name="urn:xccdf:fact:asset:identifier:fqdn" type="string">obfuscate</fact>
<fact name="urn:xccdf:fact:asset:identifier:host_name" type="string">obfuscate</fact>
</target-facts>
</TestResult>
<arf xmlns="http://scap.nist.gov/schema/asset-identification/1.1">
<ip-address>
<ip-v4>obfuscate</ip-v4>
</ip-address>
<ip-address>
<ip-v6>obfuscate</ip-v6>
</ip-address>
<mac-address>obfuscate</mac-address>
</arf>
<oval xmlns="http://oval.mitre.org/XMLSchema/oval-system-characteristics-5">
<system_info>
<interfaces>
<interface>
<ip_address>obfuscate</ip_address>
<mac_address>obfuscate</mac_address>
</interface>
</interfaces>
<primary_host_name>obfuscate</primary_host_name>
<node_name>obfuscate</node_name>
</system_info>
</oval>
<ai xmlns="http://scap.nist.gov/schema/asset-identification/1.1">
<hostname>obfuscate</hostname>
</ai>
</xml>
""")
compliance_client = ComplianceClient(config)
compliance_client._get_inventory_id = lambda: ''
compliance_client.get_initial_profiles = lambda: [{'attributes': {'ref_id': 'foo', 'tailored': False}}]
compliance_client.get_profiles_matching_os = lambda: []
compliance_client.find_scap_policy = lambda ref_id: '/usr/share/xml/scap/foo.xml'
compliance_client._results_file = lambda archive_dir, profile: str(results_file)
compliance_client.run_scan = lambda ref_id, policy_xml, output_path, tailoring_file_path: None
compliance_client.archive.archive_tmp_dir = '/tmp'
compliance_client.archive.archive_name = 'insights-compliance-test'
archive, content_type = compliance_client.oscap_scan()
assert archive == '/tmp/insights-compliance-test.tar.gz'
assert content_type == COMPLIANCE_CONTENT_TYPE
obfuscated_results = open(str(results_file)).read()
assert '<target-address>obfuscate</target-address>' not in obfuscated_results
assert '<fact name="urn:xccdf:fact:asset:identifier:fqdn" type="string">obfuscate</fact>' not in obfuscated_results
assert '<fact name="urn:xccdf:fact:asset:identifier:host_name" type="string">obfuscate</fact>' not in obfuscated_results
assert '<ip-v4>obfuscate</ip-v4>' not in obfuscated_results
assert '<ip-v6>obfuscate</ip-v6>' not in obfuscated_results
assert '<mac-address>obfuscate</mac-address>' not in obfuscated_results
assert '<ip_address>obfuscate</ip_address>' not in obfuscated_results
assert '<mac_address>obfuscate</mac_address>' not in obfuscated_results
assert '<fqdn>obfuscate</fqdn>' not in obfuscated_results
assert '<hostname>obfuscate</hostname>' not in obfuscated_results
assert '<target>obfuscate</target>' not in obfuscated_results
assert '<primary_host_name>obfuscate</primary_host_name>' not in obfuscated_results
assert '<node_name>obfuscate</node_name>' not in obfuscated_results
@patch("insights.client.apps.compliance.ComplianceClient._assert_oscap_rpms_exist")
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None, compressor='gz')
def test_oscap_scan_with_results_repaired(config, assert_rpms, tmpdir):
results_file = tmpdir.mkdir('results').join('result.xml')
results_file.write("""
<xml>
<version>0.9</version>
</xml>
""")
compliance_client = ComplianceClient(config)
compliance_client._ssg_version = '0.1.25'
compliance_client._get_inventory_id = lambda: ''
compliance_client.get_initial_profiles = lambda: [{'attributes': {'ref_id': 'foo', 'tailored': False}}]
compliance_client.get_profiles_matching_os = lambda: []
compliance_client.find_scap_policy = lambda ref_id: '/usr/share/xml/scap/foo.xml'
compliance_client._results_file = lambda archive_dir, profile: str(results_file)
compliance_client.run_scan = lambda ref_id, policy_xml, output_path, tailoring_file_path: None
compliance_client.archive.archive_tmp_dir = '/tmp'
compliance_client.archive.archive_name = 'insights-compliance-test'
archive, content_type = compliance_client.oscap_scan()
assert archive == '/tmp/insights-compliance-test.tar.gz'
assert content_type == COMPLIANCE_CONTENT_TYPE
repaired_results = open(str(results_file)).read()
assert '<version>0.1.25</version>' in repaired_results
@patch("insights.client.apps.compliance.call", return_value=(0, ''))
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None)
def test_missing_packages(config, call):
compliance_client = ComplianceClient(config)
compliance_client._get_inventory_id = lambda: ''
compliance_client.get_initial_profiles = lambda: [{'attributes': {'ref_id': 'foo'}}]
compliance_client.get_profiles_matching_os = lambda: []
compliance_client.find_scap_policy = lambda ref_id: '/usr/share/xml/scap/foo.xml'
compliance_client.run_scan = lambda ref_id, policy_xml: None
with raises(SystemExit):
compliance_client.oscap_scan()
@patch("insights.client.apps.compliance.call", return_value=(1, ''))
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None)
def test_errored_rpm_call(config, call):
compliance_client = ComplianceClient(config)
compliance_client._get_inventory_id = lambda: ''
compliance_client.get_initial_profiles = lambda: [{'attributes': {'ref_id': 'foo'}}]
compliance_client.get_profiles_matching_os = lambda: []
compliance_client.find_scap_policy = lambda ref_id: '/usr/share/xml/scap/foo.xml'
compliance_client.run_scan = lambda ref_id, policy_xml: None
with raises(SystemExit):
compliance_client.oscap_scan()
@patch("insights.client.apps.compliance.call", return_value=(0, '1.2.3'))
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None)
def test_get_ssg_version(config, call):
ssg_version = ComplianceClient(config).ssg_version
assert ssg_version == '1.2.3'
call.assert_called_with('rpm -qa --qf "%{VERSION}" scap-security-guide', keep_rc=True)
@patch("insights.client.apps.compliance.call", return_value=(1, '0.0.0'))
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None)
def test_get_ssg_version_with_failure(config, call):
ssg_version = ComplianceClient(config).ssg_version
assert not ssg_version
call.assert_called_with('rpm -qa --qf "%{VERSION}" scap-security-guide', keep_rc=True)
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None)
def test_get_profiles(config):
compliance_client = ComplianceClient(config)
compliance_client.inventory_id = '068040f1-08c8-43e4-949f-7d6470e9111c'
compliance_client.conn.session.get = Mock(return_value=Mock(status_code=200, json=Mock(return_value={'data': [{'attributes': 'data'}]})))
assert compliance_client.get_profiles('search string') == [{'attributes': 'data'}]
compliance_client.conn.session.get.assert_called_with('https://localhost/app/compliance/profiles', params={'search': 'search string', 'relationships': 'false'})
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None)
def test_get_profiles_no_profiles(config):
compliance_client = ComplianceClient(config)
compliance_client.inventory_id = '068040f1-08c8-43e4-949f-7d6470e9111c'
compliance_client.conn.session.get = Mock(return_value=Mock(status_code=200, json=Mock(return_value={'data': []})))
assert compliance_client.get_profiles('search string') == []
compliance_client.conn.session.get.assert_called_with('https://localhost/app/compliance/profiles', params={'search': 'search string', 'relationships': 'false'})
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None)
def test_get_profiles_error(config):
compliance_client = ComplianceClient(config)
compliance_client.inventory_id = '068040f1-08c8-43e4-949f-7d6470e9111c'
compliance_client.conn.session.get = Mock(return_value=Mock(status_code=500))
assert compliance_client.get_profiles('search string') == []
compliance_client.conn.session.get.assert_called_with('https://localhost/app/compliance/profiles', params={'search': 'search string', 'relationships': 'false'})
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None)
def test_get_initial_profiles(config):
compliance_client = ComplianceClient(config)
compliance_client.inventory_id = '068040f1-08c8-43e4-949f-7d6470e9111c'
compliance_client.conn.session.get = Mock(return_value=Mock(status_code=200, json=Mock(return_value={'data': [{'attributes': 'data'}]})))
assert compliance_client.get_initial_profiles() == [{'attributes': 'data'}]
compliance_client.conn.session.get.assert_called_with('https://localhost/app/compliance/profiles', params={'search': 'system_ids=068040f1-08c8-43e4-949f-7d6470e9111c canonical=false external=false', 'relationships': 'false'})
@patch("insights.client.apps.compliance.os_release_info", return_value=(None, '6.5'))
@patch("insights.client.config.InsightsConfig", base_url='localhost/app', systemid='', proxy=None)
def test_get_profiles_matching_os(config, os_release_info_mock):
compliance_client = ComplianceClient(config)
compliance_client.inventory_id = '068040f1-08c8-43e4-949f-7d6470e9111c'
compliance_client.conn.session.get = Mock(return_value=Mock(status_code=200, json=Mock(return_value={'data': [{'attributes': 'data'}]})))
assert compliance_client.get_profiles_matching_os() == [{'attributes': 'data'}]
compliance_client.conn.session.get.assert_called_with('https://localhost/app/compliance/profiles', params={'search': 'system_ids=068040f1-08c8-43e4-949f-7d6470e9111c canonical=false os_minor_version=5', 'relationships': 'false'})
@patch("insights.client.apps.compliance.os_release_info", return_value=(None, '6.5'))
@patch("insights.client.config.InsightsConfig")
def test_os_release(config, os_release_info_mock):
compliance_client = ComplianceClient(config)
assert compliance_client.os_release() == '6.5'
@patch("insights.client.apps.compliance.os_release_info", return_value=(None, '6.5'))
@patch("insights.client.config.InsightsConfig")
def test_os_minor_version(config, os_release_info_mock):
compliance_client = ComplianceClient(config)
assert compliance_client.os_minor_version() == '5'
@patch("insights.client.apps.compliance.os_release_info", return_value=(None, '6.5'))
@patch("insights.client.config.InsightsConfig")
def test_os_major_version(config, os_release_info_mock):
compliance_client = ComplianceClient(config)
assert compliance_client.os_major_version() == '6'
@patch("insights.client.config.InsightsConfig")
def test_profile_files(config):
compliance_client = ComplianceClient(config)
compliance_client.os_release = lambda: '7'
assert compliance_client.profile_files() == []
@patch("insights.client.apps.compliance.call", return_value=(0, PATH))
@patch("insights.client.config.InsightsConfig")
def test_find_scap_policy(config, call):
compliance_client = ComplianceClient(config)
compliance_client.profile_files = lambda: ['/something']
assert compliance_client.find_scap_policy('ref_id') == PATH
@patch("insights.client.config.InsightsConfig")
def test_find_scap_policy_with_one_datastream_file(config, tmpdir):
compliance_client = ComplianceClient(config)
dir1 = tmpdir.mkdir('scap')
file = dir1.join('test_file.xml')
file.write("""
<xccdf-1.2:Profile id="xccdf_org.ssgproject.content_profile_anssi_bp28_high">
</xccdf-1.2:Profile>
""")
compliance_client.profile_files = lambda: [str(file)]
with patch("insights.client.apps.compliance.SCAP_DATASTREAMS_PATH", str(dir1) + "/"):
assert compliance_client.find_scap_policy('content_profile_anssi_bp28_high') == file
@patch("insights.client.config.InsightsConfig")
def test_find_scap_policy_with_two_datastream_file(config, tmpdir):
compliance_client = ComplianceClient(config)
dir1 = tmpdir.mkdir('scap')
file1 = dir1.join('test_file1.xml')
file1.write("""
<xccdf-1.2:Profile id="xccdf_org.ssgproject.content_profile_anssi_bp28_high">
</xccdf-1.2:Profile>
""")
file2 = dir1.join('test_file2.xml')
file2.write("""
<xccdf-1.2:Profile id="xccdf_org.ssgproject.content_profile_anssi_bp28_high">
</xccdf-1.2:Profile>
""")
compliance_client.profile_files = lambda: [str(file1), str(file2)]
with patch("insights.client.apps.compliance.SCAP_DATASTREAMS_PATH", str(dir1) + "/"):
assert compliance_client.find_scap_policy('content_profile_anssi_bp28_high') == file1
@patch("insights.client.apps.compliance.call", return_value=(1, 'bad things happened'.encode('utf-8')))
@patch("insights.client.config.InsightsConfig")
def test_find_scap_policy_not_found(config, call):
compliance_client = ComplianceClient(config)
compliance_client.profile_files = lambda: ['/something']
assert compliance_client.find_scap_policy('ref_id') is None
@patch("insights.client.apps.compliance.call", return_value=(0, ''.encode('utf-8')))
@patch("insights.client.config.InsightsConfig")
def test_run_scan(config, call):
compliance_client = ComplianceClient(config)
output_path = '/tmp/oscap_results-ref_id.xml'
env = os.environ
env.update({'TZ': 'UTC'})
compliance_client.run_scan('ref_id', '/nonexistent', output_path)
if six.PY3:
call.assert_called_with(("oscap xccdf eval --profile ref_id --results " + output_path + ' /nonexistent'), keep_rc=True, env=env)
else:
call.assert_called_with(("oscap xccdf eval --profile ref_id --results " + output_path + ' /nonexistent').encode(), keep_rc=True, env=env)
@patch("insights.client.apps.compliance.call", return_value=(1, 'bad things happened'.encode('utf-8')))
@patch("insights.client.config.InsightsConfig")
def test_run_scan_fail(config, call):
compliance_client = ComplianceClient(config)
output_path = '/tmp/oscap_results-ref_id.xml'
env = os.environ
env.update({'TZ': 'UTC'})
with raises(SystemExit):
compliance_client.run_scan('ref_id', '/nonexistent', output_path)
if six.PY3:
call.assert_called_with(("oscap xccdf eval --profile ref_id --results " + output_path + ' /nonexistent'), keep_rc=True, env=env)
else:
call.assert_called_with(("oscap xccdf eval --profile ref_id --results " + output_path + ' /nonexistent').encode(), keep_rc=True, env=env)
@patch("insights.client.apps.compliance.call", return_value=(0, ''.encode('utf-8')))
@patch("insights.client.config.InsightsConfig")
def test_run_scan_missing_profile(config, call):
compliance_client = ComplianceClient(config)
output_path = '/tmp/oscap_results-ref_id.xml'
env = os.environ
env.update({'TZ': 'UTC'})
assert compliance_client.run_scan('ref_id', None, output_path) is None
call.assert_not_called()
@patch("insights.client.config.InsightsConfig")
def test_tailored_file_is_not_downloaded_if_not_needed(config):
compliance_client = ComplianceClient(config)
assert compliance_client.download_tailoring_file({'attributes': {'tailored': False}}) is None
@patch("insights.client.config.InsightsConfig")
def test_tailored_file_is_not_downloaded_if_tailored_is_missing(config):
compliance_client = ComplianceClient(config)
assert compliance_client.download_tailoring_file({'id': 'foo', 'attributes': {'ref_id': 'aaaaa'}}) is None
@patch("insights.client.apps.compliance.open", new_callable=mock_open)
@patch("insights.client.config.InsightsConfig")
def test_tailored_file_is_downloaded_from_initial_profile_if_os_minor_version_is_missing(config, call):
compliance_client = ComplianceClient(config)
compliance_client.conn.session.get = Mock(return_value=Mock(status_code=200, json=Mock(return_value={'data': [{'attributes': 'data'}]})))
assert 'oscap_tailoring_file-aaaaa' in compliance_client.download_tailoring_file({'id': 'foo', 'attributes': {'tailored': True, 'ref_id': 'aaaaa'}})
assert compliance_client.download_tailoring_file({'id': 'foo', 'attributes': {'tailored': False, 'ref_id': 'aaaaa'}}) is None
@patch("insights.client.apps.compliance.os_release_info", return_value=(None, '6.5'))
@patch("insights.client.config.InsightsConfig")
def test_tailored_file_is_not_downloaded_if_os_minor_version_mismatches(config, os_release_info_mock):
compliance_client = ComplianceClient(config)
compliance_client.conn.session.get = Mock(return_value=Mock(status_code=200, json=Mock(return_value={'data': [{'attributes': 'data'}]})))
assert compliance_client.download_tailoring_file({'id': 'foo', 'attributes': {'tailored': True, 'ref_id': 'aaaaa', 'os_minor_version': '2'}}) is None
assert compliance_client.download_tailoring_file({'id': 'foo', 'attributes': {'tailored': False, 'ref_id': 'aaaaa', 'os_minor_version': '2'}}) is None
@patch("insights.client.apps.compliance.os_release_info", return_value=(None, '6.5'))
@patch("insights.client.apps.compliance.open", new_callable=mock_open)
@patch("insights.client.config.InsightsConfig")
def test_tailored_file_is_downloaded_if_needed(config, call, os_release_info_mock):
compliance_client = ComplianceClient(config)
compliance_client.conn.session.get = Mock(return_value=Mock(status_code=200, json=Mock(return_value={'data': [{'attributes': 'data'}]})))
assert 'oscap_tailoring_file-aaaaa' in compliance_client.download_tailoring_file({'id': 'foo', 'attributes': {'tailored': True, 'ref_id': 'aaaaa', 'os_minor_version': '5'}})
assert compliance_client.download_tailoring_file({'id': 'foo', 'attributes': {'tailored': False, 'ref_id': 'aaaaa', 'os_minor_version': '5'}}) is None
@patch("insights.client.config.InsightsConfig")
def test_build_oscap_command_does_not_append_tailoring_path(config):
compliance_client = ComplianceClient(config)
expected_command = 'oscap xccdf eval --profile aaaaa --results output_path xml_sample'
assert expected_command == compliance_client.build_oscap_command('aaaaa', 'xml_sample', 'output_path', None)
@patch("insights.client.config.InsightsConfig")
def test_build_oscap_command_append_tailoring_path(config):
compliance_client = ComplianceClient(config)
expected_command = 'oscap xccdf eval --profile aaaaa --tailoring-file tailoring_path --results output_path xml_sample'
assert expected_command == compliance_client.build_oscap_command('aaaaa', 'xml_sample', 'output_path', 'tailoring_path')
@patch("insights.client.config.InsightsConfig")
def test__get_inventory_id(config):
compliance_client = ComplianceClient(config)
compliance_client.conn._fetch_system_by_machine_id = lambda: []
with raises(SystemExit):
compliance_client._get_inventory_id()
compliance_client.conn._fetch_system_by_machine_id = lambda: [{}]
with raises(SystemExit):
compliance_client._get_inventory_id()
compliance_client.conn._fetch_system_by_machine_id = lambda: [{'id': '12345'}]
assert compliance_client._get_inventory_id() == '12345'
| 52.570175 | 233 | 0.746871 | 3,046 | 23,972 | 5.60998 | 0.072226 | 0.124532 | 0.061154 | 0.046816 | 0.935686 | 0.921231 | 0.916023 | 0.89677 | 0.892556 | 0.876404 | 0 | 0.015814 | 0.113674 | 23,972 | 455 | 234 | 52.685714 | 0.788441 | 0.000876 | 0 | 0.690289 | 0 | 0.034121 | 0.365401 | 0.211282 | 0 | 0 | 0 | 0 | 0.2021 | 1 | 0.08399 | false | 0 | 0.013123 | 0 | 0.097113 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d855c4b53eb2892169fa68eb4f918eff434f721f | 104 | py | Python | ZenPacks/GoVanguard/Support/interfaces/Gvit.py | kurtwuckertjr/ZenPacks.GoVanguard.Support | 33f3e9ff9f5a40614f14514b69b8e2271eeb51de | [
"Apache-2.0"
] | null | null | null | ZenPacks/GoVanguard/Support/interfaces/Gvit.py | kurtwuckertjr/ZenPacks.GoVanguard.Support | 33f3e9ff9f5a40614f14514b69b8e2271eeb51de | [
"Apache-2.0"
] | null | null | null | ZenPacks/GoVanguard/Support/interfaces/Gvit.py | kurtwuckertjr/ZenPacks.GoVanguard.Support | 33f3e9ff9f5a40614f14514b69b8e2271eeb51de | [
"Apache-2.0"
] | null | null | null | from Products.ZenUI3.navigation.interfaces import IZenossNav
class IGvitSkin(IZenossNav):
pass
| 20.8 | 61 | 0.788462 | 11 | 104 | 7.454545 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011364 | 0.153846 | 104 | 4 | 62 | 26 | 0.920455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
d87322788ca16718ad274a8565139dd07eb5ccf7 | 47 | py | Python | src/routes/__init__.py | lokaimoma/Flask-QR-Code-Web-APP | 5789753757aa1939119a799cbc6bda023ea75bbc | [
"MIT"
] | 2 | 2022-03-05T18:54:15.000Z | 2022-03-24T12:19:22.000Z | src/routes/__init__.py | lokaimoma/Flask-QR-Code-Web-APP | 5789753757aa1939119a799cbc6bda023ea75bbc | [
"MIT"
] | null | null | null | src/routes/__init__.py | lokaimoma/Flask-QR-Code-Web-APP | 5789753757aa1939119a799cbc6bda023ea75bbc | [
"MIT"
] | null | null | null | # Created by Kelvin_Clark on 3/4/2022, 3:39 PM
| 23.5 | 46 | 0.723404 | 11 | 47 | 3 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 0.170213 | 47 | 1 | 47 | 47 | 0.615385 | 0.93617 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d8778cf57514605a84addd03558e2564febc5ff9 | 7,023 | py | Python | cogs/players.py | xshokuninx/werewolf | 2958bac2e95bc5947555fdf089c5bae53c46523c | [
"MIT"
] | null | null | null | cogs/players.py | xshokuninx/werewolf | 2958bac2e95bc5947555fdf089c5bae53c46523c | [
"MIT"
] | null | null | null | cogs/players.py | xshokuninx/werewolf | 2958bac2e95bc5947555fdf089c5bae53c46523c | [
"MIT"
] | null | null | null | from discord.ext import commands
from cogs.utils.player import Player
class PlayersCog(commands.Cog):
def __init__(self, bot):
self.bot = bot
@commands.command()
async def join(self, ctx):
"""ゲームに参加するコマンド"""
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
member = ctx.author
for p in self.bot.game.players:
if member.id == p.id:
return await ctx.send("すでにゲームに参加しています。")
player = Player(member.id)
self.bot.game.players.append(player)
name=member.mention
player.set_name(name)
await ctx.send(f"{member.mention}さんが参加しました。")
self.bot.game.playct += 1
@commands.command()
async def leave(self, ctx):
"""ゲームから退出するコマンド"""
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("既にゲームが始まっているため退出できません。")
member = ctx.author
for p in self.bot.game.players:
if member.id == p.id:
self.bot.game.players.remove(p)
self.bot.game.playct -= 1
return await ctx.send("ゲームから退出しました。")
return await ctx.send("ゲームに参加していません。")
@commands.command()
async def casting(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = ''
self.bot.game.castct =0
@commands.command()
async def murabito(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = self.bot.game.casting + 'あ'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def uranaishi(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = self.bot.game.casting + 'い'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def reibaishi(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = self.bot.game.casting + 'う'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def panya(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = self.bot.game.casting + 'え'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def kariudo(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = self.bot.game.casting + 'お'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def jinro(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = self.bot.game.casting + 'ア'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def kyojin(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = self.bot.game.casting + 'イ'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def kyosin(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = self.bot.game.casting + 'ウ'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def youko(self, ctx):
self.bot.game.casting = self.bot.game.casting + '狐'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def haitoku(self, ctx):
if self.bot.game.status == "nothing":
return await ctx.send("現在ゲームはありません。")
elif self.bot.game.status == "playing":
return await ctx.send("現在ゲーム進行中です。")
self.bot.game.casting = self.bot.game.casting + '背'
self.bot.game.castct += 1
await ctx.send(f"今の配役:{self.bot.game.casting}")
await ctx.send(f"配役人数:{self.bot.game.castct} 参加人数:{self.bot.game.playct}")
@commands.command()
async def CO(self, ctx, arg):
co= arg+'CO'
player = self.bot.game.players.get(ctx.author.id)
player.set_co(co)
await ctx.send(f"{player.name}さんが{player.co}しました。")
@commands.command()
async def resetb(self, ctx):
self.role = '役無し'
self.name = '名無し'
self.co ='CO無し'
self.is_dead = False
self.vote_target = None
self.raid_target = None
self.fortune_target = None
def setup(bot):
bot.add_cog(PlayersCog(bot))
| 39.234637 | 82 | 0.587498 | 917 | 7,023 | 4.487459 | 0.107961 | 0.161604 | 0.248603 | 0.135601 | 0.794411 | 0.776428 | 0.776428 | 0.776428 | 0.767679 | 0.767679 | 0 | 0.002508 | 0.261854 | 7,023 | 178 | 83 | 39.455056 | 0.791281 | 0 | 0 | 0.632258 | 0 | 0 | 0.201117 | 0.128829 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012903 | false | 0 | 0.012903 | 0 | 0.206452 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d8842f2ae723d7d0bad80554467f2aa3267d4f0a | 204 | py | Python | CA117/Lab_6/vowels_41.py | PRITI1999/OneLineWonders | 91a7368e0796e5a3b5839c9165f9fbe5460879f5 | [
"MIT"
] | 6 | 2016-02-04T00:15:20.000Z | 2019-10-13T13:53:16.000Z | CA117/Lab_6/vowels_41.py | PRITI1999/OneLineWonders | 91a7368e0796e5a3b5839c9165f9fbe5460879f5 | [
"MIT"
] | 2 | 2016-03-14T04:01:36.000Z | 2019-10-16T12:45:34.000Z | CA117/Lab_6/vowels_41.py | PRITI1999/OneLineWonders | 91a7368e0796e5a3b5839c9165f9fbe5460879f5 | [
"MIT"
] | 10 | 2016-02-09T14:38:32.000Z | 2021-05-25T08:16:26.000Z | (lambda L:[print("{} : {:>3}".format(l,L.count(l)))for l in sorted(set("aeiou"),key=L.count,reverse=True)])([l.lower()for r in __import__('sys').stdin for w in r.split()for l in w if l.lower()in"aeiou"])
| 102 | 203 | 0.642157 | 42 | 204 | 3.02381 | 0.52381 | 0.094488 | 0.094488 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005405 | 0.093137 | 204 | 1 | 204 | 204 | 0.681081 | 0 | 0 | 0 | 0 | 0 | 0.112745 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
d886d5c3a9dd346bf97134d2be636a3a9306cd9b | 112 | py | Python | src/ck_tools/__init__.py | knu2xs/safegraph-data-utilities | 4c6f4f45df3876d849f9560bedbad92d962b2ae1 | [
"Apache-2.0"
] | null | null | null | src/ck_tools/__init__.py | knu2xs/safegraph-data-utilities | 4c6f4f45df3876d849f9560bedbad92d962b2ae1 | [
"Apache-2.0"
] | null | null | null | src/ck_tools/__init__.py | knu2xs/safegraph-data-utilities | 4c6f4f45df3876d849f9560bedbad92d962b2ae1 | [
"Apache-2.0"
] | null | null | null | __all__ = ['add_group', 'create_local_data_resources']
from .main import add_group, create_local_data_resources | 37.333333 | 56 | 0.830357 | 16 | 112 | 5.0625 | 0.625 | 0.197531 | 0.345679 | 0.469136 | 0.790123 | 0.790123 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080357 | 112 | 3 | 56 | 37.333333 | 0.786408 | 0 | 0 | 0 | 0 | 0 | 0.318584 | 0.238938 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
d8d970b418af171513ef4719cdafc221f2d0c31b | 7,106 | py | Python | backend/wod_board/tests/routers/test_wod.py | GuillaumeOj/P13-WOD-Board | 36df7979e63c354507edb56eabdfc548b1964d08 | [
"MIT"
] | null | null | null | backend/wod_board/tests/routers/test_wod.py | GuillaumeOj/P13-WOD-Board | 36df7979e63c354507edb56eabdfc548b1964d08 | [
"MIT"
] | 82 | 2021-01-17T18:12:23.000Z | 2021-06-12T21:46:49.000Z | backend/wod_board/tests/routers/test_wod.py | GuillaumeOj/P13-WOD-Board | 36df7979e63c354507edb56eabdfc548b1964d08 | [
"MIT"
] | null | null | null | import datetime
import json
import pytest
from wod_board.models import wod
from wod_board.schemas import wod_schemas
@pytest.mark.asyncio
async def test_create_wod(db, client, db_user, token):
wod_json = {
"title": "Murph",
"description": "Murph Day!",
"date": "2021-03-24T14:42:46.580110",
"authorId": db_user.id,
}
response = await client.post(
"/api/wod",
json=wod_json,
headers={"Authorization": f"Bearer {token.access_token}"},
)
expected_response = wod_json | {
"id": 1,
"isComplete": False,
"wodTypeId": None,
"rounds": [],
"wodType": None,
}
assert response.status_code == 200
assert response.json() == expected_response
assert db.query(wod.Wod).count() == 1
response = await client.post(
"/api/wod",
json=wod_json,
)
assert response.status_code == 401
assert response.json() == {"detail": "Not authenticated"}
assert db.query(wod.Wod).count() == 1
wod_json = {
"title": "Murph",
"description": "Murph Day!",
"date": "2021-03-24T14:42:46.580110",
"authorId": 2,
}
response = await client.post(
"/api/wod",
json=wod_json,
headers={"Authorization": f"Bearer {token.access_token}"},
)
assert response.status_code == 422
assert response.json() == {"detail": "Author don't match with authenticated user"}
assert db.query(wod.Wod).count() == 1
wod_json = {
"title": "Murph",
"description": "Murph Day!",
"date": "2021-03-24T14:42:46.580110",
"authorId": db_user.id,
}
response = await client.post(
"/api/wod",
json=wod_json,
headers={"Authorization": f"Bearer {token.access_token}"},
)
assert response.status_code == 422
assert response.json() == {"detail": "Title already used"}
assert db.query(wod.Wod).count() == 1
wod_json = {
"title": "Cindy",
"date": "2021-03-24T14:42:46.580110",
"authorId": db_user.id,
"wodTypeId": 2,
}
response = await client.post(
"/api/wod",
json=wod_json,
headers={"Authorization": f"Bearer {token.access_token}"},
)
assert response.status_code == 422
assert response.json() == {"detail": "This type doesn't exist"}
assert db.query(wod.Wod).count() == 1
@pytest.mark.asyncio
async def test_update_wod(db, client, db_user, db_wod, token):
db.add(
wod.Wod(
title="Cindy",
is_complete=True,
author_id=db_user.id,
date=datetime.datetime.utcnow(),
)
)
db.commit()
assert db.query(wod.Wod).count() == 2
wod_json = {
"title": "Karen",
"authorId": db_user.id,
}
response = await client.put(
f"/api/wod/{db_wod.id}",
json=wod_json,
headers={"Authorization": f"Bearer {token.access_token}"},
)
assert response.status_code == 200
assert response.json()["title"] == wod_json["title"]
assert db.query(wod.Wod).count() == 2
response = await client.put(
"/api/wod/3",
json=wod_json,
headers={"Authorization": f"Bearer {token.access_token}"},
)
assert response.status_code == 422
assert response.json() == {"detail": "This WOD doesn't exist"}
assert db.query(wod.Wod).count() == 2
wod_json = {
"title": "Karen",
"authorId": db_user.id,
"wodTypeId": 1,
}
response = await client.put(
f"/api/wod/{db_wod.id}",
json=wod_json,
headers={"Authorization": f"Bearer {token.access_token}"},
)
assert response.status_code == 422
assert response.json() == {"detail": "This type doesn't exist"}
assert db.query(wod.Wod).count() == 2
wod_json = {
"title": "Karen",
"authorId": 2,
}
response = await client.put(
f"/api/wod/{db_wod.id}",
json=wod_json,
headers={"Authorization": f"Bearer {token.access_token}"},
)
assert response.status_code == 422
assert response.json() == {"detail": "Author don't match with authenticated user"}
assert db.query(wod.Wod).count() == 2
response = await client.put(f"/api/wod/{db_wod.id}", json=wod_json)
assert response.status_code == 401
assert response.json() == {"detail": "Not authenticated"}
assert db.query(wod.Wod).count() == 2
wod_json = {
"title": "Cindy",
"authorId": db_user.id,
}
response = await client.put(
f"/api/wod/{db_wod.id}",
json=wod_json,
headers={"Authorization": f"Bearer {token.access_token}"},
)
assert response.status_code == 422
assert response.json() == {"detail": "Title already used"}
assert db.query(wod.Wod).count() == 2
@pytest.mark.asyncio
async def test_get_wod_by_id(db, client, db_wod):
assert db.query(wod.Wod).count() == 1
response = await client.get(f"/api/wod/{db_wod.id}")
assert response.status_code == 200
assert response.json()["id"] == db_wod.id
assert db.query(wod.Wod).count() == 1
response = await client.get("/api/wod/2")
assert response.status_code == 422
assert response.json() == {"detail": "This WOD doesn't exist"}
assert db.query(wod.Wod).count() == 1
@pytest.mark.asyncio
async def test_get_wod_incomplete(db, client, db_wod, token, token_admin):
assert db.query(wod.Wod).count() == 1
response = await client.get(
"/api/wod/incomplete/",
headers={"Authorization": f"{token.token_type} {token.access_token}"},
)
assert response.status_code == 200
assert response.json() == json.loads(
wod_schemas.Wod.from_orm(db_wod).json(by_alias=True)
)
assert db.query(wod.Wod).count() == 1
response = await client.get(
"/api/wod/incomplete/",
)
assert response.status_code == 401
assert response.json() == {"detail": "Not authenticated"}
assert db.query(wod.Wod).count() == 1
response = await client.get(
"/api/wod/incomplete/",
headers={
"Authorization": f"{token_admin.token_type} {token_admin.access_token}"
},
)
assert response.status_code == 200
assert response.json() is None
assert db.query(wod.Wod).count() == 1
assert db.query(wod.Wod).count() == 1
@pytest.mark.asyncio
async def test_get_wods_by_user(db, client, token, db_wod):
assert db.query(wod.Wod).count() == 1
response = await client.get(
"/api/wod/wods/",
headers={"Authorization": f"{token.token_type} {token.access_token}"},
)
assert response.status_code == 200
assert response.json() == []
db_wod.is_complete = True
db.commit()
response = await client.get(
"/api/wod/wods/",
headers={"Authorization": f"{token.token_type} {token.access_token}"},
)
assert response.status_code == 200
assert response.json() == [
json.loads(wod_schemas.Wod.from_orm(db_wod).json(by_alias=True))
]
assert db.query(wod.Wod).count() == 1
| 29.485477 | 86 | 0.596679 | 899 | 7,106 | 4.592881 | 0.101224 | 0.122063 | 0.069266 | 0.085251 | 0.885687 | 0.873093 | 0.862679 | 0.856382 | 0.825381 | 0.82296 | 0 | 0.030337 | 0.243878 | 7,106 | 240 | 87 | 29.608333 | 0.738135 | 0 | 0 | 0.669856 | 0 | 0 | 0.221081 | 0.021672 | 0 | 0 | 0 | 0 | 0.277512 | 1 | 0 | false | 0 | 0.023923 | 0 | 0.023923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2b4a490f2788e929a62eeb2ba85e063566747546 | 1,262 | py | Python | 3. Python Advanced (September 2021)/3.2 Python OOP (October 2021)/12. Exercise - Polymorphism and Abstraction/04_wild_farm/project/animals/mammals.py | kzborisov/SoftUni | ccb2b8850adc79bfb2652a45124c3ff11183412e | [
"MIT"
] | 1 | 2021-02-07T07:51:12.000Z | 2021-02-07T07:51:12.000Z | 3. Python Advanced (September 2021)/3.2 Python OOP (October 2021)/12. Exercise - Polymorphism and Abstraction/04_wild_farm/project/animals/mammals.py | kzborisov/softuni | 9c5b45c74fa7d9748e9b3ea65a5ae4e15c142751 | [
"MIT"
] | null | null | null | 3. Python Advanced (September 2021)/3.2 Python OOP (October 2021)/12. Exercise - Polymorphism and Abstraction/04_wild_farm/project/animals/mammals.py | kzborisov/softuni | 9c5b45c74fa7d9748e9b3ea65a5ae4e15c142751 | [
"MIT"
] | null | null | null | from project.animals.animal import Mammal
from project.food import Food, Vegetable, Fruit, Meat
class Mouse(Mammal):
@staticmethod
def make_sound():
return "Squeak"
def feed(self, food: Food):
if not isinstance(food, Vegetable) and not isinstance(food, Fruit):
return f"{self.__class__.__name__} does not eat {food.__class__.__name__}!"
self.eat(food, 0.1)
class Dog(Mammal):
@staticmethod
def make_sound():
return "Woof!"
def feed(self, food: Food):
if not isinstance(food, Meat):
return f"{self.__class__.__name__} does not eat {food.__class__.__name__}!"
self.eat(food, 0.4)
class Cat(Mammal):
@staticmethod
def make_sound():
return "Meow"
def feed(self, food: Food):
if not isinstance(food, Vegetable) and not isinstance(food, Meat):
return f"{self.__class__.__name__} does not eat {food.__class__.__name__}!"
self.eat(food, 0.3)
class Tiger(Mammal):
@staticmethod
def make_sound():
return "ROAR!!!"
def feed(self, food: Food):
if not isinstance(food, Meat):
return f"{self.__class__.__name__} does not eat {food.__class__.__name__}!"
self.eat(food, 1.0) | 27.434783 | 87 | 0.630745 | 164 | 1,262 | 4.439024 | 0.231707 | 0.098901 | 0.14011 | 0.137363 | 0.803571 | 0.803571 | 0.605769 | 0.605769 | 0.605769 | 0.605769 | 0 | 0.008439 | 0.248811 | 1,262 | 46 | 88 | 27.434783 | 0.759494 | 0 | 0 | 0.529412 | 0 | 0 | 0.223278 | 0.16152 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.058824 | 0.117647 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
99878cfa20939e8c5a45d1cb78de0dacea19ad7c | 170 | py | Python | data/strategies/publishers/jama.py | jamesrharwood/journal-guidelines | fe6c0a6d3c0443df6fc816b9503fad24459ddb4a | [
"MIT"
] | null | null | null | data/strategies/publishers/jama.py | jamesrharwood/journal-guidelines | fe6c0a6d3c0443df6fc816b9503fad24459ddb4a | [
"MIT"
] | null | null | null | data/strategies/publishers/jama.py | jamesrharwood/journal-guidelines | fe6c0a6d3c0443df6fc816b9503fad24459ddb4a | [
"MIT"
] | null | null | null | url = "jamanetwork.com/journals/{ID}/issue$"
extractor_args = dict(restrict_text=[r"for\s*authors"])
template = "https://jamanetwork.com/journals/{ID}/pages/for-authors"
| 42.5 | 68 | 0.747059 | 24 | 170 | 5.208333 | 0.75 | 0.224 | 0.352 | 0.384 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052941 | 170 | 3 | 69 | 56.666667 | 0.776398 | 0 | 0 | 0 | 0 | 0 | 0.611765 | 0.211765 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
998f250c6b66904b47a90985a8e56c9a319a4dbb | 135 | bzl | Python | jazelle/workspace-rules.bzl | foodforarabbit/fusionjs | 1aeeb69c2b9d7e7d46ecef41c4b418a88e61a9b7 | [
"MIT"
] | 3 | 2020-02-18T15:33:54.000Z | 2021-06-15T11:08:20.000Z | jazelle/workspace-rules.bzl | foodforarabbit/fusionjs | 1aeeb69c2b9d7e7d46ecef41c4b418a88e61a9b7 | [
"MIT"
] | null | null | null | jazelle/workspace-rules.bzl | foodforarabbit/fusionjs | 1aeeb69c2b9d7e7d46ecef41c4b418a88e61a9b7 | [
"MIT"
] | 2 | 2019-12-16T11:45:54.000Z | 2020-08-03T19:11:08.000Z | load("//:rules/jazelle-dependencies.bzl", _jazelle_dependencies = "jazelle_dependencies")
jazelle_dependencies = _jazelle_dependencies | 45 | 89 | 0.837037 | 13 | 135 | 8.230769 | 0.384615 | 0.88785 | 0.728972 | 1.065421 | 0.71028 | 0.71028 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051852 | 135 | 3 | 90 | 45 | 0.835938 | 0 | 0 | 0 | 0 | 0 | 0.389706 | 0.242647 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
41ef07e87e8b9a4493adb9ce3c260da8574ca507 | 8,084 | py | Python | rocketlaunches/rocketapp/migrations/0001_initial.py | leemac/rocketlaunches | f2a2bd9915e2d3778bba669a76abb57285de7aa6 | [
"MIT"
] | 2 | 2015-01-03T00:22:20.000Z | 2016-05-08T23:45:22.000Z | rocketlaunches/rocketapp/migrations/0001_initial.py | leemac/rocketlaunches | f2a2bd9915e2d3778bba669a76abb57285de7aa6 | [
"MIT"
] | 11 | 2015-01-25T23:02:52.000Z | 2015-04-30T23:53:56.000Z | rocketlaunches/rocketapp/migrations/0001_initial.py | leemac/rocketlaunches | f2a2bd9915e2d3778bba669a76abb57285de7aa6 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
import datetime
class Migration(migrations.Migration):
dependencies = [
]
operations = [
migrations.CreateModel(
name='Launch',
fields=[
('id', models.AutoField(serialize=False, primary_key=True, auto_created=True, verbose_name='ID')),
('country', models.CharField(max_length=10, default='')),
('remarks', models.CharField(max_length=2000, null=True, blank=True)),
('customer', models.CharField(max_length=2000, null=True, blank=True)),
('customer_url', models.CharField(max_length=2000, null=True, blank=True)),
('payload', models.CharField(max_length=2000, null=True, blank=True)),
('payload_purpose', models.CharField(max_length=2000, null=True, blank=True)),
('status', models.CharField(max_length=2000, default='')),
('status_url', models.CharField(max_length=2000, null=True, blank=True)),
('launch_url', models.CharField(max_length=2000, null=True, blank=True)),
('orbit', models.CharField(max_length=100, null=True, blank=True)),
('launch_date', models.DateTimeField(null=True, verbose_name='date launched', blank=True)),
('launch_date_tbd', models.NullBooleanField()),
('created_date', models.DateTimeField(verbose_name='date created', default=datetime.datetime(2015, 4, 29, 12, 14, 3, 443781))),
('updated_date', models.DateTimeField(null=True, verbose_name='date updated', blank=True)),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='LaunchArticle',
fields=[
('id', models.AutoField(serialize=False, primary_key=True, auto_created=True, verbose_name='ID')),
('text', models.CharField(max_length=200)),
('url', models.CharField(max_length=2000)),
('launch', models.ForeignKey(default='', to='rocketapp.Launch')),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='LaunchVideo',
fields=[
('id', models.AutoField(serialize=False, primary_key=True, auto_created=True, verbose_name='ID')),
('text', models.CharField(max_length=200)),
('url', models.CharField(max_length=2000)),
('description', models.CharField(max_length=2000)),
('launch', models.ForeignKey(default='', to='rocketapp.Launch')),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Manufacturer',
fields=[
('id', models.AutoField(serialize=False, primary_key=True, auto_created=True, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('country', models.CharField(max_length=10)),
('url', models.CharField(max_length=2000)),
('wiki_url', models.CharField(max_length=2000)),
('description', models.CharField(max_length=2000)),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Organization',
fields=[
('id', models.AutoField(serialize=False, primary_key=True, auto_created=True, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('url', models.CharField(max_length=2000)),
('wiki_url', models.CharField(max_length=2000)),
('description', models.CharField(max_length=2000)),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Payload',
fields=[
('id', models.AutoField(serialize=False, primary_key=True, auto_created=True, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('url', models.CharField(max_length=2000)),
('wiki_url', models.CharField(max_length=2000)),
('description', models.CharField(max_length=2000)),
('manufacturers', models.ManyToManyField(to='rocketapp.Manufacturer')),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='PayloadVideo',
fields=[
('id', models.AutoField(serialize=False, primary_key=True, auto_created=True, verbose_name='ID')),
('text', models.CharField(max_length=200)),
('url', models.CharField(max_length=2000)),
('description', models.CharField(max_length=2000)),
('payload', models.ForeignKey(default='', to='rocketapp.Payload')),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Rocket',
fields=[
('id', models.AutoField(serialize=False, primary_key=True, auto_created=True, verbose_name='ID')),
('name', models.CharField(max_length=200)),
('stages', models.IntegerField()),
('height', models.DecimalField(max_digits=8, decimal_places=2)),
('mass', models.DecimalField(max_digits=8, decimal_places=2)),
('diameter', models.DecimalField(max_digits=8, decimal_places=2)),
('cost', models.DecimalField(max_digits=15, decimal_places=2, null=True)),
('cost_year', models.IntegerField(null=True)),
('payload_to_leo', models.DecimalField(max_digits=8, decimal_places=2, null=True)),
('payload_to_gto', models.DecimalField(max_digits=8, decimal_places=2, null=True)),
('payload_to_sso', models.DecimalField(max_digits=8, decimal_places=2, null=True)),
('payload_to_gso', models.DecimalField(max_digits=8, decimal_places=2, null=True)),
('status', models.CharField(max_length=200)),
('first_flight_date', models.DateTimeField(null=True, blank=True)),
('wiki_url', models.CharField(max_length=1000)),
('rocket_function', models.CharField(max_length=1000)),
('description', models.CharField(max_length=1000)),
('country', models.CharField(max_length=10)),
('created_date', models.DateTimeField(verbose_name='date created', default=datetime.datetime(2015, 4, 29, 12, 14, 3, 442402))),
('updated_date', models.DateTimeField(null=True, verbose_name='date updated', blank=True)),
('manufacturers', models.ManyToManyField(to='rocketapp.Manufacturer')),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Subscriber',
fields=[
('id', models.AutoField(serialize=False, primary_key=True, auto_created=True, verbose_name='ID')),
('email', models.CharField(max_length=200, default='')),
('active', models.BooleanField(default=True)),
('created_date', models.DateTimeField(verbose_name='date created', default=datetime.datetime(2015, 4, 29, 12, 14, 3, 445884))),
('updated_date', models.DateTimeField(null=True, verbose_name='date updated', blank=True)),
],
options={
},
bases=(models.Model,),
),
migrations.AddField(
model_name='launch',
name='rocket',
field=models.ForeignKey(default='', to='rocketapp.Rocket'),
preserve_default=True,
),
]
| 48.993939 | 143 | 0.557768 | 771 | 8,084 | 5.687419 | 0.141375 | 0.129989 | 0.155986 | 0.207982 | 0.84447 | 0.786317 | 0.742987 | 0.72862 | 0.689396 | 0.679133 | 0 | 0.036446 | 0.294038 | 8,084 | 164 | 144 | 49.292683 | 0.731908 | 0.002598 | 0 | 0.594937 | 0 | 0 | 0.106811 | 0.005458 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.018987 | 0 | 0.037975 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
51199212a21b91b5d1defc7ba12b486f43eb9e97 | 2,896 | py | Python | Problema3Cuerpos/tresCuerpos.py | DanielFGomez/Tarea1MetodosAvanzados | 7b0153c2eb4e22addbd4c23acd09841a5a28a8ec | [
"MIT"
] | null | null | null | Problema3Cuerpos/tresCuerpos.py | DanielFGomez/Tarea1MetodosAvanzados | 7b0153c2eb4e22addbd4c23acd09841a5a28a8ec | [
"MIT"
] | null | null | null | Problema3Cuerpos/tresCuerpos.py | DanielFGomez/Tarea1MetodosAvanzados | 7b0153c2eb4e22addbd4c23acd09841a5a28a8ec | [
"MIT"
] | null | null | null | import numpy as np
import matplotlib.pylab as plt
from matplotlib.backends.backend_pdf import PdfPages
data=np.genfromtxt('tresCuerpos.dat');
dataRK=np.genfromtxt('tresCuerposRK.dat');
dataA=np.genfromtxt('tresCuerposA.dat');
E=np.genfromtxt('energia.dat');
ERK=np.genfromtxt('energiaRK.dat');
with PdfPages('tresCuerpos.pdf') as pdf:
############
plt.figure()
plt.scatter(data[:,0],data[:,1],s=0.01)
plt.ylim([-2.5,2.5])
plt.xlim([-3,3])
plt.xlabel('q3')
plt.ylabel('p3')
plt.title('Integrador Simplectico')
pdf.savefig()
plt.close()
plt.figure()
plt.scatter(dataRK[:,0],dataRK[:,1],s=0.01)
plt.ylim([-2.5,2.5])
plt.xlim([-3,3])
plt.xlabel('q3')
plt.ylabel('p3')
plt.title('Integrador Runge Kutta')
pdf.savefig()
plt.close()
############
plt.figure()
plt.scatter(data[:,0],data[:,1],s=0.1)
plt.ylim([-0.6,0.6])
plt.xlim([-2.4,-1.0])
plt.title('Integrador Simplectico')
plt.xlabel('q3')
plt.ylabel('p3')
pdf.savefig()
plt.close()
plt.figure()
plt.scatter(dataRK[:,0],dataRK[:,1],s=0.1)
plt.ylim([-0.6,0.6])
plt.xlim([-2.4,-1.0])
plt.xlabel('q3')
plt.ylabel('p3')
plt.title('Integrador Runge Kutta')
pdf.savefig()
plt.close()
############
plt.figure()
plt.scatter(data[:,0],data[:,1],s=0.1)
plt.ylim([-0.3,0.3])
plt.xlim([-0.2,0.7])
plt.title('Integrador Simplectico')
plt.xlabel('q3')
plt.ylabel('p3')
pdf.savefig()
plt.close()
plt.figure()
plt.scatter(dataRK[:,0],dataRK[:,1],s=0.1)
plt.ylim([-0.3,0.3])
plt.xlim([-0.2,0.7])
plt.xlabel('q3')
plt.ylabel('p3')
plt.title('Integrador Runge Kutta')
pdf.savefig()
plt.close()
############
plt.figure()
plt.scatter(dataA[:,0],dataA[:,1],s=0.1)
plt.ylim([-2.5,2.5])
plt.xlim([-2,2])
plt.xlabel('q3')
plt.ylabel('p3')
plt.title('Integrador Simplectico')
pdf.savefig()
plt.close()
############
plt.figure()
plt.scatter(dataA[:,0],dataA[:,1],s=0.1)
plt.ylim([-0.6,0.6])
plt.xlim([0.2,0.7])
plt.xlabel('q3')
plt.ylabel('p3')
plt.title('Integrador Simplectico')
pdf.savefig()
plt.close()
############
plt.figure()
plt.scatter(dataA[:,0],dataA[:,1],s=0.1)
plt.ylim([-0.4,0.4])
plt.xlim([-0.4,0.4])
plt.xlabel('q3')
plt.ylabel('p3')
plt.title('Integrador Simplectico')
pdf.savefig()
plt.close()
###########
plt.figure()
plt.subplot(1,2,1)
plt.plot(E[::5000,0],E[::5000,1])
plt.xlabel('t')
plt.ylabel('E')
plt.title('Integrador Simplectico')
plt.subplot(1,2,2)
plt.plot(ERK[::5000,0],ERK[::5000,1])
plt.xlabel('t')
plt.ylabel('E')
plt.title('Integrador Runge Kutta')
pdf.savefig()
plt.close()
| 22.984127 | 52 | 0.543854 | 430 | 2,896 | 3.660465 | 0.125581 | 0.062897 | 0.125794 | 0.114358 | 0.795426 | 0.780178 | 0.780178 | 0.780178 | 0.773189 | 0.7554 | 0 | 0.064572 | 0.208564 | 2,896 | 125 | 53 | 23.168 | 0.622164 | 0 | 0 | 0.815534 | 0 | 0 | 0.131177 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.029126 | 0 | 0.029126 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
511f9818ca438ef7a4a77ea86f604dccd5cea023 | 7,834 | py | Python | ai2thor/tests/test_unity.py | mitchellnw/ai2thor | eeb9ea59f36937d2a185006423c2c06ba1dbe5c8 | [
"Apache-2.0"
] | null | null | null | ai2thor/tests/test_unity.py | mitchellnw/ai2thor | eeb9ea59f36937d2a185006423c2c06ba1dbe5c8 | [
"Apache-2.0"
] | null | null | null | ai2thor/tests/test_unity.py | mitchellnw/ai2thor | eeb9ea59f36937d2a185006423c2c06ba1dbe5c8 | [
"Apache-2.0"
] | null | null | null |
#import pytest
import os
import ai2thor.controller
def releases_dir(self):
return os.path.normpath(os.path.join(os.path.abspath(__file__), "..", "..", "..", "unity", "builds"))
controller = ai2thor.controller.Controller()
controller.releases_dir = releases_dir.__get__(controller, ai2thor.controller.Controller)
print("trying to start unity")
controller.start()
print("started")
controller.reset('FloorPlan28')
controller.step(dict(action='Initialize', gridSize=0.25))
#@pytest.fixture
#def controller():
# return c
def assert_near(point1, point2):
assert point1.keys() == point2.keys()
for k in point1.keys():
assert round(point1[k], 3) == round(point2[k], 3)
def test_rectangle_aspect():
controller = ai2thor.controller.Controller()
controller.releases_dir = releases_dir.__get__(controller, ai2thor.controller.Controller)
print("trying to start unity")
controller.start(player_screen_width=600, player_screen_height=300)
print("started")
controller.reset('FloorPlan28')
event = controller.step(dict(action='Initialize', gridSize=0.25))
assert event.frame.shape == (300, 600, 3)
def test_lookdown():
e = controller.step(dict(action='RotateLook', rotation=0, horizon=0))
position = controller.last_event.metadata['agent']['position']
horizon = controller.last_event.metadata['agent']['cameraHorizon']
assert horizon == 0.0
e = controller.step(dict(action='LookDown'))
assert e.metadata['agent']['position'] == position
assert round(e.metadata['agent']['cameraHorizon']) == 30
assert e.metadata['agent']['rotation'] == dict(x=0, y=0, z=0)
e = controller.step(dict(action='LookDown'))
assert round(e.metadata['agent']['cameraHorizon']) == 60
e = controller.step(dict(action='LookDown'))
assert round(e.metadata['agent']['cameraHorizon']) == 60
def test_no_leak_params():
action = dict(action='RotateLook', rotation=0, horizon=0)
e = controller.step(action)
assert 'sequenceId' not in action
def test_lookup():
e = controller.step(dict(action='RotateLook', rotation=0, horizon=0))
position = controller.last_event.metadata['agent']['position']
horizon = controller.last_event.metadata['agent']['cameraHorizon']
assert horizon == 0.0
e = controller.step(dict(action='LookUp'))
assert e.metadata['agent']['position'] == position
assert e.metadata['agent']['cameraHorizon'] == -30.0
assert e.metadata['agent']['rotation'] == dict(x=0, y=0, z=0)
e = controller.step(dict(action='LookUp'))
assert e.metadata['agent']['cameraHorizon'] == -30.0
def test_rotate_left():
e = controller.step(dict(action='RotateLook', rotation=0, horizon=0))
position = controller.last_event.metadata['agent']['position']
rotation = controller.last_event.metadata['agent']['rotation']
assert rotation == dict(x=0, y=0, z=0)
horizon = controller.last_event.metadata['agent']['cameraHorizon']
e = controller.step(dict(action='RotateLeft'))
assert e.metadata['agent']['position'] == position
assert e.metadata['agent']['cameraHorizon'] == horizon
assert e.metadata['agent']['rotation']['y'] == 270.0
assert e.metadata['agent']['rotation']['x'] == 0.0
assert e.metadata['agent']['rotation']['z'] == 0.0
def test_add_third_party_camera():
assert len(controller.last_event.metadata['thirdPartyCameras']) == 0
e = controller.step(dict(action='AddThirdPartyCamera', position=dict(x=1.2, y=2.3, z=3.4), rotation=dict(x=30, y=40,z=50)))
assert len(e.metadata['thirdPartyCameras']) == 1
assert_near(e.metadata['thirdPartyCameras'][0]['position'], dict(x=1.2, y=2.3, z=3.4))
assert_near(e.metadata['thirdPartyCameras'][0]['rotation'], dict(x=30, y=40, z=50))
assert len(e.third_party_camera_frames) == 1
assert e.third_party_camera_frames[0].shape == (300,300,3)
e = controller.step(dict(action='UpdateThirdPartyCamera', thirdPartyCameraId=0, position=dict(x=2.2, y=3.3, z=4.4), rotation=dict(x=10, y=20,z=30)))
assert_near(e.metadata['thirdPartyCameras'][0]['position'], dict(x=2.2, y=3.3, z=4.4))
assert_near(e.metadata['thirdPartyCameras'][0]['rotation'], dict(x=10, y=20, z=30))
def test_rotate_look():
e = controller.step(dict(action='RotateLook', rotation=0, horizon=0))
position = controller.last_event.metadata['agent']['position']
rotation = controller.last_event.metadata['agent']['rotation']
assert rotation == dict(x=0, y=0, z=0)
e = controller.step(dict(action='RotateLook', rotation=90, horizon=31))
assert e.metadata['agent']['position'] == position
assert int(e.metadata['agent']['cameraHorizon']) == 31
assert e.metadata['agent']['rotation']['y'] == 90.0
assert e.metadata['agent']['rotation']['x'] == 0.0
assert e.metadata['agent']['rotation']['z'] == 0.0
def test_rotate_right():
e = controller.step(dict(action='RotateLook', rotation=0, horizon=0))
position = controller.last_event.metadata['agent']['position']
rotation = controller.last_event.metadata['agent']['rotation']
assert rotation == dict(x=0, y=0, z=0)
horizon = controller.last_event.metadata['agent']['cameraHorizon']
e = controller.step(dict(action='RotateRight'))
assert e.metadata['agent']['position'] == position
assert e.metadata['agent']['cameraHorizon'] == horizon
assert e.metadata['agent']['rotation']['y'] == 90.0
assert e.metadata['agent']['rotation']['x'] == 0.0
assert e.metadata['agent']['rotation']['z'] == 0.0
def test_teleport():
controller.step(dict(action='Teleport', x=-1.5, z=-1.5, y=1.0), raise_for_failure=True)
position = controller.last_event.metadata['agent']['position']
assert position == dict(x=-1.5, z=-1.5, y=0.9799992442131042)
controller.step(dict(action='Teleport', x=-2.0, z=-2.5, y=1.0), raise_for_failure=True)
position = controller.last_event.metadata['agent']['position']
assert position == dict(x=-2.0, z=-2.5, y=0.9799990057945251)
def test_moveahead():
controller.step(dict(action='Teleport', x=-1.5, z=-1.5, y=1.0), raise_for_failure=True)
controller.step(dict(action='MoveAhead'), raise_for_failure=True)
position = controller.last_event.metadata['agent']['position']
assert position == dict(x=-1.25, z=-1.5, y=0.9799989461898804)
def test_moveback():
controller.step(dict(action='Teleport', x=-1.5, z=-1.5, y=1.0), raise_for_failure=True)
controller.step(dict(action='MoveBack'), raise_for_failure=True)
position = controller.last_event.metadata['agent']['position']
assert position == dict(x=-1.75, z=-1.5, y=0.9799989461898804)
def test_moveleft():
controller.step(dict(action='Teleport', x=-1.5, z=-1.5, y=1.0), raise_for_failure=True)
controller.step(dict(action='MoveLeft'), raise_for_failure=True)
position = controller.last_event.metadata['agent']['position']
assert_near(position, dict(x=-1.5, z=-1.25, y=0.98))
def test_moveright():
controller.step(dict(action='Teleport', x=-1.5, z=-1.5, y=1.0), raise_for_failure=True)
controller.step(dict(action='MoveRight'), raise_for_failure=True)
position = controller.last_event.metadata['agent']['position']
assert_near(position, dict(x=-1.5, z=-1.75, y=0.98))
def test_moveahead_mag():
controller.step(dict(action='Teleport', x=-1.5, z=-1.5, y=1.0), raise_for_failure=True)
controller.step(dict(action='MoveAhead', moveMagnitude=0.5), raise_for_failure=True)
position = controller.last_event.metadata['agent']['position']
assert position == dict(x=-1.0, z=-1.5, y=0.9799989461898804)
def test_moveahead_fail():
controller.step(dict(action='Teleport', x=-1.5, z=-1.5, y=1.0), raise_for_failure=True)
controller.step(dict(action='MoveAhead', moveMagnitude=5.0))
assert not controller.last_event.metadata['lastActionSuccess']
| 45.546512 | 152 | 0.688537 | 1,101 | 7,834 | 4.801998 | 0.110808 | 0.105731 | 0.105542 | 0.140723 | 0.823908 | 0.790808 | 0.773785 | 0.742009 | 0.700965 | 0.687157 | 0 | 0.050561 | 0.123947 | 7,834 | 171 | 153 | 45.812866 | 0.719802 | 0.007276 | 0 | 0.537313 | 0 | 0 | 0.152876 | 0.002831 | 0 | 0 | 0 | 0 | 0.373134 | 1 | 0.126866 | false | 0 | 0.014925 | 0.007463 | 0.149254 | 0.029851 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
514b6caf5b33f609b57b463ec654a49273360291 | 228 | py | Python | starry_students/manager/admin.py | dhruvsingh/starry_students | d43e5f2c95c076ae1b686a6b1de3d5d6a5b66520 | [
"MIT"
] | null | null | null | starry_students/manager/admin.py | dhruvsingh/starry_students | d43e5f2c95c076ae1b686a6b1de3d5d6a5b66520 | [
"MIT"
] | null | null | null | starry_students/manager/admin.py | dhruvsingh/starry_students | d43e5f2c95c076ae1b686a6b1de3d5d6a5b66520 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from django.contrib import admin
from starry_students.manager.models import TeacherStudent, Teacher, Student
admin.site.register(Teacher)
admin.site.register(Student)
admin.site.register(TeacherStudent)
| 28.5 | 75 | 0.802632 | 29 | 228 | 6.275862 | 0.586207 | 0.148352 | 0.28022 | 0.263736 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004785 | 0.083333 | 228 | 7 | 76 | 32.571429 | 0.866029 | 0.092105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
5abedb171431dca618748fe73c3e874f1a02cfbd | 9,302 | py | Python | kirchhoff.py | GeorgeWilliamStrong/ARC-Seismic | d0e810e31bea9a778f3271c1999468fa72c7b254 | [
"MIT"
] | null | null | null | kirchhoff.py | GeorgeWilliamStrong/ARC-Seismic | d0e810e31bea9a778f3271c1999468fa72c7b254 | [
"MIT"
] | null | null | null | kirchhoff.py | GeorgeWilliamStrong/ARC-Seismic | d0e810e31bea9a778f3271c1999468fa72c7b254 | [
"MIT"
] | null | null | null | # >>> Import zoeppritz.py
from zoeppritz import *
# >>> Ricker Wavelet
def ricker(f, t):
"""
input: frequency in HZ (f), time array (t)
output: ricker wavelet (x)
"""
x = (1. - 2. * (np.pi ** 2) * (f ** 2) * (t ** 2)) * np.exp(-(np.pi ** 2) * (f ** 2) * (t ** 2))
return x
def ricker_dt(f, t):
"""
input: frequency in hz (f), time array (t)
output: derivative of ricker wavelet w.r.t time
"""
x = 2. * (np.pi ** 2) * (f ** 2) * t * np.exp(-(np.pi ** 2) * (f ** 2) * (t ** 2)) * (2. * (np.pi ** 2) *
(f ** 2) * (t ** 2) - 3.)
return x
# >>> Proximity Function
def find_nearest(array, value):
"""
input: numpy array of values (array), specified value (value)
ouput: index of closest term in array to the specified value (idx)
"""
idx = (np.abs(array - value)).argmin()
return idx
# >>> Isotropic Kirchhoff Integral
def isotropic_kirchhoff_integral(vp1, vp2, vs1, vs2, p1, p2, d,
l, w, ds, tmin, tmax, f, dt=5e-4):
"""
input: upper and lower p-wave velocities (vp1, vp2), s-wave
velocities (vs1, vs2), densities (p1, p2), interface depth (d),
reciever offset (l), width (w), surface grid-spacing (ds), minimum
time (tmin), maximum time (tmax), frequency (f), time increment (dt)
output: single synthetic seismogram (trace)
"""
width = np.arange(-w / 2., (w / 2.) + 1, ds) # width array
length = np.arange(0, l + 1, ds) # length array
time = np.arange(tmin, tmax, dt) # time window
# initialise matrix to store times and kirchhoff integral values
k_values = np.zeros((len(time), 2), dtype=complex)
for i in range(len(time)):
k_values[i][0] = time[i]
# loop over each surface gridpoint between source and reciever
for i in range(len(length)):
for j in range(len(width)):
# calculate geometry
r0 = np.sqrt(length[i] ** 2 + width[j] ** 2 + d ** 2) # source to surface
r = np.sqrt((l - length[i]) ** 2 + width[j] ** 2 + d ** 2) # surface to reciever
theta0 = np.arctan(float(length[i]) / float(d)) # incidence angle
theta = np.arctan(float(l - length[i]) / float(d)) # reflection angle
# calculate the reflection coefficient
r_theta0 = isotropic_zoeppritz(vp1, vp2, vs1, vs2, p1, p2, np.degrees(theta0))[0][0]
# calculate the kirchhoff integral value
kirchhoff = ((r_theta0 * (cos(theta0) + cos(theta)) * (ds ** 2)) / (4. * np.pi * vp1 * r0 * r)) / dt
# find the closest time in J to the corresponding ray
index = find_nearest(k_values[:, 0], ((r0 + r) / vp1))
# assign kirchhoff integral value to closest time
if k_values[index][1] == 0.0:
k_values[index][1] = kirchhoff
else:
k_values[index][1] = k_values[index][1] + kirchhoff
# use the time derivative of the ricker wavelet as it is a reflection
wavelet = ricker_dt(f, k_values[:, 0].real)
# convolve the wavelet with the kirchhoff integral values
convolution = np.convolve(wavelet, k_values[:, 1])[::2]
# hilbert transform the imaginary components to account for phase shifts
trace = convolution.real + hilbert(convolution.imag)
return trace
# >>> Anisotropic Kirchhoff Integral
def anisotropic_kirchhoff_integral(c1, c2, p1, p2, a_angle, d,
l, w, ds, tmin, tmax, f, vp1, dt=5e-4, p_white=1e-7):
"""
input: 6x6 upper and lower elastic tensors (c1, c2), upper and
lower densities (p1, p2), azimuth angle (a_angle), interface depth (d),
reciever offset (l), width (w), surface grid-spacing (ds), minimum
time (tmin), maximum time (tmax), frequency (f), upper p-wave velocity (vp1),
time increment (dt), pre-whitening (p_white)
output: single synthetic seismogram (trace)
"""
width = np.arange(-w / 2., (w / 2.) + 1, ds) # width array
length = np.arange(0, l + 1, ds) # length array
time = np.arange(tmin, tmax, dt) # time window
# initialise matrix to store times and kirchhoff integral values
k_values = np.zeros((len(time), 2), dtype=complex)
for i in range(len(time)):
k_values[i][0] = time[i]
# loop over each surface gridpoint between source and reciever
for i in range(len(length)):
for j in range(len(width)):
# calculate geometry
r0 = np.sqrt(length[i] ** 2 + width[j] ** 2 + d ** 2) # source to surface
r = np.sqrt((l - length[i]) ** 2 + width[j] ** 2 + d ** 2) # surface to reciever
theta0 = np.arctan(float(length[i]) / float(d)) # incidence angle
theta = np.arctan(float(l - length[i]) / float(d)) # reflection angle
# calculate the reflection coefficient
r_theta0 = anisotropic_zoeppritz(c1, c2, p1, p2,
np.degrees(theta0), a_angle, p_white)[0][0][0]
# calculate the kirchhoff integral value
kirchhoff = ((r_theta0 * (cos(theta0) + cos(theta)) * (ds ** 2)) / (4. * np.pi * vp1 * r0 * r)) / dt
# find the closest time in J to the corresponding ray
index = find_nearest(k_values[:, 0], ((r0 + r) / vp1))
# assign kirchhoff integral value to closest time
if k_values[index][1] == 0.0:
k_values[index][1] = kirchhoff
else:
k_values[index][1] = k_values[index][1] + kirchhoff
# use the time derivative of the ricker wavelet as it is a reflection
wavelet = ricker_dt(f, k_values[:, 0].real)
# convolve the wavelet with the kirchhoff integral values
convolution = np.convolve(wavelet, k_values[:, 1])[::2]
# hilbert transform the imaginary components to account for phase shifts
trace = convolution.real + hilbert(convolution.imag)
return trace
# >>> Generate Isotropic Synthetic Data
def isotropic_synthetic(rec_min, rec_max, drec,
vp1, vp2, vs1, vs2, p1, p2, d, w, f, ds=20.,
tmin=-1, tmax=5, dt=5e-4):
"""
input: minumum reciever distance (rec_min), maximum reciever distance
(rec_max), reciever spacing (drec), upper and lower p-wave velocities
(vp1, vp2), s-wave velocities (vs1, vs2), densities (p1, p2),
interface depth (d), reciever offset (l), width (w), frequency (f),
surface grid-spacing (ds), minimum time (tmin), maximum time (tmax),
time increment (dt)
output: array of traces (traces), time array (T)
"""
rec = np.arange(rec_min, rec_max + 1, drec) # initialise reciever geometry
t = np.arange(tmin, tmax, dt) # time array
traces = np.zeros((len(rec), int((tmax - tmin) / dt))) # traces array
# loop through recievers, calculating corresponding traces
for i in tqdm(range(len(rec))):
traces[i] = isotropic_kirchhoff_integral(vp1, vp2, vs1, vs2, p1,
p2, d, rec[i], w, ds, tmin, tmax, f, dt)
return traces, t
# >>> Generate Anisotropic Synthetic Data
def anisotropic_synthetic(rec_min, rec_max, drec,
c1, c2, p1, p2, d, w, f,
vp1, a_angle=0., ds=20., tmin=-1, tmax=5, dt=5e-4, p_white=1e-7):
"""
input: minumum reciever distance (rec_min), maximum reciever distance
(rec_max), reciever spacing (drec), 6x6 upper and lower elastic tensors
(C1, C2), upper and lower densities (p1, p2), interface depth (d),
reciever offset (l), width (w), frequency (f), upper p-wave velocity (vp1),
azimuth angle (a_angle), surface grid-spacing (ds), minimum time (tmin),
maximum time (tmax), time increment (dt), pre-whitening (p_white)
output: array of traces (traces), time array (T)
"""
rec = np.arange(rec_min, rec_max + 1, drec) # initialise reciever geometry
t = np.arange(tmin, tmax, dt) # time array
traces = np.zeros((len(rec), int((tmax - tmin) / dt))) # traces array
# loop through recievers, calculating corresponding traces
for i in tqdm(range(len(rec))):
traces[i] = anisotropic_kirchhoff_integral(c1, c2, p1, p2,
a_angle, d, rec[i], w, ds, tmin, tmax, f, vp1, dt, p_white)
return traces, t
# >>> Plot Synthetic Data
def plot_synthetic(traces, time, scale_fac=1, ymin=0, ymax=1.5):
"""
input: array of traces (traces), corresponding time array (time),
scale factor (scale_fac), y axis minimum (ymin) and maximum (ymax)
output: plot of the synthetic traces
"""
fig, ax1 = plt.subplots(figsize=(13, 7))
for i in range(len(traces)):
ax1.plot((scale_fac * traces[i]) + i, time, 'k', linewidth=1.)
ax1.fill_betweenx(time, i, (scale_fac * traces[i]) + i,
where=(((scale_fac * traces[i]) + i) > i), color='k')
ax1.set_xticks([])
plt.xlim((-1., len(traces)))
plt.ylim((ymin, ymax))
ax1.invert_yaxis()
plt.grid()
ax1.set_ylabel('Time (s)')
# plt.savefig('synthetic.png', dpi=1000) # uncomment to save figure as image
plt.show()
| 37.508065 | 115 | 0.580628 | 1,291 | 9,302 | 4.126259 | 0.161115 | 0.023653 | 0.018021 | 0.019523 | 0.804956 | 0.785245 | 0.772104 | 0.752957 | 0.728365 | 0.715975 | 0 | 0.030725 | 0.282735 | 9,302 | 247 | 116 | 37.659919 | 0.767686 | 0.408622 | 0 | 0.589474 | 0 | 0 | 0.001914 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084211 | false | 0 | 0.010526 | 0 | 0.168421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
850daa883d1bb7a11c66d4affcf6e985f84aff4d | 44 | py | Python | nipype/workflows/rsfmri/fsl/__init__.py | carlohamalainen/nipype | 0c4f587946f48277de471b1801b60bd18fdfb775 | [
"BSD-3-Clause"
] | 1 | 2018-04-18T12:13:37.000Z | 2018-04-18T12:13:37.000Z | nipype/workflows/rsfmri/fsl/__init__.py | ito-takuya/nipype | 9099a5809487b55868cdec82a719030419cbd6ba | [
"BSD-3-Clause"
] | 2 | 2017-10-05T21:08:38.000Z | 2018-10-09T23:01:23.000Z | nipype/workflows/rsfmri/fsl/__init__.py | ito-takuya/nipype | 9099a5809487b55868cdec82a719030419cbd6ba | [
"BSD-3-Clause"
] | 1 | 2020-02-19T13:47:05.000Z | 2020-02-19T13:47:05.000Z | from .resting import create_resting_preproc
| 22 | 43 | 0.886364 | 6 | 44 | 6.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 44 | 1 | 44 | 44 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
85210de6f73ce890c813f69cde426f1162c0c54b | 23 | py | Python | project/apps/django_backend_template/views/__init__.py | adosaa/Backend-django-app | 3a7eb746ebc703e2cdbf1e4b2ac5703b3fedcd85 | [
"MIT"
] | 2 | 2020-11-04T21:47:48.000Z | 2020-11-04T21:47:50.000Z | project/apps/django_backend_template/views/__init__.py | adosaa/Backend-Django-App | 3a7eb746ebc703e2cdbf1e4b2ac5703b3fedcd85 | [
"MIT"
] | null | null | null | project/apps/django_backend_template/views/__init__.py | adosaa/Backend-Django-App | 3a7eb746ebc703e2cdbf1e4b2ac5703b3fedcd85 | [
"MIT"
] | null | null | null | from .student import *
| 11.5 | 22 | 0.73913 | 3 | 23 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
51963c4ee7a3565d437c8f8787ebd1fed5fe74e9 | 28 | py | Python | data_processing/name_resolution/__init__.py | halabikeren/plant_pollynator_networks | 3ad2b9fad5bb3f669818421d63a451e08f70c6ef | [
"MIT"
] | null | null | null | data_processing/name_resolution/__init__.py | halabikeren/plant_pollynator_networks | 3ad2b9fad5bb3f669818421d63a451e08f70c6ef | [
"MIT"
] | null | null | null | data_processing/name_resolution/__init__.py | halabikeren/plant_pollynator_networks | 3ad2b9fad5bb3f669818421d63a451e08f70c6ef | [
"MIT"
] | null | null | null | from .name_resolver import * | 28 | 28 | 0.821429 | 4 | 28 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 28 | 1 | 28 | 28 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
51a4c1c0bef6bb9580690bd38bae17c673dc74ea | 35 | py | Python | smart_home_hub/vui/tts/__init__.py | rsmith49/smart_home_hub | 36c8d07feecfb0166f0125585e109b1b357c4519 | [
"Apache-2.0"
] | null | null | null | smart_home_hub/vui/tts/__init__.py | rsmith49/smart_home_hub | 36c8d07feecfb0166f0125585e109b1b357c4519 | [
"Apache-2.0"
] | null | null | null | smart_home_hub/vui/tts/__init__.py | rsmith49/smart_home_hub | 36c8d07feecfb0166f0125585e109b1b357c4519 | [
"Apache-2.0"
] | null | null | null | from .base_tts import TextToSpeech
| 17.5 | 34 | 0.857143 | 5 | 35 | 5.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.935484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
51dd3a2e68fd29edd169af1749d803a7a269c585 | 6,261 | py | Python | ols_bootstrap/auxillary/bca.py | pvh95/MasterThesis | 27ed22ada5ba293870537ca0bb52e0fe2fce9db2 | [
"MIT"
] | null | null | null | ols_bootstrap/auxillary/bca.py | pvh95/MasterThesis | 27ed22ada5ba293870537ca0bb52e0fe2fce9db2 | [
"MIT"
] | null | null | null | ols_bootstrap/auxillary/bca.py | pvh95/MasterThesis | 27ed22ada5ba293870537ca0bb52e0fe2fce9db2 | [
"MIT"
] | null | null | null | import numpy as np
from scipy.stats import norm
from ols_bootstrap.auxillary.linreg import LR
class BCa:
"""
Implements percentile, BC, BCa confidence interval within the BCa class.
Percentile is a special case of BCa with acceleration factor a_hat=0 and z0 = 0
BC is a special case of BCa with acceleration factor a_hat = 0
"""
def __init__(self, Y, X, orig_params, bs_params, ci=0.95, ci_type="bc"):
self._Y = Y
self._X = X
self._orig_params = orig_params.reshape(orig_params.shape[0], 1)
self._bs_params = bs_params
self._ci = ci
self._lwb = (1 - self._ci) / 2
self._upb = self._ci + self._lwb
self._ci_type = ci_type
def _compute_z0_jknife_reps_acceleration(self):
self._a_hat = np.zeros(self._orig_params.shape[0])
num_of_bs = self._bs_params.shape[1]
# Compute z0, the inverse normal distribution function of median bias
self._z0 = norm.ppf(
np.sum(self._bs_params < self._orig_params, axis=1) / num_of_bs
)
# Compute acceleration factor a_hat. It will be computed from jacknife estimator of the skewness of the parameter if computing for BCa.
if self._ci_type == "bca":
jknife_reps = np.zeros_like(self._X)
for i in range(jknife_reps.shape[0]):
jknife_X_sample = np.delete(self._X, i, axis=0)
jknife_Y_sample = np.delete(self._Y, i, axis=0)
ols_model = LR(jknife_Y_sample, jknife_X_sample)
ols_model.fit()
jknife_reps[i, :] = ols_model.params
mean_jknife_params = np.mean(jknife_reps, axis=0)
self._a_hat = (1 / 6) * np.divide(
np.sum((mean_jknife_params - jknife_reps) ** 3, axis=0),
(np.sum((mean_jknife_params - jknife_reps) ** 2, axis=0) ** (3 / 2)),
)
@property
def bca_ci(self):
if self._ci_type == "percentile":
bca_ci_mtx = np.percentile(
self._bs_params, [self._lwb * 100, self._upb * 100], axis=1
).T
else:
self._compute_z0_jknife_reps_acceleration()
z_lower = norm.ppf(self._lwb)
z_upper = norm.ppf(self._upb)
numerator_in_ci_lower = self._z0 + z_lower
numerator_in_ci_upper = self._z0 + z_upper
bca_lower_ppf = norm.cdf(
self._z0
+ numerator_in_ci_lower / (1 - self._a_hat * numerator_in_ci_lower)
)
bca_upper_ppf = norm.cdf(
self._z0
+ numerator_in_ci_upper / (1 - self._a_hat * numerator_in_ci_upper)
)
bca_lwb_ci_val = np.diag(
np.percentile(self._bs_params, bca_lower_ppf * 100, axis=1)
)
bca_upb_ci_val = np.diag(
np.percentile(self._bs_params, bca_upper_ppf * 100, axis=1)
)
bca_ci_mtx = np.c_[bca_lwb_ci_val, bca_upb_ci_val]
return bca_ci_mtx
# # A slightly modified old verison
# class BCa:
# """
# Implements percentile, BC, BCa confidence interval within the BCa class.
# Percentile is a special case of BCa with acceleration factor a_hat=0 and z0 = 0
# BC is a special case of BCa with acceleration factor a_hat = 0
# """
# def __init__(self, Y, X, orig_params, bs_params, ci=0.95, ci_type="bc"):
# self._Y = Y
# self._X = X
# self._orig_params = orig_params
# self._bs_params = bs_params
# self._ci = ci
# self._lwb = (1 - self._ci) / 2
# self._upb = self._ci + self._lwb
# self._ci_type = ci_type
# def _compute_z0_jknife_reps_acceleration(self):
# self._z0 = np.zeros_like(self._orig_params)
# self._a_hat = np.zeros_like(self._orig_params)
# # Compute z0, the inverse normal distribution function of median bias
# for row_ind in range(self._z0.shape[0]):
# self._z0[row_ind] = norm.ppf(
# np.sum(self._bs_params[row_ind, :] < self._orig_params[row_ind])
# / self._bs_params.shape[1]
# )
# # Compute acceleration factor a_hat. It will be computed from jacknife estimator of the skewness of the parameter if computing for BCa.
# if self._ci_type == "bca":
# jknife_reps = np.zeros_like(self._X)
# for i in range(jknife_reps.shape[0]):
# jknife_X_sample = np.delete(self._X, i, axis=0)
# jknife_Y_sample = np.delete(self._Y, i, axis=0)
# ols_model = LR(jknife_Y_sample, jknife_X_sample)
# ols_model.fit()
# jknife_reps[i, :] = ols_model.params
# mean_jknife_params = np.mean(jknife_reps, axis=0)
# self._a_hat = (1 / 6) * np.divide(
# np.sum((mean_jknife_params - jknife_reps) ** 3, axis=0),
# (np.sum((mean_jknife_params - jknife_reps) ** 2, axis=0) ** (3 / 2)),
# )
# @property
# def bca_ci(self):
# if self._ci_type == "percentile":
# bca_ci_mtx = np.percentile(
# self._bs_params, [self._lwb * 100, self._upb * 100], axis=1
# ).T
# else:
# self._compute_z0_jknife_reps_acceleration()
# z_lower = norm.ppf(self._lwb)
# z_upper = norm.ppf(self._upb)
# numerator_in_ci_lower = self._z0 + z_lower
# numerator_in_ci_upper = self._z0 + z_upper
# bca_lower_ppf = norm.cdf(
# self._z0
# + numerator_in_ci_lower / (1 - self._a_hat * numerator_in_ci_lower)
# )
# bca_upper_ppf = norm.cdf(
# self._z0
# + numerator_in_ci_upper / (1 - self._a_hat * numerator_in_ci_upper)
# )
# bca_ci_mtx = np.zeros((self._z0.shape[0], 2))
# for i in range(self._z0.shape[0]):
# bca_ci_mtx[i, :] = np.percentile(
# self._bs_params[i],
# [bca_lower_ppf[i] * 100, bca_upper_ppf[i] * 100],
# axis=0,
# )
# return bca_ci_mtx
| 36.401163 | 145 | 0.563648 | 860 | 6,261 | 3.751163 | 0.125581 | 0.049597 | 0.048357 | 0.040918 | 0.880657 | 0.858648 | 0.820211 | 0.805332 | 0.805332 | 0.805332 | 0 | 0.025078 | 0.331257 | 6,261 | 171 | 146 | 36.614035 | 0.745402 | 0.553905 | 0 | 0.033333 | 0 | 0 | 0.00556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.05 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.