hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
16693286bda8fc5cb36e02f9aa7765ff20fcfe4e | 7,066 | py | Python | tests/unit/utils/test_validators.py | kajusK/HiddenPlaces | aa976f611a419bc33f8a65f0314956ec09fe2bfd | [
"MIT"
] | null | null | null | tests/unit/utils/test_validators.py | kajusK/HiddenPlaces | aa976f611a419bc33f8a65f0314956ec09fe2bfd | [
"MIT"
] | null | null | null | tests/unit/utils/test_validators.py | kajusK/HiddenPlaces | aa976f611a419bc33f8a65f0314956ec09fe2bfd | [
"MIT"
] | null | null | null | """Unit tests for app.validators. """
from wtforms import ValidationError
import flask
from pytest import raises
from app.utils.validators import password_rules, image_file, allowed_file
class DummyField(object):
"""Dummy field object to emulate wtforms field."""
def __init__(self, data=None, errors=(), raw_data=None):
self.data = data
self.errors = list(errors)
self.raw_data = raw_data
def gettext(self, string):
return string
def ngettext(self, singular, plural, n):
return singular
class DummyForm(dict):
"""Dummy form object to emulate wtforms form."""
pass
class DummyFile(object):
"""Dummy file like class to emulate uploaded file handler."""
def __init__(self, filename):
self.filename = filename
def __repr__(self):
return self.filename
def _run_validator_check(subtests, validator, valid, invalid):
"""Runs tests again validator with valid and invalid inputs.
Args:
subtest: Subtests fixture.
validator: Validator instance to run tests against
valid: List of valid inputs
invalid: List of invalid inputs
"""
field = DummyField()
for item in valid:
field.data = item
with subtests.test(item=item):
validator(DummyForm(), field)
for item in invalid:
field.data = item
with subtests.test(item=item):
with raises(ValidationError):
validator(DummyForm(), field)
def test_allowed_file(subtests, req_context):
validator = allowed_file()
extensions = ['exe', 'html']
valid = ['foo.jpg', 'exe', 'foo.exe.zip', 'foo']
invalid = ['foo.exe', 'foo.EXE', 'foo.pdf.exe', 'foo.html']
valid = [DummyFile(x) for x in valid]
invalid = [DummyFile(x) for x in invalid]
flask.current_app.config['DISABLED_EXTENSIONS'] = extensions
with flask.current_app.test_request_context():
_run_validator_check(subtests, validator, valid, invalid)
def test_allowed_file_multiple(subtests, req_context):
validator = allowed_file()
extensions = ['exe', 'html']
valid = ['foo.jpg', 'exe', 'foo.exe.zip', 'foo']
invalid = ['foo.exe', 'foo.EXE', 'foo.pdf.exe', 'foo.html']
valid = [[DummyFile(x) for x in valid], [DummyFile(valid[0])],
[DummyFile(valid[0]), DummyFile(valid[1])]]
invalid = [[DummyFile(x) for x in invalid], [DummyFile(invalid[0])],
[DummyFile(invalid[0]), DummyFile(invalid[1])]]
flask.current_app.config['DISABLED_EXTENSIONS'] = extensions
with flask.current_app.test_request_context():
_run_validator_check(subtests, validator, valid, invalid)
def test_allowed_file_message(req_context):
validator = allowed_file(message="custom message")
field = DummyField()
field.data = DummyFile("blah.foo")
flask.current_app.config['DISABLED_EXTENSIONS'] = ['foo']
with flask.current_app.test_request_context():
with raises(ValidationError) as e:
validator(DummyForm(), field)
assert str(e.value) == "custom message"
def test_image_file(subtests, req_context):
validator = image_file()
extensions = ['jpg', 'png', 'tiff']
valid = ['foo.jpg', 'foo.JPG', 'bar.png', 'blah.tiff', 'a.foo.jpg']
invalid = ['foo', 'jpg', 'foo.pdf', 'foo.jpg.pdf', '', '.jpg', 'o.gif']
valid = [DummyFile(x) for x in valid]
invalid = [DummyFile(x) for x in invalid]
flask.current_app.config['IMAGE_EXTENSIONS'] = extensions
with flask.current_app.test_request_context():
_run_validator_check(subtests, validator, valid, invalid)
def test_image_file_multiple(subtests, req_context):
validator = image_file()
extensions = ['jpg', 'png', 'tiff']
valid = ['foo.jpg', 'foo.JPG', 'bar.png', 'blah.tiff', 'a.foo.jpg']
invalid = ['foo', 'jpg', 'foo.pdf', 'foo.jpg.pdf', '', '.jpg', 'o.gif']
valid = [[DummyFile(x) for x in valid], [DummyFile(valid[0])],
[DummyFile(valid[0]), DummyFile(valid[1])]]
invalid = [[DummyFile(x) for x in invalid], [DummyFile(invalid[0])],
[DummyFile(invalid[0]), DummyFile(invalid[1])]]
flask.current_app.config['IMAGE_EXTENSIONS'] = extensions
with flask.current_app.test_request_context():
_run_validator_check(subtests, validator, valid, invalid)
def test_image_file_message(req_context):
validator = image_file(message="custom message")
field = DummyField()
field.data = DummyFile("blah")
flask.current_app.config['IMAGE_EXTENSIONS'] = ['foo']
with flask.current_app.test_request_context():
with raises(ValidationError) as e:
validator(DummyForm(), field)
assert str(e.value) == "custom message"
def test_password_rules_length(subtests):
validator = password_rules(length=6, upper=None, lower=None, numeric=None,
special=None)
valid = ["as123.21", "abcdef", "sdadadaswasasa", "1234567", "...,.,..,",
"AAAAAAA", "AbCdEf"]
invalid = ["abc", "123", "....", "aBcDe", "a1.V3"]
_run_validator_check(subtests, validator, valid, invalid)
def test_password_rules_upper(subtests):
validator = password_rules(length=6, upper=2, lower=None, numeric=None,
special=None)
valid = ["abcDEf", "HellOO", "ABCDEZ", "A.b#3CZ", "ADSDSA"]
invalid = ["abcdEf", "helloo", "A231sdsd"]
_run_validator_check(subtests, validator, valid, invalid)
def test_password_rules_lower(subtests):
validator = password_rules(length=6, upper=None, lower=3, numeric=None,
special=None)
valid = ["abcdefg", "axzBAR", "123abcdsa", "AbCdEfGh", "..as..2ds.."]
invalid = ["foOBAR", "123ABcdSA", "1a2b.C#"]
_run_validator_check(subtests, validator, valid, invalid)
def test_password_rules_numeric(subtests):
validator = password_rules(length=6, upper=None, lower=None, numeric=2,
special=None)
valid = ["1bcd4A.d", "123456", "a?9#.0"]
invalid = ["2ds.#<", "abcdef", "ABCDEF", "x2U.'Q"]
_run_validator_check(subtests, validator, valid, invalid)
def test_password_rules_special(subtests):
validator = password_rules(length=6, upper=None, lower=None, numeric=None,
special=3)
valid = ["ab.?123!", ".#@dS9", "abcdef123><?"]
invalid = ["abcdef", ".23134", "AbCd123,]"]
_run_validator_check(subtests, validator, valid, invalid)
def test_password_rules_all(subtests):
validator = password_rules(length=6, upper=2, lower=1, numeric=1,
special=1)
valid = ["ABc1.2", "abcDEF123#%^", "a2B.C?"]
invalid = ["helloo", "ABCDEF", "Ab1.?c"]
_run_validator_check(subtests, validator, valid, invalid)
def test_password_rules_message(subtests):
validator = password_rules(length=100, message="custom message")
field = DummyField()
field.data = "wrong"
with raises(ValidationError) as e:
validator(DummyForm(), field)
assert str(e.value) == "custom message"
| 35.686869 | 78 | 0.644495 | 849 | 7,066 | 5.202591 | 0.177856 | 0.069278 | 0.040752 | 0.062259 | 0.709079 | 0.678515 | 0.662441 | 0.633009 | 0.616255 | 0.582975 | 0 | 0.01664 | 0.209029 | 7,066 | 197 | 79 | 35.86802 | 0.773663 | 0.055618 | 0 | 0.532847 | 0 | 0 | 0.125057 | 0 | 0 | 0 | 0 | 0 | 0.021898 | 1 | 0.138686 | false | 0.116788 | 0.029197 | 0.021898 | 0.211679 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
166add4d1cc09be73d6135b394a15f57ecfca1b9 | 615 | py | Python | ts_eval/utils/nans.py | vshulyak/ts-eval | 2049b1268cf4272f5fa1471851523f8da14dd84c | [
"MIT"
] | 1 | 2021-07-12T08:58:07.000Z | 2021-07-12T08:58:07.000Z | ts_eval/utils/nans.py | vshulyak/ts-eval | 2049b1268cf4272f5fa1471851523f8da14dd84c | [
"MIT"
] | null | null | null | ts_eval/utils/nans.py | vshulyak/ts-eval | 2049b1268cf4272f5fa1471851523f8da14dd84c | [
"MIT"
] | null | null | null | import warnings
import numpy as np
def nans_in_same_positions(*arrays):
"""
Compares all provided arrays to see if they have NaNs in the same positions.
"""
if len(arrays) == 0:
return True
for arr in arrays[1:]:
if not (np.isnan(arrays[0]) == np.isnan(arr)).all():
return False
return True
def nanmeanw(arr, axis=None):
"""
Computes nanmean without raising a warning in case of NaNs in the dataset
"""
with warnings.catch_warnings():
warnings.simplefilter("ignore", category=RuntimeWarning)
return np.nanmean(arr, axis=axis)
| 24.6 | 80 | 0.642276 | 84 | 615 | 4.654762 | 0.571429 | 0.046036 | 0.046036 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00655 | 0.255285 | 615 | 24 | 81 | 25.625 | 0.847162 | 0.243902 | 0 | 0.153846 | 0 | 0 | 0.013825 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.153846 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
166b671e9115e476c69bab6e6077599dd6b6cdea | 5,434 | py | Python | tests/authorization/test_searches.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 2 | 2018-02-23T12:16:11.000Z | 2020-10-08T17:54:24.000Z | tests/authorization/test_searches.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 87 | 2017-04-21T18:57:15.000Z | 2021-12-13T19:43:57.000Z | tests/authorization/test_searches.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 1 | 2018-03-01T16:44:25.000Z | 2018-03-01T16:44:25.000Z | """Unit tests of authorization searches."""
import pytest
from ..utilities.general import is_never_authz, is_no_authz, uses_cataloging, uses_filesystem_only
from dlkit.abstract_osid.osid import errors
from dlkit.primordium.id.primitives import Id
from dlkit.primordium.type.primitives import Type
from dlkit.runtime import PROXY_SESSION, proxy_example
from dlkit.runtime.managers import Runtime
REQUEST = proxy_example.SimpleRequest()
CONDITION = PROXY_SESSION.get_proxy_condition()
CONDITION.set_http_request(REQUEST)
PROXY = PROXY_SESSION.get_proxy(CONDITION)
DEFAULT_TYPE = Type(**{'identifier': 'DEFAULT', 'namespace': 'DEFAULT', 'authority': 'DEFAULT'})
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def authorization_search_class_fixture(request):
# From test_templates/resource.py::ResourceSearch::init_template
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'AUTHORIZATION',
proxy=PROXY,
implementation=request.cls.service_config)
create_form = request.cls.svc_mgr.get_vault_form_for_create([])
create_form.display_name = 'Test catalog'
create_form.description = 'Test catalog description'
request.cls.catalog = request.cls.svc_mgr.create_vault(create_form)
def class_tear_down():
request.cls.svc_mgr.delete_vault(request.cls.catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def authorization_search_test_fixture(request):
# From test_templates/resource.py::ResourceSearch::init_template
request.cls.search = request.cls.catalog.get_authorization_search()
@pytest.mark.usefixtures("authorization_search_class_fixture", "authorization_search_test_fixture")
class TestAuthorizationSearch(object):
"""Tests for AuthorizationSearch"""
@pytest.mark.skip('unimplemented test')
def test_search_among_authorizations(self):
"""Tests search_among_authorizations"""
pass
@pytest.mark.skip('unimplemented test')
def test_order_authorization_results(self):
"""Tests order_authorization_results"""
pass
@pytest.mark.skip('unimplemented test')
def test_get_authorization_search_record(self):
"""Tests get_authorization_search_record"""
pass
@pytest.mark.usefixtures("authorization_search_results_class_fixture", "authorization_search_results_test_fixture")
class TestAuthorizationSearchResults(object):
"""Tests for AuthorizationSearchResults"""
@pytest.mark.skip('unimplemented test')
def test_get_authorizations(self):
"""Tests get_authorizations"""
pass
@pytest.mark.skip('unimplemented test')
def test_get_authorization_query_inspector(self):
"""Tests get_authorization_query_inspector"""
pass
@pytest.mark.skip('unimplemented test')
def test_get_authorization_search_results_record(self):
"""Tests get_authorization_search_results_record"""
pass
@pytest.fixture(scope="class",
params=['TEST_SERVICE', 'TEST_SERVICE_ALWAYS_AUTHZ', 'TEST_SERVICE_NEVER_AUTHZ', 'TEST_SERVICE_CATALOGING', 'TEST_SERVICE_FILESYSTEM', 'TEST_SERVICE_MEMCACHE'])
def vault_search_class_fixture(request):
# From test_templates/resource.py::ResourceSearch::init_template
request.cls.service_config = request.param
request.cls.svc_mgr = Runtime().get_service_manager(
'AUTHORIZATION',
proxy=PROXY,
implementation=request.cls.service_config)
create_form = request.cls.svc_mgr.get_vault_form_for_create([])
create_form.display_name = 'Test catalog'
create_form.description = 'Test catalog description'
request.cls.catalog = request.cls.svc_mgr.create_vault(create_form)
def class_tear_down():
request.cls.svc_mgr.delete_vault(request.cls.catalog.ident)
request.addfinalizer(class_tear_down)
@pytest.fixture(scope="function")
def vault_search_test_fixture(request):
# From test_templates/resource.py::ResourceSearch::init_template
request.cls.search = request.cls.catalog.get_vault_search()
@pytest.mark.usefixtures("vault_search_class_fixture", "vault_search_test_fixture")
class TestVaultSearch(object):
"""Tests for VaultSearch"""
@pytest.mark.skip('unimplemented test')
def test_search_among_vaults(self):
"""Tests search_among_vaults"""
pass
@pytest.mark.skip('unimplemented test')
def test_order_vault_results(self):
"""Tests order_vault_results"""
pass
@pytest.mark.skip('unimplemented test')
def test_get_vault_search_record(self):
"""Tests get_vault_search_record"""
pass
@pytest.mark.usefixtures("vault_search_results_class_fixture", "vault_search_results_test_fixture")
class TestVaultSearchResults(object):
"""Tests for VaultSearchResults"""
@pytest.mark.skip('unimplemented test')
def test_get_vaults(self):
"""Tests get_vaults"""
pass
@pytest.mark.skip('unimplemented test')
def test_get_vault_query_inspector(self):
"""Tests get_vault_query_inspector"""
pass
@pytest.mark.skip('unimplemented test')
def test_get_vault_search_results_record(self):
"""Tests get_vault_search_results_record"""
pass
| 36.469799 | 176 | 0.749724 | 643 | 5,434 | 6.001555 | 0.157076 | 0.051827 | 0.043535 | 0.08396 | 0.714693 | 0.637989 | 0.584089 | 0.584089 | 0.56284 | 0.502721 | 0 | 0 | 0.144645 | 5,434 | 148 | 177 | 36.716216 | 0.83025 | 0.146117 | 0 | 0.586957 | 0 | 0 | 0.200615 | 0.109866 | 0 | 0 | 0 | 0 | 0 | 1 | 0.195652 | false | 0.130435 | 0.076087 | 0 | 0.315217 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
166ed868a00e2876de6024b3dcf661e7d6afc455 | 216 | py | Python | OOP_MiniQuiz/run_car_Level2.py | HelloYeew/helloyeew-lab-computer-programming-i | 60b05072f32f23bab4a336b506ba7f66e52c045d | [
"MIT"
] | null | null | null | OOP_MiniQuiz/run_car_Level2.py | HelloYeew/helloyeew-lab-computer-programming-i | 60b05072f32f23bab4a336b506ba7f66e52c045d | [
"MIT"
] | null | null | null | OOP_MiniQuiz/run_car_Level2.py | HelloYeew/helloyeew-lab-computer-programming-i | 60b05072f32f23bab4a336b506ba7f66e52c045d | [
"MIT"
] | null | null | null | from car import *
def compare(car1,car2):
print(car1)
print(car2)
car1 = Car("Nissan","Tiida",450000)
car2 = Car("Toyota","Vios",400000)
car3 = Car("BMW","X3",3400000)
compare(car3,car1)
compare(car1,car2) | 18 | 35 | 0.671296 | 32 | 216 | 4.53125 | 0.5625 | 0.151724 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165775 | 0.134259 | 216 | 12 | 36 | 18 | 0.609626 | 0 | 0 | 0 | 0 | 0 | 0.119816 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0 | 0.222222 | 0.222222 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16725a52de27142aa18864c727dddea44204b666 | 5,940 | py | Python | beartype/vale/__init__.py | posita/beartype | e56399686e1f2ffd5128a4030b19314504e32450 | [
"MIT"
] | null | null | null | beartype/vale/__init__.py | posita/beartype | e56399686e1f2ffd5128a4030b19314504e32450 | [
"MIT"
] | null | null | null | beartype/vale/__init__.py | posita/beartype | e56399686e1f2ffd5128a4030b19314504e32450 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# --------------------( LICENSE )--------------------
# Copyright (c) 2014-2021 Beartype authors.
# See "LICENSE" for further details.
'''
**Beartype validators.**
This submodule publishes a PEP-compliant hierarchy of subscriptable (indexable)
classes enabling callers to validate the internal structure of arbitrarily
complex scalars, data structures, and third-party objects. Like annotation
objects defined by the :mod:`typing` module (e.g., :attr:`typing.Union`), these
classes dynamically generate PEP-compliant type hints when subscripted
(indexed) and are thus intended to annotate callables and variables. Unlike
annotation objects defined by the :mod:`typing` module, these classes are *not*
explicitly covered by existing PEPs and thus *not* directly usable as
annotations.
Instead, callers are expected to (in order):
#. Annotate callable parameters and returns to be validated with
:pep:`593`-compliant :attr:`typing.Annotated` type hints.
#. Subscript those hints with (in order):
#. The type of those parameters and returns.
#. One or more subscriptions of classes declared by this submodule.
'''
# ....................{ IMPORTS }....................
#!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
# WARNING: To avoid polluting the public module namespace, external attributes
# should be locally imported at module scope *ONLY* under alternate private
# names (e.g., "from argparse import ArgumentParser as _ArgumentParser" rather
# than merely "from argparse import ArgumentParser").
#!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
from beartype.vale._is._valeis import _IsFactory
from beartype.vale._is._valeistype import (
_IsInstanceFactory,
_IsSubclassFactory,
)
from beartype.vale._is._valeisobj import _IsAttrFactory
from beartype.vale._is._valeisoper import _IsEqualFactory
# ....................{ SINGLETONS }....................
# Public factory singletons instantiating these private factory classes.
Is = _IsFactory(basename='Is')
IsAttr = _IsAttrFactory(basename='IsAttr')
IsEqual = _IsEqualFactory(basename='IsEqual')
IsInstance = _IsInstanceFactory(basename='IsInstance')
IsSubclass = _IsSubclassFactory(basename='IsSubclass')
# Delete all private factory classes imported above for safety.
del (
_IsFactory,
_IsAttrFactory,
_IsEqualFactory,
_IsInstanceFactory,
_IsSubclassFactory,
)
# ....................{ TODO }....................
#FIXME: As intelligently requested by @Saphyel at #32, add support for
#additional classes support constraints resembling:
#
#* String constraints:
# * Email.
# * Uuid.
# * Choice.
# * Language.
# * Locale.
# * Country.
# * Currency.
#* Comparison constraints
# * IdenticalTo.
# * NotIdenticalTo.
# * LessThan.
# * GreaterThan.
# * Range.
# * DivisibleBy.
#FIXME: Add a new BeartypeValidator.get_cause_or_none() method with the same
#signature and docstring as the existing CauseSleuth.get_cause_or_none()
#method. This new BeartypeValidator.get_cause_or_none() method should then be
#called by the "_peperrorannotated" submodule to generate human-readable
#exception messages. Note that this implies that:
#* The BeartypeValidator.__init__() method will need to additionally accept a new
# mandatory "get_cause_or_none: Callable[[], Optional[str]]" parameter, which
# that method should then localize to "self.get_cause_or_none".
#* Each __class_getitem__() dunder method of each "_BeartypeValidatorFactoryABC" subclass will need
# to additionally define and pass that callable when creating and returning
# its "BeartypeValidator" instance.
#FIXME: *BRILLIANT IDEA.* Holyshitballstime. The idea here is that we can
#leverage all of our existing "beartype.is" infrastructure to dynamically
#synthesize PEP-compliant type hints that would then be implicitly supported by
#any runtime type checker. At present, subscriptions of "Is" (e.g.,
#"Annotated[str, Is[lambda text: bool(text)]]") are only supported by beartype
#itself. Of course, does anyone care? I mean, if you're using a runtime type
#checker, you're probably *ONLY* using beartype. Right? That said, this would
#technically improve portability by allowing users to switch between different
#checkers... except not really, since they'd still have to import beartype
#infrastructure to do so. So, this is probably actually useless.
#
#Nonetheless, the idea itself is trivial. We declare a new
#"beartype.is.Portable" singleton accessed in the same way: e.g.,
# from beartype import beartype
# from beartype.is import Portable
# NonEmptyStringTest = Is[lambda text: bool(text)]
# NonEmptyString = Portable[str, NonEmptyStringTest]
# @beartype
# def munge_it(text: NonEmptyString) -> str: ...
#
#So what's the difference between "typing.Annotated" and "beartype.is.Portable"
#then? Simple. The latter dynamically generates one new PEP 3119-compliant
#metaclass and associated class whenever subscripted. Clearly, this gets
#expensive in both space and time consumption fast -- which is why this won't
#be the default approach. For safety, this new class does *NOT* subclass the
#first subscripted class. Instead:
#* This new metaclass of this new class simply defines an __isinstancecheck__()
# dunder method. For the above example, this would be:
# class NonEmptyStringMetaclass(object):
# def __isinstancecheck__(cls, obj) -> bool:
# return isinstance(obj, str) and NonEmptyStringTest(obj)
#* This new class would then be entirely empty. For the above example, this
# would be:
# class NonEmptyStringClass(object, metaclass=NonEmptyStringMetaclass):
# pass
#
#Well, so much for brilliant. It's slow and big, so it seems doubtful anyone
#would actually do that. Nonetheless, that's food for thought for you.
| 45.343511 | 99 | 0.711616 | 714 | 5,940 | 5.837535 | 0.457983 | 0.017274 | 0.011996 | 0.016795 | 0.071017 | 0.056622 | 0.056622 | 0.037428 | 0 | 0 | 0 | 0.003598 | 0.157744 | 5,940 | 130 | 100 | 45.692308 | 0.829502 | 0.868687 | 0 | 0.210526 | 0 | 0 | 0.050725 | 0 | 0 | 0 | 0 | 0.007692 | 0 | 1 | 0 | false | 0 | 0.210526 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1676599bdfdd4b081bb8bb20aa32589f69c604ef | 3,701 | py | Python | src/regrtest.py | ucsd-progsys/csolve-bak | 89cfeb5403e617f45ece4bae9f88f8e6cd7ca934 | [
"BSD-3-Clause"
] | null | null | null | src/regrtest.py | ucsd-progsys/csolve-bak | 89cfeb5403e617f45ece4bae9f88f8e6cd7ca934 | [
"BSD-3-Clause"
] | 1 | 2018-04-24T10:43:07.000Z | 2018-04-24T10:43:07.000Z | src/regrtest.py | ucsd-progsys/csolve-bak | 89cfeb5403e617f45ece4bae9f88f8e6cd7ca934 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/python
# Copyright (c) 2009 The Regents of the University of California. All rights reserved.
#
# Permission is hereby granted, without written agreement and without
# license or royalty fees, to use, copy, modify, and distribute this
# software and its documentation for any purpose, provided that the
# above copyright notice and the following two paragraphs appear in
# all copies of this software.
#
# IN NO EVENT SHALL THE UNIVERSITY OF CALIFORNIA BE LIABLE TO ANY PARTY
# FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES
# ARISING OUT OF THE USE OF THIS SOFTWARE AND ITS DOCUMENTATION, EVEN
# IF THE UNIVERSITY OF CALIFORNIA HAS BEEN ADVISED OF THE POSSIBILITY
# OF SUCH DAMAGE.
#
# THE UNIVERSITY OF CALIFORNIA SPECIFICALLY DISCLAIMS ANY WARRANTIES,
# INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY
# AND FITNESS FOR A PARTICULAR PURPOSE. THE SOFTWARE PROVIDED HEREUNDER IS
# ON AN "AS IS" BASIS, AND THE UNIVERSITY OF CALIFORNIA HAS NO OBLIGATION
# TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS, OR MODIFICATIONS.
import time, subprocess, optparse, sys, socket, os
import misc.rtest as rtest
solve = "./csolve -c".split()
null = open("/dev/null", "w")
now = (time.asctime(time.localtime(time.time()))).replace(" ","_")
logfile = "../tests/logs/regrtest_results_%s_%s" % (socket.gethostname (), now)
argcomment = "//! run with "
def logged_sys_call(args, out=None, err=None):
print "exec: " + " ".join(args)
return subprocess.call(args, stdout=out, stderr=err)
def solve_quals(file,bare,time,quiet,flags):
if quiet: out = null
else: out = None
if time: time = ["time"]
else: time = []
hygiene_flags = [("--csolveprefix=%s" % (file)), "-o", "/dev/null"]
out = open(file + ".log", "w")
rv = logged_sys_call(time + solve + flags + hygiene_flags + [file], out)
out.close()
return rv
def run_script(file,quiet):
if quiet: out = null
else: out = None
return logged_sys_call(file, out)
def getfileargs(file):
f = open(file)
l = f.readline()
f.close()
if l.startswith(argcomment):
return l[len(argcomment):].strip().split(" ")
else:
return []
class Config (rtest.TestConfig):
def __init__ (self, dargs, testdirs, logfile, threadcount):
rtest.TestConfig.__init__ (self, testdirs, logfile, threadcount)
self.dargs = dargs
if os.path.exists("../tests/postests/coreutils/"):
logged_sys_call(["../tests/postests/coreutils/makeCoreUtil.sh", "init"], None)
def run_test (self, file):
os.environ['CSOLVEFLAGS'] = self.dargs
if file.endswith(".c"):
fargs = getfileargs(file)
return solve_quals(file, True, False, True, fargs)
elif file.endswith(".sh"):
return run_script(file, True)
def is_test (self, file):
return (file.endswith(".sh") and os.access(file, os.X_OK)) \
or (file.endswith(".c") and not file.endswith(".csolve.save.c") and not file.endswith(".ssa.c"))
#####################################################################################
#testdirs = [("../postests", 0)]
#testdirs = [("../negtests", 1)]
#testdirs = [("../slowtests", 1)]
#DEFAULT
testdirs = [("../tests/postests", 0), ("../tests/negtests", [1, 2])]
#testdirs = [("../tests/microtests", 0)]
parser = optparse.OptionParser()
parser.add_option("-t", "--threads", dest="threadcount", default=1, type=int, help="spawn n threads")
parser.add_option("-o", "--opts", dest="opts", default="", type=str, help="additional arguments to csolve")
parser.disable_interspersed_args()
options, args = parser.parse_args()
runner = rtest.TestRunner (Config (options.opts, testdirs, logfile, options.threadcount))
exit (runner.run ())
| 38.154639 | 107 | 0.676574 | 493 | 3,701 | 5.010142 | 0.432049 | 0.02915 | 0.030364 | 0.050607 | 0.083401 | 0.020243 | 0.020243 | 0 | 0 | 0 | 0 | 0.003842 | 0.156174 | 3,701 | 96 | 108 | 38.552083 | 0.787064 | 0.321805 | 0 | 0.071429 | 0 | 0 | 0.144167 | 0.044583 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.035714 | null | null | 0.017857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
167b69684843eed85973a69dafe6205fbdff9406 | 845 | py | Python | cocos2d/tools/jenkins-scripts/configs/cocos-2dx-develop-win32.py | triompha/EarthWarrior3D | d68a347902fa1ca1282df198860f5fb95f326797 | [
"MIT"
] | null | null | null | cocos2d/tools/jenkins-scripts/configs/cocos-2dx-develop-win32.py | triompha/EarthWarrior3D | d68a347902fa1ca1282df198860f5fb95f326797 | [
"MIT"
] | null | null | null | cocos2d/tools/jenkins-scripts/configs/cocos-2dx-develop-win32.py | triompha/EarthWarrior3D | d68a347902fa1ca1282df198860f5fb95f326797 | [
"MIT"
] | null | null | null | import os
import subprocess
import sys
print 'Build Config:'
print ' Host:win7 x86'
print ' Branch:develop'
print ' Target:win32'
print ' "%VS110COMNTOOLS%..\IDE\devenv.com" "build\cocos2d-win32.vc2012.sln" /Build "Debug|Win32"'
if(os.path.exists('build/cocos2d-win32.vc2012.sln') == False):
node_name = os.environ['NODE_NAME']
source_dir = '../cocos-2dx-develop-base-repo/node/' + node_name
source_dir = source_dir.replace("/", os.sep)
os.system("xcopy " + source_dir + " . /E /Y /H")
os.system('git pull origin develop')
os.system('git submodule update --init --force')
ret = subprocess.call('"%VS110COMNTOOLS%..\IDE\devenv.com" "build\cocos2d-win32.vc2012.sln" /Build "Debug|Win32"', shell=True)
os.system('git clean -xdf -f')
print 'build exit'
print ret
if ret == 0:
exit(0)
else:
exit(1)
| 33.8 | 127 | 0.668639 | 122 | 845 | 4.57377 | 0.491803 | 0.064516 | 0.091398 | 0.123656 | 0.286738 | 0.240143 | 0.240143 | 0.240143 | 0.240143 | 0.240143 | 0 | 0.05618 | 0.157396 | 845 | 24 | 128 | 35.208333 | 0.727528 | 0 | 0 | 0 | 0 | 0.086957 | 0.516443 | 0.243605 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.130435 | null | null | 0.304348 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16848dd03e02c952cce813e4092be02064f38ca9 | 1,470 | py | Python | githubdl/url_helpers.py | wilvk/githubdl | 1dc8c1c0d93a8e4b8155aecf4f5e73e2931ed920 | [
"MIT"
] | 16 | 2018-06-20T00:01:40.000Z | 2022-01-24T08:16:17.000Z | githubdl/url_helpers.py | wilvk/githubdl | 1dc8c1c0d93a8e4b8155aecf4f5e73e2931ed920 | [
"MIT"
] | 12 | 2018-07-18T21:09:37.000Z | 2020-03-28T23:38:13.000Z | githubdl/url_helpers.py | wilvk/githubdl | 1dc8c1c0d93a8e4b8155aecf4f5e73e2931ed920 | [
"MIT"
] | null | null | null | import re
from urllib.parse import urlparse
import logging
def check_url_is_http(repo_url):
predicate = re.compile('^https?://.*$')
match = predicate.search(repo_url)
return False if match is None else True
def check_url_is_ssh(repo_url):
predicate = re.compile(r'^git\@.*\.git$')
match = predicate.search(repo_url)
return False if match is None else True
def get_domain_name_from_http_url(repo_url):
site_object = urlparse(repo_url)
return site_object.netloc
def get_repo_name_from_http_url(repo_url):
site_object = urlparse(repo_url)
parsed_string = re.sub(r'\.git$', '', site_object.path)
if parsed_string[0] == '/':
return parsed_string[1:]
return parsed_string
def get_repo_name_from_ssh_url(repo_url):
predicate = re.compile(r'(?<=\:)(.*)(?=\.)')
match = predicate.search(repo_url)
return match.group()
def get_domain_name_from_ssh_url(repo_url):
predicate = re.compile(r'(?<=\@)(.*)(?=\:)')
match = predicate.search(repo_url)
return match.group()
def validate_protocol_exists(is_ssh, is_http):
if not is_ssh and not is_http:
err_message = "Error: repository url provided is not http(s) or ssh"
logging.critical(err_message)
raise RuntimeError(err_message)
def check_url_protocol(repo_url):
is_ssh = check_url_is_ssh(repo_url)
is_http = check_url_is_http(repo_url)
validate_protocol_exists(is_ssh, is_http)
return (is_ssh, is_http)
| 31.276596 | 76 | 0.706803 | 223 | 1,470 | 4.327354 | 0.246637 | 0.108808 | 0.067358 | 0.074611 | 0.635233 | 0.577202 | 0.472539 | 0.404145 | 0.404145 | 0.404145 | 0 | 0.001643 | 0.172109 | 1,470 | 46 | 77 | 31.956522 | 0.79129 | 0 | 0 | 0.263158 | 0 | 0 | 0.081633 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.078947 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
168c810ecd449bb3eb263646cbc454470f8c28e4 | 527 | py | Python | train_test_val.py | arashk7/Yolo5_Dataset_Generator | aeba58b51201b8521478c777b40c4d31f0c60be9 | [
"Apache-2.0"
] | null | null | null | train_test_val.py | arashk7/Yolo5_Dataset_Generator | aeba58b51201b8521478c777b40c4d31f0c60be9 | [
"Apache-2.0"
] | null | null | null | train_test_val.py | arashk7/Yolo5_Dataset_Generator | aeba58b51201b8521478c777b40c4d31f0c60be9 | [
"Apache-2.0"
] | null | null | null | import os
import shutil
input_dir = 'E:\Dataset\zhitang\Dataset_Zhitang_Yolo5'
output_dir = 'E:\Dataset\zhitang\Dataset_Zhitang_Yolo5\ZhitangYolo5'
in_img_dir = os.path.join(input_dir, 'Images')
in_label_dir = os.path.join(input_dir, 'Labels')
out_img_dir = os.path.join(output_dir, 'images')
out_label_dir = os.path.join(output_dir, 'labels')
splits = {'train','test','valid'}
files = os.listdir(in_img_dir)
count = len(files)
for f in files:
print(f)
src = os.path.join(input_dir,f)
shutil.copyfile(src, dst)
| 22.913043 | 68 | 0.736243 | 87 | 527 | 4.218391 | 0.37931 | 0.081744 | 0.13624 | 0.141689 | 0.52861 | 0.435967 | 0.201635 | 0 | 0 | 0 | 0 | 0.006438 | 0.11575 | 527 | 22 | 69 | 23.954545 | 0.781116 | 0 | 0 | 0 | 0 | 0 | 0.250478 | 0.17782 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.133333 | null | null | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
168cde4a792e9985c473078c1d3e1678761198e7 | 4,873 | py | Python | homeassistant/components/media_player/pjlink.py | dauden1184/home-assistant | f4c6d389b77d0efa86644e76604eaea5d21abdb5 | [
"Apache-2.0"
] | 4 | 2019-01-10T14:47:54.000Z | 2021-04-22T02:06:27.000Z | homeassistant/components/media_player/pjlink.py | dauden1184/home-assistant | f4c6d389b77d0efa86644e76604eaea5d21abdb5 | [
"Apache-2.0"
] | 6 | 2021-02-08T21:02:40.000Z | 2022-03-12T00:52:16.000Z | homeassistant/components/media_player/pjlink.py | dauden1184/home-assistant | f4c6d389b77d0efa86644e76604eaea5d21abdb5 | [
"Apache-2.0"
] | 3 | 2018-08-29T19:26:20.000Z | 2020-01-19T11:58:22.000Z | """
Support for controlling projector via the PJLink protocol.
For more details about this platform, please refer to the documentation at
https://home-assistant.io/components/media_player.pjlink/
"""
import logging
import voluptuous as vol
from homeassistant.components.media_player import (
PLATFORM_SCHEMA, SUPPORT_SELECT_SOURCE, SUPPORT_TURN_OFF, SUPPORT_TURN_ON,
SUPPORT_VOLUME_MUTE, MediaPlayerDevice)
from homeassistant.const import (
CONF_HOST, CONF_NAME, CONF_PASSWORD, CONF_PORT, STATE_OFF, STATE_ON)
import homeassistant.helpers.config_validation as cv
REQUIREMENTS = ['pypjlink2==1.2.0']
_LOGGER = logging.getLogger(__name__)
CONF_ENCODING = 'encoding'
DEFAULT_PORT = 4352
DEFAULT_ENCODING = 'utf-8'
PLATFORM_SCHEMA = PLATFORM_SCHEMA.extend({
vol.Required(CONF_HOST): cv.string,
vol.Optional(CONF_PORT, default=DEFAULT_PORT): cv.port,
vol.Optional(CONF_NAME): cv.string,
vol.Optional(CONF_ENCODING, default=DEFAULT_ENCODING): cv.string,
vol.Optional(CONF_PASSWORD): cv.string,
})
SUPPORT_PJLINK = SUPPORT_VOLUME_MUTE | \
SUPPORT_TURN_ON | SUPPORT_TURN_OFF | SUPPORT_SELECT_SOURCE
def setup_platform(hass, config, add_entities, discovery_info=None):
"""Set up the PJLink platform."""
host = config.get(CONF_HOST)
port = config.get(CONF_PORT)
name = config.get(CONF_NAME)
encoding = config.get(CONF_ENCODING)
password = config.get(CONF_PASSWORD)
if 'pjlink' not in hass.data:
hass.data['pjlink'] = {}
hass_data = hass.data['pjlink']
device_label = "{}:{}".format(host, port)
if device_label in hass_data:
return
device = PjLinkDevice(host, port, name, encoding, password)
hass_data[device_label] = device
add_entities([device], True)
def format_input_source(input_source_name, input_source_number):
"""Format input source for display in UI."""
return "{} {}".format(input_source_name, input_source_number)
class PjLinkDevice(MediaPlayerDevice):
"""Representation of a PJLink device."""
def __init__(self, host, port, name, encoding, password):
"""Iinitialize the PJLink device."""
self._host = host
self._port = port
self._name = name
self._password = password
self._encoding = encoding
self._muted = False
self._pwstate = STATE_OFF
self._current_source = None
with self.projector() as projector:
if not self._name:
self._name = projector.get_name()
inputs = projector.get_inputs()
self._source_name_mapping = \
{format_input_source(*x): x for x in inputs}
self._source_list = sorted(self._source_name_mapping.keys())
def projector(self):
"""Create PJLink Projector instance."""
from pypjlink import Projector
projector = Projector.from_address(
self._host, self._port, self._encoding)
projector.authenticate(self._password)
return projector
def update(self):
"""Get the latest state from the device."""
with self.projector() as projector:
pwstate = projector.get_power()
if pwstate == 'off':
self._pwstate = STATE_OFF
else:
self._pwstate = STATE_ON
self._muted = projector.get_mute()[1]
self._current_source = \
format_input_source(*projector.get_input())
@property
def name(self):
"""Return the name of the device."""
return self._name
@property
def state(self):
"""Return the state of the device."""
return self._pwstate
@property
def is_volume_muted(self):
"""Return boolean indicating mute status."""
return self._muted
@property
def source(self):
"""Return current input source."""
return self._current_source
@property
def source_list(self):
"""Return all available input sources."""
return self._source_list
@property
def supported_features(self):
"""Return projector supported features."""
return SUPPORT_PJLINK
def turn_off(self):
"""Turn projector off."""
with self.projector() as projector:
projector.set_power('off')
def turn_on(self):
"""Turn projector on."""
with self.projector() as projector:
projector.set_power('on')
def mute_volume(self, mute):
"""Mute (true) of unmute (false) media player."""
with self.projector() as projector:
from pypjlink import MUTE_AUDIO
projector.set_mute(MUTE_AUDIO, mute)
def select_source(self, source):
"""Set the input source."""
source = self._source_name_mapping[source]
with self.projector() as projector:
projector.set_input(*source)
| 31.038217 | 78 | 0.65668 | 580 | 4,873 | 5.277586 | 0.227586 | 0.03953 | 0.033322 | 0.037243 | 0.159752 | 0.063378 | 0.04247 | 0.029402 | 0 | 0 | 0 | 0.002704 | 0.241125 | 4,873 | 156 | 79 | 31.237179 | 0.825041 | 0.14488 | 0 | 0.134615 | 0 | 0 | 0.015935 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.144231 | false | 0.067308 | 0.067308 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
168f0267685e958dd990feeae60a1086e6b78107 | 31,038 | py | Python | pxr/usd/usdGeom/testenv/testUsdGeomSchemata.py | yurivict/USD | 3b097e3ba8fabf1777a1256e241ea15df83f3065 | [
"Apache-2.0"
] | 1 | 2022-03-16T01:40:10.000Z | 2022-03-16T01:40:10.000Z | pxr/usd/usdGeom/testenv/testUsdGeomSchemata.py | yurivict/USD | 3b097e3ba8fabf1777a1256e241ea15df83f3065 | [
"Apache-2.0"
] | null | null | null | pxr/usd/usdGeom/testenv/testUsdGeomSchemata.py | yurivict/USD | 3b097e3ba8fabf1777a1256e241ea15df83f3065 | [
"Apache-2.0"
] | 1 | 2018-10-03T19:08:33.000Z | 2018-10-03T19:08:33.000Z | #!/pxrpythonsubst
#
# Copyright 2017 Pixar
#
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
# Section 6. Trademarks. is deleted and replaced with:
#
# 6. Trademarks. This License does not grant permission to use the trade
# names, trademarks, service marks, or product names of the Licensor
# and its affiliates, except as required to comply with Section 4(c) of
# the License and to reproduce the content of the NOTICE file.
#
# You may obtain a copy of the Apache License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the Apache License with the above modification is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the Apache License for the specific
# language governing permissions and limitations under the Apache License.
# pylint: disable=map-builtin-not-iterating
import sys, unittest
from pxr import Sdf, Usd, UsdGeom, Vt, Gf, Tf
class TestUsdGeomSchemata(unittest.TestCase):
def test_Basic(self):
l = Sdf.Layer.CreateAnonymous()
stage = Usd.Stage.Open(l.identifier)
p = stage.DefinePrim("/Mesh", "Mesh")
self.assertTrue(p)
mesh = UsdGeom.Mesh(p)
self.assertTrue(mesh)
self.assertTrue(mesh.GetPrim())
self.assertTrue(not mesh.GetPointsAttr().Get(1))
self.assertEqual(p.GetTypeName(),
Usd.SchemaRegistry().GetSchemaTypeName(mesh._GetStaticTfType()))
#
# Make sure uniform access behaves as expected.
#
ori = p.GetAttribute("orientation")
# The generic orientation attribute should be automatically defined because
# it is a registered attribute of a well known schema. However, it's not
# yet authored at the current edit target.
self.assertTrue(ori.IsDefined())
self.assertTrue(not ori.IsAuthoredAt(ori.GetStage().GetEditTarget()))
# Author a value, and check that it's still defined, and now is in fact
# authored at the current edit target.
ori.Set(UsdGeom.Tokens.leftHanded)
self.assertTrue(ori.IsDefined())
self.assertTrue(ori.IsAuthoredAt(ori.GetStage().GetEditTarget()))
mesh.GetOrientationAttr().Set(UsdGeom.Tokens.rightHanded, 10)
# "leftHanded" should have been authored at Usd.TimeCode.Default, so reading the
# attribute at Default should return lh, not rh.
self.assertEqual(ori.Get(), UsdGeom.Tokens.leftHanded)
# The value "rightHanded" was set at t=10, so reading *any* time should
# return "rightHanded"
self.assertEqual(ori.Get(9.9), UsdGeom.Tokens.rightHanded)
self.assertEqual(ori.Get(10), UsdGeom.Tokens.rightHanded)
self.assertEqual(ori.Get(10.1), UsdGeom.Tokens.rightHanded)
self.assertEqual(ori.Get(11), UsdGeom.Tokens.rightHanded)
#
# Attribute name sanity check. We expect the names returned by the schema
# to match the names returned via the generic API.
#
self.assertTrue(len(mesh.GetSchemaAttributeNames()) > 0)
self.assertNotEqual(mesh.GetSchemaAttributeNames(True), mesh.GetSchemaAttributeNames(False))
for n in mesh.GetSchemaAttributeNames():
# apiName overrides
if n == "primvars:displayColor":
n = "displayColor"
elif n == "primvars:displayOpacity":
n = "displayOpacity"
name = n[0].upper() + n[1:]
self.assertTrue(("Get" + name + "Attr") in dir(mesh),
("Get" + name + "Attr() not found in: " + str(dir(mesh))))
def test_IsA(self):
# Author Scene and Compose Stage
l = Sdf.Layer.CreateAnonymous()
stage = Usd.Stage.Open(l.identifier)
# For every prim schema type in this module, validate that:
# 1. We can define a prim of its type
# 2. Its type and inheritance matches our expectations
# 3. At least one of its builtin properties is available and defined
# BasisCurves Tests
schema = UsdGeom.BasisCurves.Define(stage, "/BasisCurves")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # BasisCurves is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # BasisCurves is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # BasisCurves is not a Cylinder
self.assertTrue(schema.GetBasisAttr())
# Camera Tests
schema = UsdGeom.Camera.Define(stage, "/Camera")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # Camera is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # Camera is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # Camera is not a Cylinder
self.assertTrue(schema.GetFocalLengthAttr())
# Capsule Tests
schema = UsdGeom.Capsule.Define(stage, "/Capsule")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # Capsule is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # Capsule is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # Capsule is not a Cylinder
self.assertTrue(schema.GetAxisAttr())
# Cone Tests
schema = UsdGeom.Cone.Define(stage, "/Cone")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # Cone is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # Cone is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # Cone is not a Cylinder
self.assertTrue(schema.GetAxisAttr())
# Cube Tests
schema = UsdGeom.Cube.Define(stage, "/Cube")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # Cube is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # Cube is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # Cube is not a Cylinder
self.assertTrue(schema.GetSizeAttr())
# Cylinder Tests
schema = UsdGeom.Cylinder.Define(stage, "/Cylinder")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # Cylinder is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # Cylinder is a Xformable
self.assertTrue(prim.IsA(UsdGeom.Cylinder)) # Cylinder is a Cylinder
self.assertTrue(schema.GetAxisAttr())
# Mesh Tests
schema = UsdGeom.Mesh.Define(stage, "/Mesh")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertTrue(prim.IsA(UsdGeom.Mesh)) # Mesh is a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # Mesh is a XFormable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # Mesh is not a Cylinder
self.assertTrue(schema.GetFaceVertexCountsAttr())
# NurbsCurves Tests
schema = UsdGeom.NurbsCurves.Define(stage, "/NurbsCurves")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # NurbsCurves is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # NurbsCurves is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # NurbsCurves is not a Cylinder
self.assertTrue(schema.GetKnotsAttr())
# NurbsPatch Tests
schema = UsdGeom.NurbsPatch.Define(stage, "/NurbsPatch")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # NurbsPatch is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # NurbsPatch is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # NurbsPatch is not a Cylinder
self.assertTrue(schema.GetUKnotsAttr())
# Points Tests
schema = UsdGeom.Points.Define(stage, "/Points")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # Points is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # Points is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # Points is not a Cylinder
self.assertTrue(schema.GetWidthsAttr())
# Scope Tests
schema = UsdGeom.Scope.Define(stage, "/Scope")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # Scope is not a Mesh
self.assertFalse(prim.IsA(UsdGeom.Xformable)) # Scope is not a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # Scope is not a Cylinder
# Scope has no builtins!
# Sphere Tests
schema = UsdGeom.Sphere.Define(stage, "/Sphere")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # Sphere is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # Sphere is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # Sphere is not a Cylinder
self.assertTrue(schema.GetRadiusAttr())
# Xform Tests
schema = UsdGeom.Xform.Define(stage, "/Xform")
self.assertTrue(schema)
prim = schema.GetPrim()
self.assertFalse(prim.IsA(UsdGeom.Mesh)) # Xform is not a Mesh
self.assertTrue(prim.IsA(UsdGeom.Xformable)) # Xform is a Xformable
self.assertFalse(prim.IsA(UsdGeom.Cylinder)) # Xform is not a Cylinder
self.assertTrue(schema.GetXformOpOrderAttr())
def test_Fallbacks(self):
# Author Scene and Compose Stage
stage = Usd.Stage.CreateInMemory()
# Xformable Tests
identity = Gf.Matrix4d(1)
origin = Gf.Vec3f(0, 0, 0)
xform = UsdGeom.Xform.Define(stage, "/Xform") # direct subclass
xformOpOrder = xform.GetXformOpOrderAttr()
self.assertFalse(xformOpOrder.HasAuthoredValue())
# xformOpOrder has no fallback value
self.assertEqual(xformOpOrder.Get(), None)
self.assertFalse(xformOpOrder.HasFallbackValue())
# Try authoring and reverting...
xformOpOrderAttr = xform.GetPrim().GetAttribute(UsdGeom.Tokens.xformOpOrder)
self.assertTrue(xformOpOrderAttr)
self.assertEqual(xformOpOrderAttr.Get(), None)
opOrderVal = ["xformOp:transform"]
self.assertTrue(xformOpOrderAttr.Set(opOrderVal))
self.assertTrue(xformOpOrderAttr.HasAuthoredValue())
self.assertNotEqual(xformOpOrderAttr.Get(), None)
self.assertTrue(xformOpOrderAttr.Clear())
self.assertFalse(xformOpOrderAttr.HasAuthoredValue())
self.assertEqual(xformOpOrderAttr.Get(), None)
self.assertFalse(xformOpOrder.HasFallbackValue())
mesh = UsdGeom.Mesh.Define(stage, "/Mesh") # multiple ancestor hops
# PointBased and Curves
curves = UsdGeom.BasisCurves.Define(stage, "/Curves")
self.assertEqual(curves.GetNormalsInterpolation(), UsdGeom.Tokens.vertex)
self.assertEqual(curves.GetWidthsInterpolation(), UsdGeom.Tokens.vertex)
# Before we go, test that CreateXXXAttr performs as we expect in various
# scenarios
# Number 1: Sparse and non-sparse authoring on def'd prim
mesh.CreateDoubleSidedAttr(False, True)
self.assertFalse(mesh.GetDoubleSidedAttr().HasAuthoredValue())
mesh.CreateDoubleSidedAttr(False, False)
self.assertTrue(mesh.GetDoubleSidedAttr().HasAuthoredValue())
# Number 2: Sparse authoring demotes to dense for non-defed prim
overMesh = UsdGeom.Mesh(stage.OverridePrim('/overMesh'))
overMesh.CreateDoubleSidedAttr(False, True)
self.assertTrue(overMesh.GetDoubleSidedAttr().HasAuthoredValue())
self.assertEqual(overMesh.GetDoubleSidedAttr().Get(), False)
overMesh.CreateDoubleSidedAttr(True, True)
self.assertEqual(overMesh.GetDoubleSidedAttr().Get(), True)
# make it a defined mesh, and sanity check it still evals the same
mesh2 = UsdGeom.Mesh.Define(stage, "/overMesh")
self.assertEqual(overMesh.GetDoubleSidedAttr().Get(), True)
# Check querying of fallback values.
sphere = UsdGeom.Sphere.Define(stage, "/Sphere")
radius = sphere.GetRadiusAttr()
self.assertTrue(radius.HasFallbackValue())
radiusQuery = Usd.AttributeQuery(radius)
self.assertTrue(radiusQuery.HasFallbackValue())
def test_DefineSchema(self):
s = Usd.Stage.CreateInMemory()
parent = s.OverridePrim('/parent')
self.assertTrue(parent)
# Make a subscope.
scope = UsdGeom.Scope.Define(s, '/parent/subscope')
self.assertTrue(scope)
# Assert that a simple find or create gives us the scope back.
self.assertTrue(s.OverridePrim('/parent/subscope'))
self.assertEqual(s.OverridePrim('/parent/subscope'), scope.GetPrim())
# Try to make a mesh at subscope's path. This transforms the scope into a
# mesh, since Define() always authors typeName.
mesh = UsdGeom.Mesh.Define(s, '/parent/subscope')
self.assertTrue(mesh)
self.assertTrue(not scope)
# Make a mesh at a different path, should work.
mesh = UsdGeom.Mesh.Define(s, '/parent/mesh')
self.assertTrue(mesh)
def test_BasicMetadataCases(self):
s = Usd.Stage.CreateInMemory()
spherePrim = UsdGeom.Sphere.Define(s, '/sphere').GetPrim()
radius = spherePrim.GetAttribute('radius')
self.assertTrue(radius.HasMetadata('custom'))
self.assertTrue(radius.HasMetadata('typeName'))
self.assertTrue(radius.HasMetadata('variability'))
self.assertTrue(radius.IsDefined())
self.assertTrue(not radius.IsCustom())
self.assertEqual(radius.GetTypeName(), 'double')
allMetadata = radius.GetAllMetadata()
self.assertEqual(allMetadata['typeName'], 'double')
self.assertEqual(allMetadata['variability'], Sdf.VariabilityVarying)
self.assertEqual(allMetadata['custom'], False)
# Author a custom property spec.
layer = s.GetRootLayer()
sphereSpec = layer.GetPrimAtPath('/sphere')
radiusSpec = Sdf.AttributeSpec(
sphereSpec, 'radius', Sdf.ValueTypeNames.Double,
variability=Sdf.VariabilityUniform, declaresCustom=True)
self.assertTrue(radiusSpec.custom)
self.assertEqual(radiusSpec.variability, Sdf.VariabilityUniform)
# Definition should win.
self.assertTrue(not radius.IsCustom())
self.assertEqual(radius.GetVariability(), Sdf.VariabilityVarying)
allMetadata = radius.GetAllMetadata()
self.assertEqual(allMetadata['typeName'], 'double')
self.assertEqual(allMetadata['variability'], Sdf.VariabilityVarying)
self.assertEqual(allMetadata['custom'], False)
# List fields on 'visibility' attribute -- should include 'allowedTokens',
# provided by the property definition.
visibility = spherePrim.GetAttribute('visibility')
self.assertTrue(visibility.IsDefined())
self.assertTrue('allowedTokens' in visibility.GetAllMetadata())
# Assert that attribute fallback values are returned for builtin attributes.
do = spherePrim.GetAttribute('primvars:displayOpacity')
self.assertTrue(do.IsDefined())
self.assertTrue(do.Get() is None)
def test_Camera(self):
from pxr import Gf
stage = Usd.Stage.CreateInMemory()
camera = UsdGeom.Camera.Define(stage, "/Camera")
self.assertTrue(camera.GetPrim().IsA(UsdGeom.Xformable)) # Camera is Xformable
self.assertEqual(camera.GetProjectionAttr().Get(), 'perspective')
camera.GetProjectionAttr().Set('orthographic')
self.assertEqual(camera.GetProjectionAttr().Get(), 'orthographic')
self.assertTrue(Gf.IsClose(camera.GetHorizontalApertureAttr().Get(),
0.825 * 25.4, 1e-5))
camera.GetHorizontalApertureAttr().Set(3.0)
self.assertEqual(camera.GetHorizontalApertureAttr().Get(), 3.0)
self.assertTrue(Gf.IsClose(camera.GetVerticalApertureAttr().Get(),
0.602 * 25.4, 1e-5))
camera.GetVerticalApertureAttr().Set(2.0)
self.assertEqual(camera.GetVerticalApertureAttr().Get(), 2.0)
self.assertEqual(camera.GetFocalLengthAttr().Get(), 50.0)
camera.GetFocalLengthAttr().Set(35.0)
self.assertTrue(Gf.IsClose(camera.GetFocalLengthAttr().Get(), 35.0, 1e-5))
self.assertEqual(camera.GetClippingRangeAttr().Get(), Gf.Vec2f(1, 1000000))
camera.GetClippingRangeAttr().Set(Gf.Vec2f(5, 10))
self.assertTrue(Gf.IsClose(camera.GetClippingRangeAttr().Get(),
Gf.Vec2f(5, 10), 1e-5))
self.assertEqual(camera.GetClippingPlanesAttr().Get(), Vt.Vec4fArray())
cp = Vt.Vec4fArray([(1, 2, 3, 4), (8, 7, 6, 5)])
camera.GetClippingPlanesAttr().Set(cp)
self.assertEqual(camera.GetClippingPlanesAttr().Get(), cp)
cp = Vt.Vec4fArray()
camera.GetClippingPlanesAttr().Set(cp)
self.assertEqual(camera.GetClippingPlanesAttr().Get(), cp)
self.assertEqual(camera.GetFStopAttr().Get(), 0.0)
camera.GetFStopAttr().Set(2.8)
self.assertTrue(Gf.IsClose(camera.GetFStopAttr().Get(), 2.8, 1e-5))
self.assertEqual(camera.GetFocusDistanceAttr().Get(), 0.0)
camera.GetFocusDistanceAttr().Set(10.0)
self.assertEqual(camera.GetFocusDistanceAttr().Get(), 10.0)
def test_Points(self):
stage = Usd.Stage.CreateInMemory()
# Points Tests
schema = UsdGeom.Points.Define(stage, "/Points")
self.assertTrue(schema)
# Test that id's roundtrip properly, for big numbers, and negative numbers
ids = [8589934592, 1099511627776, 0, -42]
schema.CreateIdsAttr(ids)
resolvedIds = list(schema.GetIdsAttr().Get()) # convert VtArray to list
self.assertEqual(ids, resolvedIds)
def test_Revert_Bug111239(self):
# This used to test a change for Bug111239, but now tests that this
# fix has been reverted. We no longer allow the C++ typename be used as
# a prim's typename.
s = Usd.Stage.CreateInMemory()
sphere = s.DefinePrim('/sphere', typeName='Sphere')
tfTypeName = UsdGeom.Sphere._GetStaticTfType().typeName
self.assertEqual(tfTypeName, 'UsdGeomSphere')
usdGeomSphere = s.DefinePrim('/usdGeomSphere', typeName='tfTypeName')
self.assertTrue(UsdGeom.Sphere(sphere))
self.assertTrue('radius' in [a.GetName() for a in sphere.GetAttributes()])
self.assertFalse(UsdGeom.Sphere(usdGeomSphere))
self.assertFalse('radius' in [a.GetName() for a in usdGeomSphere.GetAttributes()])
def test_ComputeExtent(self):
from pxr import Gf
# Create some simple test cases
allPoints = [
[(1, 1, 0)], # Zero-Volume Extent Test
[(0, 0, 0)], # Simple Width Test
[(-1, -1, -1), (1, 1, 1)], # Multiple Width Test
[(-1, -1, -1), (1, 1, 1)], # Erroneous Widths/Points Test
# Complex Test, Many Points/Widths
[(3, -1, 5), (-1.5, 0, 3), (1, 3, -2), (2, 2, -4)],
]
allWidths = [
[0], # Zero-Volume Extent Test
[2], # Simple Width Test
[2, 4], # Multiple Width Test
[2, 4, 5], # Erroneous Widths/Points Test
[1, 2, 2, 1] # Complex Test, Many Points/Widths
]
pointBasedSolutions = [
[(1, 1, 0), (1, 1, 0)], # Zero-Volume Extent Test
[(0, 0, 0), (0, 0, 0)], # Simple Width Test
[(-1, -1, -1), (1, 1, 1)], # Multiple Width Test
# Erroneous Widths/Points Test -> Ok For Point-Based
[(-1, -1, -1), (1, 1, 1)],
[(-1.5, -1, -4), (3, 3, 5)] # Complex Test, Many Points/Widths
]
pointsSolutions = [
[(1, 1, 0), (1, 1, 0)], # Zero-Volume Extent Test
[(-1, -1, -1), (1, 1, 1)], # Simple Width Test
[(-2, -2, -2), (3, 3, 3)], # Multiple Width Test
# Erroneous Widths/Points Test -> Returns None
None,
[(-2.5, -1.5, -4.5), (3.5, 4, 5.5)] # Complex Test, Many Points/Widths
]
# Perform the correctness tests for PointBased and Points
# Test for empty points prims
emptyPoints = []
extremeExtentArr = UsdGeom.PointBased.ComputeExtent(emptyPoints)
# We need to map the contents of extremeExtentArr to floats from
# num.float32s due to the way Gf.Vec3f is wrapped out
# XXX: This is awful, it'd be nice to not do it
extremeExtentRange = Gf.Range3f(Gf.Vec3f(*map(float, extremeExtentArr[0])),
Gf.Vec3f(*map(float, extremeExtentArr[1])))
self.assertTrue(extremeExtentRange.IsEmpty())
# PointBased Test
numDataSets = len(allPoints)
for i in range(numDataSets):
pointsData = allPoints[i]
expectedExtent = pointBasedSolutions[i]
actualExtent = UsdGeom.PointBased.ComputeExtent(pointsData)
for a, b in zip(expectedExtent, actualExtent):
self.assertTrue(Gf.IsClose(a, b, 1e-5))
# Points Test
for i in range(numDataSets):
pointsData = allPoints[i]
widthsData = allWidths[i]
expectedExtent = pointsSolutions[i]
actualExtent = UsdGeom.Points.ComputeExtent(pointsData, widthsData)
if actualExtent is not None and expectedExtent is not None:
for a, b in zip(expectedExtent, actualExtent):
self.assertTrue(Gf.IsClose(a, b, 1e-5))
# Compute extent via generic UsdGeom.Boundable API
s = Usd.Stage.CreateInMemory()
pointsPrim = UsdGeom.Points.Define(s, "/Points")
pointsPrim.CreatePointsAttr(pointsData)
pointsPrim.CreateWidthsAttr(widthsData)
actualExtent = UsdGeom.Boundable.ComputeExtentFromPlugins(
pointsPrim, Usd.TimeCode.Default())
if actualExtent is not None and expectedExtent is not None:
for a, b in zip(expectedExtent, list(actualExtent)):
self.assertTrue(Gf.IsClose(a, b, 1e-5))
# Mesh Test
for i in range(numDataSets):
pointsData = allPoints[i]
expectedExtent = pointBasedSolutions[i]
# Compute extent via generic UsdGeom.Boundable API.
# UsdGeom.Mesh does not have its own compute extent function, so
# it should fall back to the extent for PointBased prims.
s = Usd.Stage.CreateInMemory()
meshPrim = UsdGeom.Mesh.Define(s, "/Mesh")
meshPrim.CreatePointsAttr(pointsData)
actualExtent = UsdGeom.Boundable.ComputeExtentFromPlugins(
meshPrim, Usd.TimeCode.Default())
for a, b in zip(expectedExtent, actualExtent):
self.assertTrue(Gf.IsClose(a, b, 1e-5))
# Test UsdGeomCurves
curvesPoints = [
[(0,0,0), (1,1,1), (2,1,1), (3,0,0)], # Test Curve with 1 width
[(0,0,0), (1,1,1), (2,1,1), (3,0,0)], # Test Curve with 2 widths
[(0,0,0), (1,1,1), (2,1,1), (3,0,0)] # Test Curve with no width
]
curvesWidths = [
[1], # Test Curve with 1 width
[.5, .1], # Test Curve with 2 widths
[] # Test Curve with no width
]
curvesSolutions = [
[(-.5,-.5,-.5), (3.5,1.5,1.5)], # Test Curve with 1 width
[(-.25,-.25,-.25), (3.25,1.25,1.25)], # Test Curve with 2 widths (MAX)
[(0,0,0), (3,1,1)], # Test Curve with no width
]
# Perform the actual v. expected comparison
numDataSets = len(curvesPoints)
for i in range(numDataSets):
pointsData = curvesPoints[i]
widths = curvesWidths[i]
expectedExtent = curvesSolutions[i]
actualExtent = UsdGeom.Curves.ComputeExtent(pointsData, widths)
for a, b in zip(expectedExtent, actualExtent):
self.assertTrue(Gf.IsClose(a, b, 1e-5))
# Compute extent via generic UsdGeom.Boundable API
s = Usd.Stage.CreateInMemory()
nurbsCurvesPrim = UsdGeom.NurbsCurves.Define(s, "/NurbsCurves")
nurbsCurvesPrim.CreatePointsAttr(pointsData)
nurbsCurvesPrim.CreateWidthsAttr(widths)
actualExtent = UsdGeom.Boundable.ComputeExtentFromPlugins(
nurbsCurvesPrim, Usd.TimeCode.Default())
for a, b in zip(expectedExtent, actualExtent):
self.assertTrue(Gf.IsClose(a, b, 1e-5))
basisCurvesPrim = UsdGeom.BasisCurves.Define(s, "/BasisCurves")
basisCurvesPrim.CreatePointsAttr(pointsData)
basisCurvesPrim.CreateWidthsAttr(widths)
actualExtent = UsdGeom.Boundable.ComputeExtentFromPlugins(
basisCurvesPrim, Usd.TimeCode.Default())
for a, b in zip(expectedExtent, actualExtent):
self.assertTrue(Gf.IsClose(a, b, 1e-5))
def test_TypeUsage(self):
# Perform Type-Ness Checking for ComputeExtent
pointsAsList = [(0, 0, 0), (1, 1, 1), (2, 2, 2)]
pointsAsVec3fArr = Vt.Vec3fArray(pointsAsList)
comp = UsdGeom.PointBased.ComputeExtent
expectedExtent = comp(pointsAsVec3fArr)
actualExtent = comp(pointsAsList)
for a, b in zip(expectedExtent, actualExtent):
self.assertTrue(Gf.IsClose(a, b, 1e-5))
def test_Bug116593(self):
from pxr import Gf
s = Usd.Stage.CreateInMemory()
prim = s.DefinePrim('/sphere', typeName='Sphere')
# set with list of tuples
vec = [(1,2,2),(12,3,3)]
self.assertTrue(UsdGeom.ModelAPI(prim).SetExtentsHint(vec))
self.assertEqual(UsdGeom.ModelAPI(prim).GetExtentsHint()[0], Gf.Vec3f(1,2,2))
self.assertEqual(UsdGeom.ModelAPI(prim).GetExtentsHint()[1], Gf.Vec3f(12,3,3))
# set with Gf vecs
vec = [Gf.Vec3f(1,2,2), Gf.Vec3f(1,1,1)]
self.assertTrue(UsdGeom.ModelAPI(prim).SetExtentsHint(vec))
self.assertEqual(UsdGeom.ModelAPI(prim).GetExtentsHint()[0], Gf.Vec3f(1,2,2))
self.assertEqual(UsdGeom.ModelAPI(prim).GetExtentsHint()[1], Gf.Vec3f(1,1,1))
def test_Typed(self):
from pxr import Tf
xform = Tf.Type.FindByName("UsdGeomXform")
imageable = Tf.Type.FindByName("UsdGeomImageable")
geomModelAPI = Tf.Type.FindByName("UsdGeomModelAPI")
self.assertTrue(Usd.SchemaRegistry.IsTyped(xform))
self.assertTrue(Usd.SchemaRegistry.IsTyped(imageable))
self.assertFalse(Usd.SchemaRegistry.IsTyped(geomModelAPI))
def test_Concrete(self):
from pxr import Tf
xform = Tf.Type.FindByName("UsdGeomXform")
imageable = Tf.Type.FindByName("UsdGeomImageable")
geomModelAPI = Tf.Type.FindByName("UsdGeomModelAPI")
self.assertTrue(Usd.SchemaRegistry().IsConcrete(xform))
self.assertFalse(Usd.SchemaRegistry().IsConcrete(imageable))
self.assertFalse(Usd.SchemaRegistry().IsConcrete(geomModelAPI))
def test_Apply(self):
s = Usd.Stage.CreateInMemory('AppliedSchemas.usd')
root = s.DefinePrim('/hello')
self.assertEqual([], root.GetAppliedSchemas())
# Check duplicates
UsdGeom.MotionAPI.Apply(root)
self.assertEqual(['MotionAPI'], root.GetAppliedSchemas())
UsdGeom.MotionAPI.Apply(root)
self.assertEqual(['MotionAPI'], root.GetAppliedSchemas())
# Ensure duplicates aren't picked up
UsdGeom.ModelAPI.Apply(root)
self.assertEqual(['MotionAPI', 'GeomModelAPI'], root.GetAppliedSchemas())
# Verify that we get exceptions but don't crash when applying to the
# null prim.
with self.assertRaises(Tf.ErrorException):
self.assertFalse(UsdGeom.MotionAPI.Apply(Usd.Prim()))
with self.assertRaises(Tf.ErrorException):
self.assertFalse(UsdGeom.ModelAPI.Apply(Usd.Prim()))
def test_IsATypeless(self):
from pxr import Usd, Tf
s = Usd.Stage.CreateInMemory()
spherePrim = s.DefinePrim('/sphere', typeName='Sphere')
typelessPrim = s.DefinePrim('/regular')
types = [Tf.Type.FindByName('UsdGeomSphere'),
Tf.Type.FindByName('UsdGeomGprim'),
Tf.Type.FindByName('UsdGeomBoundable'),
Tf.Type.FindByName('UsdGeomXformable'),
Tf.Type.FindByName('UsdGeomImageable'),
Tf.Type.FindByName('UsdTyped')]
# Our sphere prim should return true on IsA queries for Sphere
# and everything it inherits from. Our plain prim should return false
# for all of them.
for t in types:
self.assertTrue(spherePrim.IsA(t))
self.assertFalse(typelessPrim.IsA(t))
def test_HasAPI(self):
from pxr import Usd, Tf
s = Usd.Stage.CreateInMemory()
prim = s.DefinePrim('/prim')
types = [Tf.Type.FindByName('UsdGeomMotionAPI'),
Tf.Type.FindByName('UsdGeomModelAPI')]
# Check that no APIs have yet been applied
for t in types:
self.assertFalse(prim.HasAPI(t))
# Apply our schemas to this prim
UsdGeom.ModelAPI.Apply(prim)
UsdGeom.MotionAPI.Apply(prim)
# Check that all our applied schemas show up
for t in types:
self.assertTrue(prim.HasAPI(t))
# Check that we get an exception for unknown and non-API types
with self.assertRaises(Tf.ErrorException):
prim.HasAPI(Tf.Type.Unknown)
with self.assertRaises(Tf.ErrorException):
prim.HasAPI(Tf.Type.FindByName('UsdGeomXform'))
with self.assertRaises(Tf.ErrorException):
prim.HasAPI(Tf.Type.FindByName('UsdGeomImageable'))
with self.assertRaises(Tf.ErrorException):
# Test with a non-applied API schema.
prim.HasAPI(Tf.Type.FindByName('UsdModelAPI'))
if __name__ == "__main__":
unittest.main()
| 42.69326 | 100 | 0.63055 | 3,425 | 31,038 | 5.706277 | 0.16292 | 0.070917 | 0.027937 | 0.028142 | 0.456048 | 0.358831 | 0.312679 | 0.27763 | 0.258033 | 0.209681 | 0 | 0.018041 | 0.257104 | 31,038 | 726 | 101 | 42.752066 | 0.82956 | 0.204266 | 0 | 0.384449 | 0 | 0 | 0.04174 | 0.002734 | 0 | 0 | 0 | 0 | 0.414687 | 1 | 0.034557 | false | 0 | 0.019438 | 0 | 0.056156 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
168fdf67ec71ebdf125bbe9b6f5c14dad854391f | 1,310 | py | Python | round_robin_generator/matchup_times.py | avadavat/round_robin_generator | 242d522386f6af26db029232fcffb51004ff4c59 | [
"MIT"
] | null | null | null | round_robin_generator/matchup_times.py | avadavat/round_robin_generator | 242d522386f6af26db029232fcffb51004ff4c59 | [
"MIT"
] | 5 | 2020-04-26T19:44:41.000Z | 2020-05-01T16:26:06.000Z | round_robin_generator/matchup_times.py | avadavat/round_robin_generator | 242d522386f6af26db029232fcffb51004ff4c59 | [
"MIT"
] | null | null | null | import pandas as pd
from datetime import timedelta
def generate_times(matchup_df: pd.DataFrame, tournament_start_time, game_duration, game_stagger):
time_df = pd.DataFrame(index=matchup_df.index, columns=matchup_df.columns)
if game_stagger == 0:
for round_num in range(time_df.shape[0]):
round_key = 'Round ' + str(round_num + 1)
match_time = tournament_start_time + timedelta(minutes=(game_duration * round_num))
time_df.loc[round_key, :] = match_time.strftime('%I:%M%p')
return time_df
else:
"""
# Given the algorithm, at worst every player can play every (game duration + stagger time)
# This is b/c your opponent begins play one stagger count after you at the latest.
"""
for round_num in range(time_df.shape[0]):
round_key = 'Round ' + str(round_num + 1)
default_spread = [tournament_start_time + timedelta(minutes=game_num * game_stagger) for game_num in
range(time_df.shape[1])]
match_times = [
(def_time + timedelta(minutes=((game_duration + game_stagger) * round_num))).strftime('%I:%M%p') for
def_time in default_spread]
time_df.loc[round_key, :] = match_times
return time_df
| 48.518519 | 116 | 0.636641 | 178 | 1,310 | 4.432584 | 0.359551 | 0.060837 | 0.072243 | 0.053232 | 0.371356 | 0.320659 | 0.139417 | 0.139417 | 0.139417 | 0.139417 | 0 | 0.006244 | 0.266412 | 1,310 | 26 | 117 | 50.384615 | 0.814776 | 0 | 0 | 0.3 | 1 | 0 | 0.023529 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.1 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1690da2be65319bb6696ac8f2ce11540524171c2 | 14,922 | py | Python | src/week2-mlflow/AutoML/XGBoost-fake-news-automl.py | xzhnshng/databricks-zero-to-mlops | f1691c6f6137ad8b938e64cea4700c7011efb800 | [
"CC0-1.0"
] | null | null | null | src/week2-mlflow/AutoML/XGBoost-fake-news-automl.py | xzhnshng/databricks-zero-to-mlops | f1691c6f6137ad8b938e64cea4700c7011efb800 | [
"CC0-1.0"
] | null | null | null | src/week2-mlflow/AutoML/XGBoost-fake-news-automl.py | xzhnshng/databricks-zero-to-mlops | f1691c6f6137ad8b938e64cea4700c7011efb800 | [
"CC0-1.0"
] | null | null | null | # Databricks notebook source
# MAGIC %md
# MAGIC # XGBoost training
# MAGIC This is an auto-generated notebook. To reproduce these results, attach this notebook to the **10-3-ML-Cluster** cluster and rerun it.
# MAGIC - Compare trials in the [MLflow experiment](#mlflow/experiments/406583024052808/s?orderByKey=metrics.%60val_f1_score%60&orderByAsc=false)
# MAGIC - Navigate to the parent notebook [here](#notebook/406583024052798) (If you launched the AutoML experiment using the Experiments UI, this link isn't very useful.)
# MAGIC - Clone this notebook into your project folder by selecting **File > Clone** in the notebook toolbar.
# MAGIC
# MAGIC Runtime Version: _10.3.x-cpu-ml-scala2.12_
# COMMAND ----------
import mlflow
import databricks.automl_runtime
# Use MLflow to track experiments
mlflow.set_experiment("/Users/noah.gift@gmail.com/databricks_automl/label_news_articles_csv-2022_03_12-15_38")
target_col = "label"
# COMMAND ----------
# MAGIC %md
# MAGIC ## Load Data
# COMMAND ----------
from mlflow.tracking import MlflowClient
import os
import uuid
import shutil
import pandas as pd
# Create temp directory to download input data from MLflow
input_temp_dir = os.path.join(os.environ["SPARK_LOCAL_DIRS"], "tmp", str(uuid.uuid4())[:8])
os.makedirs(input_temp_dir)
# Download the artifact and read it into a pandas DataFrame
input_client = MlflowClient()
input_data_path = input_client.download_artifacts("c2dfe80b419d4a8dbc88a90e3274369a", "data", input_temp_dir)
df_loaded = pd.read_parquet(os.path.join(input_data_path, "training_data"))
# Delete the temp data
shutil.rmtree(input_temp_dir)
# Preview data
df_loaded.head(5)
# COMMAND ----------
df_loaded.head(1).to_dict()
# COMMAND ----------
# MAGIC %md
# MAGIC ### Select supported columns
# MAGIC Select only the columns that are supported. This allows us to train a model that can predict on a dataset that has extra columns that are not used in training.
# MAGIC `[]` are dropped in the pipelines. See the Alerts tab of the AutoML Experiment page for details on why these columns are dropped.
# COMMAND ----------
from databricks.automl_runtime.sklearn.column_selector import ColumnSelector
supported_cols = ["text_without_stopwords", "published", "language", "main_img_url", "site_url", "hasImage", "title_without_stopwords", "text", "title", "type", "author"]
col_selector = ColumnSelector(supported_cols)
# COMMAND ----------
# MAGIC %md
# MAGIC ## Preprocessors
# COMMAND ----------
transformers = []
# COMMAND ----------
# MAGIC %md
# MAGIC ### Categorical columns
# COMMAND ----------
# MAGIC %md
# MAGIC #### Low-cardinality categoricals
# MAGIC Convert each low-cardinality categorical column into multiple binary columns through one-hot encoding.
# MAGIC For each input categorical column (string or numeric), the number of output columns is equal to the number of unique values in the input column.
# COMMAND ----------
from sklearn.pipeline import Pipeline
from sklearn.preprocessing import OneHotEncoder
one_hot_encoder = OneHotEncoder(handle_unknown="ignore")
transformers.append(("onehot", one_hot_encoder, ["published", "language", "site_url", "hasImage", "title", "title_without_stopwords", "text_without_stopwords"]))
# COMMAND ----------
# MAGIC %md
# MAGIC #### Medium-cardinality categoricals
# MAGIC Convert each medium-cardinality categorical column into a numerical representation.
# MAGIC Each string column is hashed to 1024 float columns.
# MAGIC Each numeric column is imputed with zeros.
# COMMAND ----------
from sklearn.feature_extraction import FeatureHasher
from sklearn.impute import SimpleImputer
from sklearn.pipeline import Pipeline
for feature in ["text", "main_img_url"]:
hash_transformer = Pipeline(steps=[
("imputer", SimpleImputer(missing_values=None, strategy="constant", fill_value="")),
(f"{feature}_hasher", FeatureHasher(n_features=1024, input_type="string"))])
transformers.append((f"{feature}_hasher", hash_transformer, [feature]))
# COMMAND ----------
# MAGIC %md
# MAGIC ### Text features
# MAGIC Convert each feature to a fixed-length vector using TF-IDF vectorization. The length of the output
# MAGIC vector is equal to 1024. Each column corresponds to one of the top word n-grams
# MAGIC where n is in the range [1, 2].
# COMMAND ----------
import numpy as np
from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.impute import SimpleImputer
from sklearn.pipeline import Pipeline
from sklearn.preprocessing import FunctionTransformer
for col in {'type', 'author'}:
vectorizer = Pipeline(steps=[
("imputer", SimpleImputer(missing_values=None, strategy="constant", fill_value="")),
# Reshape to 1D since SimpleImputer changes the shape of the input to 2D
("reshape", FunctionTransformer(np.reshape, kw_args={"newshape":-1})),
("tfidf", TfidfVectorizer(decode_error="ignore", ngram_range = (1, 2), max_features=1024))])
transformers.append((f"text_{col}", vectorizer, [col]))
# COMMAND ----------
from sklearn.compose import ColumnTransformer
preprocessor = ColumnTransformer(transformers, remainder="passthrough", sparse_threshold=0)
# COMMAND ----------
# MAGIC %md
# MAGIC ### Feature standardization
# MAGIC Scale all feature columns to be centered around zero with unit variance.
# COMMAND ----------
from sklearn.preprocessing import StandardScaler
standardizer = StandardScaler()
# COMMAND ----------
# MAGIC %md
# MAGIC ## Train - Validation - Test Split
# MAGIC Split the input data into 3 sets:
# MAGIC - Train (60% of the dataset used to train the model)
# MAGIC - Validation (20% of the dataset used to tune the hyperparameters of the model)
# MAGIC - Test (20% of the dataset used to report the true performance of the model on an unseen dataset)
# COMMAND ----------
df_loaded.columns
# COMMAND ----------
from sklearn.model_selection import train_test_split
split_X = df_loaded.drop([target_col], axis=1)
split_y = df_loaded[target_col]
# Split out train data
X_train, split_X_rem, y_train, split_y_rem = train_test_split(split_X, split_y, train_size=0.6, random_state=799811440, stratify=split_y)
# Split remaining data equally for validation and test
X_val, X_test, y_val, y_test = train_test_split(split_X_rem, split_y_rem, test_size=0.5, random_state=799811440, stratify=split_y_rem)
# COMMAND ----------
# MAGIC %md
# MAGIC ## Train classification model
# MAGIC - Log relevant metrics to MLflow to track runs
# MAGIC - All the runs are logged under [this MLflow experiment](#mlflow/experiments/406583024052808/s?orderByKey=metrics.%60val_f1_score%60&orderByAsc=false)
# MAGIC - Change the model parameters and re-run the training cell to log a different trial to the MLflow experiment
# MAGIC - To view the full list of tunable hyperparameters, check the output of the cell below
# COMMAND ----------
from xgboost import XGBClassifier
help(XGBClassifier)
# COMMAND ----------
import mlflow
import sklearn
from sklearn import set_config
from sklearn.pipeline import Pipeline
set_config(display="diagram")
xgbc_classifier = XGBClassifier(
colsample_bytree=0.7324555878929649,
learning_rate=0.007636627530856404,
max_depth=7,
min_child_weight=6,
n_estimators=106,
n_jobs=100,
subsample=0.6972187716458148,
verbosity=0,
random_state=799811440,
)
model = Pipeline([
("column_selector", col_selector),
("preprocessor", preprocessor),
("standardizer", standardizer),
("classifier", xgbc_classifier),
])
# Create a separate pipeline to transform the validation dataset. This is used for early stopping.
pipeline = Pipeline([
("column_selector", col_selector),
("preprocessor", preprocessor),
("standardizer", standardizer),
])
mlflow.sklearn.autolog(disable=True)
X_val_processed = pipeline.fit_transform(X_val, y_val)
model
# COMMAND ----------
# Enable automatic logging of input samples, metrics, parameters, and models
mlflow.sklearn.autolog(log_input_examples=True, silent=True)
with mlflow.start_run(run_name="xgboost") as mlflow_run:
model.fit(X_train, y_train, classifier__early_stopping_rounds=5, classifier__eval_set=[(X_val_processed,y_val)], classifier__verbose=False)
# Training metrics are logged by MLflow autologging
# Log metrics for the validation set
xgbc_val_metrics = mlflow.sklearn.eval_and_log_metrics(model, X_val, y_val, prefix="val_")
# Log metrics for the test set
xgbc_test_metrics = mlflow.sklearn.eval_and_log_metrics(model, X_test, y_test, prefix="test_")
# Display the logged metrics
xgbc_val_metrics = {k.replace("val_", ""): v for k, v in xgbc_val_metrics.items()}
xgbc_test_metrics = {k.replace("test_", ""): v for k, v in xgbc_test_metrics.items()}
display(pd.DataFrame([xgbc_val_metrics, xgbc_test_metrics], index=["validation", "test"]))
# COMMAND ----------
# Patch requisite packages to the model environment YAML for model serving
import os
import shutil
import uuid
import yaml
None
import xgboost
from mlflow.tracking import MlflowClient
xgbc_temp_dir = os.path.join(os.environ["SPARK_LOCAL_DIRS"], str(uuid.uuid4())[:8])
os.makedirs(xgbc_temp_dir)
xgbc_client = MlflowClient()
xgbc_model_env_path = xgbc_client.download_artifacts(mlflow_run.info.run_id, "model/conda.yaml", xgbc_temp_dir)
xgbc_model_env_str = open(xgbc_model_env_path)
xgbc_parsed_model_env_str = yaml.load(xgbc_model_env_str, Loader=yaml.FullLoader)
xgbc_parsed_model_env_str["dependencies"][-1]["pip"].append(f"xgboost=={xgboost.__version__}")
with open(xgbc_model_env_path, "w") as f:
f.write(yaml.dump(xgbc_parsed_model_env_str))
xgbc_client.log_artifact(run_id=mlflow_run.info.run_id, local_path=xgbc_model_env_path, artifact_path="model")
shutil.rmtree(xgbc_temp_dir)
# COMMAND ----------
# MAGIC %md
# MAGIC ## Feature importance
# MAGIC
# MAGIC SHAP is a game-theoretic approach to explain machine learning models, providing a summary plot
# MAGIC of the relationship between features and model output. Features are ranked in descending order of
# MAGIC importance, and impact/color describe the correlation between the feature and the target variable.
# MAGIC - Generating SHAP feature importance is a very memory intensive operation, so to ensure that AutoML can run trials without
# MAGIC running out of memory, we disable SHAP by default.<br />
# MAGIC You can set the flag defined below to `shap_enabled = True` and re-run this notebook to see the SHAP plots.
# MAGIC - To reduce the computational overhead of each trial, a single example is sampled from the validation set to explain.<br />
# MAGIC For more thorough results, increase the sample size of explanations, or provide your own examples to explain.
# MAGIC - SHAP cannot explain models using data with nulls; if your dataset has any, both the background data and
# MAGIC examples to explain will be imputed using the mode (most frequent values). This affects the computed
# MAGIC SHAP values, as the imputed samples may not match the actual data distribution.
# MAGIC
# MAGIC For more information on how to read Shapley values, see the [SHAP documentation](https://shap.readthedocs.io/en/latest/example_notebooks/overviews/An%20introduction%20to%20explainable%20AI%20with%20Shapley%20values.html).
# COMMAND ----------
# Set this flag to True and re-run the notebook to see the SHAP plots
shap_enabled = True
# COMMAND ----------
if shap_enabled:
from shap import KernelExplainer, summary_plot
# SHAP cannot explain models using data with nulls.
# To enable SHAP to succeed, both the background data and examples to explain are imputed with the mode (most frequent values).
mode = X_train.mode().iloc[0]
# Sample background data for SHAP Explainer. Increase the sample size to reduce variance.
train_sample = X_train.sample(n=min(100, len(X_train.index))).fillna(mode)
# Sample a single example from the validation set to explain. Increase the sample size and rerun for more thorough results.
example = X_val.sample(n=1).fillna(mode)
# Use Kernel SHAP to explain feature importance on the example from the validation set.
predict = lambda x: model.predict_proba(pd.DataFrame(x, columns=X_train.columns))
explainer = KernelExplainer(predict, train_sample, link="logit")
shap_values = explainer.shap_values(example, l1_reg=False)
summary_plot(shap_values, example, class_names=model.classes_)
# COMMAND ----------
# MAGIC %md
# MAGIC ## Inference
# MAGIC [The MLflow Model Registry](https://docs.databricks.com/applications/mlflow/model-registry.html) is a collaborative hub where teams can share ML models, work together from experimentation to online testing and production, integrate with approval and governance workflows, and monitor ML deployments and their performance. The snippets below show how to add the model trained in this notebook to the model registry and to retrieve it later for inference.
# MAGIC
# MAGIC > **NOTE:** The `model_uri` for the model already trained in this notebook can be found in the cell below
# MAGIC
# MAGIC ### Register to Model Registry
# MAGIC ```
# MAGIC model_name = "Example"
# MAGIC
# MAGIC model_uri = f"runs:/{ mlflow_run.info.run_id }/model"
# MAGIC registered_model_version = mlflow.register_model(model_uri, model_name)
# MAGIC ```
# MAGIC
# MAGIC ### Load from Model Registry
# MAGIC ```
# MAGIC model_name = "Example"
# MAGIC model_version = registered_model_version.version
# MAGIC
# MAGIC model = mlflow.pyfunc.load_model(model_uri=f"models:/{model_name}/{model_version}")
# MAGIC model.predict(input_X)
# MAGIC ```
# MAGIC
# MAGIC ### Load model without registering
# MAGIC ```
# MAGIC model_uri = f"runs:/{ mlflow_run.info.run_id }/model"
# MAGIC
# MAGIC model = mlflow.pyfunc.load_model(model_uri)
# MAGIC model.predict(input_X)
# MAGIC ```
# COMMAND ----------
# model_uri for the generated model
print(f"runs:/{ mlflow_run.info.run_id }/model")
# COMMAND ----------
# MAGIC %md
# MAGIC ### Loading model to make prediction
# COMMAND ----------
model_uri = f"runs:/51c0348482e042ea8e4b7983ab6bff99/model"
model = mlflow.pyfunc.load_model(model_uri)
#model.predict(input_X)
# COMMAND ----------
import pandas as pd
data = {'author': {0: 'bigjim.com'},
'published': {0: '2016-10-27T18:05:26.351+03:00'},
'title': {0: 'aliens are coming to invade earth'},
'text': {0: 'aliens are coming to invade earth'},
'language': {0: 'english'},
'site_url': {0: 'cnn.com'},
'main_img_url': {0: 'https://2.bp.blogspot.com/-0mdp0nZiwMI/UYwYvexmW2I/AAAAAAAAVQM/7C_X5WRE_mQ/w1200-h630-p-nu/Edison-Stock-Ticker.jpg'},
'type': {0: 'bs'},
'title_without_stopwords': {0: 'aliens are coming to invade earth'},
'text_without_stopwords': {0: 'aliens are coming to invade earth'},
'hasImage': {0: 1.0}}
df = pd.DataFrame(data=data)
df.head()
# COMMAND ----------
model.predict(df)
# COMMAND ----------
| 36.753695 | 461 | 0.743399 | 2,099 | 14,922 | 5.13959 | 0.273464 | 0.013904 | 0.015573 | 0.022896 | 0.229514 | 0.170745 | 0.136911 | 0.131072 | 0.109752 | 0.062106 | 0 | 0.024601 | 0.13919 | 14,922 | 405 | 462 | 36.844444 | 0.815259 | 0.513537 | 0 | 0.2 | 1 | 0.007143 | 0.172551 | 0.053452 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.007143 | 0.242857 | 0 | 0.242857 | 0.007143 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1694a3aec6658351c14a81b2e91e92955b6cb8a7 | 341 | py | Python | lucky_guess/__init__.py | mfinzi/lucky-guess-chemist | 01898b733dc7d026f70d0cb6337309cb600502fb | [
"MIT"
] | null | null | null | lucky_guess/__init__.py | mfinzi/lucky-guess-chemist | 01898b733dc7d026f70d0cb6337309cb600502fb | [
"MIT"
] | null | null | null | lucky_guess/__init__.py | mfinzi/lucky-guess-chemist | 01898b733dc7d026f70d0cb6337309cb600502fb | [
"MIT"
] | null | null | null |
import importlib
import pkgutil
__all__ = []
for loader, module_name, is_pkg in pkgutil.walk_packages(__path__):
module = importlib.import_module('.'+module_name,package=__name__)
try:
globals().update({k: getattr(module, k) for k in module.__all__})
__all__ += module.__all__
except AttributeError: continue | 34.1 | 73 | 0.71261 | 42 | 341 | 5.095238 | 0.547619 | 0.140187 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 341 | 10 | 74 | 34.1 | 0.767025 | 0 | 0 | 0 | 0 | 0 | 0.002924 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
169ed0cf36c52beabffce88a57318686603b6c41 | 443 | py | Python | users/migrations/0002_auto_20191113_1352.py | Dragonite/djangohat | 68890703b1fc647785cf120ada281d6f3fcc4121 | [
"MIT"
] | 2 | 2019-11-15T05:07:24.000Z | 2019-11-15T10:27:48.000Z | users/migrations/0002_auto_20191113_1352.py | Dragonite/djangohat | 68890703b1fc647785cf120ada281d6f3fcc4121 | [
"MIT"
] | null | null | null | users/migrations/0002_auto_20191113_1352.py | Dragonite/djangohat | 68890703b1fc647785cf120ada281d6f3fcc4121 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.2 on 2019-11-13 13:52
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('users', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='users',
name='site_key',
field=models.CharField(blank=True, default='b7265a9e874f4068b0b48d45ef97595a', max_length=32, unique=True),
),
]
| 23.315789 | 119 | 0.625282 | 47 | 443 | 5.808511 | 0.765957 | 0.014652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130699 | 0.257336 | 443 | 18 | 120 | 24.611111 | 0.699088 | 0.10158 | 0 | 0 | 1 | 0 | 0.156566 | 0.080808 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16a335de057546c0e95c5699aa9470bc30a7f928 | 334 | py | Python | src/djangoreactredux/wsgi.py | noscripter/django-react-redux-jwt-base | 078fb86005db106365df51fa11d8602fa432e3c3 | [
"MIT"
] | 4 | 2016-07-03T08:18:45.000Z | 2018-12-25T07:47:41.000Z | src/djangoreactredux/wsgi.py | noscripter/django-react-redux-jwt-base | 078fb86005db106365df51fa11d8602fa432e3c3 | [
"MIT"
] | 2 | 2021-03-20T00:02:08.000Z | 2021-06-10T23:34:26.000Z | src/djangoreactredux/wsgi.py | noscripter/django-react-redux-jwt-base | 078fb86005db106365df51fa11d8602fa432e3c3 | [
"MIT"
] | 1 | 2019-08-02T14:51:41.000Z | 2019-08-02T14:51:41.000Z | """
WSGI config for django-react-redux-jwt-base project.
"""
import os
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "djangoreactredux.settings.dev")
from django.core.wsgi import get_wsgi_application
from whitenoise.django import DjangoWhiteNoise
application = get_wsgi_application()
application = DjangoWhiteNoise(application)
| 23.857143 | 80 | 0.820359 | 40 | 334 | 6.7 | 0.575 | 0.052239 | 0.134328 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086826 | 334 | 13 | 81 | 25.692308 | 0.878689 | 0.155689 | 0 | 0 | 0 | 0 | 0.186813 | 0.186813 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
16a40296272a4a2617c7e2666b6828a4cb958030 | 1,414 | py | Python | simple_settings/dynamic_settings/base.py | matthewh/simple-settings | dbddf8d5be7096ee7c4c3cc6d82824befa9b714f | [
"MIT"
] | null | null | null | simple_settings/dynamic_settings/base.py | matthewh/simple-settings | dbddf8d5be7096ee7c4c3cc6d82824befa9b714f | [
"MIT"
] | null | null | null | simple_settings/dynamic_settings/base.py | matthewh/simple-settings | dbddf8d5be7096ee7c4c3cc6d82824befa9b714f | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import re
from copy import deepcopy
import jsonpickle
class BaseReader(object):
"""
Base class for dynamic readers
"""
_default_conf = {}
def __init__(self, conf):
self.conf = deepcopy(self._default_conf)
self.conf.update(conf)
self.key_pattern = self.conf.get('pattern')
self.auto_casting = self.conf.get('auto_casting')
self.key_prefix = self.conf.get('prefix')
def get(self, key):
if not self._is_valid_key(key):
return
result = self._get(self._qualified_key(key))
if self.auto_casting and (result is not None):
result = jsonpickle.decode(result)
return result
def set(self, key, value):
if not self._is_valid_key(key):
return
if self.auto_casting:
value = jsonpickle.encode(value)
self._set(self._qualified_key(key), value)
def _is_valid_key(self, key):
if not self.key_pattern:
return True
return bool(re.match(self.key_pattern, key))
def _qualified_key(self, key):
"""
Prepends the configured prefix to the key (if applicable).
:param key: The unprefixed key.
:return: The key with any configured prefix prepended.
"""
pfx = self.key_prefix if self.key_prefix is not None else ''
return '{}{}'.format(pfx, key)
| 28.28 | 68 | 0.609618 | 183 | 1,414 | 4.519126 | 0.311475 | 0.084643 | 0.050786 | 0.029021 | 0.095526 | 0.067715 | 0.067715 | 0.067715 | 0 | 0 | 0 | 0.000988 | 0.2843 | 1,414 | 49 | 69 | 28.857143 | 0.816206 | 0.141443 | 0 | 0.129032 | 0 | 0 | 0.025022 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16129 | false | 0 | 0.096774 | 0 | 0.516129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
16a89cacbc82dd93659b9a841883e22a139d8576 | 447 | py | Python | main.py | 1999foxes/run-cmd-from-websocket | 0e2a080fe92b93c6cba63dfe5649ac2a3e745009 | [
"Apache-2.0"
] | null | null | null | main.py | 1999foxes/run-cmd-from-websocket | 0e2a080fe92b93c6cba63dfe5649ac2a3e745009 | [
"Apache-2.0"
] | null | null | null | main.py | 1999foxes/run-cmd-from-websocket | 0e2a080fe92b93c6cba63dfe5649ac2a3e745009 | [
"Apache-2.0"
] | null | null | null | import asyncio
import json
import logging
import websockets
logging.basicConfig()
async def counter(websocket, path):
try:
print("connect")
async for message in websocket:
print(message)
finally:
USERS.remove(websocket)
async def main():
async with websockets.serve(counter, "localhost", 5000):
await asyncio.Future() # run forever
if __name__ == "__main__":
asyncio.run(main())
| 17.88 | 60 | 0.657718 | 50 | 447 | 5.72 | 0.62 | 0.055944 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011834 | 0.243848 | 447 | 24 | 61 | 18.625 | 0.83432 | 0.024609 | 0 | 0 | 0 | 0 | 0.0553 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.235294 | 0 | 0.235294 | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16a8a652721deb01765dac84306cf8e790d8b09a | 3,998 | py | Python | 3d_Vnet/3dvnet.py | GingerSpacetail/Brain-Tumor-Segmentation-and-Survival-Prediction-using-Deep-Neural-Networks | f627ce48e44bcc7d295ee1cf4086bfdfd7705d44 | [
"MIT"
] | 100 | 2020-05-21T10:23:31.000Z | 2022-03-26T18:26:38.000Z | 3d_Vnet/3dvnet.py | GingerSpacetail/Brain-Tumor-Segmentation-and-Survival-Prediction-using-Deep-Neural-Networks | f627ce48e44bcc7d295ee1cf4086bfdfd7705d44 | [
"MIT"
] | 3 | 2020-08-19T18:14:01.000Z | 2021-01-04T09:53:07.000Z | 3d_Vnet/3dvnet.py | GingerSpacetail/Brain-Tumor-Segmentation-and-Survival-Prediction-using-Deep-Neural-Networks | f627ce48e44bcc7d295ee1cf4086bfdfd7705d44 | [
"MIT"
] | 25 | 2020-09-05T04:19:22.000Z | 2022-02-09T19:30:29.000Z | import random
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
#%matplotlib inline
import tensorflow as tf
import keras.backend as K
from keras.utils import to_categorical
from keras import metrics
from keras.models import Model, load_model
from keras.layers import Input, BatchNormalization, Activation, Dense, Dropout,Maximum
from keras.layers.core import Lambda, RepeatVector, Reshape
from keras.layers.convolutional import Conv2D, Conv2DTranspose,Conv3D,Conv3DTranspose
from keras.layers.pooling import MaxPooling2D, GlobalMaxPool2D,MaxPooling3D
from keras.layers.merge import concatenate, add
from keras.callbacks import EarlyStopping, ModelCheckpoint, ReduceLROnPlateau
from keras.optimizers import Adam
from keras.preprocessing.image import ImageDataGenerator, array_to_img, img_to_array, load_img
from skimage.io import imread, imshow, concatenate_images
from skimage.transform import resize
from sklearn.utils import class_weight
from keras.callbacks import ModelCheckpoint
from keras.callbacks import CSVLogger
from keras.callbacks import EarlyStopping
from keras.layers.advanced_activations import PReLU
import os
from skimage.io import imread, imshow, concatenate_images
from skimage.transform import resize
# from medpy.io import load
import numpy as np
#import cv2
import nibabel as nib
from PIL import Image
def conv_block(input_mat,num_filters,kernel_size,batch_norm):
X = Conv3D(num_filters,kernel_size=(kernel_size,kernel_size,kernel_size),strides=(1,1,1),padding='same')(input_mat)
if batch_norm:
X = BatchNormalization()(X)
X = Activation('relu')(X)
X = Conv3D(num_filters,kernel_size=(kernel_size,kernel_size,kernel_size),strides=(1,1,1),padding='same')(X)
if batch_norm:
X = BatchNormalization()(X)
X = Activation('relu')(X)
X = add([input_mat,X]);
return X
def Vnet_3d(input_img, n_filters = 8, dropout = 0.2, batch_norm = True):
#c1 = conv_block(input_img,n_filters,3,batch_norm)
c1 = Conv3D(n_filters,kernel_size = (5,5,5) , strides = (1,1,1) , padding='same')(input_img)
#c1 = add([c1,input_img])
c2 = Conv3D(n_filters*2,kernel_size = (2,2,2) , strides = (2,2,2) , padding = 'same' )(c1)
c3 = conv_block(c2 , n_filters*2,5,True)
p3 = Conv3D(n_filters*4,kernel_size = (2,2,2) , strides = (2,2,2), padding = 'same')(c3)
p3 = Dropout(dropout)(p3)
c4 = conv_block(p3, n_filters*4,5,True)
p4 = Conv3D(n_filters*8,kernel_size = (2,2,2) , strides = (2,2,2) , padding='same')(c4)
p4 = Dropout(dropout)(p4)
c5 = conv_block(p4, n_filters*8,5,True)
p6 = Conv3D(n_filters*16,kernel_size = (2,2,2) , strides = (2,2,2) , padding='same')(c5)
p6 = Dropout(dropout)(p6)
#c6 = conv_block(p5, n_filters*8,5,True)
#p6 = Conv3D(n_filters*16,kernel_size = (2,2,2) , strides = (2,2,2) , padding='same')(c6)
p7 = conv_block(p6,n_filters*16,5,True)
u6 = Conv3DTranspose(n_filters*8, (2,2,2), strides=(2, 2, 2), padding='same')(p7);
u6 = concatenate([u6,c5]);
c7 = conv_block(u6,n_filters*16,5,True)
c7 = Dropout(dropout)(c7)
u7 = Conv3DTranspose(n_filters*4,(2,2,2),strides = (2,2,2) , padding= 'same')(c7);
u8 = concatenate([u7,c4]);
c8 = conv_block(u8,n_filters*8,5,True)
c8 = Dropout(dropout)(c8)
u9 = Conv3DTranspose(n_filters*2,(2,2,2),strides = (2,2,2) , padding= 'same')(c8);
u9 = concatenate([u9,c3]);
c9 = conv_block(u9,n_filters*4,5,True)
c9 = Dropout(dropout)(c9)
u10 = Conv3DTranspose(n_filters,(2,2,2),strides = (2,2,2) , padding= 'same')(c9);
u10 = concatenate([u10,c1]);
c10 = Conv3D(n_filters*2,kernel_size = (5,5,5),strides = (1,1,1) , padding = 'same')(u10);
c10 = Dropout(dropout)(c10)
c10 = add([c10,u10]);
#c9 = conv_block(u9,n_filters,3,batch_norm)
outputs = Conv3D(4, (1,1,1), activation='softmax')(c10)
model = Model(inputs=input_img, outputs=outputs)
return model
| 34.465517 | 118 | 0.693847 | 617 | 3,998 | 4.36953 | 0.210697 | 0.027448 | 0.021142 | 0.033383 | 0.428042 | 0.35089 | 0.306751 | 0.303042 | 0.303042 | 0.267433 | 0 | 0.064865 | 0.167084 | 3,998 | 115 | 119 | 34.765217 | 0.744745 | 0.074037 | 0 | 0.162162 | 0 | 0 | 0.017603 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0 | 0.405405 | 0 | 0.459459 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
16a9cd5f8c3947e5f770014cb07528f411173928 | 18,818 | py | Python | lib/networks/Resnet50_train.py | yangxue0827/TF_Deformable_Net | 00c86380fd2725ebe7ae22f41d460ffc0bca378d | [
"MIT"
] | 193 | 2017-07-19T14:29:38.000Z | 2021-10-20T07:35:42.000Z | lib/networks/Resnet50_train.py | yangxue0827/TF_Deformable_Net | 00c86380fd2725ebe7ae22f41d460ffc0bca378d | [
"MIT"
] | 29 | 2017-07-24T10:07:22.000Z | 2020-01-03T20:38:36.000Z | lib/networks/Resnet50_train.py | Zardinality/TF_Deformable_Net | 00c86380fd2725ebe7ae22f41d460ffc0bca378d | [
"MIT"
] | 67 | 2017-07-27T14:32:47.000Z | 2021-12-27T13:10:37.000Z | # --------------------------------------------------------
# TFFRCNN - Resnet50
# Copyright (c) 2016
# Licensed under The MIT License [see LICENSE for details]
# Written by miraclebiu
# --------------------------------------------------------
import tensorflow as tf
from .network import Network
from ..fast_rcnn.config import cfg
class Resnet50_train(Network):
def __init__(self, trainable=True):
self.inputs = []
self.data = tf.placeholder(tf.float32, shape=[None, None, None, 3], name='data')
self.im_info = tf.placeholder(tf.float32, shape=[None, 3], name='im_info')
self.gt_boxes = tf.placeholder(tf.float32, shape=[None, 5], name='gt_boxes')
self.gt_ishard = tf.placeholder(tf.int32, shape=[None], name='gt_ishard')
self.dontcare_areas = tf.placeholder(tf.float32, shape=[None, 4], name='dontcare_areas')
self.keep_prob = tf.placeholder(tf.float32)
self.layers = dict({'data':self.data, 'im_info':self.im_info, 'gt_boxes':self.gt_boxes,\
'gt_ishard': self.gt_ishard, 'dontcare_areas': self.dontcare_areas})
self.trainable = trainable
self.setup()
def setup(self):
n_classes = cfg.NCLASSES
# anchor_scales = [8, 16, 32]
anchor_scales = cfg.ANCHOR_SCALES
_feat_stride = [16, ]
(self.feed('data')
.conv(7, 7, 64, 2, 2, relu=False, name='conv1')
.batch_normalization(relu=True, name='bn_conv1', is_training=False)
.max_pool(3, 3, 2, 2, padding='VALID',name='pool1')
.conv(1, 1, 256, 1, 1, biased=False, relu=False, name='res2a_branch1')
.batch_normalization(name='bn2a_branch1',is_training=False,relu=False))
(self.feed('pool1')
.conv(1, 1, 64, 1, 1, biased=False, relu=False, name='res2a_branch2a')
.batch_normalization(relu=True, name='bn2a_branch2a',is_training=False)
.conv(3, 3, 64, 1, 1, biased=False, relu=False, name='res2a_branch2b')
.batch_normalization(relu=True, name='bn2a_branch2b',is_training=False)
.conv(1, 1, 256, 1, 1, biased=False, relu=False, name='res2a_branch2c')
.batch_normalization(name='bn2a_branch2c',is_training=False,relu=False))
(self.feed('bn2a_branch1',
'bn2a_branch2c')
.add(name='res2a')
.relu(name='res2a_relu')
.conv(1, 1, 64, 1, 1, biased=False, relu=False, name='res2b_branch2a')
.batch_normalization(relu=True, name='bn2b_branch2a',is_training=False)
.conv(3, 3, 64, 1, 1, biased=False, relu=False, name='res2b_branch2b')
.batch_normalization(relu=True, name='bn2b_branch2b',is_training=False)
.conv(1, 1, 256, 1, 1, biased=False, relu=False, name='res2b_branch2c')
.batch_normalization(name='bn2b_branch2c',is_training=False,relu=False))
(self.feed('res2a_relu',
'bn2b_branch2c')
.add(name='res2b')
.relu(name='res2b_relu')
.conv(1, 1, 64, 1, 1, biased=False, relu=False, name='res2c_branch2a')
.batch_normalization(relu=True, name='bn2c_branch2a',is_training=False)
.conv(3, 3, 64, 1, 1, biased=False, relu=False, name='res2c_branch2b')
.batch_normalization(relu=True, name='bn2c_branch2b',is_training=False)
.conv(1, 1, 256, 1, 1, biased=False, relu=False, name='res2c_branch2c')
.batch_normalization(name='bn2c_branch2c',is_training=False,relu=False))
(self.feed('res2b_relu',
'bn2c_branch2c')
.add(name='res2c')
.relu(name='res2c_relu')
.conv(1, 1, 512, 2, 2, biased=False, relu=False, name='res3a_branch1', padding='VALID')
.batch_normalization(name='bn3a_branch1',is_training=False,relu=False))
(self.feed('res2c_relu')
.conv(1, 1, 128, 2, 2, biased=False, relu=False, name='res3a_branch2a', padding='VALID')
.batch_normalization(relu=True, name='bn3a_branch2a',is_training=False)
.conv(3, 3, 128, 1, 1, biased=False, relu=False, name='res3a_branch2b')
.batch_normalization(relu=True, name='bn3a_branch2b',is_training=False)
.conv(1, 1, 512, 1, 1, biased=False, relu=False, name='res3a_branch2c')
.batch_normalization(name='bn3a_branch2c',is_training=False,relu=False))
(self.feed('bn3a_branch1',
'bn3a_branch2c')
.add(name='res3a')
.relu(name='res3a_relu')
.conv(1, 1, 128, 1, 1, biased=False, relu=False, name='res3b_branch2a')
.batch_normalization(relu=True, name='bn3b_branch2a',is_training=False)
.conv(3, 3, 128, 1, 1, biased=False, relu=False, name='res3b_branch2b')
.batch_normalization(relu=True, name='bn3b_branch2b',is_training=False)
.conv(1, 1, 512, 1, 1, biased=False, relu=False, name='res3b_branch2c')
.batch_normalization(name='bn3b_branch2c',is_training=False,relu=False))
(self.feed('res3a_relu',
'bn3b_branch2c')
.add(name='res3b')
.relu(name='res3b_relu')
.conv(1, 1, 128, 1, 1, biased=False, relu=False, name='res3c_branch2a')
.batch_normalization(relu=True, name='bn3c_branch2a',is_training=False)
.conv(3, 3, 128, 1, 1, biased=False, relu=False, name='res3c_branch2b')
.batch_normalization(relu=True, name='bn3c_branch2b',is_training=False)
.conv(1, 1, 512, 1, 1, biased=False, relu=False, name='res3c_branch2c')
.batch_normalization(name='bn3c_branch2c',is_training=False,relu=False))
(self.feed('res3b_relu',
'bn3c_branch2c')
.add(name='res3c')
.relu(name='res3c_relu')
.conv(1, 1, 128, 1, 1, biased=False, relu=False, name='res3d_branch2a')
.batch_normalization(relu=True, name='bn3d_branch2a',is_training=False)
.conv(3, 3, 128, 1, 1, biased=False, relu=False, name='res3d_branch2b')
.batch_normalization(relu=True, name='bn3d_branch2b',is_training=False)
.conv(1, 1, 512, 1, 1, biased=False, relu=False, name='res3d_branch2c')
.batch_normalization(name='bn3d_branch2c',is_training=False,relu=False))
(self.feed('res3c_relu',
'bn3d_branch2c')
.add(name='res3d')
.relu(name='res3d_relu')
.conv(1, 1, 1024, 2, 2, biased=False, relu=False, name='res4a_branch1', padding='VALID')
.batch_normalization(name='bn4a_branch1',is_training=False,relu=False))
(self.feed('res3d_relu')
.conv(1, 1, 256, 2, 2, biased=False, relu=False, name='res4a_branch2a', padding='VALID')
.batch_normalization(relu=True, name='bn4a_branch2a',is_training=False)
.conv(3, 3, 256, 1, 1, biased=False, relu=False, name='res4a_branch2b')
.batch_normalization(relu=True, name='bn4a_branch2b',is_training=False)
.conv(1, 1, 1024, 1, 1, biased=False, relu=False, name='res4a_branch2c')
.batch_normalization(name='bn4a_branch2c',is_training=False,relu=False))
(self.feed('bn4a_branch1',
'bn4a_branch2c')
.add(name='res4a')
.relu(name='res4a_relu')
.conv(1, 1, 256, 1, 1, biased=False, relu=False, name='res4b_branch2a')
.batch_normalization(relu=True, name='bn4b_branch2a',is_training=False)
.conv(3, 3, 256, 1, 1, biased=False, relu=False, name='res4b_branch2b')
.batch_normalization(relu=True, name='bn4b_branch2b',is_training=False)
.conv(1, 1, 1024, 1, 1, biased=False, relu=False, name='res4b_branch2c')
.batch_normalization(name='bn4b_branch2c',is_training=False,relu=False))
(self.feed('res4a_relu',
'bn4b_branch2c')
.add(name='res4b')
.relu(name='res4b_relu')
.conv(1, 1, 256, 1, 1, biased=False, relu=False, name='res4c_branch2a')
.batch_normalization(relu=True, name='bn4c_branch2a',is_training=False)
.conv(3, 3, 256, 1, 1, biased=False, relu=False, name='res4c_branch2b')
.batch_normalization(relu=True, name='bn4c_branch2b',is_training=False)
.conv(1, 1, 1024, 1, 1, biased=False, relu=False, name='res4c_branch2c')
.batch_normalization(name='bn4c_branch2c',is_training=False,relu=False))
(self.feed('res4b_relu',
'bn4c_branch2c')
.add(name='res4c')
.relu(name='res4c_relu')
.conv(1, 1, 256, 1, 1, biased=False, relu=False, name='res4d_branch2a')
.batch_normalization(relu=True, name='bn4d_branch2a',is_training=False)
.conv(3, 3, 256, 1, 1, biased=False, relu=False, name='res4d_branch2b')
.batch_normalization(relu=True, name='bn4d_branch2b',is_training=False)
.conv(1, 1, 1024, 1, 1, biased=False, relu=False, name='res4d_branch2c')
.batch_normalization(name='bn4d_branch2c',is_training=False,relu=False))
(self.feed('res4c_relu',
'bn4d_branch2c')
.add(name='res4d')
.relu(name='res4d_relu')
.conv(1, 1, 256, 1, 1, biased=False, relu=False, name='res4e_branch2a')
.batch_normalization(relu=True, name='bn4e_branch2a',is_training=False)
.conv(3, 3, 256, 1, 1, biased=False, relu=False, name='res4e_branch2b')
.batch_normalization(relu=True, name='bn4e_branch2b',is_training=False)
.conv(1, 1, 1024, 1, 1, biased=False, relu=False, name='res4e_branch2c')
.batch_normalization(name='bn4e_branch2c',is_training=False,relu=False))
(self.feed('res4d_relu',
'bn4e_branch2c')
.add(name='res4e')
.relu(name='res4e_relu')
.conv(1, 1, 256, 1, 1, biased=False, relu=False, name='res4f_branch2a')
.batch_normalization(relu=True, name='bn4f_branch2a',is_training=False)
.conv(3, 3, 256, 1, 1, biased=False, relu=False, name='res4f_branch2b')
.batch_normalization(relu=True, name='bn4f_branch2b',is_training=False)
.conv(1, 1, 1024, 1, 1, biased=False, relu=False, name='res4f_branch2c')
.batch_normalization(name='bn4f_branch2c',is_training=False,relu=False))
(self.feed('res4e_relu',
'bn4f_branch2c')
.add(name='res4f')
.relu(name='res4f_relu'))
#========= RPN ============
(self.feed('res4f_relu')
.conv(3,3,512,1,1,name='rpn_conv/3x3')
.conv(1,1,len(anchor_scales)*3*2 ,1 , 1, padding='VALID', relu = False, name='rpn_cls_score'))
(self.feed('rpn_cls_score', 'gt_boxes', 'gt_ishard', 'dontcare_areas', 'im_info')
.anchor_target_layer(_feat_stride, anchor_scales, name = 'rpn-data' ))
# Loss of rpn_cls & rpn_boxes
(self.feed('rpn_conv/3x3')
.conv(1,1,len(anchor_scales)*3*4, 1, 1, padding='VALID', relu = False, name='rpn_bbox_pred'))
#========= RoI Proposal ============
(self.feed('rpn_cls_score')
.spatial_reshape_layer(2, name = 'rpn_cls_score_reshape')
.spatial_softmax(name='rpn_cls_prob'))
(self.feed('rpn_cls_prob')
.spatial_reshape_layer(len(anchor_scales)*3*2, name = 'rpn_cls_prob_reshape'))
(self.feed('rpn_cls_prob_reshape','rpn_bbox_pred','im_info')
.proposal_layer(_feat_stride, anchor_scales, 'TRAIN',name = 'rpn_rois'))
(self.feed('rpn_rois','gt_boxes', 'gt_ishard', 'dontcare_areas')
.proposal_target_layer(n_classes,name = 'roi-data'))
#========= RCNN ============
(self.feed('res4f_relu')
.conv(1, 1, 2048, 1, 1, biased=False, relu=False, name='res5a_branch1', padding='VALID')
.batch_normalization(relu=False, name='bn5a_branch1'))
(self.feed('res4f_relu')
.conv(1, 1, 512, 1, 1, biased=False, relu=False, name='res5a_branch2a', padding='VALID')
.batch_normalization(relu=False, name='bn5a_branch2a')
.relu(name='res5a_branch2a_relu')
.conv(3, 3, 72, 1, 1, biased=True, rate=2, relu=False, name='res5a_branch2b_offset', padding='SAME', initializer='zeros'))
(self.feed('res5a_branch2a_relu', 'res5a_branch2b_offset')
.deform_conv(3, 3, 512, 1, 1, biased=False, rate=2, relu=False, num_deform_group=4, name='res5a_branch2b')
.batch_normalization(relu=False, name='bn5a_branch2b')
.relu(name='res5a_branch2b_relu')
.conv(1, 1, 2048, 1, 1, biased=False, relu=False, name='res5a_branch2c', padding='VALID')
.batch_normalization(relu=False, name='bn5a_branch2c'))
(self.feed('bn5a_branch1', 'bn5a_branch2c')
.add(name='res5a')
.relu(name='res5a_relu')
.conv(1, 1, 512, 1, 1, biased=False, relu=False, name='res5b_branch2a', padding='VALID')
.batch_normalization(relu=False, name='bn5b_branch2a')
.relu(name='res5b_branch2a_relu')
.conv(3, 3, 72, 1, 1, biased=True, rate=2, relu=False, name='res5b_branch2b_offset', padding='SAME', initializer='zeros'))
(self.feed('res5b_branch2a_relu', 'res5b_branch2b_offset')
.deform_conv(3, 3, 512, 1, 1, biased=False, rate=2, relu=False, num_deform_group=4, name='res5b_branch2b')
.batch_normalization(relu=False, name='bn5b_branch2b')
.relu(name='res5b_branch2b_relu')
.conv(1, 1, 2048, 1, 1, biased=False, relu=False, name='res5b_branch2c', padding='VALID')
.batch_normalization(relu=False, name='bn5b_branch2c'))
(self.feed('res5a_relu', 'bn5b_branch2c')
.add(name='res5b')
.relu(name='res5b_relu')
.conv(1, 1, 512, 1, 1, biased=False, relu=False, name='res5c_branch2a', padding='VALID')
.batch_normalization(relu=False, name='bn5c_branch2a')
.relu(name='res5c_branch2a_relu')
.conv(3, 3, 72, 1, 1, biased=True, rate=2, relu=False, name='res5c_branch2b_offset', padding='SAME', initializer='zeros') )
(self.feed('res5c_branch2a_relu', 'res5c_branch2b_offset')
.deform_conv(3, 3, 512, 1, 1, biased=False, rate=2, relu=False, num_deform_group=4, name='res5c_branch2b')
.batch_normalization(relu=False, name='bn5c_branch2b')
.relu(name='res5c_branch2b_relu')
.conv(1, 1, 2048, 1, 1, biased=False, relu=False, name='res5c_branch2c', padding='VALID')
.batch_normalization(relu=False, name='bn5c_branch2c'))
(self.feed('res5b_relu', 'bn5c_branch2c')
.add(name='res5c')
.relu(name='res5c_relu')
.conv(1, 1, 256, 1, 1, relu=False, name='conv_new_1')
.relu(name='conv_new_1_relu'))
(self.feed('conv_new_1_relu', 'roi-data')
.deform_psroi_pool(group_size=1, pooled_size=7, sample_per_part=4, no_trans=True, part_size=7, output_dim=256, trans_std=1e-1, spatial_scale=0.0625, name='offset_t')
# .flatten_data(name='offset_flatten')
.fc(num_out=7 * 7 * 2, name='offset', relu=False)
.reshape(shape=(-1,2,7,7), name='offset_reshape'))
(self.feed('conv_new_1_relu', 'roi-data', 'offset_reshape')
.deform_psroi_pool(group_size=1, pooled_size=7, sample_per_part=4, no_trans=False, part_size=7, output_dim=256, trans_std=1e-1, spatial_scale=0.0625, name='deformable_roi_pool')
.fc(num_out=1024, name='fc_new_1')
.fc(num_out=1024, name='fc_new_2'))
(self.feed('fc_new_2')
.fc(num_out=n_classes, name='cls_score', relu=False)
.softmax(name='cls_prob'))
(self.feed('fc_new_2')
.fc(num_out=4*n_classes, name='bbox_pred', relu=False))
# (self.feed('res4f_relu','roi-data')
# .roi_pool(7,7,1.0/16,name='res5a_branch2a_roipooling')
# .conv(1, 1, 512, 2, 2, biased=False, relu=False, name='res5a_branch2a', padding='VALID')
# .batch_normalization(relu=True, name='bn5a_branch2a',is_training=False)
# .conv(3, 3, 512, 1, 1, biased=False, relu=False, name='res5a_branch2b')
# .batch_normalization(relu=True, name='bn5a_branch2b',is_training=False)
# .conv(1, 1, 2048, 1, 1, biased=False, relu=False, name='res5a_branch2c')
# .batch_normalization(name='bn5a_branch2c',is_training=False,relu=False))
# (self.feed('res5a_branch2a_roipooling')
# .conv(1,1,2048,2,2,biased=False, relu=False, name='res5a_branch1', padding='VALID')
# .batch_normalization(name='bn5a_branch1',is_training=False,relu=False))
# (self.feed('bn5a_branch2c','bn5a_branch1')
# .add(name='res5a')
# .relu(name='res5a_relu')
# .conv(1, 1, 512, 1, 1, biased=False, relu=False, name='res5b_branch2a')
# .batch_normalization(relu=True, name='bn5b_branch2a',is_training=False)
# .conv(3, 3, 512, 1, 1, biased=False, relu=False, name='res5b_branch2b')
# .batch_normalization(relu=True, name='bn5b_branch2b',is_training=False)
# .conv(1, 1, 2048, 1, 1, biased=False, relu=False, name='res5b_branch2c')
# .batch_normalization(name='bn5b_branch2c',is_training=False,relu=False))
# #pdb.set_trace()
# (self.feed('res5a_relu',
# 'bn5b_branch2c')
# .add(name='res5b')
# .relu(name='res5b_relu')
# .conv(1, 1, 512, 1, 1, biased=False, relu=False, name='res5c_branch2a')
# .batch_normalization(relu=True, name='bn5c_branch2a',is_training=False)
# .conv(3, 3, 512, 1, 1, biased=False, relu=False, name='res5c_branch2b')
# .batch_normalization(relu=True, name='bn5c_branch2b',is_training=False)
# .conv(1, 1, 2048, 1, 1, biased=False, relu=False, name='res5c_branch2c')
# .batch_normalization(name='bn5c_branch2c',is_training=False,relu=False))
# #pdb.set_trace()
# (self.feed('res5b_relu',
# 'bn5c_branch2c')
# .add(name='res5c')
# .relu(name='res5c_relu')
# .fc(n_classes, relu=False, name='cls_score')
# .softmax(name='cls_prob'))
# (self.feed('res5c_relu')
# .fc(n_classes*4, relu=False, name='bbox_pred'))
| 58.080247 | 189 | 0.597619 | 2,416 | 18,818 | 4.447434 | 0.074089 | 0.020289 | 0.102932 | 0.109819 | 0.74053 | 0.709912 | 0.548813 | 0.541461 | 0.395905 | 0.384272 | 0 | 0.06968 | 0.240408 | 18,818 | 323 | 190 | 58.260062 | 0.682034 | 0.144064 | 0 | 0.021097 | 0 | 0 | 0.189634 | 0.009158 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008439 | false | 0 | 0.012658 | 0 | 0.025316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16ad65c0a3b3c48d1d5528a704a36242b69e1b30 | 590 | py | Python | python/get_links.py | quiddity-wp/mediawiki-api-demos | 98910dbd9c2cbbb13db790f3e8979419aeab34d4 | [
"MIT"
] | 63 | 2019-05-19T13:22:37.000Z | 2022-03-30T13:21:40.000Z | python/get_links.py | quiddity-wp/mediawiki-api-demos | 98910dbd9c2cbbb13db790f3e8979419aeab34d4 | [
"MIT"
] | 67 | 2019-05-03T17:17:19.000Z | 2021-06-21T11:02:10.000Z | python/get_links.py | quiddity-wp/mediawiki-api-demos | 98910dbd9c2cbbb13db790f3e8979419aeab34d4 | [
"MIT"
] | 49 | 2019-02-19T09:28:33.000Z | 2019-03-24T04:36:53.000Z | #This file is auto-generated. See modules.json and autogenerator.py for details
#!/usr/bin/python3
"""
get_links.py
MediaWiki API Demos
Demo of `Links` module: Get all links on the given page(s)
MIT License
"""
import requests
S = requests.Session()
URL = "https://en.wikipedia.org/w/api.php"
PARAMS = {
"action": "query",
"format": "json",
"titles": "Albert Einstein",
"prop": "links"
}
R = S.get(url=URL, params=PARAMS)
DATA = R.json()
PAGES = DATA["query"]["pages"]
for k, v in PAGES.items():
for l in v["links"]:
print(l["title"])
| 16.857143 | 79 | 0.618644 | 87 | 590 | 4.183908 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002155 | 0.213559 | 590 | 34 | 80 | 17.352941 | 0.782328 | 0.340678 | 0 | 0 | 1 | 0 | 0.288462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.066667 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16b0eceb3e8aafd2e9b6e9e274abab88018c34aa | 495 | py | Python | subir/ingreso/migrations/0004_auto_20191003_1509.py | Brandon1625/subir | b827a30e64219fdc9de07689d2fb32e2c4bd02b7 | [
"bzip2-1.0.6"
] | null | null | null | subir/ingreso/migrations/0004_auto_20191003_1509.py | Brandon1625/subir | b827a30e64219fdc9de07689d2fb32e2c4bd02b7 | [
"bzip2-1.0.6"
] | null | null | null | subir/ingreso/migrations/0004_auto_20191003_1509.py | Brandon1625/subir | b827a30e64219fdc9de07689d2fb32e2c4bd02b7 | [
"bzip2-1.0.6"
] | null | null | null | # Generated by Django 2.2.4 on 2019-10-03 21:09
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('ingreso', '0003_auto_20190907_2152'),
]
operations = [
migrations.AlterField(
model_name='detalle_ingreso',
name='id_prod',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='producto.Producto'),
),
]
| 24.75 | 116 | 0.650505 | 58 | 495 | 5.431034 | 0.689655 | 0.07619 | 0.088889 | 0.139683 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081579 | 0.232323 | 495 | 19 | 117 | 26.052632 | 0.747368 | 0.090909 | 0 | 0 | 1 | 0 | 0.154018 | 0.051339 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16b8947aeb5e92484b74a59f50dce7a8d1075f22 | 23,601 | py | Python | dev/Tools/build/waf-1.7.13/lmbrwaflib/unit_test_lumberyard_modules.py | akulamartin/lumberyard | 2d4be458a02845179be098e40cdc0c48f28f3b5a | [
"AML"
] | 8 | 2019-10-07T16:33:47.000Z | 2020-12-07T03:59:58.000Z | dev/Tools/build/waf-1.7.13/lmbrwaflib/unit_test_lumberyard_modules.py | 29e7e280-0d1c-4bba-98fe-f7cd3ca7500a/lumberyard | 1c52b941dcb7d94341fcf21275fe71ff67173ada | [
"AML"
] | null | null | null | dev/Tools/build/waf-1.7.13/lmbrwaflib/unit_test_lumberyard_modules.py | 29e7e280-0d1c-4bba-98fe-f7cd3ca7500a/lumberyard | 1c52b941dcb7d94341fcf21275fe71ff67173ada | [
"AML"
] | 4 | 2019-08-05T07:25:46.000Z | 2020-12-07T05:12:55.000Z | #
# All or portions of this file Copyright (c) Amazon.com, Inc. or its affiliates or
# its licensors.
#
# For complete copyright and license terms please see the LICENSE at the root of this
# distribution (the "License"). All use of this software is governed by the License,
# or, if provided, by the license below or the license accompanying this file. Do not
# remove or modify any license notices. This file is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#
from waflib import Errors
import lumberyard_modules
import unittest
import pytest
import utils
class FakeContext(object):
pass
class FakeIncludeSettings(object):
pass
class FakePlatformSettings(object):
def __init__(self, platform_name, aliases=set()):
self.platform = platform_name
self.aliases = aliases
class FakeConfigurationSettings(object):
def __init__(self, settings_name, base_config=None):
self.base_config = base_config
self.name = settings_name
class FakeConfiguration(object):
def __init__(self, settings, is_test=False, is_server=False):
self.settings = settings
self.is_test = is_test
self.is_server = is_server
@pytest.fixture()
def mock_parse_json(mock_json_map):
if not mock_json_map:
mock_json_map = {'path': {}}
def _mock_parse_json(path, _):
return mock_json_map[path]
old_parse_json_file = utils.parse_json_file
utils.parse_json_file = _mock_parse_json
yield
utils.parse_json_file = old_parse_json_file
@pytest.fixture()
def fake_context():
return FakeContext()
def test_SanitizeKWInput_SimpleKwDictionary_Success():
kw = dict(
libpath='mylib'
)
lumberyard_modules.sanitize_kw_input(kw)
assert isinstance(kw['libpath'], list)
assert kw['libpath'][0] == 'mylib'
def test_SanitizeKWInput_SimpleKwDictionaryInAdditionalSettings_Success():
kw = dict(
libpath='mylib',
additional_settings=dict(stlibpath='mystlib')
)
lumberyard_modules.sanitize_kw_input(kw)
assert isinstance(kw['libpath'], list)
assert kw['libpath'][0] == 'mylib'
assert isinstance(kw['additional_settings'], list)
assert isinstance(kw['additional_settings'][0], dict)
assert isinstance(kw['additional_settings'][0]['stlibpath'], list)
assert kw['additional_settings'][0]['stlibpath'][0] == 'mystlib'
@pytest.mark.parametrize(
"target, kw_key, source_section, additional_aliases, merge_dict, expected", [
pytest.param('test_target', 'fake_key', {}, {}, {}, {}, id='MissingKeyInSourceNoChange'),
pytest.param('test_target', 'fake_key', {'fake_key': 'fake_value'}, {}, {}, {'fake_key': 'fake_value'}, id='MissingKeyInTargetKeyAdded'),
pytest.param('test_target', 'copyright_org', {'copyright_org': False}, {}, {'copyright_org': 'AMZN'}, type(Errors.WafError), id='InvalidStringKwInSourceError'),
pytest.param('test_target', 'copyright_org', {'copyright_org': 'AMZN'}, {}, {'copyright_org': False}, type(Errors.WafError), id='InvalidStringKwInTargetError'),
pytest.param('test_target', 'copyright_org', {'copyright_org': 'AMZN'}, {}, {'copyright_org': 'A2Z'}, {'copyright_org': 'AMZN'}, id='MergeStringReplaceSuccess'),
pytest.param('test_target', 'client_only', {'client_only': 'False'}, {}, {'client_only': True}, type(Errors.WafError), id='InvalidBoolKwInSourceError'),
pytest.param('test_target', 'client_only', {'client_only': False}, {}, {'client_only': 'True'}, type(Errors.WafError), id='InvalidBoolKwInTargetError'),
pytest.param('test_target', 'client_only', {'client_only': False}, {}, {'client_only': True}, {'client_only': False}, id='MergeBoolReplaceKwSuccess'),
])
def test_ProjectSettingsFileMergeKwKey_ValidInputs(mock_parse_json, target, kw_key, source_section, additional_aliases, merge_dict, expected):
fake_context = FakeContext()
test_settings = lumberyard_modules.ProjectSettingsFile(fake_context, 'path', additional_aliases)
if isinstance(expected,dict):
test_settings.merge_kw_key(target=target,
kw_key=kw_key,
source_section=source_section,
merge_kw=merge_dict)
assert merge_dict == expected
elif isinstance(expected, type(Errors.WafError)):
with pytest.raises(Errors.WafError):
test_settings.merge_kw_key(target=target,
kw_key=kw_key,
source_section=source_section,
merge_kw=merge_dict)
@pytest.mark.parametrize(
"test_dict, fake_include_settings, mock_json_map, additional_aliases, expected", [
pytest.param({}, None, None, {}, {}, id='BasicNoAdditionalAliasNoAdditionalIncludes'),
pytest.param({}, 'include_test',
{
'path': {
'includes': ['include_test']
},'include_test': {}
}, {}, {'includes': ['include_test']}, id='BasicNoAdditionalAliasSingleAdditionalIncludes')
])
def test_ProjectSettingsFileMergeKwKey_ValidInputs(mock_parse_json, fake_context, test_dict, fake_include_settings, mock_json_map, additional_aliases, expected):
if fake_include_settings:
def _mock_get_project_settings_file(include_settings_file, additional_aliases):
assert fake_include_settings == include_settings_file
fake_settings = FakeIncludeSettings()
return fake_settings
fake_context.get_project_settings_file = _mock_get_project_settings_file
test = lumberyard_modules.ProjectSettingsFile(fake_context,
'path',
additional_aliases)
assert test.dict == expected
@pytest.mark.parametrize(
"mock_json_map, additional_aliases, section_key, expected", [
pytest.param(None, {}, 'no_section', {}, id='SimpleNoChange'),
pytest.param({
'path': {
"test_section": {
"key1": "value1"
}
}
}, {}, 'test_section', {'key1': 'value1'}, id='SimpleChanges')
])
def test_ProjectSettingsFileMergeKwSection_ValidInputs_Success(mock_parse_json, fake_context, mock_json_map, additional_aliases, section_key, expected):
test_settings = lumberyard_modules.ProjectSettingsFile(fake_context, 'path', additional_aliases)
merge_dict = {}
test_settings.merge_kw_section(section_key=section_key,
target='test_target',
merge_kw=merge_dict)
assert expected == merge_dict
class ProjectSettingsTest(unittest.TestCase):
def setUp(self):
self.old_parse_json = utils.parse_json_file
utils.parse_json_file = self.mockParseJson
self.mock_json_map = {}
def tearDown(self):
utils.parse_json_file = self.old_parse_json
def mockParseJson(self, path, _):
return self.mock_json_map[path]
def createSimpleSettings(self, fake_context = FakeContext(), test_dict={}, additional_aliases={}):
self.mock_json_map = {'path': test_dict}
test_settings = lumberyard_modules.ProjectSettingsFile(fake_context, 'path', additional_aliases)
return test_settings
def test_ProjectSettingsFileMergeKwDict_RecursiveMergeAdditionalSettingsNoPlatformNoConfiguration_Success(self):
"""
Test scenario:
Setup a project settings that contains other project settings, so that it can recursively call merge_kw_dict
recursively
"""
include_settings_file = 'include_test'
test_settings_single_include = {'includes': [include_settings_file]}
test_empty_settings = {}
test_merge_kw_key = 'passed'
test_merge_kw_value = True
self.mock_json_map = {'path': test_settings_single_include,
include_settings_file: test_empty_settings}
# Prepare a mock include settings object
test_include_settings = self.createSimpleSettings()
def _mock_merge_kw_dict(target, merge_kw, platform, configuration):
merge_kw[test_merge_kw_key] = test_merge_kw_value
pass
test_include_settings.merge_kw_dict = _mock_merge_kw_dict
# Prepare a mock context
fake_context = FakeContext()
def _mock_get_project_settings_file(_a, _b):
return test_include_settings
fake_context.get_project_settings_file = _mock_get_project_settings_file
test_settings = self.createSimpleSettings(fake_context=fake_context,
test_dict=test_settings_single_include)
test_merge_kw = {}
test_settings.merge_kw_dict(target='test_target',
merge_kw=test_merge_kw,
platform=None,
configuration=None)
self.assertIn(test_merge_kw_key, test_merge_kw)
self.assertEqual(test_merge_kw[test_merge_kw_key], test_merge_kw_value)
def test_ProjectSettingsFileMergeKwDict_MergePlatformSection_Success(self):
"""
Test scenario:
Test the merge_kw_dict when only platform is set and not any configurations
"""
test_platform = 'test_platform'
test_alias = 'alias_1'
fake_context = FakeContext()
fake_platform_settings = FakePlatformSettings(platform_name='test_platform',
aliases={test_alias})
def _mock_get_platform_settings(platform):
self.assertEqual(platform, test_platform)
return fake_platform_settings
fake_context.get_platform_settings = _mock_get_platform_settings
test_dict = {}
test_settings = self.createSimpleSettings(fake_context=fake_context,
test_dict=test_dict)
sections_merged = set()
def _mock_merge_kw_section(section, target, merge_kw):
sections_merged.add(section)
pass
test_settings.merge_kw_section = _mock_merge_kw_section
test_merge_kw = {}
test_settings.merge_kw_dict(target='test_target',
merge_kw=test_merge_kw,
platform=test_platform,
configuration=None)
# Validate all the sections passed to the merge_kw_dict
self.assertIn('{}/*'.format(test_platform), sections_merged)
self.assertIn('{}/*'.format(test_alias), sections_merged)
self.assertEqual(len(sections_merged), 2)
def test_ProjectSettingsFileMergeKwDict_MergePlatformConfigurationNoDerivedNoTestNoDedicatedSection_Success(self):
"""
Test scenario:
Test the merge_kw_dict when the platform + configuration is set, and the configuration is not a test nor
server configuration
"""
test_platform_name = 'test_platform'
test_configuration_name = 'test_configuration'
test_configuration = FakeConfiguration(settings=FakeConfigurationSettings(settings_name=test_configuration_name))
fake_context = FakeContext()
fake_platform_settings = FakePlatformSettings(platform_name='test_platform')
def _mock_get_platform_settings(platform):
self.assertEqual(platform, test_platform_name)
return fake_platform_settings
fake_context.get_platform_settings = _mock_get_platform_settings
test_dict = {}
test_settings = self.createSimpleSettings(fake_context=fake_context,
test_dict=test_dict)
sections_merged = set()
def _mock_merge_kw_section(section, target, merge_kw):
sections_merged.add(section)
pass
test_settings.merge_kw_section = _mock_merge_kw_section
test_merge_kw = {}
test_settings.merge_kw_dict(target='test_target',
merge_kw=test_merge_kw,
platform=test_platform_name,
configuration=test_configuration)
# Validate all the sections passed to the merge_kw_dict
self.assertIn('{}/*'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertEqual(len(sections_merged), 2)
def test_ProjectSettingsFileMergeKwDict_MergePlatformConfigurationDerivedNoTestNoDedicatedSection_Success(self):
"""
Test scenario:
Test the merge_kw_dict when the platform + configuration is set, and the configuration is not a test nor
server configuration, but is derived from another configuration
"""
test_platform_name = 'test_platform'
test_configuration_name = 'test_configuration'
base_test_configuration_name = 'base_configuration'
test_configuration = FakeConfiguration(
settings=FakeConfigurationSettings(settings_name=test_configuration_name,
base_config=FakeConfiguration(FakeConfigurationSettings(settings_name=base_test_configuration_name))))
fake_context = FakeContext()
fake_platform_settings = FakePlatformSettings(platform_name='test_platform')
def _mock_get_platform_settings(platform):
self.assertEqual(platform, test_platform_name)
return fake_platform_settings
fake_context.get_platform_settings = _mock_get_platform_settings
test_dict = {}
test_settings = self.createSimpleSettings(fake_context=fake_context,
test_dict=test_dict)
sections_merged = set()
def _mock_merge_kw_section(section, target, merge_kw):
sections_merged.add(section)
pass
test_settings.merge_kw_section = _mock_merge_kw_section
test_merge_kw = {}
test_settings.merge_kw_dict(target='test_target',
merge_kw=test_merge_kw,
platform=test_platform_name,
configuration=test_configuration)
# Validate all the sections passed to the merge_kw_dict
self.assertIn('{}/*'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertIn('{}/{}'.format(test_platform_name, base_test_configuration_name), sections_merged)
self.assertEqual(len(sections_merged), 3)
def test_ProjectSettingsFileMergeKwDict_MergePlatformConfigurationNoDerivedTestDedicatedSection_Success(self):
"""
Test scenario:
Test the merge_kw_dict when the platform + configuration is set, and the configuration is a test and a
server configuration
"""
test_platform_name = 'test_platform'
test_configuration_name = 'test_configuration'
test_configuration = FakeConfiguration(settings=FakeConfigurationSettings(settings_name=test_configuration_name),
is_test=True,
is_server=True)
fake_context = FakeContext()
fake_platform_settings = FakePlatformSettings(platform_name='test_platform')
def _mock_get_platform_settings(platform):
self.assertEqual(platform, test_platform_name)
return fake_platform_settings
fake_context.get_platform_settings = _mock_get_platform_settings
test_dict = {}
test_settings = self.createSimpleSettings(fake_context=fake_context,
test_dict=test_dict)
sections_merged = set()
def _mock_merge_kw_section(section, target, merge_kw):
sections_merged.add(section)
pass
test_settings.merge_kw_section = _mock_merge_kw_section
test_merge_kw = {}
test_settings.merge_kw_dict(target='test_target',
merge_kw=test_merge_kw,
platform=test_platform_name,
configuration=test_configuration)
# Validate all the sections passed to the merge_kw_dict
self.assertIn('{}/{}'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertIn('*/*/dedicated,test', sections_merged)
self.assertIn('{}/*/dedicated,test'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}/dedicated,test'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertIn('*/*/test,dedicated', sections_merged)
self.assertIn('{}/*/test,dedicated'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}/test,dedicated'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertEqual(len(sections_merged), 8)
def test_ProjectSettingsFileMergeKwDict_MergePlatformConfigurationNoDerivedTestNoDedicatedSection_Success(self):
"""
Test scenario:
Test the merge_kw_dict when the platform + configuration is set, and the configuration is a test but not a
server configuration
"""
test_platform_name = 'test_platform'
test_configuration_name = 'test_configuration'
test_configuration = FakeConfiguration(
settings=FakeConfigurationSettings(settings_name=test_configuration_name),
is_test=True,
is_server=False)
fake_context = FakeContext()
fake_platform_settings = FakePlatformSettings(platform_name='test_platform')
def _mock_get_platform_settings(platform):
self.assertEqual(platform, test_platform_name)
return fake_platform_settings
fake_context.get_platform_settings = _mock_get_platform_settings
test_dict = {}
test_settings = self.createSimpleSettings(fake_context=fake_context,
test_dict=test_dict)
sections_merged = set()
def _mock_merge_kw_section(section, target, merge_kw):
sections_merged.add(section)
pass
test_settings.merge_kw_section = _mock_merge_kw_section
test_merge_kw = {}
test_settings.merge_kw_dict(target='test_target',
merge_kw=test_merge_kw,
platform=test_platform_name,
configuration=test_configuration)
# Validate all the sections passed to the merge_kw_dict
self.assertIn('{}/*'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertIn('*/*/test', sections_merged)
self.assertIn('{}/*/test'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}/test'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertIn('*/*/dedicated,test', sections_merged)
self.assertIn('{}/*/dedicated,test'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}/dedicated,test'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertIn('*/*/test,dedicated', sections_merged)
self.assertIn('{}/*/test,dedicated'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}/test,dedicated'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertEqual(len(sections_merged), 11)
def test_ProjectSettingsFileMergeKwDict_MergePlatformConfigurationNoDerivedNoTestDedicatedSection_Success(self):
"""
Test scenario:
Test the merge_kw_dict when the platform + configuration is set, and the configuration is a server but not a
test configuration
"""
test_platform_name = 'test_platform'
test_configuration_name = 'test_configuration'
test_configuration = FakeConfiguration(
settings=FakeConfigurationSettings(settings_name=test_configuration_name),
is_test=False,
is_server=True)
fake_context = FakeContext()
fake_platform_settings = FakePlatformSettings(platform_name='test_platform')
def _mock_get_platform_settings(platform):
self.assertEqual(platform, test_platform_name)
return fake_platform_settings
fake_context.get_platform_settings = _mock_get_platform_settings
test_dict = {}
test_settings = self.createSimpleSettings(fake_context=fake_context,
test_dict=test_dict)
sections_merged = set()
def _mock_merge_kw_section(section, target, merge_kw):
sections_merged.add(section)
pass
test_settings.merge_kw_section = _mock_merge_kw_section
test_merge_kw = {}
test_settings.merge_kw_dict(target='test_target',
merge_kw=test_merge_kw,
platform=test_platform_name,
configuration=test_configuration)
# Validate all the sections passed to the merge_kw_dict
self.assertIn('{}/*'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertIn('*/*/dedicated', sections_merged)
self.assertIn('{}/*/dedicated'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}/dedicated'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertIn('*/*/dedicated,test', sections_merged)
self.assertIn('{}/*/dedicated,test'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}/dedicated,test'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertIn('*/*/test,dedicated', sections_merged)
self.assertIn('{}/*/test,dedicated'.format(test_platform_name), sections_merged)
self.assertIn('{}/{}/test,dedicated'.format(test_platform_name, test_configuration_name), sections_merged)
self.assertEqual(len(sections_merged), 11)
| 43.304587 | 191 | 0.637897 | 2,345 | 23,601 | 6.03838 | 0.088699 | 0.04202 | 0.046328 | 0.055085 | 0.724929 | 0.697387 | 0.673234 | 0.662076 | 0.642514 | 0.62959 | 0 | 0.001173 | 0.277785 | 23,601 | 544 | 192 | 43.384191 | 0.829569 | 0.079403 | 0 | 0.581662 | 0 | 0 | 0.099921 | 0.014941 | 0 | 0 | 0 | 0 | 0.17765 | 1 | 0.106017 | false | 0.028653 | 0.014327 | 0.011461 | 0.17192 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16bd643a28b81f74d29d0b9a43b20d245093f663 | 12,716 | py | Python | tensorhive/config.py | roscisz/TensorHive | 4a680f47a0ee1ce366dc82ad9964e229d9749c4e | [
"Apache-2.0"
] | 129 | 2017-08-25T11:45:15.000Z | 2022-03-29T05:11:25.000Z | tensorhive/config.py | roscisz/TensorHive | 4a680f47a0ee1ce366dc82ad9964e229d9749c4e | [
"Apache-2.0"
] | 251 | 2017-07-27T10:05:58.000Z | 2022-03-02T12:46:13.000Z | tensorhive/config.py | roscisz/TensorHive | 4a680f47a0ee1ce366dc82ad9964e229d9749c4e | [
"Apache-2.0"
] | 20 | 2017-08-13T13:05:14.000Z | 2022-03-19T02:21:37.000Z | from pathlib import PosixPath
import configparser
from typing import Dict, Optional, Any, List
from inspect import cleandoc
import shutil
import tensorhive
import os
import logging
log = logging.getLogger(__name__)
class CONFIG_FILES:
# Where to copy files
# (TensorHive tries to load these by default)
config_dir = PosixPath.home() / '.config/TensorHive'
MAIN_CONFIG_PATH = str(config_dir / 'main_config.ini')
HOSTS_CONFIG_PATH = str(config_dir / 'hosts_config.ini')
MAILBOT_CONFIG_PATH = str(config_dir / 'mailbot_config.ini')
# Where to get file templates from
# (Clone file when it's not found in config directory)
tensorhive_package_dir = PosixPath(__file__).parent
MAIN_CONFIG_TEMPLATE_PATH = str(tensorhive_package_dir / 'main_config.ini')
HOSTS_CONFIG_TEMPLATE_PATH = str(tensorhive_package_dir / 'hosts_config.ini')
MAILBOT_TEMPLATE_CONFIG_PATH = str(tensorhive_package_dir / 'mailbot_config.ini')
ALEMBIC_CONFIG_PATH = str(tensorhive_package_dir / 'alembic.ini')
MIGRATIONS_CONFIG_PATH = str(tensorhive_package_dir / 'migrations')
class ConfigInitilizer:
'''Makes sure that all default config files exist'''
def __init__(self):
# 1. Check if all config files exist
all_exist = PosixPath(CONFIG_FILES.MAIN_CONFIG_PATH).exists() and \
PosixPath(CONFIG_FILES.HOSTS_CONFIG_PATH).exists() and \
PosixPath(CONFIG_FILES.MAILBOT_CONFIG_PATH).exists()
if not all_exist:
log.warning('[•] Detected missing default config file(s), recreating...')
self.recreate_default_configuration_files()
log.info('[•] All configs already exist, skipping...')
def recreate_default_configuration_files(self) -> None:
try:
# 1. Create directory for stroing config files
CONFIG_FILES.config_dir.mkdir(parents=True, exist_ok=True)
# 2. Clone templates safely from `tensorhive` package
self.safe_copy(src=CONFIG_FILES.MAIN_CONFIG_TEMPLATE_PATH, dst=CONFIG_FILES.MAIN_CONFIG_PATH)
self.safe_copy(src=CONFIG_FILES.HOSTS_CONFIG_TEMPLATE_PATH, dst=CONFIG_FILES.HOSTS_CONFIG_PATH)
self.safe_copy(src=CONFIG_FILES.MAILBOT_TEMPLATE_CONFIG_PATH, dst=CONFIG_FILES.MAILBOT_CONFIG_PATH)
# 3. Change config files permission
rw_owner_only = 0o600
os.chmod(CONFIG_FILES.MAIN_CONFIG_PATH, rw_owner_only)
os.chmod(CONFIG_FILES.HOSTS_CONFIG_PATH, rw_owner_only)
os.chmod(CONFIG_FILES.MAILBOT_CONFIG_PATH, rw_owner_only)
except Exception:
log.error('[✘] Unable to recreate configuration files.')
def safe_copy(self, src: str, dst: str) -> None:
'''Safe means that it won't override existing configuration'''
if PosixPath(dst).exists():
log.info('Skipping, file already exists: {}'.format(dst))
else:
shutil.copy(src, dst)
log.info('Copied {} to {}'.format(src, dst))
class ConfigLoader:
@staticmethod
def load(path, displayed_title=''):
import configparser
config = configparser.ConfigParser(strict=False)
full_path = PosixPath(path).expanduser()
if config.read(str(full_path)):
log.info('[•] Reading {} config from {}'.format(displayed_title, full_path))
else:
log.warning('[✘] Configuration file not found ({})'.format(full_path))
log.info('Using default {} settings from config.py'.format(displayed_title))
return config
ConfigInitilizer()
config = ConfigLoader.load(CONFIG_FILES.MAIN_CONFIG_PATH, displayed_title='main')
def display_config(cls):
'''
Displays all uppercase class atributes (class must be defined first)
Example usage: display_config(API_SERVER)
'''
print('[{class_name}]'.format(class_name=cls.__name__))
for key, value in cls.__dict__.items():
if key.isupper():
print('{} = {}'.format(key, value))
def check_env_var(name: str):
'''Makes sure that env variable is declared'''
if not os.getenv(name):
msg = cleandoc(
'''
{env} - undeclared environment variable!
Try this: `export {env}="..."`
''').format(env=name).split('\n')
log.warning(msg[0])
log.warning(msg[1])
class SSH:
section = 'ssh'
HOSTS_CONFIG_FILE = config.get(section, 'hosts_config_file', fallback=CONFIG_FILES.HOSTS_CONFIG_PATH)
TEST_ON_STARTUP = config.getboolean(section, 'test_on_startup', fallback=True)
TIMEOUT = config.getfloat(section, 'timeout', fallback=10.0)
NUM_RETRIES = config.getint(section, 'number_of_retries', fallback=1)
KEY_FILE = config.get(section, 'key_file', fallback='~/.config/TensorHive/ssh_key')
def hosts_config_to_dict(path: str) -> Dict: # type: ignore
'''Parses sections containing hostnames'''
hosts_config = ConfigLoader.load(path, displayed_title='hosts')
result = {}
for section in hosts_config.sections():
# We want to parse only sections which describe target hosts
if section == 'proxy_tunneling':
continue
hostname = section
result[hostname] = {
'user': hosts_config.get(hostname, 'user'),
'port': hosts_config.getint(hostname, 'port', fallback=22)
}
return result
def proxy_config_to_dict(path: str) -> Optional[Dict]: # type: ignore
'''Parses [proxy_tunneling] section'''
config = ConfigLoader.load(path, displayed_title='proxy')
section = 'proxy_tunneling'
# Check if section is present and if yes, check if tunneling is enabled
if config.has_section(section) and config.getboolean(section, 'enabled', fallback=False):
return {
'proxy_host': config.get(section, 'proxy_host'),
'proxy_user': config.get(section, 'proxy_user'),
'proxy_port': config.getint(section, 'proxy_port', fallback=22)
}
else:
return None
AVAILABLE_NODES = hosts_config_to_dict(HOSTS_CONFIG_FILE)
PROXY = proxy_config_to_dict(HOSTS_CONFIG_FILE)
class DB:
section = 'database'
default_path = '~/.config/TensorHive/database.sqlite'
def uri_for_path(path: str) -> str: # type: ignore
return 'sqlite:///{}'.format(PosixPath(path).expanduser())
SQLALCHEMY_DATABASE_URI = uri_for_path(config.get(section, 'path', fallback=default_path))
TEST_DATABASE_URI = 'sqlite://' # Use in-memory (before: sqlite:///test_database.sqlite)
class API:
section = 'api'
TITLE = config.get(section, 'title', fallback='TensorHive API')
URL_HOSTNAME = config.get(section, 'url_hostname', fallback='0.0.0.0')
URL_PREFIX = config.get(section, 'url_prefix', fallback='api')
SPEC_FILE = config.get(section, 'spec_file', fallback='api_specification.yml')
IMPL_LOCATION = config.get(section, 'impl_location', fallback='tensorhive.api.controllers')
import yaml
respones_file_path = str(PosixPath(__file__).parent / 'controllers/responses.yml')
with open(respones_file_path, 'r') as file:
RESPONSES = yaml.safe_load(file)
class APP_SERVER:
section = 'web_app.server'
BACKEND = config.get(section, 'backend', fallback='gunicorn')
HOST = config.get(section, 'host', fallback='0.0.0.0')
PORT = config.getint(section, 'port', fallback=5000)
WORKERS = config.getint(section, 'workers', fallback=4)
LOG_LEVEL = config.get(section, 'loglevel', fallback='warning')
class API_SERVER:
section = 'api.server'
BACKEND = config.get(section, 'backend', fallback='gevent')
HOST = config.get(section, 'host', fallback='0.0.0.0')
PORT = config.getint(section, 'port', fallback=1111)
DEBUG = config.getboolean(section, 'debug', fallback=False)
class MONITORING_SERVICE:
section = 'monitoring_service'
ENABLED = config.getboolean(section, 'enabled', fallback=True)
ENABLE_GPU_MONITOR = config.getboolean(section, 'enable_gpu_monitor', fallback=True)
UPDATE_INTERVAL = config.getfloat(section, 'update_interval', fallback=2.0)
class PROTECTION_SERVICE:
section = 'protection_service'
ENABLED = config.getboolean(section, 'enabled', fallback=True)
UPDATE_INTERVAL = config.getfloat(section, 'update_interval', fallback=2.0)
NOTIFY_ON_PTY = config.getboolean(section, 'notify_on_pty', fallback=True)
NOTIFY_VIA_EMAIL = config.getboolean(section, 'notify_via_email', fallback=False)
class MAILBOT:
mailbot_config = ConfigLoader.load(CONFIG_FILES.MAILBOT_CONFIG_PATH, displayed_title='mailbot')
section = 'general'
INTERVAL = mailbot_config.getfloat(section, 'interval', fallback=10.0)
MAX_EMAILS_PER_PROTECTION_INTERVAL = mailbot_config.getint(section,
'max_emails_per_protection_interval', fallback=50)
NOTIFY_INTRUDER = mailbot_config.getboolean(section, 'notify_intruder', fallback=True)
NOTIFY_ADMIN = mailbot_config.getboolean(section, 'notify_admin', fallback=False)
ADMIN_EMAIL = mailbot_config.get(section, 'admin_email', fallback=None)
section = 'smtp'
SMTP_LOGIN = mailbot_config.get(section, 'email', fallback=None)
SMTP_PASSWORD = mailbot_config.get(section, 'password', fallback=None)
SMTP_SERVER = mailbot_config.get(section, 'smtp_server', fallback=None)
SMTP_PORT = mailbot_config.getint(section, 'smtp_port', fallback=587)
section = 'template/intruder'
INTRUDER_SUBJECT = mailbot_config.get(section, 'subject')
INTRUDER_BODY_TEMPLATE = mailbot_config.get(section, 'html_body')
section = 'template/admin'
ADMIN_SUBJECT = mailbot_config.get(section, 'subject')
ADMIN_BODY_TEMPLATE = mailbot_config.get(section, 'html_body')
class USAGE_LOGGING_SERVICE:
section = 'usage_logging_service'
default_path = '~/.config/TensorHive/logs/'
def full_path(path: str) -> str: # type: ignore
return str(PosixPath(path).expanduser())
ENABLED = config.getboolean(section, 'enabled', fallback=True)
UPDATE_INTERVAL = config.getfloat(section, 'update_interval', fallback=2.0)
LOG_DIR = full_path(config.get(section, 'log_dir', fallback=default_path))
LOG_CLEANUP_ACTION = config.getint(section, 'log_cleanup_action', fallback=2)
class JOB_SCHEDULING_SERVICE:
section = 'job_scheduling_service'
ENABLED = config.getboolean(section, 'enabled', fallback=True)
UPDATE_INTERVAL = config.getfloat(section, 'update_interval', fallback=30.0)
STOP_TERMINATION_ATTEMPTS_AFTER = config.getfloat(section, 'stop_termination_attempts_after_mins', fallback=5.0)
SCHEDULE_QUEUED_JOBS_WHEN_FREE_MINS = config.getint(section, "schedule_queued_jobs_when_free_mins", fallback=30)
class AUTH:
from datetime import timedelta
section = 'auth'
def config_get_parsed(option: str, fallback: Any) -> List[str]: # type: ignore
'''
Parses value for option from string to a valid python list.
Fallback value is returned when anything goes wrong (e.g. option or value not present)
Example .ini file, function called with arguments: option='some_option', fallback=None
[some_section]
some_option = ['foo', 'bar']
Will return:
['foo', 'bar']
'''
import ast
try:
raw_arguments = config.get('auth', option)
parsed_arguments = ast.literal_eval(raw_arguments)
return parsed_arguments
except (configparser.Error, ValueError):
log.warning('Parsing [auth] config section failed for option "{}", using fallback value: {}'.format(
option, fallback))
return fallback
FLASK_JWT = {
'SECRET_KEY': config.get(section, 'secrect_key', fallback='jwt-some-secret'),
'JWT_BLACKLIST_ENABLED': config.getboolean(section, 'jwt_blacklist_enabled', fallback=True),
'JWT_BLACKLIST_TOKEN_CHECKS': config_get_parsed('jwt_blacklist_token_checks', fallback=['access', 'refresh']),
'BUNDLE_ERRORS': config.getboolean(section, 'bundle_errors', fallback=True),
'JWT_ACCESS_TOKEN_EXPIRES': timedelta(minutes=config.getint(section, 'jwt_access_token_expires_minutes',
fallback=1)),
'JWT_REFRESH_TOKEN_EXPIRES': timedelta(days=config.getint(section, 'jwt_refresh_token_expires_days',
fallback=1)),
'JWT_TOKEN_LOCATION': config_get_parsed('jwt_token_location', fallback=['headers'])
}
| 42.959459 | 118 | 0.681268 | 1,527 | 12,716 | 5.429601 | 0.212181 | 0.032565 | 0.048245 | 0.022193 | 0.267157 | 0.199132 | 0.138584 | 0.101435 | 0.075624 | 0.066216 | 0 | 0.006244 | 0.206511 | 12,716 | 295 | 119 | 43.105085 | 0.814965 | 0.094291 | 0 | 0.087805 | 0 | 0 | 0.172567 | 0.045692 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053659 | false | 0.004878 | 0.058537 | 0.009756 | 0.609756 | 0.009756 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
16c7d2d61e641808d594577e77047ea93c4d6c86 | 8,007 | py | Python | software/Opal/spud/diamond/build/lib.linux-x86_64-2.7/diamond/dialogs.py | msc-acse/acse-9-independent-research-project-Wade003 | cfcba990d52ccf535171cf54c0a91b184db6f276 | [
"MIT"
] | 2 | 2020-05-11T02:39:46.000Z | 2020-05-11T03:08:38.000Z | software/multifluids_icferst/libspud/diamond/build/lib.linux-x86_64-2.7/diamond/dialogs.py | msc-acse/acse-9-independent-research-project-Wade003 | cfcba990d52ccf535171cf54c0a91b184db6f276 | [
"MIT"
] | null | null | null | software/multifluids_icferst/libspud/diamond/build/lib.linux-x86_64-2.7/diamond/dialogs.py | msc-acse/acse-9-independent-research-project-Wade003 | cfcba990d52ccf535171cf54c0a91b184db6f276 | [
"MIT"
] | 2 | 2020-05-21T22:50:19.000Z | 2020-10-28T17:16:31.000Z | #!/usr/bin/env python
# This file is part of Diamond.
#
# Diamond is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Diamond is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Diamond. If not, see <http://www.gnu.org/licenses/>.
import os
import sys
import traceback
import gtk
import pygtkconsole
def prompt(parent, message, type = gtk.MESSAGE_QUESTION, has_cancel = False):
"""
Display a simple Yes / No dialog. Returns one of gtk.RESPONSE_{YES,NO,CANCEL}.
"""
prompt_dialog = gtk.MessageDialog(parent, 0, type, gtk.BUTTONS_NONE, message)
prompt_dialog.add_buttons(gtk.STOCK_YES, gtk.RESPONSE_YES, gtk.STOCK_NO, gtk.RESPONSE_NO)
if has_cancel:
prompt_dialog.add_buttons(gtk.STOCK_CANCEL, gtk.RESPONSE_CANCEL)
prompt_dialog.connect("response", prompt_response)
prompt_dialog.run()
return prompt_response.response
def long_message(parent, message):
"""
Display a message prompt, with the message contained within a scrolled window.
"""
message_dialog = gtk.Dialog(parent = parent, buttons = (gtk.STOCK_OK, gtk.RESPONSE_ACCEPT))
message_dialog.set_default_size(400, 300)
message_dialog.connect("response", close_dialog)
scrolled_window = gtk.ScrolledWindow()
message_dialog.vbox.add(scrolled_window)
scrolled_window.show()
scrolled_window.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_ALWAYS)
text_view = gtk.TextView()
scrolled_window.add(text_view)
text_view.show()
text_view.get_buffer().set_text(message)
text_view.set_cursor_visible(False)
text_view.set_property("editable", False)
text_view.set_property("height-request", 180)
text_view.set_property("width-request", 240)
message_dialog.run()
return
def error(parent, message):
"""
Display an error message.
"""
error_dialog = gtk.MessageDialog(parent, 0, gtk.MESSAGE_WARNING, gtk.BUTTONS_OK, message)
error_dialog.connect("response", close_dialog)
error_dialog.run()
return
def error_tb(parent, message):
"""
Display an error message, together with the last traceback.
"""
tb = traceback.format_exception(sys.exc_info()[0] ,sys.exc_info()[1], sys.exc_info()[2])
tb_msg = ""
for tbline in tb:
tb_msg += tbline
long_message(parent, tb_msg + "\n" + message)
return
def get_filename(title, action, filter_names_and_patterns = {}, folder_uri = None):
"""
Utility function to get a filename.
"""
if action == gtk.FILE_CHOOSER_ACTION_SAVE:
buttons=(gtk.STOCK_CANCEL,gtk.RESPONSE_CANCEL,gtk.STOCK_SAVE,gtk.RESPONSE_OK)
elif action == gtk.FILE_CHOOSER_ACTION_CREATE_FOLDER:
buttons=(gtk.STOCK_CANCEL,gtk.RESPONSE_CANCEL,gtk.STOCK_NEW,gtk.RESPONSE_OK)
else:
buttons=(gtk.STOCK_CANCEL,gtk.RESPONSE_CANCEL,gtk.STOCK_OPEN,gtk.RESPONSE_OK)
filew = gtk.FileChooserDialog(title=title, action=action, buttons=buttons)
filew.set_default_response(gtk.RESPONSE_OK)
if not folder_uri is None:
filew.set_current_folder_uri("file://" + os.path.abspath(folder_uri))
for filtername in filter_names_and_patterns:
filter = gtk.FileFilter()
filter.set_name(filtername)
filter.add_pattern(filter_names_and_patterns[filtername])
filew.add_filter(filter)
allfilter = gtk.FileFilter()
allfilter.set_name("All known files")
for filtername in filter_names_and_patterns:
allfilter.add_pattern(filter_names_and_patterns[filtername])
filew.add_filter(allfilter)
filter = gtk.FileFilter()
filter.set_name("All files")
filter.add_pattern("*")
filew.add_filter(filter)
result = filew.run()
if result == gtk.RESPONSE_OK:
filename = filew.get_filename()
filtername = filew.get_filter().get_name()
filew.destroy()
return filename
else:
filew.destroy()
return None
def console(parent, locals = None):
"""
Launch a python console.
"""
console_dialog = gtk.Dialog(parent = parent, buttons = (gtk.STOCK_QUIT, gtk.RESPONSE_ACCEPT))
console_dialog.set_default_size(400, 300)
console_dialog.connect("response", close_dialog)
stdout = sys.stdout
stderr = sys.stderr
console_widget = pygtkconsole.GTKInterpreterConsole(locals)
console_dialog.vbox.add(console_widget)
console_widget.show()
console_dialog.run()
sys.stdout = stdout
sys.stderr = stderr
return
def prompt_response(dialog, response_id):
"""
Signal handler for dialog response signals. Stores the dialog response in the
function namespace, to allow response return in other functions.
"""
if response_id == gtk.RESPONSE_DELETE_EVENT:
response_id = gtk.RESPONSE_CANCEL
prompt_response.response = response_id
close_dialog(dialog, response_id)
return
def close_dialog(dialog, response_id = None):
"""
Signal handler for dialog reponse or destroy signals. Closes the dialog.
"""
dialog.destroy()
return
def radio_dialog(title, message, choices, logo):
r = RadioDialog(title, message, choices, logo)
return r.data
def message_box(window, title, message):
dialog = gtk.MessageDialog(window, 0, gtk.MESSAGE_INFO, gtk.BUTTONS_OK, message)
dialog.set_title(title)
dialog.connect("response", close_dialog)
dialog.run()
class RadioDialog:
def __init__(self, title, message, choices, logo):
self.window = gtk.Window(gtk.WINDOW_TOPLEVEL)
self.window.connect("delete_event", self.cleanup)
self.window.connect("key_press_event", self.key_press)
self.window.set_title(title)
self.window.set_position(gtk.WIN_POS_CENTER)
if not logo is None:
self.window.set_icon_from_file(logo)
self.window.show()
#swindow = gtk.ScrolledWindow()
#swindow.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
#self.window.add(swindow)
#swindow.show()
main_box = gtk.VBox(False, 0)
self.window.add(main_box)
main_box.show()
if not logo is None:
image = gtk.Image()
image.set_from_file(logo)
main_box.pack_start(image, True, True, 0)
image.show()
label = gtk.Label(message)
main_box.add(label)
label.show()
radio_box = gtk.VBox(False, 10)
main_box.pack_start(radio_box, True, True, 0)
radio_box.show()
separator = gtk.HSeparator()
main_box.pack_start(separator, False, True, 0)
separator.show()
close = gtk.Button(stock=gtk.STOCK_OK)
close.connect("clicked", self.cleanup)
main_box.pack_start(close, False, False, 0)
close.show()
prev_radio = None
for choice in choices:
radio = gtk.RadioButton(prev_radio, choice)
radio.connect("toggled", self.radio_callback, choice)
radio_box.pack_start(radio, False, False, 0)
radio.show()
if prev_radio is None:
radio.set_active(True)
prev_radio = radio
self.data = choices[0]
gtk.main()
def cleanup(self, widget, data=None):
self.window.destroy()
gtk.main_quit()
def key_press(self, widget, event):
if event.keyval == gtk.keysyms.Return:
self.cleanup(None)
def radio_callback(self, widget, data):
self.data = data
class GoToDialog:
def __init__(self, parent):
self.goto_gui = gtk.glade.XML(parent.gladefile, root="GoToDialog")
self.dialog_box = self.goto_gui.get_widget("GoToDialog")
self.dialog_box.set_modal(True)
def run(self):
signals = {"goto_activate": self.on_goto_activate,
"cancel_activate": self.on_cancel_activate}
self.goto_gui.signal_autoconnect(signals)
self.dialog_box.show()
return ""
def on_goto_activate(self, widget=None):
print "goto"
def on_cancel_activate(self, widget=None):
print "cancel"
| 28.193662 | 95 | 0.723242 | 1,113 | 8,007 | 4.997305 | 0.230907 | 0.031643 | 0.018878 | 0.019777 | 0.233549 | 0.144373 | 0.093132 | 0.06005 | 0.044948 | 0.020137 | 0 | 0.00511 | 0.168977 | 8,007 | 283 | 96 | 28.293286 | 0.830778 | 0.100912 | 0 | 0.104651 | 0 | 0 | 0.031669 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.02907 | null | null | 0.011628 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16c8cf672763555c8ebe97c11704c5a42703427b | 5,536 | py | Python | bobjiang/settings.py | bobjiangps/django-blog | 6afd36fa96c5a027546575b362b0a481c5d7c1a5 | [
"MIT"
] | 3 | 2019-10-25T13:08:04.000Z | 2020-01-05T11:29:18.000Z | bobjiang/settings.py | bobjiangps/django-blog | 6afd36fa96c5a027546575b362b0a481c5d7c1a5 | [
"MIT"
] | 9 | 2020-05-10T10:13:56.000Z | 2022-03-11T23:33:52.000Z | bobjiang/settings.py | bobjiangps/django-blog | 6afd36fa96c5a027546575b362b0a481c5d7c1a5 | [
"MIT"
] | 3 | 2019-02-11T02:55:51.000Z | 2020-01-05T11:29:20.000Z | """
Django settings for bobjiang project.
Generated by 'django-admin startproject' using Django 2.0.6.
For more information on this file, see
https://docs.djangoproject.com/en/2.0/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/2.0/ref/settings/
"""
import os
import json
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
with open(os.path.join(BASE_DIR, "store.json"), "r") as store_file:
STORED = json.load(store_file)
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/2.0/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = STORED['secret_key']
# SECURITY WARNING: don't run with debug turned on in production!
# DEBUG = True
DEBUG = False
RECORD_VISITOR = True
# RECORD_VISITOR = False
ALLOWED_HOSTS = ['*',]
APPEND_SLASH = True
# Application definition
INSTALLED_APPS = [
'haystack',
'blog.apps.BlogConfig',
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'main',
'comments',
'ckeditor',
'ckeditor_uploader',
'tool',
'accounting',
#'xadmin',
#'crispy_forms',
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'bobjiang.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [os.path.join(BASE_DIR, 'templates')],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
'bobjiang.context_processors.device'
],
},
},
]
WSGI_APPLICATION = 'bobjiang.wsgi.application'
# Database
# https://docs.djangoproject.com/en/2.0/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': STORED['db_name'],
'USER': STORED['db_user'],
'PASSWORD': STORED['db_pw'],
'HOST': '127.0.0.1',
'PORT': 3306,
'OPTIONS': {
'autocommit': True,
},
}
}
# Password validation
# https://docs.djangoproject.com/en/2.0/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/2.0/topics/i18n/
#LANGUAGE_CODE = 'en-us'
LANGUAGE_CODE = 'zh-hans'
TIME_ZONE = 'Asia/Shanghai'
USE_I18N = True
USE_L10N = True
USE_TZ = False
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/2.0/howto/static-files/
STATIC_URL = '/static/'
#STATIC_ROOT = os.path.join(BASE_DIR, 'static')
STATICFILES_DIRS = [
os.path.join(BASE_DIR, "static"),
]
#STATIC_ROOT = '/home/bob/djproject/bobjiang/blog/static/'
MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
MEDIA_URL = '/media/'
CKEDITOR_UPLOAD_PATH = 'upload/'
CKEDITOR_IMAGE_BACKEND = 'pillow'
CKEDITOR_BROWSE_SHOW_DIRS = True
CKEDITOR_RESTRICT_BY_USER = True
CKEDITOR_CONFIGS = {
'default': {
'toolbar': (['div', 'Source', '-', 'Save', 'NewPage', 'Preview', '-', 'Templates'],
['Cut', 'Copy', 'Paste', 'PasteText', 'PasteFromWord', '-','Print','SpellChecker','Scayt'],
['Undo', 'Redo', '-', 'Find', 'Replace', '-', 'SelectAll', 'RemoveFormat','-','Maximize', 'ShowBlocks', '-',"CodeSnippet", 'Subscript', 'Superscript'],
['Form', 'Checkbox', 'Radio', 'TextField', 'Textarea', 'Select', 'Button', 'ImageButton',
'HiddenField'],
['Bold', 'Italic', 'Underline', 'Strike', '-'],
['NumberedList', 'BulletedList', '-', 'Outdent', 'Indent', 'Blockquote'],
['JustifyLeft', 'JustifyCenter', 'JustifyRight', 'JustifyBlock'],
['Link', 'Unlink', 'Anchor'],
['Image', 'Flash', 'Table', 'HorizontalRule', 'Smiley', 'SpecialChar', 'PageBreak'],
['Styles', 'Format', 'Font', 'FontSize'],
['TextColor', 'BGColor'],
),
'extraPlugins': 'codesnippet',
}
}
# haystack
HAYSTACK_CONNECTIONS = {
'default': {
'ENGINE': 'blog.whoosh_cn_backend.WhooshEngine',
'PATH': os.path.join(BASE_DIR, 'whoosh_index'),
},
}
HAYSTACK_SEARCH_RESULTS_PER_PAGE = 5
HAYSTACK_SIGNAL_PROCESSOR = 'haystack.signals.RealtimeSignalProcessor'
| 28.984293 | 171 | 0.638728 | 564 | 5,536 | 6.138298 | 0.460993 | 0.056326 | 0.044483 | 0.050549 | 0.173888 | 0.149913 | 0.077123 | 0.077123 | 0.034662 | 0 | 0 | 0.007706 | 0.203035 | 5,536 | 190 | 172 | 29.136842 | 0.776972 | 0.213692 | 0 | 0.041322 | 1 | 0 | 0.462624 | 0.250174 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.049587 | 0.016529 | 0 | 0.016529 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16c94789f75ac4c3a4caedf7d0832ce6641802d7 | 671 | py | Python | Users/models.py | titusnjuguna/FreeDom | 204b3d06ba66e6e8a04af976a25c3c1b7c070f75 | [
"MIT"
] | 1 | 2022-02-10T17:54:53.000Z | 2022-02-10T17:54:53.000Z | Users/models.py | titusnjuguna/FreeDom | 204b3d06ba66e6e8a04af976a25c3c1b7c070f75 | [
"MIT"
] | null | null | null | Users/models.py | titusnjuguna/FreeDom | 204b3d06ba66e6e8a04af976a25c3c1b7c070f75 | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.models import User
from PIL import Image
class Profile(models.Model):
user = models.ForeignKey(User, on_delete=models.CASCADE)
image = models.ImageField(default='default.jpg',
upload_to='profile_pic/')
def __str__(self):
return f'{self.user.username} Profile'
def save(self,*args,**kwargs):
super().save()
img = Image(self.prof_pic.path)
if img.height > 300 and img.width > 300:
output_size = (300,300)
img.thumbnail(output_size)
img.save(self.prof_pic.path)
| 29.173913 | 60 | 0.588674 | 82 | 671 | 4.682927 | 0.536585 | 0.052083 | 0.057292 | 0.078125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025696 | 0.304024 | 671 | 22 | 61 | 30.5 | 0.796574 | 0 | 0 | 0 | 0 | 0 | 0.076006 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.1875 | 0.0625 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
16cc8a900ca38b32bc2a6bbb0fff269ef5b921da | 1,430 | py | Python | conanfile.py | mmurooka/mc_rtc_data | bf45279cc59f9d85915cb2a01a84c23e5ce45958 | [
"BSD-2-Clause"
] | 1 | 2021-04-12T06:02:53.000Z | 2021-04-12T06:02:53.000Z | conanfile.py | mmurooka/mc_rtc_data | bf45279cc59f9d85915cb2a01a84c23e5ce45958 | [
"BSD-2-Clause"
] | 3 | 2020-06-18T10:01:15.000Z | 2021-11-08T12:43:43.000Z | conanfile.py | mmurooka/mc_rtc_data | bf45279cc59f9d85915cb2a01a84c23e5ce45958 | [
"BSD-2-Clause"
] | 4 | 2020-03-12T08:57:41.000Z | 2021-09-07T03:07:56.000Z | # -*- coding: utf-8 -*-
#
from conans import python_requires
import conans.tools as tools
import os
base = python_requires("Eigen3ToPython/latest@multi-contact/dev")
class MCRTCDataConan(base.Eigen3ToPythonConan):
name = "mc_rtc_data"
version = "1.0.4"
description = "Environments/Robots description for mc_rtc"
topics = ("robotics", "data")
url = "https://github.com/jrl-umi3218/mc_rtc_data"
homepage = "https://github.com/jrl-umi3218/mc_rtc_data"
author = "Pierre Gergondet <pierre.gergondet@gmail.com>"
license = "BSD-2-Clause"
exports = ["LICENSE"]
exports_sources = ["CMakeLists.txt", "conan/CMakeLists.txt", "cmake/*", "jvrc_description/*", "mc_env_description/*", "mc_int_obj_description/*", "mc_rtc_data/*"]
generators = "cmake"
settings = "os", "arch"
requires = ()
def config_options(self):
del self.options.python2_version
del self.options.python3_version
def package_id(self):
pass
def package(self):
cmake = self._configure_cmake()
cmake.install()
tools.rmdir(os.path.join(self.package_folder, "lib", "pkgconfig"))
for f in [".catkin", "_setup_util.py", "env.sh", "setup.bash", "local_setup.bash", "setup.sh", "local_setup.sh", "setup.zsh", "local_setup.zsh", ".rosinstall"]:
p = os.path.join(self.package_folder, f)
if os.path.exists(p):
os.remove(p)
| 34.878049 | 168 | 0.652448 | 181 | 1,430 | 4.983425 | 0.524862 | 0.027716 | 0.039911 | 0.037694 | 0.133038 | 0.133038 | 0.073171 | 0.073171 | 0 | 0 | 0 | 0.014731 | 0.193007 | 1,430 | 40 | 169 | 35.75 | 0.766898 | 0.014685 | 0 | 0 | 0 | 0 | 0.360142 | 0.064769 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096774 | false | 0.032258 | 0.096774 | 0 | 0.645161 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
16cd7731b200cbda5815fed9bc8eb8baf3b78188 | 1,217 | py | Python | hyperion/migrations/0006_auto_20190218_2251.py | ExiaSR/hyperion | 0b14ef55ed00b964f1966c722f4162c475aa4895 | [
"MIT"
] | 3 | 2019-01-30T03:50:04.000Z | 2019-02-20T00:33:05.000Z | hyperion/migrations/0006_auto_20190218_2251.py | ExiaSR/hyperion | 0b14ef55ed00b964f1966c722f4162c475aa4895 | [
"MIT"
] | 173 | 2019-01-30T08:30:54.000Z | 2019-04-05T19:43:06.000Z | hyperion/migrations/0006_auto_20190218_2251.py | ExiaSR/hyperion | 0b14ef55ed00b964f1966c722f4162c475aa4895 | [
"MIT"
] | 2 | 2019-05-06T22:59:56.000Z | 2020-09-29T03:13:03.000Z | # Generated by Django 2.1.5 on 2019-02-18 22:51
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('hyperion', '0005_auto_20190212_2116'),
]
operations = [
migrations.RenameField(
model_name='post',
old_name='visibleTo',
new_name='visible_to',
),
migrations.AddField(
model_name='post',
name='content_type',
field=models.CharField(choices=[('1', 'text/plain'), ('2', 'text/markdown'), ('3', 'image/png;base64'), ('4', 'image/jpeg;base64'), ('5', 'application/base64')], default='1', max_length=1),
),
migrations.AddField(
model_name='post',
name='visibility',
field=models.CharField(choices=[('1', 'PUBLIC'), ('2', 'FOAF'), ('3', 'FRIENDS'), ('4', 'PRIVATE'), ('5', 'SERVERONLY')], default='1', max_length=1),
),
migrations.AlterField(
model_name='comment',
name='author',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='comments', to='hyperion.UserProfile'),
),
]
| 34.771429 | 201 | 0.571898 | 129 | 1,217 | 5.27907 | 0.542636 | 0.052863 | 0.057269 | 0.064611 | 0.25257 | 0.170338 | 0 | 0 | 0 | 0 | 0 | 0.056106 | 0.253081 | 1,217 | 34 | 202 | 35.794118 | 0.693069 | 0.036976 | 0 | 0.321429 | 1 | 0 | 0.209402 | 0.019658 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.178571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16cdaac129cd705700eab605365385f7b7b8a82c | 2,236 | py | Python | pottan_ocr/utils.py | nithyadurai87/pottan-ocr-tamil | e455891dc0ddd508d1318abf84fc59cc548873f7 | [
"MIT"
] | 5 | 2019-05-05T18:26:14.000Z | 2019-08-02T05:04:12.000Z | pottan_ocr/utils.py | nithyadurai87/pottan-ocr-tamil | e455891dc0ddd508d1318abf84fc59cc548873f7 | [
"MIT"
] | 3 | 2020-07-17T02:28:11.000Z | 2021-05-08T21:58:10.000Z | pottan_ocr/utils.py | nithyadurai87/pottan-ocr-tamil | e455891dc0ddd508d1318abf84fc59cc548873f7 | [
"MIT"
] | 3 | 2020-04-11T19:39:08.000Z | 2020-12-21T08:44:21.000Z | import torch
import json
import numpy as np
from torch.autograd import Variable
import gzip
import yaml
from re import split
from matplotlib import pyplot
def showImg( im ):
pyplot.imshow( im )
pyplot.show()
def myOpen( fname, mode ):
return open( fname, mode, encoding="utf-8" )
def readFile( fname ):
opener, mode = ( gzip.open, 'rt' ) if fname[-3:] == '.gz' else ( open, 'r' )
with opener( fname, mode ) as f:
return f.read()
def readLines( fname ):
return split('[\r\n]', readFile( fname ) )
def readJson( fname ):
with myOpen( fname, 'r' ) as f:
return json.load( f )
def writeFile( fname, contents ):
with myOpen( fname, 'w' ) as f:
f.write( contents )
def writeJson( fname, data ):
with myOpen( fname, 'w') as outfile:
json.dump(data, outfile)
def readYaml( fname ):
with myOpen(fname, 'r') as fp:
return yaml.load( fp )
config = readYaml('./config.yaml')
class averager(object):
"""Compute average for `torch.Variable` and `torch.Tensor`. """
def __init__(self):
self.reset()
def add(self, v):
if isinstance(v, Variable):
count = v.data.numel()
v = v.data.sum()
elif isinstance(v, torch.Tensor):
count = v.numel()
v = v.sum()
self.n_count += count
self.sum += v
def reset(self):
self.n_count = 0
self.sum = 0
def val(self):
res = 0
if self.n_count != 0:
res = self.sum / float(self.n_count)
return res
def loadTrainedModel( model, opt ):
"""Load a pretrained model into given model"""
print('loading pretrained model from %s' % opt.crnn)
if( opt.cuda ):
stateDict = torch.load(opt.crnn )
else:
stateDict = torch.load(opt.crnn, map_location={'cuda:0': 'cpu'} )
# Handle the case of some old torch version. It will save the data as module.<xyz> . Handle it
if( list( stateDict.keys() )[0][:7] == 'module.' ):
for key in list(stateDict.keys()):
stateDict[ key[ 7:] ] = stateDict[key]
del stateDict[ key ]
model.load_state_dict( stateDict )
print('Completed loading pre trained model')
| 24.304348 | 99 | 0.58542 | 301 | 2,236 | 4.312292 | 0.38206 | 0.042373 | 0.046225 | 0.030817 | 0.101695 | 0.035439 | 0 | 0 | 0 | 0 | 0 | 0.006231 | 0.2822 | 2,236 | 91 | 100 | 24.571429 | 0.802492 | 0.086315 | 0 | 0 | 0 | 0 | 0.057579 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.203125 | false | 0 | 0.125 | 0.03125 | 0.4375 | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16cf7d6d5783bc8dc6f881f5646090c8b7e4317c | 7,584 | py | Python | population_estimator/curses_io.py | cruzanta/population-estimator | cb56c551b615726543d8b1643302be2d30fd593c | [
"MIT"
] | 1 | 2019-02-10T01:30:09.000Z | 2019-02-10T01:30:09.000Z | population_estimator/curses_io.py | cruzantada/population-estimator | cb56c551b615726543d8b1643302be2d30fd593c | [
"MIT"
] | null | null | null | population_estimator/curses_io.py | cruzantada/population-estimator | cb56c551b615726543d8b1643302be2d30fd593c | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""
Module for painting output on and obtaining input from a text-based terminal
window using the curses library.
"""
import curses
import textwrap
def display_string(screen, a_string, output_line):
# Paints a string on a text-based terminal window.
_, width = screen.getmaxyx()
try:
screen.addstr(output_line, 0, textwrap.fill(a_string, width - 1))
except curses.error:
screen.addstr(0, 0, textwrap.fill(
'Terminal window too small for output! Please resize. ', width - 1))
return output_line
def display_list_items(screen, a_list, output_line):
# Paints each item of a list on a text-based terminal window.
for item in a_list:
output_line = display_string(screen, '%s' % (item), output_line)
output_line += 1
return output_line
def display_formatted_dict(screen, dct, output_line):
# Paints each key, value pair of a dict on a text-based terminal window.
for key, value in dct.items():
if isinstance(value, int):
value = '{:,}'.format(value)
formatted_dict = '%s: %s' % (key, value)
output_line = display_string(screen, formatted_dict, output_line)
output_line += 1
return output_line
def display_string_with_prompt(screen, first_line_num, a_string, prompt):
"""Paints two strings and accepts input.
Paints two strings on a text-based terminal window. The latter of the two
strings serves as the prompt for the user to enter input.
Args:
screen: A window object that represents the text-based terminal window.
first_line_num: An integer that represents the location along the y-axis
of the terminal window where the first character of the first string
is painted.
a_string: The first string that is painted on the terminal window.
prompt: A string that serves as a prompt for the user to enter input.
Returns:
A string that the user enters in as input.
"""
screen.clear()
output_line = first_line_num
output_line = display_string(screen, a_string, output_line)
output_line += 3
output_line = display_string(screen, prompt, output_line)
screen.refresh()
return screen.getstr(output_line, len(prompt) + 1)
def display_list_items_with_prompt(screen, first_line_num, a_string, a_list,
prompt):
"""Paints a string, each item of a list, and accepts input.
Paints a string, each item of a list, and another string on a text-based
terminal window. Each item of the list is painted on its own line.
The second string serves as a prompt for the user to enter input.
Args:
screen: A window object that represents the text-based terminal window.
first_line_num: An integer that represents the location along the y-axis
of the terminal window where the first character of the first string
is painted.
a_string: The first string that is painted on the terminal window.
a_list: A list whose items are painted on each line of the terminal
window.
prompt: A string that serves as a prompt for the user to enter input.
Returns:
A string that the user enters in as input.
"""
screen.clear()
output_line = first_line_num
output_line = display_string(screen, a_string, output_line)
output_line += 2
output_line = display_list_items(screen, a_list, output_line)
output_line += 1
output_line = display_string(screen, prompt, output_line)
screen.refresh()
return screen.getstr(output_line, len(prompt) + 1)
def display_formatted_dicts_with_prompt(screen, first_line_num, a_string,
list_of_dicts, prompt):
"""Paints a string, each item of each dict in a list, and accepts input.
Paints a string, each item of each dict in a list, and another string on a
text-based terminal window. Each key, value pair of each dict is painted on
its own line with the key and value separated by a colon. The second string
serves as a prompt for the user to enter input.
Args:
screen: A window object that represents the text-based terminal window.
first_line_num: An integer that represents the location along the y-axis
of the terminal window where the first character of the first string
is painted.
a_string: The first string that is painted on the terminal window.
list_of_dicts: A list of dictionaries whose key, value pairs are painted
on their own line of the terminal window.
prompt: A string that serves as a prompt for the user to enter input.
Returns:
A string that the user enters in as input.
"""
screen.clear()
output_line = first_line_num
output_line = display_string(screen, a_string, output_line)
output_line += 2
for dct in list_of_dicts:
output_line = display_formatted_dict(screen, dct, output_line)
output_line += 1
output_line += 1
output_line = display_string(screen, prompt, output_line)
screen.refresh()
return screen.getstr(output_line, len(prompt) + 1)
def get_user_menu_selection(screen, first_line_num, a_string, menu_items,
prompt):
"""Paints a string, a menu, and accepts input.
Paints a string, a menu, and another string on a text-based terminal window.
The menu is composed of the items in a list, and each item is assigned its
own number that represents the order in which the item appears in the menu.
The second string serves as a prompt for the user to enter a number from the
menu.
Args:
screen: A window object that represents the text-based terminal window.
first_line_num: An integer that represents the location along the y-axis
of the terminal window where the first character of the first string
is painted.
a_string: The first string that is painted on the terminal window.
menu_items: A list whose items are painted on each line of the terminal
window as menu options.
prompt: A string that serves as a prompt for the user to enter a number
from the menu.
Returns:
A string representation of the item in 'menu_items' that the user
selects.
"""
# Create a dictionary that contains the items in 'menu_items'. Each item
# is added as a value with an integer key that represents the order in which
# the item will appear in the menu.
item_key = 1
selection_items = {}
for item in menu_items:
selection_items['%s' % (item_key)] = item
item_key += 1
# Display the menu and prompt the user for a selection.
while True:
screen.clear()
output_line = first_line_num
output_line = display_string(screen, a_string, output_line)
output_line += 3
for menu_num in sorted(selection_items.iterkeys()):
item_line = '%s) %s' % (menu_num, selection_items[menu_num])
output_line = display_string(screen, item_line, output_line)
output_line += 1
output_line += 1
output_line = display_string(screen, prompt, output_line)
screen.refresh()
input = screen.getstr(output_line, len(prompt) + 1)
if input not in selection_items.keys():
continue # Force the user to enter a valid selection.
else:
return selection_items[input]
| 36.114286 | 80 | 0.676292 | 1,118 | 7,584 | 4.447227 | 0.12254 | 0.104586 | 0.044449 | 0.055511 | 0.736726 | 0.707763 | 0.673572 | 0.620676 | 0.570193 | 0.55712 | 0 | 0.003944 | 0.264504 | 7,584 | 209 | 81 | 36.287081 | 0.887415 | 0.520042 | 0 | 0.493671 | 0 | 0 | 0.021778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088608 | false | 0 | 0.025316 | 0 | 0.202532 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16d095db1ff9ef61a032d5e0564695c1cb47f1b3 | 6,986 | py | Python | SAP/released_tr_email_sender/ui.py | botisko/personal_programs | 2e234271db438e228b9028b8180a6e833f482104 | [
"MIT"
] | null | null | null | SAP/released_tr_email_sender/ui.py | botisko/personal_programs | 2e234271db438e228b9028b8180a6e833f482104 | [
"MIT"
] | 1 | 2021-01-08T13:25:16.000Z | 2021-01-08T13:25:16.000Z | SAP/released_tr_email_sender/ui.py | botisko/personal_programs | 2e234271db438e228b9028b8180a6e833f482104 | [
"MIT"
] | 1 | 2021-01-08T12:52:29.000Z | 2021-01-08T12:52:29.000Z | import json
from tkinter import *
from tkinter import ttk
from tkinter import messagebox
from tr_data import TRData, NO_DATA_MEETS_CRITERIA
from email_text import email_body_template
from helpers import send_email
RECIPIENT = <email_address>
EXCEPTION_FILE = "tr_number_exceptions.json"
class TrEmailSender:
def __init__(self, transport_requests: TRData):
self.transport_requests = transport_requests
self.exceptions = self.load_exceptions()
# WINDOW CREATION
self.window = Tk()
self.window.title("Send email with import requests to TST")
self.window.config(padx=20, pady=20)
# TTILE LABEL
self.title_lbl = Label(
text="Please select TRs to be included into email: ",
)
# BUTTONS
self.refresh_btn = Button(text="REFRESH", command=self.refresh)
self.exceptions_btn = Button(text="Add to exceptions", command=self.add_to_exceptions)
self.select_all_btn = Button(text="Select All", command=self.select_all)
self.send_btn = Button(text="SEND", command=self.send_email)
# list of TRs
columns_labels = {
'tr_number': ("TR Number", 100),
'description': ("Description", 350),
'tkt_type': ("Ticket Type", 80),
'ticket_num': ("Ticket Number", 80),
'module': ("SAP Module", 80),
'export_datetime': ("Export Timestamp", 150),
'owner': ("Owner", 80)
}
# TREE VIEW for list display
self.tr_tree_view = ttk.Treeview(columns=tuple(columns_labels.keys()), show='headings')
# Update columns
for column, (label, field_length) in columns_labels.items():
self.tr_tree_view.column(column, minwidth=80, width=field_length, anchor='w', stretch=False)
self.tr_tree_view.heading(column, text=label)
# insert data
self.populate_tree_view_lines()
#LAYOUT PLACEMENT
self.title_lbl.grid(row=0, column=0, sticky=W)
self.tr_tree_view.grid(row=1, column=0, rowspan=4)
self.refresh_btn.grid(row=1, column=1, sticky=N+S+E+W, padx=2, pady=2)
self.exceptions_btn.grid(row=2, column=1, sticky=E+W+S, padx=1, pady=2)
self.select_all_btn.grid(row=3, column=1, sticky=E+W+N, padx=1, pady=2)
self.send_btn.grid(row=4, column=1, sticky=N+S+E+W, padx=1, pady=2)
# DISPLAY WINDOW
self.window.mainloop()
def refresh(self):
# delete all rows in tree view
for item in self.tr_tree_view.get_children():
self.tr_tree_view.delete(item)
# update with new data
self.transport_requests.refresh()
self.exceptions = self.load_exceptions()
self.populate_tree_view_lines()
def populate_tree_view_lines(self):
all_are_in_exceptions = True
for (tr_number, export_timestamp, owner, description, ticket_number, sap_module, ticket_type) in self.transport_requests.data:
# check if not in exception
if not tr_number in self.exceptions:
year = export_timestamp[:4]
month = export_timestamp[4:6]
day = export_timestamp[6:8]
time = f"{export_timestamp[8:10]}:{export_timestamp[10:12]}:{export_timestamp[12:]}"
export_date_time = f"{day}/{month}/{year} - {time}"
line_values = (tr_number, description, ticket_type, ticket_number, sap_module, export_date_time, owner)
self.tr_tree_view.insert('', 'end', values=line_values)
all_are_in_exceptions = False
# if all TRs are in exceptions, insert only pre-defined information
if all_are_in_exceptions:
tr_number = NO_DATA_MEETS_CRITERIA[0][0]
description = NO_DATA_MEETS_CRITERIA[0][3]
no_data_information = (tr_number, description, "", "", "", "", "")
self.tr_tree_view.insert('', 'end', values=no_data_information)
def select_all(self):
items = self.tr_tree_view.get_children()
self.tr_tree_view.selection_add(items)
def get_selected_item_ids(self):
return self.tr_tree_view.selection()
def send_email(self):
# get selected lines
selected_ids = self.get_selected_item_ids()
# get data of each id
if not selected_ids:
messagebox.showinfo(
title="Status Info",
message="There is nothing to send.\n\nPlease refresh the page."
)
return None
email_details = self.prepare_email_details(selected_ids)
# send email
if send_email(**email_details):
messagebox.showinfo(
title="Status Info", message="Email has been sent!")
# add trs into exceptions
return self.add_to_exceptions()
else:
return None
def prepare_email_details(self, selected_ids):
transport_data = [self.tr_tree_view.item(id_tag, 'values') for id_tag in selected_ids]
# prepare list of transports for email body
html_list_of_trs = ""
ticket_numbers = set()
for (tr_number, description, ticket_type, ticket_number, sap_module, export_timestamp, owner) in transport_data:
html_list_of_trs += f"<li>{tr_number} - {owner} - {description}</li>"
ticket_numbers.add(ticket_number)
# prepare email details
email_details = {
'recipient': RECIPIENT,
'subject': f"Transport requests for: {', '.join(sorted(ticket_numbers)).rstrip(', ')}",
'html_body': email_body_template.format(html_list_of_trs)
}
return email_details
def load_exceptions(self):
try:
with open(file=EXCEPTION_FILE, mode='r') as file:
exception_list = set(json.load(file)['tr_numbers'])
except FileNotFoundError:
with open(file=EXCEPTION_FILE, mode='w') as file:
exception_dict = {'tr_numbers': []}
json.dump(exception_dict, file, indent=4)
return set()
else:
return exception_list
def add_to_exceptions(self):
selected_ids = self.get_selected_item_ids()
if not selected_ids:
messagebox.showinfo(
title="Status Info",
message="Nothing has been selected.\n\nPlease refresh the page."
)
return None
transport_numbers = [self.tr_tree_view.item(id_tag, 'values')[0] for id_tag in selected_ids]
# add TR number of selected items to exception json file
for number in transport_numbers:
self.exceptions.add(number)
updated_data= {'tr_numbers': list(self.exceptions)}
with open(file=EXCEPTION_FILE, mode='w') as file:
json.dump(updated_data, file, indent=4)
return self.refresh()
| 38.174863 | 134 | 0.61838 | 878 | 6,986 | 4.687927 | 0.21754 | 0.034985 | 0.031584 | 0.044218 | 0.252915 | 0.187804 | 0.160836 | 0.11516 | 0.090865 | 0.073372 | 0 | 0.012905 | 0.278987 | 6,986 | 182 | 135 | 38.384615 | 0.804249 | 0.066848 | 0 | 0.15873 | 0 | 0 | 0.115574 | 0.021391 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.063492 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16d0a3ae5b7a5043417a9ada134eda9cc4f2dd27 | 1,548 | py | Python | AI-Practice-Tensorflow-Notes-master/opt/opt4_8_backward.py | foochane/Tensorflow-Learning | 54d210a1286051e9d60c98a62bd63eb070bc0a11 | [
"Apache-2.0"
] | 2 | 2019-01-23T14:23:17.000Z | 2019-01-23T14:23:49.000Z | AI-Practice-Tensorflow-Notes-master/opt/opt4_8_backward.py | foochane/Tensorflow-Learning | 54d210a1286051e9d60c98a62bd63eb070bc0a11 | [
"Apache-2.0"
] | null | null | null | AI-Practice-Tensorflow-Notes-master/opt/opt4_8_backward.py | foochane/Tensorflow-Learning | 54d210a1286051e9d60c98a62bd63eb070bc0a11 | [
"Apache-2.0"
] | null | null | null | #coding:utf-8
#0导入模块 ,生成模拟数据集
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
import opt4_8_generateds
import opt4_8_forward
STEPS = 40000
BATCH_SIZE = 30
LEARNING_RATE_BASE = 0.001
LEARNING_RATE_DECAY = 0.999
REGULARIZER = 0.01
def backward():
x = tf.placeholder(tf.float32, shape=(None, 2))
y_ = tf.placeholder(tf.float32, shape=(None, 1))
X, Y_, Y_c = opt4_8_generateds.generateds()
y = opt4_8_forward.forward(x, REGULARIZER)
global_step = tf.Variable(0,trainable=False)
learning_rate = tf.train.exponential_decay(
LEARNING_RATE_BASE,
global_step,
300/BATCH_SIZE,
LEARNING_RATE_DECAY,
staircase=True)
#定义损失函数
loss_mse = tf.reduce_mean(tf.square(y-y_))
loss_total = loss_mse + tf.add_n(tf.get_collection('losses'))
#定义反向传播方法:包含正则化
train_step = tf.train.AdamOptimizer(learning_rate).minimize(loss_total)
with tf.Session() as sess:
init_op = tf.global_variables_initializer()
sess.run(init_op)
for i in range(STEPS):
start = (i*BATCH_SIZE) % 300
end = start + BATCH_SIZE
sess.run(train_step, feed_dict={x: X[start:end], y_:Y_[start:end]})
if i % 2000 == 0:
loss_v = sess.run(loss_total, feed_dict={x:X,y_:Y_})
print("After %d steps, loss is: %f" %(i, loss_v))
xx, yy = np.mgrid[-3:3:.01, -3:3:.01]
grid = np.c_[xx.ravel(), yy.ravel()]
probs = sess.run(y, feed_dict={x:grid})
probs = probs.reshape(xx.shape)
plt.scatter(X[:,0], X[:,1], c=np.squeeze(Y_c))
plt.contour(xx, yy, probs, levels=[.5])
plt.show()
if __name__=='__main__':
backward()
| 24.967742 | 72 | 0.700258 | 259 | 1,548 | 3.945946 | 0.416988 | 0.07045 | 0.026419 | 0.043053 | 0.060665 | 0.060665 | 0 | 0 | 0 | 0 | 0 | 0.043247 | 0.148579 | 1,548 | 61 | 73 | 25.377049 | 0.73217 | 0.029716 | 0 | 0 | 1 | 0 | 0.02737 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022727 | false | 0 | 0.113636 | 0 | 0.136364 | 0.022727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16d1b5218231a945c48c3095503b717e135149a2 | 7,987 | py | Python | tests/test_transliterate.py | abosoar/camel_tools | 0a92c06f6dde0063e26df5cbe4d74c2f99b418e0 | [
"MIT"
] | 1 | 2021-03-23T12:50:47.000Z | 2021-03-23T12:50:47.000Z | tests/test_transliterate.py | KaoutharMokrane/camel_tools | e9099907835b05d448362bce2cb0e815ac7f5590 | [
"MIT"
] | null | null | null | tests/test_transliterate.py | KaoutharMokrane/camel_tools | e9099907835b05d448362bce2cb0e815ac7f5590 | [
"MIT"
] | 1 | 2021-01-24T05:06:33.000Z | 2021-01-24T05:06:33.000Z | # -*- coding: utf-8 -*-
# MIT License
#
# Copyright 2018-2020 New York University Abu Dhabi
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
"""
Tests for camel_tools.transliterate.
"""
from __future__ import absolute_import
import pytest
from camel_tools.utils.charmap import CharMapper
from camel_tools.utils.transliterate import Transliterator
# A mapper that translates lower-case English characters to a lower-case x and
# upper-case English characters to an upper-case X. This makes it easy to
# predict what the transliteration should be.
TEST_MAP = {
u'A-Z': u'X',
u'a-z': u'x',
}
TEST_MAPPER = CharMapper(TEST_MAP, None)
class TestTransliteratorInit(object):
"""Test class for Transliterator.__init__.
"""
def test_init_none_mapper(self):
"""Test that init raises a TypeError when given a mapper that is None.
"""
with pytest.raises(TypeError):
Transliterator(None)
def test_init_invalid_type_mapper(self):
"""Test that init raises a TypeError when given a mapper that is not a
CharMapper instance.
"""
with pytest.raises(TypeError):
Transliterator({})
def test_init_valid_mapper(self):
"""Test that init doesn't raise an error when given a valid mapper.
"""
assert Transliterator(TEST_MAPPER)
def test_init_none_marker(self):
"""Test that init raises a TypeError when given a marker that is None.
"""
with pytest.raises(TypeError):
Transliterator(TEST_MAPPER, None)
def test_init_invalid_type_marker(self):
"""Test that init raises a TypeError when given a marker that is not a
string.
"""
with pytest.raises(TypeError):
Transliterator(TEST_MAPPER, [])
def test_init_empty_marker(self):
"""Test that init raises a ValueError when given a marker that is an
empty string.
"""
with pytest.raises(ValueError):
Transliterator(TEST_MAPPER, '')
def test_init_invalid_marker1(self):
"""Test that init raises a ValueError when given an invalid marker (
wgitespace in the middle).
"""
with pytest.raises(ValueError):
Transliterator(TEST_MAPPER, '@@LAT @@')
def test_init_invalid_marker2(self):
"""Test that init raises a ValueError when given an invalid marker (
whitespace at the end).
"""
with pytest.raises(ValueError):
Transliterator(TEST_MAPPER, '@@LAT@@ ')
def test_init_invalid_marker3(self):
"""Test that init raises a ValueError when given an invalid marker (
whitespace at the beginning).
"""
with pytest.raises(ValueError):
Transliterator(TEST_MAPPER, ' @@LAT@@')
def test_init_valid_marker1(self):
"""Test that init doesn't raise an error when given a valid marker.
"""
assert Transliterator(TEST_MAPPER, '@@LAT@@')
def test_init_valid_marker2(self):
"""Test that init doesn't raise an error when given a valid marker.
"""
assert Transliterator(TEST_MAPPER, u'@@LAT@@')
class TestTransliteratorTranslate(object):
"""Test class for Transliterator.translate.
"""
def test_trans_empty(self):
"""Test that transliterating an empty string returns an empty string.
"""
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(u'') == u''
def test_trans_single_no_markers(self):
"""Test that a single word with no markers gets transliterated.
"""
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(u'Hello') == u'Xxxxx'
def test_trans_single_with_markers(self):
"""Test that a single word with markers does not get transliterated.
"""
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(u'@@Hello') == u'@@Hello'
def test_trans_single_strip(self):
"""Test that a single word with markers does not get transliterated
but markers do get stripped when strip_markers is set to True.
"""
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(u'@@Hello', True) == u'Hello'
def test_trans_single_ignore(self):
"""Test that a single word with markers gets transliterated when ignore
markers is set to True.
"""
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(u'@@Hello', False, True) == u'@@Xxxxx'
def test_trans_single_ignore_strip(self):
"""Test that a single word with markers gets transliterated with
markers stripped when both strip_markers and ignore_markers are set to
True.
"""
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(u'@@Hello', True, True) == u'Xxxxx'
def test_trans_sent_no_markers(self):
"""Test that a sentence with no markers gets transliterated.
"""
sent_orig = u'Hello World, this is a sentence!'
sent_out = u'Xxxxx Xxxxx, xxxx xx x xxxxxxxx!'
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(sent_orig) == sent_out
def test_trans_sent_with_markers(self):
"""Test that tokens with markers in a sentence do not get
transliterated.
"""
sent_orig = u'Hello @@World, this is a @@sentence!'
sent_out = u'Xxxxx @@World, xxxx xx x @@sentence!'
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(sent_orig) == sent_out
def test_trans_sent_strip(self):
"""Test that tokens with markers in a sentence do not get
transliterated but markers do get stripped when strip_markers is set
to True.
"""
sent_orig = u'Hello @@World, this is a @@sentence!'
sent_out = u'Xxxxx World, xxxx xx x sentence!'
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(sent_orig, True) == sent_out
def test_trans_sent_ignore(self):
"""Test that tokens with markers in a sentence get transliterated
when ignore markers is set to True.
"""
sent_orig = u'Hello @@World, this is a @@sentence!'
sent_out = u'Xxxxx @@Xxxxx, xxxx xx x @@xxxxxxxx!'
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(sent_orig, False, True) == sent_out
def test_trans_sent_ignore_strip(self):
"""Test that tokens with markers in a sentence get transliterated with
markers stripped when both strip_markers and ignore_markers are set to
True.
"""
sent_orig = u'Hello @@World, this is a @@sentence!'
sent_out = u'Xxxxx Xxxxx, xxxx xx x xxxxxxxx!'
trans = Transliterator(TEST_MAPPER, '@@')
assert trans.transliterate(sent_orig, True, True) == sent_out
| 33.700422 | 79 | 0.662076 | 1,035 | 7,987 | 4.97971 | 0.192271 | 0.02988 | 0.051222 | 0.034148 | 0.689756 | 0.650175 | 0.598564 | 0.576251 | 0.520567 | 0.510477 | 0 | 0.002331 | 0.248028 | 7,987 | 236 | 80 | 33.84322 | 0.855811 | 0.428446 | 0 | 0.357143 | 0 | 0 | 0.113724 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.261905 | false | 0 | 0.047619 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16d397fdfd404f351b1fb42cfa6cff5538a49320 | 790 | py | Python | 00-Aulas/Aula007_2.py | AmandaRH07/Python_Entra21 | 4084962508f1597c0498d8b329e0f45e2ac55302 | [
"MIT"
] | null | null | null | 00-Aulas/Aula007_2.py | AmandaRH07/Python_Entra21 | 4084962508f1597c0498d8b329e0f45e2ac55302 | [
"MIT"
] | null | null | null | 00-Aulas/Aula007_2.py | AmandaRH07/Python_Entra21 | 4084962508f1597c0498d8b329e0f45e2ac55302 | [
"MIT"
] | null | null | null | # Funções
cabecalho = "SISTEMA DE CADASTRO DE FUNCIONARIO\n\n\n"
rodape = "\n\n\n Obrigada pela preferencia"
def imprimir_tela(conteudo):
print(cabecalho)
#print(opcao_menu)
print(conteudo)
print(rodape)
def ler_opcoes():
opcao = int(input("Insira a opção: "))
return opcao
def carregar_opcoes(opcao):
if opcao == 1:
imprimir_tela("A opção escolhida foi 'Cadastrar funcionário'")
elif opcao == 2:
imprimir_tela("A opção escolhida foi 'Listar funcionários'")
elif opcao == 3:
imprimir_tela("A opção escolhida foi 'Editar funcionário'")
elif opcao == 4:
imprimir_tela("A opção escolhida foi 'Deletar funcionário'")
elif opcao == 5:
imprimir_tela("A opção escolhida foi 'Sair'")
else:
pass | 27.241379 | 70 | 0.655696 | 101 | 790 | 5.039604 | 0.445545 | 0.141454 | 0.127701 | 0.176817 | 0.294695 | 0.294695 | 0 | 0 | 0 | 0 | 0 | 0.008292 | 0.236709 | 790 | 29 | 71 | 27.241379 | 0.835821 | 0.03038 | 0 | 0 | 0 | 0 | 0.378272 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136364 | false | 0.045455 | 0 | 0 | 0.181818 | 0.136364 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16d55202daea41a875b382f2393a76063d29376b | 4,865 | py | Python | lib/django-0.96/django/views/generic/list_detail.py | MiCHiLU/google_appengine_sdk | 3da9f20d7e65e26c4938d2c4054bc4f39cbc5522 | [
"Apache-2.0"
] | 790 | 2015-01-03T02:13:39.000Z | 2020-05-10T19:53:57.000Z | AppServer/lib/django-0.96/django/views/generic/list_detail.py | nlake44/appscale | 6944af660ca4cb772c9b6c2332ab28e5ef4d849f | [
"Apache-2.0"
] | 1,361 | 2015-01-08T23:09:40.000Z | 2020-04-14T00:03:04.000Z | AppServer/lib/django-0.96/django/views/generic/list_detail.py | nlake44/appscale | 6944af660ca4cb772c9b6c2332ab28e5ef4d849f | [
"Apache-2.0"
] | 155 | 2015-01-08T22:59:31.000Z | 2020-04-08T08:01:53.000Z | from django.template import loader, RequestContext
from django.http import Http404, HttpResponse
from django.core.xheaders import populate_xheaders
from django.core.paginator import ObjectPaginator, InvalidPage
from django.core.exceptions import ObjectDoesNotExist
def object_list(request, queryset, paginate_by=None, page=None,
allow_empty=False, template_name=None, template_loader=loader,
extra_context=None, context_processors=None, template_object_name='object',
mimetype=None):
"""
Generic list of objects.
Templates: ``<app_label>/<model_name>_list.html``
Context:
object_list
list of objects
is_paginated
are the results paginated?
results_per_page
number of objects per page (if paginated)
has_next
is there a next page?
has_previous
is there a prev page?
page
the current page
next
the next page
previous
the previous page
pages
number of pages, total
hits
number of objects, total
last_on_page
the result number of the last of object in the
object_list (1-indexed)
first_on_page
the result number of the first object in the
object_list (1-indexed)
"""
if extra_context is None: extra_context = {}
queryset = queryset._clone()
if paginate_by:
paginator = ObjectPaginator(queryset, paginate_by)
if not page:
page = request.GET.get('page', 1)
try:
page = int(page)
object_list = paginator.get_page(page - 1)
except (InvalidPage, ValueError):
if page == 1 and allow_empty:
object_list = []
else:
raise Http404
c = RequestContext(request, {
'%s_list' % template_object_name: object_list,
'is_paginated': paginator.pages > 1,
'results_per_page': paginate_by,
'has_next': paginator.has_next_page(page - 1),
'has_previous': paginator.has_previous_page(page - 1),
'page': page,
'next': page + 1,
'previous': page - 1,
'last_on_page': paginator.last_on_page(page - 1),
'first_on_page': paginator.first_on_page(page - 1),
'pages': paginator.pages,
'hits' : paginator.hits,
}, context_processors)
else:
c = RequestContext(request, {
'%s_list' % template_object_name: queryset,
'is_paginated': False
}, context_processors)
if not allow_empty and len(queryset) == 0:
raise Http404
for key, value in extra_context.items():
if callable(value):
c[key] = value()
else:
c[key] = value
if not template_name:
model = queryset.model
template_name = "%s/%s_list.html" % (model._meta.app_label, model._meta.object_name.lower())
t = template_loader.get_template(template_name)
return HttpResponse(t.render(c), mimetype=mimetype)
def object_detail(request, queryset, object_id=None, slug=None,
slug_field=None, template_name=None, template_name_field=None,
template_loader=loader, extra_context=None,
context_processors=None, template_object_name='object',
mimetype=None):
"""
Generic detail of an object.
Templates: ``<app_label>/<model_name>_detail.html``
Context:
object
the object
"""
if extra_context is None: extra_context = {}
model = queryset.model
if object_id:
queryset = queryset.filter(pk=object_id)
elif slug and slug_field:
queryset = queryset.filter(**{slug_field: slug})
else:
raise AttributeError, "Generic detail view must be called with either an object_id or a slug/slug_field."
try:
obj = queryset.get()
except ObjectDoesNotExist:
raise Http404, "No %s found matching the query" % (model._meta.verbose_name)
if not template_name:
template_name = "%s/%s_detail.html" % (model._meta.app_label, model._meta.object_name.lower())
if template_name_field:
template_name_list = [getattr(obj, template_name_field), template_name]
t = template_loader.select_template(template_name_list)
else:
t = template_loader.get_template(template_name)
c = RequestContext(request, {
template_object_name: obj,
}, context_processors)
for key, value in extra_context.items():
if callable(value):
c[key] = value()
else:
c[key] = value
response = HttpResponse(t.render(c), mimetype=mimetype)
populate_xheaders(request, response, model, getattr(obj, obj._meta.pk.name))
return response
| 36.856061 | 113 | 0.623227 | 578 | 4,865 | 5.029412 | 0.200692 | 0.057792 | 0.03096 | 0.024768 | 0.328173 | 0.290334 | 0.265566 | 0.179567 | 0.148607 | 0.148607 | 0 | 0.007225 | 0.288798 | 4,865 | 131 | 114 | 37.137405 | 0.832948 | 0 | 0 | 0.382022 | 0 | 0 | 0.07343 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.05618 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16d79dca474781cfacdcca9ed1544b5e9e33234c | 2,612 | py | Python | src/richie/apps/courses/lms/edx.py | kernicPanel/richie | 803deda3e29383ce85593e1836a3cf4efc6b847e | [
"MIT"
] | null | null | null | src/richie/apps/courses/lms/edx.py | kernicPanel/richie | 803deda3e29383ce85593e1836a3cf4efc6b847e | [
"MIT"
] | null | null | null | src/richie/apps/courses/lms/edx.py | kernicPanel/richie | 803deda3e29383ce85593e1836a3cf4efc6b847e | [
"MIT"
] | null | null | null | """
Backend to connect Open edX richie with an LMS
"""
import logging
import re
import requests
from requests.auth import AuthBase
from ..serializers import SyncCourseRunSerializer
from .base import BaseLMSBackend
logger = logging.getLogger(__name__)
def split_course_key(key):
"""Split an OpenEdX course key by organization, course and course run codes.
We first try splitting the key as a version 1 key (course-v1:org+course+run)
and fallback the old version (org/course/run).
"""
if key.startswith("course-v1:"):
organization, course, run = key[10:].split("+")
else:
organization, course, run = key.split("/")
return organization, course, run
class EdXTokenAuth(AuthBase):
"""Attach HTTP token authentication to the given Request object."""
def __init__(self, token):
"""Set-up token value in the instance."""
self.token = token
def __call__(self, request):
"""Modify and return the request."""
request.headers.update(
{"X-Edx-Api-Key": self.token, "Content-Type": "application/json"}
)
return request
class TokenAPIClient(requests.Session):
"""
A :class:`requests.Session` that automatically authenticates against edX's preferred
authentication method up to Dogwood, given a secret token.
For more usage details, see documentation of the :class:`requests.Session` object:
https://requests.readthedocs.io/en/master/user/advanced/#session-objects
"""
def __init__(self, token, *args, **kwargs):
"""Extending the session object by setting the authentication token."""
super().__init__(*args, **kwargs)
self.auth = EdXTokenAuth(token)
class EdXLMSBackend(BaseLMSBackend):
"""LMS backend for Richie tested with Open EdX Dogwood to Hawthorn."""
@property
def api_client(self):
"""Instantiate and return an edx token API client."""
return TokenAPIClient(self.configuration["API_TOKEN"])
def extract_course_id(self, url):
"""Extract the LMS course id from the course run url."""
return re.match(self.configuration["COURSE_REGEX"], url).group("course_id")
def extract_course_number(self, data):
"""Extract the LMS course number from data dictionary."""
course_id = self.extract_course_id(data.get("resource_link"))
return split_course_key(course_id)[1]
@staticmethod
def get_course_run_serializer(data, partial=False):
"""Prepare data and return a bound serializer."""
return SyncCourseRunSerializer(data=data, partial=partial)
| 32.246914 | 88 | 0.68683 | 326 | 2,612 | 5.380368 | 0.407975 | 0.041049 | 0.035918 | 0.027366 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002886 | 0.204058 | 2,612 | 80 | 89 | 32.65 | 0.840789 | 0.383231 | 0 | 0 | 0 | 0 | 0.06345 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.216216 | false | 0 | 0.162162 | 0 | 0.621622 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16d7b7c1c6e2def8cf0c9ec10f6916a0a8cf367f | 4,106 | py | Python | BitTorrent-5.2.2/BTL/brpclib.py | jpabb7/p2pScrapper | 0fd57049606864223eb45f956a58adda1231af88 | [
"MIT"
] | 4 | 2016-04-26T03:43:54.000Z | 2016-11-17T08:09:04.000Z | BitTorrent-5.2.2/BTL/brpclib.py | jpabb7/p2pScrapper | 0fd57049606864223eb45f956a58adda1231af88 | [
"MIT"
] | 17 | 2015-01-05T21:06:22.000Z | 2015-12-07T20:45:44.000Z | BitTorrent-5.2.2/BTL/brpclib.py | jpabb7/p2pScrapper | 0fd57049606864223eb45f956a58adda1231af88 | [
"MIT"
] | 7 | 2015-07-28T09:17:17.000Z | 2021-11-07T02:29:41.000Z | # by Greg Hazel
import xmlrpclib
from xmlrpclib2 import *
from BTL import brpc
old_PyCurlTransport = PyCurlTransport
class PyCurlTransport(old_PyCurlTransport):
def set_connection_params(self, h):
h.add_header('User-Agent', "brpclib.py/1.0")
h.add_header('Connection', "Keep-Alive")
h.add_header('Content-Type', "application/octet-stream")
def _parse_response(self, response):
# read response from input file/socket, and parse it
return brpc.loads(response.getvalue())[0]
# --------------------------------------------------------------------
# request dispatcher
class _Method:
# some magic to bind an B-RPC method to an RPC server.
# supports "nested" methods (e.g. examples.getStateName)
def __init__(self, send, name):
self.__send = send
self.__name = name
def __getattr__(self, name):
return _Method(self.__send, "%s.%s" % (self.__name, name))
def __call__(self, *args, **kwargs):
args = (args, kwargs)
return self.__send(self.__name, args)
# ARG! prevent repr(_Method()) from submiting an RPC call!
def __repr__(self):
return "<%s instance at 0x%08X>" % (self.__class__, id(self))
# Double underscore is BAD!
class BRPC_ServerProxy(xmlrpclib.ServerProxy):
"""uri [,options] -> a logical connection to an B-RPC server
uri is the connection point on the server, given as
scheme://host/target.
The standard implementation always supports the "http" scheme. If
SSL socket support is available (Python 2.0), it also supports
"https".
If the target part and the slash preceding it are both omitted,
"/RPC2" is assumed.
The following options can be given as keyword arguments:
transport: a transport factory
encoding: the request encoding (default is UTF-8)
All 8-bit strings passed to the server proxy are assumed to use
the given encoding.
"""
def __init__(self, uri, transport=None, encoding=None, verbose=0,
allow_none=0):
# establish a "logical" server connection
# get the url
import urllib
type, uri = urllib.splittype(uri)
if type not in ("http", "https"):
raise IOError, "unsupported B-RPC protocol"
self.__host, self.__handler = urllib.splithost(uri)
if not self.__handler:
self.__handler = "/RPC2"
if transport is None:
if type == "https":
transport = xmlrpclib.SafeTransport()
else:
transport = xmlrpclib.Transport()
self.__transport = transport
self.__encoding = encoding
self.__verbose = verbose
self.__allow_none = allow_none
def __request(self, methodname, params):
# call a method on the remote server
request = brpc.dumps(params, methodname, encoding=self.__encoding,
allow_none=self.__allow_none)
response = self.__transport.request(
self.__host,
self.__handler,
request,
verbose=self.__verbose
)
if len(response) == 1:
response = response[0]
return response
def __repr__(self):
return (
"<ServerProxy for %s%s>" %
(self.__host, self.__handler)
)
__str__ = __repr__
def __getattr__(self, name):
# magic method dispatcher
return _Method(self.__request, name)
def new_server_proxy(url):
c = cache_set.get_cache(PyCURL_Cache, url)
t = PyCurlTransport(c)
return BRPC_ServerProxy(url, transport=t)
ServerProxy = new_server_proxy
if __name__ == '__main__':
s = ServerProxy('https://greg.mitte.bittorrent.com:7080/')
def ping(*a, **kw):
(a2, kw2) = s.ping(*a, **kw)
assert a2 == list(a), '%s list is not %s' % (r, list(a))
assert kw2 == dict(kw), '%s dict is not %s' % (kw2, dict(kw))
ping(0, 1, 1, name="potato")
ping(0, 1, 1, name="anime")
ping("phish", 0, 1, 1)
ping("games", 0, 1, 1)
| 30.641791 | 74 | 0.605212 | 503 | 4,106 | 4.695825 | 0.355865 | 0.016935 | 0.00508 | 0.024132 | 0.009314 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012735 | 0.273259 | 4,106 | 133 | 75 | 30.87218 | 0.77882 | 0.11057 | 0 | 0.052632 | 0 | 0 | 0.092642 | 0.008027 | 0 | 0 | 0 | 0 | 0.026316 | 0 | null | null | 0 | 0.052632 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16de03e641bb707c0257c647f4e57b0375e2b543 | 668 | py | Python | Python/fibs.py | familug/FAMILUG | ef8c11d92f4038d80f3f1a24cbab022c19791acf | [
"BSD-2-Clause"
] | 5 | 2015-10-13T04:13:04.000Z | 2020-12-23T13:47:43.000Z | Python/fibs.py | familug/FAMILUG | ef8c11d92f4038d80f3f1a24cbab022c19791acf | [
"BSD-2-Clause"
] | null | null | null | Python/fibs.py | familug/FAMILUG | ef8c11d92f4038d80f3f1a24cbab022c19791acf | [
"BSD-2-Clause"
] | 8 | 2015-07-20T15:37:38.000Z | 2021-04-14T07:18:10.000Z | def fib(n):
if n < 2:
return n
else:
return fib(n-1) + fib(n-2)
def fib_fast(n):
from math import sqrt
s5 = sqrt(5)
x = (1 + s5) ** n
y = (1 - s5) ** n
return int((x - y)/(s5 * 2**n))
def print_fib(n):
for i in range(n):
print fib(i),
print
for i in range(n):
print fib_fast(i),
def print_fib2(n):
fibs = [0, 1]
a, b = 0, 1
if n == 0:
print a
elif n == 1:
print a, b
else:
print 0, 1,
for i in range(2, n):
a, b = b, a + b
print b,
if __name__ == "__main__":
print_fib(10)
print
print_fib2(10)
| 15.904762 | 35 | 0.438623 | 113 | 668 | 2.469027 | 0.283186 | 0.057348 | 0.064516 | 0.11828 | 0.143369 | 0.143369 | 0.143369 | 0 | 0 | 0 | 0 | 0.06701 | 0.419162 | 668 | 41 | 36 | 16.292683 | 0.652062 | 0 | 0 | 0.181818 | 0 | 0 | 0.011976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.030303 | null | null | 0.363636 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16e4dfbf8bd61eccd8ee52165a28c0666d169326 | 840 | py | Python | test_mnist.py | aidiary/chainer-siamese | 6abce9192298e14682a7c766e2a5cdd10f519193 | [
"MIT"
] | null | null | null | test_mnist.py | aidiary/chainer-siamese | 6abce9192298e14682a7c766e2a5cdd10f519193 | [
"MIT"
] | null | null | null | test_mnist.py | aidiary/chainer-siamese | 6abce9192298e14682a7c766e2a5cdd10f519193 | [
"MIT"
] | null | null | null | import os
import chainer
import chainer.links as L
from net import SiameseNetwork
import numpy as np
import matplotlib.pyplot as plt
# 訓練済みモデルをロード
model = SiameseNetwork()
chainer.serializers.load_npz(os.path.join('result', 'model.npz'), model)
# テストデータをロード
_, test = chainer.datasets.get_mnist(ndim=3)
test_data, test_label = test._datasets
# テストデータを学習した低次元空間(2次元)に写像
y = model.forward_once(test_data)
feat = y.data
# ラベルごとに描画
c = ['#ff0000', '#ffff00', '#00ff00', '#00ffff', '#0000ff',
'#ff00ff', '#990000', '#999900', '#009900', '#009999']
# 各ラベルごとに異なる色でプロット
# 同じクラス内のインスタンスが近くに集まり、
# 異なるクラスのインスタンスが離れていれば成功
for i in range(10):
f = feat[np.where(test_label == i)]
plt.plot(f[:, 0], f[:, 1], '.', c=c[i])
plt.legend(['0', '1', '2', '3', '4', '5', '6', '7', '8', '9'])
plt.savefig(os.path.join('result', 'result.png'))
| 24.705882 | 72 | 0.667857 | 118 | 840 | 4.677966 | 0.618644 | 0.047101 | 0.036232 | 0.057971 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07978 | 0.134524 | 840 | 33 | 73 | 25.454545 | 0.679505 | 0.140476 | 0 | 0 | 0 | 0 | 0.156863 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.315789 | 0 | 0.315789 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
16e5abfcca6728651310e1b9d7d20815d0685476 | 5,535 | py | Python | TwoFeetTempoMove.py | b0nz0/TwisterTempo | fc975af4095509d8ec4fe2f84313fe152577bed2 | [
"MIT"
] | null | null | null | TwoFeetTempoMove.py | b0nz0/TwisterTempo | fc975af4095509d8ec4fe2f84313fe152577bed2 | [
"MIT"
] | null | null | null | TwoFeetTempoMove.py | b0nz0/TwisterTempo | fc975af4095509d8ec4fe2f84313fe152577bed2 | [
"MIT"
] | null | null | null | from random import randrange, random
from time import time
import logging
from TwisterTempoGUI import TwisterTempoGUI
class TwoFeetTempoMove(object):
COLORS_ALPHA = {0: 'RED', 1: 'BLUE', 2: 'YELLOW', 3: 'GREEN'}
COLORS_RGB = {0: (255, 0, 0), 1: (0, 0, 255), 2: (255, 255, 0), 3: (0, 255, 0)}
FOOT_CHANGE_PERC = 0.3
FOOT_ON_AIR_PERC = 0.08
FEET_ON_SAME_CIRCLE_PERC = 0.05
def __init__(self, min_delay=0, max_delay=100):
assert min_delay >= 0
assert max_delay > 0
self.min_delay = min_delay
self.max_delay = max_delay
self._last_beat_millis = 0
self._left_color = randrange(0, len(TwoFeetTempoMove.COLORS_ALPHA))
self._right_color = randrange(0, len(TwoFeetTempoMove.COLORS_ALPHA))
self._left_direction = "FW"
self._right_direction = "FW"
self._next_foot = 'RIGHT'
logging.info("Starting with LEFT: %s, RIGHT: %s" %
(TwoFeetTempoMove.COLORS_ALPHA[self._left_color],
TwoFeetTempoMove.COLORS_ALPHA[self._right_color]))
self.tt_gui = TwisterTempoGUI()
self.tt_gui.set_left_color(TwoFeetTempoMove.COLORS_ALPHA[self._left_color])
self.tt_gui.set_right_color(TwoFeetTempoMove.COLORS_ALPHA[self._right_color])
self._starting_millis = time() * 1000
def get_colors_alpha(self):
return {'RIGHT': TwoFeetTempoMove.COLORS_ALPHA[self._right_color],
'LEFT': TwoFeetTempoMove.COLORS_ALPHA[self._left_color]}
def get_colors_rgb(self):
return {'RIGHT': TwoFeetTempoMove.COLORS_RGB[self._right_color],
'LEFT': TwoFeetTempoMove.COLORS_RGB[self._left_color]}
def increase_speed(self):
self.min_delay = self.min_delay - 10
def decrease_speed(self):
self.min_delay = self.min_delay + 10
def tempo_found_callback(self, seconds, millis, confidence):
act_delay = millis - self._last_beat_millis + randrange(0, self.max_delay)
if act_delay >= self.min_delay:
self._last_beat_millis = millis
self.beat_found()
def beat_found(self):
millis = self._last_beat_millis
logging.debug("Randomized beat found at: %d:%d.%d" %
(millis / 60000, millis / 1000, millis % 1000))
act_millis = time() * 1000 - self._starting_millis
logging.debug("\tActual: %d:%d.%d" %
(act_millis / 60000, act_millis / 1000, act_millis % 1000))
# special moves
if random() < TwoFeetTempoMove.FOOT_ON_AIR_PERC: # randomized next foot on air move
if self._next_foot == 'RIGHT':
self.tt_gui.set_right_color(TwoFeetTempoMove.COLORS_ALPHA[self._right_color], on_air=True)
else:
self.tt_gui.set_left_color(TwoFeetTempoMove.COLORS_ALPHA[self._left_color], on_air=True)
logging.debug("\tmove next foot On Air")
elif random() < TwoFeetTempoMove.FEET_ON_SAME_CIRCLE_PERC: # randomized both feet on same circle
if self._next_foot == 'RIGHT':
self._right_color = self._left_color
self.tt_gui.set_large_color(TwoFeetTempoMove.COLORS_ALPHA[self._right_color])
else:
self._left_color = self._right_color
self.tt_gui.set_large_color(TwoFeetTempoMove.COLORS_ALPHA[self._left_color])
logging.debug("\tmove both feet on same circle")
# end special moves
else:
if random() < TwoFeetTempoMove.FOOT_CHANGE_PERC: # randomize at 30% the switch on foot
if self._next_foot == 'RIGHT':
self._next_foot = 'LEFT'
else:
self._next_foot = 'RIGHT'
if self._next_foot == 'RIGHT':
if self._right_direction == "FW":
if self._right_color == len(TwoFeetTempoMove.COLORS_ALPHA) - 1:
self._right_color = self._right_color - 1
self._right_direction = "BW"
else:
self._right_color = self._right_color + 1
else:
if self._right_color == 0:
self._right_color = self._right_color + 1
self._right_direction = "FW"
else:
self._right_color = self._right_color - 1
self.tt_gui.set_right_color(TwoFeetTempoMove.COLORS_ALPHA[self._right_color])
logging.debug("\tmove RIGHT foot to " + TwoFeetTempoMove.COLORS_ALPHA[self._right_color])
self._next_foot = 'LEFT'
else:
if self._left_direction == "FW":
if self._left_color == len(TwoFeetTempoMove.COLORS_ALPHA) - 1:
self._left_color = self._left_color - 1
self._left_direction = "BW"
else:
self._left_color = self._left_color + 1
else:
if self._left_color == 0:
self._left_color = self._left_color + 1
self._left_direction = "FW"
else:
self._left_color = self._left_color - 1
self.tt_gui.set_left_color(TwoFeetTempoMove.COLORS_ALPHA[self._left_color])
logging.debug("\tmove LEFT foot to " + TwoFeetTempoMove.COLORS_ALPHA[self._left_color])
self._next_foot = 'RIGHT'
| 45.368852 | 106 | 0.592954 | 652 | 5,535 | 4.673313 | 0.142638 | 0.065638 | 0.089596 | 0.162783 | 0.605514 | 0.507384 | 0.409583 | 0.350837 | 0.258943 | 0.258943 | 0 | 0.026302 | 0.313098 | 5,535 | 121 | 107 | 45.743802 | 0.775118 | 0.024571 | 0 | 0.352941 | 0 | 0 | 0.051001 | 0 | 0 | 0 | 0 | 0 | 0.019608 | 1 | 0.068627 | false | 0 | 0.039216 | 0.019608 | 0.186275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
16ef740b41f41832481d4956834bb037ddc3b7b6 | 2,614 | py | Python | tests/test_nested_structures_inside_structure_values.py | Robinson04/StructNoSQL | 335c63593025582336bb67ad0b0ed39d30800b74 | [
"MIT"
] | 3 | 2020-10-30T23:31:26.000Z | 2022-03-30T21:48:40.000Z | tests/test_nested_structures_inside_structure_values.py | Robinson04/StructNoSQL | 335c63593025582336bb67ad0b0ed39d30800b74 | [
"MIT"
] | 42 | 2020-09-16T15:23:11.000Z | 2021-09-20T13:00:50.000Z | tests/test_nested_structures_inside_structure_values.py | Robinson04/StructNoSQL | 335c63593025582336bb67ad0b0ed39d30800b74 | [
"MIT"
] | 2 | 2021-01-03T21:37:22.000Z | 2021-08-12T20:28:52.000Z | import unittest
from typing import Set, Optional, Dict, List
from uuid import uuid4
from StructNoSQL import BaseField, MapModel, TableDataModel
from tests.components.playground_table_clients import PlaygroundDynamoDBBasicTable, TEST_ACCOUNT_ID
class TableModel(TableDataModel):
accountId = BaseField(field_type=str, required=True)
nestedDictDictStructure = BaseField(field_type=Dict[str, Dict[str, bool]], required=False, key_name='itemKey')
# nestedDictListStructure = BaseField(field_type=Dict[str, List[str]], required=False)
# nestedDictSetStructure = BaseField(field_type=Dict[str, Set[str]], required=False)
class TestsNestedStructuresInsideStructureValues(unittest.TestCase):
def __init__(self, method_name: str):
super().__init__(methodName=method_name)
self.users_table = PlaygroundDynamoDBBasicTable(data_model=TableModel)
def test_nested_dict_dict_structure(self):
random_parent_key = f"parentKey_{uuid4()}"
random_child_key = f"childKey_{uuid4()}"
keys_fields_switch = list(self.users_table.fields_switch.keys())
self.assertIn('nestedDictDictStructure.{{itemKey}}.{{itemKeyChild}}', keys_fields_switch)
update_success = self.users_table.update_field(
key_value=TEST_ACCOUNT_ID,
field_path='nestedDictDictStructure.{{itemKey}}.{{itemKeyChild}}',
query_kwargs={'itemKey': random_parent_key, 'itemKeyChild': random_child_key},
value_to_set=True
)
self.assertTrue(update_success)
retrieved_item = self.users_table.get_field(
key_value=TEST_ACCOUNT_ID,
field_path='nestedDictDictStructure.{{itemKey}}',
query_kwargs={'itemKey': random_parent_key}
)
self.assertEqual(retrieved_item, {'itemKeyChild': True})
removed_item = self.users_table.remove_field(
key_value=TEST_ACCOUNT_ID,
field_path='nestedDictDictStructure.{{itemKey}}',
query_kwargs={'itemKey': random_parent_key}
)
self.assertEqual(removed_item, {'itemKeyChild': True})
retrieved_expected_none_item = self.users_table.get_field(
TEST_ACCOUNT_ID,
field_path='nestedDictDictStructure.{{itemKey}}',
query_kwargs={'itemKey': random_parent_key}
)
self.assertIsNone(retrieved_expected_none_item)
def test_nested_dict_list_structure(self):
# todo: implement
pass
def test_nested_dict_set_structure(self):
# todo: implement
pass
if __name__ == '__main__':
unittest.main()
| 38.441176 | 114 | 0.704285 | 277 | 2,614 | 6.270758 | 0.306859 | 0.031088 | 0.048359 | 0.041451 | 0.342545 | 0.264824 | 0.218768 | 0.218768 | 0.218768 | 0.218768 | 0 | 0.001429 | 0.197016 | 2,614 | 67 | 115 | 39.014925 | 0.826108 | 0.076129 | 0 | 0.229167 | 0 | 0 | 0.134855 | 0.086722 | 0 | 0 | 0 | 0.014925 | 0.104167 | 1 | 0.083333 | false | 0.041667 | 0.104167 | 0 | 0.270833 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc43defd49d4ea43585c8d3910e9622ef8bc8d38 | 1,099 | py | Python | scrapy/spider/spider/items.py | huobingli/splider | a62f0553160531a0735b249b0dc49747e9c821f9 | [
"MIT"
] | null | null | null | scrapy/spider/spider/items.py | huobingli/splider | a62f0553160531a0735b249b0dc49747e9c821f9 | [
"MIT"
] | null | null | null | scrapy/spider/spider/items.py | huobingli/splider | a62f0553160531a0735b249b0dc49747e9c821f9 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Define here the models for your scraped items
#
# See documentation in:
# https://doc.scrapy.org/en/latest/topics/items.html
import scrapy
from scrapy.loader import ItemLoader
from scrapy.loader.processors import TakeFirst
# class SpiderItem(scrapy.Item):
# # define the fields for your item here like:
# # name = scrapy.Field()
# pass
#
#
#
# class TorrentItem(scrapy.Item):
# url = scrapy.Field()
# name = scrapy.Field()
# description = scrapy.Field()
# size = scrapy.Field()
#
# import scrapy
class StockstarItemLoader(ItemLoader):
# 自定义itemloader,用于存储爬虫所抓取的字段内容
default_output_processor = TakeFirst()
class StockstarItem(scrapy.Item): # 建立相应的字段
# define the fields for your item here like:
# name = scrapy.Field()
code = scrapy.Field() # 股票代码
abbr = scrapy.Field() # 股票简称
last_trade = scrapy.Field() # 最新价
chg_ratio = scrapy.Field() # 涨跌幅
chg_amt = scrapy.Field() # 涨跌额
chg_ratio_5min = scrapy.Field() # 5分钟涨幅
volumn = scrapy.Field() # 成交量
turn_over = scrapy.Field() # 成交额 | 24.977273 | 52 | 0.66424 | 133 | 1,099 | 5.428571 | 0.503759 | 0.213296 | 0.062327 | 0.049862 | 0.135734 | 0.135734 | 0.135734 | 0.135734 | 0.135734 | 0.135734 | 0 | 0.00348 | 0.215651 | 1,099 | 44 | 53 | 24.977273 | 0.834107 | 0.503185 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.214286 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc44f25c8ff96beccbbd3fbaa05ae2dcf6790cc6 | 576 | py | Python | fopp/Chapter 12. Functions/get_num_digits.py | H2u-Hwng/EVC | c650fe7356a333011514cf9025dfd97bf71b1de3 | [
"MIT"
] | null | null | null | fopp/Chapter 12. Functions/get_num_digits.py | H2u-Hwng/EVC | c650fe7356a333011514cf9025dfd97bf71b1de3 | [
"MIT"
] | null | null | null | fopp/Chapter 12. Functions/get_num_digits.py | H2u-Hwng/EVC | c650fe7356a333011514cf9025dfd97bf71b1de3 | [
"MIT"
] | null | null | null | # Take number, and convert integer to string
# Calculate and return number of digits
def get_num_digits(num):
# Convert int to str
num_str = str(num)
# Calculate number of digits
digits = len(num_str)
return digits
# Define main function
def main():
# Prompt user for an integer
number = int(input('Enter an integer: '))
# Obtain number of digits
num_digits = get_num_digits(number)
# Display result
print(f'The number of digits in number {number} is {num_digits}.')
# Call main function
main()
| 20.571429 | 70 | 0.647569 | 80 | 576 | 4.5625 | 0.4375 | 0.087671 | 0.153425 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.276042 | 576 | 27 | 71 | 21.333333 | 0.8753 | 0.402778 | 0 | 0 | 0 | 0 | 0.221557 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0 | 0 | 0.333333 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc450f5f688b95fda7b269a4ca568c7ecc5143ca | 4,992 | py | Python | whois/__init__.py | mzpqnxow/whois-1 | b5623ed25cfa58d9457d30dae640e69b9e530b23 | [
"MIT"
] | null | null | null | whois/__init__.py | mzpqnxow/whois-1 | b5623ed25cfa58d9457d30dae640e69b9e530b23 | [
"MIT"
] | null | null | null | whois/__init__.py | mzpqnxow/whois-1 | b5623ed25cfa58d9457d30dae640e69b9e530b23 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import print_function
from __future__ import absolute_import
from __future__ import unicode_literals
from __future__ import division
from future import standard_library
standard_library.install_aliases()
from builtins import *
import re
import sys
import os
import subprocess
import socket
from .parser import WhoisEntry
from .whois import NICClient
# thanks to https://www.regextester.com/104038
IPV4_OR_V6 = re.compile(r"((^\s*((([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5]))\s*$)|(^\s*((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(%.+)?\s*$))")
def whois(url, command=False, flags=0):
# clean domain to expose netloc
ip_match = IPV4_OR_V6.match(url)
if ip_match:
domain = url
try:
result = socket.gethostbyaddr(url)
except socket.herror as e:
pass
else:
domain = extract_domain(result[0])
else:
domain = extract_domain(url)
if command:
# try native whois command
r = subprocess.Popen(['whois', domain], stdout=subprocess.PIPE)
text = r.stdout.read().decode()
else:
# try builtin client
nic_client = NICClient()
text = nic_client.whois_lookup(None, domain.encode('idna'), flags)
return WhoisEntry.load(domain, text)
suffixes = None
def extract_domain(url):
"""Extract the domain from the given URL
>>> print(extract_domain('http://www.google.com.au/tos.html'))
google.com.au
>>> print(extract_domain('abc.def.com'))
def.com
>>> print(extract_domain(u'www.公司.hk'))
公司.hk
>>> print(extract_domain('chambagri.fr'))
chambagri.fr
>>> print(extract_domain('www.webscraping.com'))
webscraping.com
>>> print(extract_domain('198.252.206.140'))
stackoverflow.com
>>> print(extract_domain('102.112.2O7.net'))
2o7.net
>>> print(extract_domain('globoesporte.globo.com'))
globo.com
>>> print(extract_domain('1-0-1-1-1-0-1-1-1-1-1-1-1-.0-0-0-0-0-0-0-0-0-0-0-0-0-10-0-0-0-0-0-0-0-0-0-0-0-0-0.info'))
0-0-0-0-0-0-0-0-0-0-0-0-0-10-0-0-0-0-0-0-0-0-0-0-0-0-0.info
>>> print(extract_domain('2607:f8b0:4006:802::200e'))
1e100.net
>>> print(extract_domain('172.217.3.110'))
1e100.net
"""
if IPV4_OR_V6.match(url):
# this is an IP address
return socket.gethostbyaddr(url)[0]
# load known TLD suffixes
global suffixes
if not suffixes:
# downloaded from https://publicsuffix.org/list/public_suffix_list.dat
tlds_path = os.path.join(os.getcwd(), os.path.dirname(__file__), 'data', 'public_suffix_list.dat')
with open(tlds_path, encoding='utf-8') as tlds_fp:
suffixes = set(line.encode('utf-8') for line in tlds_fp.read().splitlines() if line and not line.startswith('//'))
if not isinstance(url, str):
url = url.decode('utf-8')
url = re.sub('^.*://', '', url)
url = url.split('/')[0].lower()
# find the longest suffix match
domain = b''
split_url = url.split('.')
for section in reversed(split_url):
if domain:
domain = b'.' + domain
domain = section.encode('utf-8') + domain
if domain not in suffixes:
if not b'.' in domain and len(split_url) >= 2:
# If this is the first section and there wasn't a match, try to
# match the first two sections - if that works, keep going
# See https://github.com/richardpenman/whois/issues/50
second_order_tld = '.'.join([split_url[-2], split_url[-1]])
if not second_order_tld.encode('utf-8') in suffixes:
break
else:
break
return domain.decode('utf-8')
if __name__ == '__main__':
try:
url = sys.argv[1]
except IndexError:
print('Usage: %s url' % sys.argv[0])
else:
print(whois(url))
| 42.305085 | 1,227 | 0.55629 | 927 | 4,992 | 2.9137 | 0.211435 | 0.035542 | 0.048871 | 0.059237 | 0.197705 | 0.183636 | 0.178082 | 0.155498 | 0.138467 | 0.138467 | 0 | 0.109804 | 0.182692 | 4,992 | 117 | 1,228 | 42.666667 | 0.552206 | 0.252404 | 0 | 0.132353 | 0 | 0.014706 | 0.358366 | 0.337107 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029412 | false | 0.014706 | 0.191176 | 0 | 0.264706 | 0.044118 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc45c15aebfb0da618b90f3884eb8a545e0f2823 | 3,255 | py | Python | app/dialog/avatar_picture_dialog.py | tirinox/alphavatarbot | 5adac8c9c4534206eaf6c146f6e194ed5951d055 | [
"MIT"
] | 1 | 2021-03-18T15:35:15.000Z | 2021-03-18T15:35:15.000Z | app/dialog/avatar_picture_dialog.py | tirinox/alphavatarbot | 5adac8c9c4534206eaf6c146f6e194ed5951d055 | [
"MIT"
] | null | null | null | app/dialog/avatar_picture_dialog.py | tirinox/alphavatarbot | 5adac8c9c4534206eaf6c146f6e194ed5951d055 | [
"MIT"
] | 1 | 2021-03-18T15:35:51.000Z | 2021-03-18T15:35:51.000Z | import asyncio
from contextlib import AsyncExitStack
from aiogram.dispatcher.filters.state import StatesGroup, State
from aiogram.dispatcher.storage import FSMContextProxy
from aiogram.types import Message, PhotoSize, ReplyKeyboardRemove, ContentTypes
from aiogram.utils.helper import HelperMode
from dialog.avatar_image_work import download_tg_photo, get_userpic, combine_frame_and_photo_v2, img_to_bio
from dialog.base import BaseDialog, message_handler
from localization import BaseLocalization
from lib.depcont import DepContainer
from lib.texts import kbd
# todo: accept documents!
class AvatarStates(StatesGroup):
mode = HelperMode.snake_case # fixme: no state handle
MAIN = State()
class AvatarDialog(BaseDialog):
def __init__(self, loc: BaseLocalization, data: FSMContextProxy, d: DepContainer):
super().__init__(loc, data, d)
self._work_lock = asyncio.Lock()
def menu_kbd(self):
return kbd([
self.loc.BUTTON_AVA_FROM_MY_USERPIC,
], vert=True)
@message_handler(state=None)
async def on_no_state(self, message: Message):
await self.on_enter(message)
@message_handler(state=AvatarStates.MAIN)
async def on_enter(self, message: Message):
if message.text == self.loc.BUTTON_AVA_FROM_MY_USERPIC:
await self.handle_avatar_picture(message, self.loc)
else:
await AvatarStates.MAIN.set()
await message.answer(self.loc.TEXT_AVA_WELCOME, reply_markup=self.menu_kbd())
@message_handler(state=AvatarStates.MAIN, content_types=ContentTypes.PHOTO)
async def on_picture(self, message: Message):
await self.handle_avatar_picture(message, self.loc, explicit_picture=message.photo[0])
async def handle_avatar_picture(self, message: Message, loc: BaseLocalization, explicit_picture: PhotoSize = None):
async with AsyncExitStack() as stack:
stack.enter_async_context(self._work_lock)
# POST A LOADING STICKER
sticker = await message.answer_sticker(self.loc.LOADING_STICKER,
disable_notification=True,
reply_markup=ReplyKeyboardRemove())
# CLEAN UP IN THE END
stack.push_async_callback(sticker.delete)
if explicit_picture is not None:
user_pic = await download_tg_photo(explicit_picture)
else:
user_pic = await get_userpic(message.from_user)
w, h = user_pic.size
if not w or not h:
await message.answer(loc.TEXT_AVA_ERR_INVALID, reply_markup=self.menu_kbd())
return
if not ((64 <= w <= 4096) and (64 <= h <= 4096)):
await message.answer(loc.TEXT_AVA_ERR_SIZE, reply_markup=self.menu_kbd())
return
# pic = await combine_frame_and_photo(self.deps.cfg, user_pic)
pic = await combine_frame_and_photo_v2(self.deps.cfg, user_pic)
user_id = message.from_user.id
pic = img_to_bio(pic, name=f'alpha_avatar_{user_id}.png')
await message.answer_document(pic, caption=loc.TEXT_AVA_READY, reply_markup=self.menu_kbd())
| 39.695122 | 119 | 0.677112 | 404 | 3,255 | 5.200495 | 0.324257 | 0.023322 | 0.042837 | 0.036173 | 0.254641 | 0.150405 | 0.097097 | 0.039981 | 0 | 0 | 0 | 0.006093 | 0.243625 | 3,255 | 81 | 120 | 40.185185 | 0.847279 | 0.046083 | 0 | 0.070175 | 0 | 0 | 0.00839 | 0.00839 | 0 | 0 | 0 | 0.012346 | 0 | 1 | 0.035088 | false | 0 | 0.192982 | 0.017544 | 0.350877 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc4baf04ed5a5ebda75a1b19ad254a0f725f6190 | 2,027 | py | Python | nehebn2.py | psifertex/nehebn2 | 8b62a88a9d06624dbb62b8b74cc0566172fba970 | [
"MIT"
] | null | null | null | nehebn2.py | psifertex/nehebn2 | 8b62a88a9d06624dbb62b8b74cc0566172fba970 | [
"MIT"
] | null | null | null | nehebn2.py | psifertex/nehebn2 | 8b62a88a9d06624dbb62b8b74cc0566172fba970 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
from components import ProgramState
import binaryninja as binja
import argparse
import os.path
import curses
# TODO...impliment live-refreashing the settings.json during run (add the keybinding and check for it here in the global input loop)
# TODO...support multi-key presses? Not sure if this already works or not
# TODO...make sure to support small terminals (I think it does right now, but I should add some more checks so nothing goes out of bounds)
def main(stdscr):
# Setup
parser = argparse.ArgumentParser(description='Nearly Headless BinaryNinja.')
parser.add_argument('filename', nargs='?', default="")
args = parser.parse_args()
program = ''
if not args.filename == "":
if os.path.isfile(args.filename):
bv = binja.BinaryViewType.get_view_of_file(''.join(args.filename), False)
bv.update_analysis()
while not str(bv.analysis_progress) == "Idle":
prog = bv.analysis_progress
stdscr.erase()
stdscr.border()
state = ''
if prog.state == binja.AnalysisState.DisassembleState:
state = "Disassembling"
else:
state = "Analyzing"
loadingText = "Loading File: "
prog = int((prog.count/(prog.total+1))*34.0)
stdscr.addstr(2, 4, loadingText)
stdscr.addstr(2, 4 + len(loadingText), state)
stdscr.addstr(4, 4, '[' + '#'*prog + ' '*(34-prog) + ']')
stdscr.refresh()
program = ProgramState(stdscr, bv)
else:
raise IOError("File does not exist.")
else:
program = ProgramState(stdscr)
key = ""
while program.is_running:
# Input Filtering
try:
key = stdscr.getkey()
except curses.error as err:
if not str(err) == "no input":
raise curses.error(str(err))
else:
key = "" # Clear Key Buffer
# Rendering and input
program.parseInput(key)
program.render()
curses.doupdate()
if __name__ == "__main__":
background = "2a2a2a"
text = "e0e0e0"
curses.wrapper(main)
| 28.957143 | 138 | 0.644795 | 253 | 2,027 | 5.098814 | 0.561265 | 0.027907 | 0.027907 | 0.021705 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01225 | 0.23483 | 2,027 | 69 | 139 | 29.376812 | 0.819471 | 0.207203 | 0 | 0.12 | 0 | 0 | 0.080675 | 0 | 0 | 0 | 0 | 0.014493 | 0 | 1 | 0.02 | false | 0 | 0.1 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc4f1db83b181105ad1b030028d4a99321957de7 | 560 | py | Python | boards/migrations/0024_boardpreferences_moderators.py | oscarsiles/jotlet | 361f7ad0d32ea96d012020a67493931482207036 | [
"BSD-3-Clause"
] | null | null | null | boards/migrations/0024_boardpreferences_moderators.py | oscarsiles/jotlet | 361f7ad0d32ea96d012020a67493931482207036 | [
"BSD-3-Clause"
] | 2 | 2022-03-21T22:22:33.000Z | 2022-03-28T22:18:33.000Z | boards/migrations/0024_boardpreferences_moderators.py | oscarsiles/jotlet | 361f7ad0d32ea96d012020a67493931482207036 | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 4.0.3 on 2022-03-01 14:42
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('boards', '0023_alter_image_type'),
]
operations = [
migrations.AddField(
model_name='boardpreferences',
name='moderators',
field=models.ManyToManyField(blank=True, related_name='moderated_boards', to=settings.AUTH_USER_MODEL),
),
]
| 26.666667 | 115 | 0.673214 | 62 | 560 | 5.903226 | 0.709677 | 0.054645 | 0.087432 | 0.114754 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043779 | 0.225 | 560 | 20 | 116 | 28 | 0.799539 | 0.080357 | 0 | 0 | 1 | 0 | 0.134503 | 0.040936 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc511f5404ed81ec6c064f4f97b303375361769d | 774 | py | Python | leetcode/47.py | windniw/just-for-fun | 54e5c2be145f3848811bfd127f6a89545e921570 | [
"Apache-2.0"
] | 1 | 2019-08-28T23:15:25.000Z | 2019-08-28T23:15:25.000Z | leetcode/47.py | windniw/just-for-fun | 54e5c2be145f3848811bfd127f6a89545e921570 | [
"Apache-2.0"
] | null | null | null | leetcode/47.py | windniw/just-for-fun | 54e5c2be145f3848811bfd127f6a89545e921570 | [
"Apache-2.0"
] | null | null | null | """
link: https://leetcode.com/problems/permutations-ii
problem: 求全排列,nums中存在重复数
solution: 同46,加上排序即可
"""
class Solution:
def permuteUnique(self, nums: List[int]) -> List[List[int]]:
if len(nums) == 1:
return [nums]
new_nums = nums.copy()
new_nums.sort()
res = []
for i in range(0, len(new_nums)):
if i + 1 < len(new_nums) and new_nums[i] == new_nums[i + 1]:
continue
new_nums[i], new_nums[0] = new_nums[0], new_nums[i]
sub_result = self.permuteUnique(new_nums[1:])
for r in sub_result:
res.append([new_nums[0]] + r.copy())
new_nums[i], new_nums[0] = new_nums[0], new_nums[i]
return res
| 28.666667 | 73 | 0.529716 | 106 | 774 | 3.698113 | 0.367925 | 0.285714 | 0.122449 | 0.112245 | 0.201531 | 0.163265 | 0.163265 | 0.163265 | 0.163265 | 0.163265 | 0 | 0.023166 | 0.330749 | 774 | 26 | 74 | 29.769231 | 0.733591 | 0.127907 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc53c586e44516a506fdeff0f180d92c8730dd5b | 908 | py | Python | courses/migrations/0003_alter_content_options_alter_module_options_and_more.py | antonnifo/E-Soma | 93d49b27dedbff58d19f8245a79693762fc819d5 | [
"MIT"
] | 1 | 2022-02-09T06:28:04.000Z | 2022-02-09T06:28:04.000Z | courses/migrations/0003_alter_content_options_alter_module_options_and_more.py | antonnifo/E-Soma | 93d49b27dedbff58d19f8245a79693762fc819d5 | [
"MIT"
] | null | null | null | courses/migrations/0003_alter_content_options_alter_module_options_and_more.py | antonnifo/E-Soma | 93d49b27dedbff58d19f8245a79693762fc819d5 | [
"MIT"
] | 1 | 2022-02-09T06:29:11.000Z | 2022-02-09T06:29:11.000Z | # Generated by Django 4.0.1 on 2022-01-20 13:10
import courses.fields
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('courses', '0002_video_text_image_file_content'),
]
operations = [
migrations.AlterModelOptions(
name='content',
options={'ordering': ['order']},
),
migrations.AlterModelOptions(
name='module',
options={'ordering': ['order']},
),
migrations.AddField(
model_name='content',
name='order',
field=courses.fields.OrderField(blank=True, default=0),
preserve_default=False,
),
migrations.AddField(
model_name='module',
name='order',
field=courses.fields.OrderField(blank=True, default=0),
preserve_default=False,
),
]
| 25.942857 | 67 | 0.562775 | 83 | 908 | 6.048193 | 0.518072 | 0.077689 | 0.123506 | 0.119522 | 0.294821 | 0.294821 | 0.294821 | 0.294821 | 0.294821 | 0.294821 | 0 | 0.033816 | 0.316079 | 908 | 34 | 68 | 26.705882 | 0.774557 | 0.049559 | 0 | 0.571429 | 1 | 0 | 0.119628 | 0.039489 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.178571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc56aa53a834b83d16f942b242b5f67998363eda | 22,135 | py | Python | mscreen/autodocktools_prepare_py3k/mglutil/web/services/AppService_services.py | e-mayo/mscreen | a50f0b2f7104007c730baa51b4ec65c891008c47 | [
"MIT"
] | 9 | 2021-03-06T04:24:28.000Z | 2022-01-03T09:53:07.000Z | mglutil/web/services/AppService_services.py | e-mayo/autodocktools-prepare-py3k | 2dd2316837bcb7c19384294443b2855e5ccd3e01 | [
"BSD-3-Clause"
] | 3 | 2021-03-07T05:37:16.000Z | 2021-09-19T15:06:54.000Z | mglutil/web/services/AppService_services.py | e-mayo/autodocktools-prepare-py3k | 2dd2316837bcb7c19384294443b2855e5ccd3e01 | [
"BSD-3-Clause"
] | 4 | 2019-08-28T23:11:39.000Z | 2021-11-27T08:43:36.000Z | ##################################################
# ./AppService_services.py
# generated by ZSI.wsdl2python
#
#
##################################################
from .AppService_services_types import *
from .AppService_services_types import \
nbcr_sdsc_edu_opal_types as ns1
import urllib.parse, types
from ZSI.TCcompound import Struct
from ZSI import client
import ZSI
class AppServiceInterface:
def getAppServicePortType(self, portAddress=None, **kw):
raise NonImplementationError("method not implemented")
class AppServiceLocator(AppServiceInterface):
AppServicePortType_address = "https://rocks-106.sdsc.edu:8443/axis/services/AutogridServicePort"
def getAppServicePortTypeAddress(self):
return AppServiceLocator.AppServicePortType_address
def getAppServicePortType(self, portAddress=None, **kw):
return AppServicePortSoapBindingSOAP(portAddress or AppServiceLocator.AppServicePortType_address, **kw)
class AppServicePortSoapBindingSOAP:
def __init__(self, addr, **kw):
netloc = (urllib.parse.urlparse(addr)[1]).split(":") + [80,]
if "host" not in kw:
kw["host"] = netloc[0]
if "port" not in kw:
kw["port"] = int(netloc[1])
if "url" not in kw:
kw["url"] = urllib.parse.urlparse(addr)[2]
self.binding = client.Binding(**kw)
def destroy(self, request):
"""
@param: request is str
@return: response from destroyResponse::
_destroyOutput: ns1.StatusOutputType_Def
_baseURL: str
_code: int
_message: str
"""
if not isinstance(request, str):
raise TypeError("%s incorrect request type" %(request.__class__))
kw = {'requestclass': destroyRequestWrapper}
response = self.binding.Send(None, None, request, soapaction="http://nbcr.sdsc.edu/opal/destroy", **kw)
response = self.binding.Receive(destroyResponseWrapper())
if not isinstance(response, destroyResponse) and\
not issubclass(destroyResponse, response.__class__):
raise TypeError("%s incorrect response type" %(response.__class__))
return response
def getAppConfig(self, request):
"""
@param: request to getAppConfigRequest
@return: response from getAppConfigResponse::
_getAppConfigOutput: ns1.AppConfigType_Def
_binaryLocation: str
_defaultArgs: str, optional
_metadata: ns1.AppMetadataType_Def
_info: str, optional
_types: ns1.ArgumentsType_Def, optional
_flags: ns1.FlagsArrayType_Def, optional
_flag: ns1.FlagsType_Def, optional
_id: str
_tag: str
_textDesc: str, optional
_implicitParams: ns1.ImplicitParamsArrayType_Def, optional
_param: ns1.ImplicitParamsType_Def, optional
_extension: str, optional
_id: str
_ioType: ns1.IOType_Def
_IOType: str, optional
_max: int, optional
_min: int, optional
_name: str, optional
_required: boolean, optional
_semanticType: str, optional
_textDesc: str, optional
_taggedParams: ns1.ParamsArrayType_Def, optional
_param: ns1.ParamsType_Def, optional
_id: str
_ioType: ns1.IOType_Def, optional
_paramType: ns1.ParamType_Def
_ParamType: str, optional
_required: boolean, optional
_semanticType: str, optional
_tag: str, optional
_textDesc: str, optional
_value: str, optional
_separator: str, optional
_untaggedParams: ns1.ParamsArrayType_Def, optional
_usage: str
_parallel: boolean
"""
if not isinstance(request, getAppConfigRequest) and\
not issubclass(getAppConfigRequest, request.__class__):
raise TypeError("%s incorrect request type" %(request.__class__))
kw = {}
response = self.binding.Send(None, None, request, soapaction="http://nbcr.sdsc.edu/opal/getAppConfig", **kw)
response = self.binding.Receive(getAppConfigResponseWrapper())
if not isinstance(response, getAppConfigResponse) and\
not issubclass(getAppConfigResponse, response.__class__):
raise TypeError("%s incorrect response type" %(response.__class__))
return response
def getAppMetadata(self, request):
"""
@param: request to getAppMetadataRequest
@return: response from getAppMetadataResponse::
_getAppMetadataOutput: ns1.AppMetadataType_Def
_info: str, optional
_types: ns1.ArgumentsType_Def, optional
_flags: ns1.FlagsArrayType_Def, optional
_flag: ns1.FlagsType_Def, optional
_id: str
_tag: str
_textDesc: str, optional
_implicitParams: ns1.ImplicitParamsArrayType_Def, optional
_param: ns1.ImplicitParamsType_Def, optional
_extension: str, optional
_id: str
_ioType: ns1.IOType_Def
_IOType: str, optional
_max: int, optional
_min: int, optional
_name: str, optional
_required: boolean, optional
_semanticType: str, optional
_textDesc: str, optional
_taggedParams: ns1.ParamsArrayType_Def, optional
_param: ns1.ParamsType_Def, optional
_id: str
_ioType: ns1.IOType_Def, optional
_paramType: ns1.ParamType_Def
_ParamType: str, optional
_required: boolean, optional
_semanticType: str, optional
_tag: str, optional
_textDesc: str, optional
_value: str, optional
_separator: str, optional
_untaggedParams: ns1.ParamsArrayType_Def, optional
_usage: str
"""
if not isinstance(request, getAppMetadataRequest) and\
not issubclass(getAppMetadataRequest, request.__class__):
raise TypeError("%s incorrect request type" %(request.__class__))
kw = {}
response = self.binding.Send(None, None, request, soapaction="http://nbcr.sdsc.edu/opal/getAppMetadata", **kw)
response = self.binding.Receive(getAppMetadataResponseWrapper())
if not isinstance(response, getAppMetadataResponse) and\
not issubclass(getAppMetadataResponse, response.__class__):
raise TypeError("%s incorrect response type" %(response.__class__))
return response
def getOutputAsBase64ByName(self, request):
"""
@param: request to getOutputAsBase64ByNameRequest::
_getOutputAsBase64ByNameInput: ns1.OutputsByNameInputType_Def
_fileName: str
_jobID: str
@return: response from getOutputAsBase64ByNameResponse::
_item: str, optional
"""
if not isinstance(request, getOutputAsBase64ByNameRequest) and\
not issubclass(getOutputAsBase64ByNameRequest, request.__class__):
raise TypeError("%s incorrect request type" %(request.__class__))
kw = {}
response = self.binding.Send(None, None, request, soapaction="http://nbcr.sdsc.edu/opal/getOutputAsBase64ByName", **kw)
response = self.binding.Receive(getOutputAsBase64ByNameResponseWrapper())
if not isinstance(response, getOutputAsBase64ByNameResponse) and\
not issubclass(getOutputAsBase64ByNameResponse, response.__class__):
raise TypeError("%s incorrect response type" %(response.__class__))
return response
def getOutputs(self, request):
"""
@param: request is str
@return: response from getOutputsResponse::
_getOutputsOutput: ns1.JobOutputType_Def
_outputFile: ns1.OutputFileType_Def, optional
_name: str
_url: str
_stdErr: str, optional
_stdOut: str, optional
"""
if not isinstance(request, str):
raise TypeError("%s incorrect request type" %(request.__class__))
kw = {'requestclass': getOutputsRequestWrapper}
response = self.binding.Send(None, None, request, soapaction="http://nbcr.sdsc.edu/opal/getOutputs", **kw)
response = self.binding.Receive(getOutputsResponseWrapper())
if not isinstance(response, getOutputsResponse) and\
not issubclass(getOutputsResponse, response.__class__):
raise TypeError("%s incorrect response type" %(response.__class__))
return response
def launchJob(self, request):
"""
@param: request to launchJobRequest::
_launchJobInput: ns1.JobInputType_Def
_argList: str, optional
_inputFile: ns1.InputFileType_Def, optional
_contents: str
_name: str
_numProcs: int, optional
@return: response from launchJobResponse::
_launchJobOutput: ns1.JobSubOutputType_Def
_jobID: str
_status: ns1.StatusOutputType_Def
_baseURL: str
_code: int
_message: str
"""
if not isinstance(request, launchJobRequest) and\
not issubclass(launchJobRequest, request.__class__):
raise TypeError("%s incorrect request type" %(request.__class__))
kw = {}
response = self.binding.Send(None, None, request, soapaction="http://nbcr.sdsc.edu/opal/launchJob", **kw)
response = self.binding.Receive(launchJobResponseWrapper())
if not isinstance(response, launchJobResponse) and\
not issubclass(launchJobResponse, response.__class__):
raise TypeError("%s incorrect response type" %(response.__class__))
return response
def launchJobBlocking(self, request):
"""
@param: request to launchJobBlockingRequest::
_launchJobBlockingInput: ns1.JobInputType_Def
_argList: str, optional
_inputFile: ns1.InputFileType_Def, optional
_contents: str
_name: str
_numProcs: int, optional
@return: response from launchJobBlockingResponse::
_launchJobBlockingOutput: ns1.BlockingOutputType_Def
_jobOut: ns1.JobOutputType_Def
_outputFile: ns1.OutputFileType_Def, optional
_name: str
_url: str
_stdErr: str, optional
_stdOut: str, optional
_status: ns1.StatusOutputType_Def
_baseURL: str
_code: int
_message: str
"""
if not isinstance(request, launchJobBlockingRequest) and\
not issubclass(launchJobBlockingRequest, request.__class__):
raise TypeError("%s incorrect request type" %(request.__class__))
kw = {}
response = self.binding.Send(None, None, request, soapaction="http://nbcr.sdsc.edu/opal/launchJobBlocking", **kw)
response = self.binding.Receive(launchJobBlockingResponseWrapper())
if not isinstance(response, launchJobBlockingResponse) and\
not issubclass(launchJobBlockingResponse, response.__class__):
raise TypeError("%s incorrect response type" %(response.__class__))
return response
def queryStatus(self, request):
"""
@param: request is str
@return: response from queryStatusResponse::
_queryStatusOutput: ns1.StatusOutputType_Def
_baseURL: str
_code: int
_message: str
"""
if not isinstance(request, str):
raise TypeError("%s incorrect request type" %(request.__class__))
kw = {'requestclass': queryStatusRequestWrapper}
response = self.binding.Send(None, None, request, soapaction="http://nbcr.sdsc.edu/opal/queryStatus", **kw)
response = self.binding.Receive(queryStatusResponseWrapper())
if not isinstance(response, queryStatusResponse) and\
not issubclass(queryStatusResponse, response.__class__):
raise TypeError("%s incorrect response type" %(response.__class__))
return response
class destroyRequest(ns1.destroyInput_Dec):
if not hasattr( ns1.destroyInput_Dec(), "typecode" ):
typecode = ns1.destroyInput_Dec()
def __init__(self, name=None, ns=None):
ns1.destroyInput_Dec.__init__(self, name=None, ns=None)
class destroyRequestWrapper(destroyRequest):
"""wrapper for document:literal message"""
typecode = destroyRequest( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
destroyRequest.__init__( self, name=None, ns=None )
class destroyResponse(ns1.destroyOutput_Dec):
if not hasattr( ns1.destroyOutput_Dec(), "typecode" ):
typecode = ns1.destroyOutput_Dec()
def __init__(self, name=None, ns=None):
ns1.destroyOutput_Dec.__init__(self, name=None, ns=None)
class destroyResponseWrapper(destroyResponse):
"""wrapper for document:literal message"""
typecode = destroyResponse( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
destroyResponse.__init__( self, name=None, ns=None )
class getAppConfigRequest:
def __init__(self, name=None, ns=None):
getAppConfigRequest.typecode = Struct(getAppConfigRequest,[], pname=name, aname="%s" % name, oname="%s xmlns=\"\"" % name )
class getAppConfigRequestWrapper(getAppConfigRequest):
"""wrapper for document:literal message"""
typecode = getAppConfigRequest( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
getAppConfigRequest.__init__( self, name=None, ns=None )
class getAppConfigResponse(ns1.getAppConfigOutput_Dec):
if not hasattr( ns1.getAppConfigOutput_Dec(), "typecode" ):
typecode = ns1.getAppConfigOutput_Dec()
def __init__(self, name=None, ns=None):
ns1.getAppConfigOutput_Dec.__init__(self, name=None, ns=None)
class getAppConfigResponseWrapper(getAppConfigResponse):
"""wrapper for document:literal message"""
typecode = getAppConfigResponse( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
getAppConfigResponse.__init__( self, name=None, ns=None )
class getAppMetadataRequest:
def __init__(self, name=None, ns=None):
getAppMetadataRequest.typecode = Struct(getAppMetadataRequest,[], pname=name, aname="%s" % name, oname="%s xmlns=\"\"" % name )
class getAppMetadataRequestWrapper(getAppMetadataRequest):
"""wrapper for document:literal message"""
typecode = getAppMetadataRequest( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
getAppMetadataRequest.__init__( self, name=None, ns=None )
class getAppMetadataResponse(ns1.getAppMetadataOutput_Dec):
if not hasattr( ns1.getAppMetadataOutput_Dec(), "typecode" ):
typecode = ns1.getAppMetadataOutput_Dec()
def __init__(self, name=None, ns=None):
ns1.getAppMetadataOutput_Dec.__init__(self, name=None, ns=None)
class getAppMetadataResponseWrapper(getAppMetadataResponse):
"""wrapper for document:literal message"""
typecode = getAppMetadataResponse( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
getAppMetadataResponse.__init__( self, name=None, ns=None )
class getOutputAsBase64ByNameRequest(ns1.getOutputAsBase64ByNameInput_Dec):
if not hasattr( ns1.getOutputAsBase64ByNameInput_Dec(), "typecode" ):
typecode = ns1.getOutputAsBase64ByNameInput_Dec()
def __init__(self, name=None, ns=None):
ns1.getOutputAsBase64ByNameInput_Dec.__init__(self, name=None, ns=None)
class getOutputAsBase64ByNameRequestWrapper(getOutputAsBase64ByNameRequest):
"""wrapper for document:literal message"""
typecode = getOutputAsBase64ByNameRequest( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
getOutputAsBase64ByNameRequest.__init__( self, name=None, ns=None )
class getOutputAsBase64ByNameResponse(ns1.getOutputAsBase64ByNameOutput_Dec):
if not hasattr( ns1.getOutputAsBase64ByNameOutput_Dec(), "typecode" ):
typecode = ns1.getOutputAsBase64ByNameOutput_Dec()
def __init__(self, name=None, ns=None):
ns1.getOutputAsBase64ByNameOutput_Dec.__init__(self, name=None, ns=None)
class getOutputAsBase64ByNameResponseWrapper(getOutputAsBase64ByNameResponse):
"""wrapper for document:literal message"""
typecode = getOutputAsBase64ByNameResponse( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
getOutputAsBase64ByNameResponse.__init__( self, name=None, ns=None )
class getOutputsRequest(ns1.getOutputsInput_Dec):
if not hasattr( ns1.getOutputsInput_Dec(), "typecode" ):
typecode = ns1.getOutputsInput_Dec()
def __init__(self, name=None, ns=None):
ns1.getOutputsInput_Dec.__init__(self, name=None, ns=None)
class getOutputsRequestWrapper(getOutputsRequest):
"""wrapper for document:literal message"""
typecode = getOutputsRequest( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
getOutputsRequest.__init__( self, name=None, ns=None )
class getOutputsResponse(ns1.getOutputsOutput_Dec):
if not hasattr( ns1.getOutputsOutput_Dec(), "typecode" ):
typecode = ns1.getOutputsOutput_Dec()
def __init__(self, name=None, ns=None):
ns1.getOutputsOutput_Dec.__init__(self, name=None, ns=None)
class getOutputsResponseWrapper(getOutputsResponse):
"""wrapper for document:literal message"""
typecode = getOutputsResponse( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
getOutputsResponse.__init__( self, name=None, ns=None )
class launchJobBlockingRequest(ns1.launchJobBlockingInput_Dec):
if not hasattr( ns1.launchJobBlockingInput_Dec(), "typecode" ):
typecode = ns1.launchJobBlockingInput_Dec()
def __init__(self, name=None, ns=None):
ns1.launchJobBlockingInput_Dec.__init__(self, name=None, ns=None)
class launchJobBlockingRequestWrapper(launchJobBlockingRequest):
"""wrapper for document:literal message"""
typecode = launchJobBlockingRequest( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
launchJobBlockingRequest.__init__( self, name=None, ns=None )
class launchJobBlockingResponse(ns1.launchJobBlockingOutput_Dec):
if not hasattr( ns1.launchJobBlockingOutput_Dec(), "typecode" ):
typecode = ns1.launchJobBlockingOutput_Dec()
def __init__(self, name=None, ns=None):
ns1.launchJobBlockingOutput_Dec.__init__(self, name=None, ns=None)
class launchJobBlockingResponseWrapper(launchJobBlockingResponse):
"""wrapper for document:literal message"""
typecode = launchJobBlockingResponse( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
launchJobBlockingResponse.__init__( self, name=None, ns=None )
class launchJobRequest(ns1.launchJobInput_Dec):
if not hasattr( ns1.launchJobInput_Dec(), "typecode" ):
typecode = ns1.launchJobInput_Dec()
def __init__(self, name=None, ns=None):
ns1.launchJobInput_Dec.__init__(self, name=None, ns=None)
class launchJobRequestWrapper(launchJobRequest):
"""wrapper for document:literal message"""
typecode = launchJobRequest( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
launchJobRequest.__init__( self, name=None, ns=None )
class launchJobResponse(ns1.launchJobOutput_Dec):
if not hasattr( ns1.launchJobOutput_Dec(), "typecode" ):
typecode = ns1.launchJobOutput_Dec()
def __init__(self, name=None, ns=None):
ns1.launchJobOutput_Dec.__init__(self, name=None, ns=None)
class launchJobResponseWrapper(launchJobResponse):
"""wrapper for document:literal message"""
typecode = launchJobResponse( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
launchJobResponse.__init__( self, name=None, ns=None )
class queryStatusRequest(ns1.queryStatusInput_Dec):
if not hasattr( ns1.queryStatusInput_Dec(), "typecode" ):
typecode = ns1.queryStatusInput_Dec()
def __init__(self, name=None, ns=None):
ns1.queryStatusInput_Dec.__init__(self, name=None, ns=None)
class queryStatusRequestWrapper(queryStatusRequest):
"""wrapper for document:literal message"""
typecode = queryStatusRequest( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
queryStatusRequest.__init__( self, name=None, ns=None )
class queryStatusResponse(ns1.queryStatusOutput_Dec):
if not hasattr( ns1.queryStatusOutput_Dec(), "typecode" ):
typecode = ns1.queryStatusOutput_Dec()
def __init__(self, name=None, ns=None):
ns1.queryStatusOutput_Dec.__init__(self, name=None, ns=None)
class queryStatusResponseWrapper(queryStatusResponse):
"""wrapper for document:literal message"""
typecode = queryStatusResponse( name=None, ns=None ).typecode
def __init__( self, name=None, ns=None, **kw ):
queryStatusResponse.__init__( self, name=None, ns=None )
| 41.296642 | 136 | 0.651096 | 2,066 | 22,135 | 6.687803 | 0.099226 | 0.045162 | 0.056452 | 0.079033 | 0.555475 | 0.505609 | 0.448795 | 0.415865 | 0.385467 | 0.344069 | 0 | 0.010064 | 0.250373 | 22,135 | 535 | 137 | 41.373832 | 0.822636 | 0.252586 | 0 | 0.262948 | 1 | 0 | 0.066174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.175299 | false | 0 | 0.023904 | 0.007968 | 0.446215 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc5ad874ef55f36dd52e0759ceb93fea5f606b23 | 2,437 | py | Python | constellation_forms/migrations/0001_initial.py | ConstellationApps/Forms | 5d2bacf589c1a473cf619f34d569d33191b11285 | [
"ISC"
] | 2 | 2017-04-18T02:41:00.000Z | 2017-04-18T02:51:39.000Z | constellation_forms/migrations/0001_initial.py | ConstellationApps/Forms | 5d2bacf589c1a473cf619f34d569d33191b11285 | [
"ISC"
] | 33 | 2017-03-03T06:16:44.000Z | 2019-08-20T23:06:21.000Z | constellation_forms/migrations/0001_initial.py | ConstellationApps/Forms | 5d2bacf589c1a473cf619f34d569d33191b11285 | [
"ISC"
] | 1 | 2017-02-22T18:48:04.000Z | 2017-02-22T18:48:04.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.10.6 on 2017-03-15 00:56
from __future__ import unicode_literals
from django.conf import settings
import django.contrib.postgres.fields.jsonb
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Form',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('form_id', models.IntegerField()),
('version', models.IntegerField()),
('name', models.TextField()),
('description', models.TextField(blank=True)),
('elements', django.contrib.postgres.fields.jsonb.JSONField()),
],
options={
'ordering': ('-version',),
'db_table': 'form',
},
),
migrations.CreateModel(
name='FormSubmission',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('state', models.IntegerField(choices=[(0, 'draft'), (1, 'submitted'), (2, 'approved'), (3, 'denied')])),
('modified', models.DateField()),
('submission', django.contrib.postgres.fields.jsonb.JSONField()),
('form', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='constellation_forms.Form')),
('owner', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
options={
'db_table': 'form_submission',
},
),
migrations.CreateModel(
name='Validator',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.TextField()),
('regex', models.TextField()),
],
options={
'db_table': 'validators',
},
),
migrations.AlterUniqueTogether(
name='form',
unique_together=set([('form_id', 'version')]),
),
]
| 37.492308 | 142 | 0.552318 | 222 | 2,437 | 5.932432 | 0.40991 | 0.024298 | 0.047836 | 0.061503 | 0.32574 | 0.301443 | 0.23918 | 0.23918 | 0.179954 | 0.179954 | 0 | 0.012331 | 0.30119 | 2,437 | 64 | 143 | 38.078125 | 0.76101 | 0.027903 | 0 | 0.375 | 1 | 0 | 0.108199 | 0.010144 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.089286 | 0 | 0.160714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc5ae8aa40e31ef7936556c55251ff7ab9886286 | 2,258 | py | Python | src/webpy1/src/borough/dbsqli.py | ptphp/PyLib | 07ac99cf2deb725475f5771b123b9ea1375f5e65 | [
"Apache-2.0"
] | 1 | 2020-02-17T08:18:29.000Z | 2020-02-17T08:18:29.000Z | src/webpy1/src/borough/dbsqli.py | ptphp/PyLib | 07ac99cf2deb725475f5771b123b9ea1375f5e65 | [
"Apache-2.0"
] | null | null | null | src/webpy1/src/borough/dbsqli.py | ptphp/PyLib | 07ac99cf2deb725475f5771b123b9ea1375f5e65 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import sqlite3 as sqlite
import os.path as osp
import sys
class Sqli(object):
conn = ''
cursor = ''
def __init__(self, dbname):
try:
self.conn = sqlite.connect(osp.abspath(dbname))
except Exception, what:
print what
sys.exit()
self.conn.row_factory = sqlite.Row
self.cursor = self.conn.cursor()
def createTable(self):
self.cursor.execute('''
CREATE TABLE IF NOT EXISTS [website](
[id] INTEGER PRIMARY KEY,
[siteName] TEXT,
[loginUrl] TEXT,
[loginQuery] TEXT,
[postUrl] TEXT,
[postQuery] TEXT,
UNIQUE([siteName]));
''')
print "create table website "
self.cursor.execute('''
CREATE INDEX IF NOT EXISTS [website_idx_siteName] ON [website]([siteName]);
''')
print 'create website index'
self.conn.commit()
def createTable_com(self):
self.cursor.execute('''
CREATE TABLE IF NOT EXISTS [com](
[id] INTEGER PRIMARY KEY,
[title] TEXT,
[city] TEXT,
[url] TEXT,
UNIQUE([url]));
''')
print "create table com "
self.cursor.execute('''
CREATE INDEX IF NOT EXISTS [website_idx_url] ON [com]([url]);
''')
print 'create map index'
self.conn.commit()
def createTable_58(self):
self.cursor.execute('''
CREATE TABLE IF NOT EXISTS [com](
[id] INTEGER PRIMARY KEY,
[title] TEXT,
[city] TEXT,
[url] TEXT,
UNIQUE([url]));
''')
print "create table com "
self.cursor.execute('''
CREATE INDEX IF NOT EXISTS [website_idx_url] ON [com]([url]);
''')
print 'create map index'
self.conn.commit()
def query(self, sql):
try:
self.cursor.execute(sql)
self.conn.commit()
except Exception, what:
print what
def show(self):
r = self.cursor.fetchall()
return r
def showone(self):
return self.cursor.fetchone()
def __del__(self):
self.cursor.close()
self.conn.close()
| 27.204819 | 79 | 0.529672 | 248 | 2,258 | 4.754032 | 0.294355 | 0.093299 | 0.100933 | 0.117048 | 0.524173 | 0.476675 | 0.439355 | 0.439355 | 0.439355 | 0.402884 | 0 | 0.002706 | 0.345438 | 2,258 | 82 | 80 | 27.536585 | 0.794993 | 0.018601 | 0 | 0.546667 | 0 | 0 | 0.412376 | 0.019874 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.04 | null | null | 0.106667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc5eeaf616c5264490632d3b43d2af7080e1aea8 | 28,625 | py | Python | gate/mate_ksx3267v2.py | mrchoi87/IRSOSv4 | 886c3dcbeb64c3a8cc257b58692946fd5462312e | [
"BSD-3-Clause"
] | null | null | null | gate/mate_ksx3267v2.py | mrchoi87/IRSOSv4 | 886c3dcbeb64c3a8cc257b58692946fd5462312e | [
"BSD-3-Clause"
] | null | null | null | gate/mate_ksx3267v2.py | mrchoi87/IRSOSv4 | 886c3dcbeb64c3a8cc257b58692946fd5462312e | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
#
# -*- coding: utf-8 -*-
#
# Copyright (c) 2018 JiNong, Inc.
# All right reserved.
#
import struct
import time
import socket
import select
import traceback
import hashlib
import json
from enum import IntEnum
from threading import Thread, Lock
from mate import Mate, ThreadMate, DevType
from mblock import MBlock, BlkType, StatCode, ResCode, CmdCode, Observation, Request, Response, NotiCode, Notice
from pymodbus.client.sync import ModbusSerialClient
from pymodbus.client.sync import ModbusTcpClient
class NodeType(IntEnum):
SENNODE = 1
ACTNODE = 2
INTNODE = 3
NUTNODE = 4
class ProtoVer(IntEnum):
KS_X_3267_2020 = 10
KS_X_3267_2018 = 101
TTA_1 = 201
class KSX3267MateV2(ThreadMate):
_SLEEP = 0.5
_VERSION = "KSX3267_0.1"
_KEYWORDS = {"value" : (2, "float"), "status" : (1, "status"),
"opid" : (1, "short"), "state-hold-time" : (2, "int"), "ratio": (1, "short"),
"position" : (1, "short"), "remain-time" : (2, "int"),
"control": (1, "control"), "area" : (1, "short"), "alert" : (1, "alert"),
"hold-time" : (2, "int"), "operation" : (1, "operation"),
"time" : (2, "int"), "opentime" : (1, "short"), "closetime" : (1, "short"),
"EC": (2, "float"), "pH": (2, "float"), "on-sec" : (1, "short"),
"start-area" : (1, "short"), "stop-area": (1, "short"),
"epoch" : (2, "int"), "vfloat": (2, "float"), "vint" : (2, "int")}
_DEVINFOREG = 2
_DEVCODEREG = 101
def __init__(self, option, devinfo, coupleid, logger):
super(KSX3267MateV2, self).__init__(option, devinfo, coupleid, logger)
self._timeout = 3 if "timeout" not in option else option["timeout"]
self._conn = {}
self._tempthd = []
self._isdetecting = False
self._detection = {"port": [], "saddr":0, "eaddr":0, "opid":0}
#self._nodes = self._devinfo.getgw()["children"]
self._lock = Lock()
self._logger.info("KSX3267MateV2 Started.")
def detect_node(self, conn, unit, registers):
print "detect_node", unit, registers
compcode = registers[0]
nodecode = registers[2]
size = registers[4]
while True:
res = self.readregister(conn, KSX3267MateV2._DEVCODEREG, size, unit)
if res is None or res.isError():
self._logger.warn("Fail to get devices from " + str(unit) + " " + str(res))
return None
if len(res.registers) != size:
self._logger.info("retry to get data since size of data is not matched. " + str(size) + " " + str(len(res.registers)))
continue
return {"compcode" : compcode, "nodecode" : nodecode, "devcodes": res.registers}
def getdk(self, dev, idx):
dk = json.loads(dev["dk"])
return dk[idx]
def setdetection(self, flag, opid=0):
self._isdetecting = flag
self._detection["opid"] = opid
def startdetection(self, params, opid):
if self._detection["opid"] != 0:
self._logger.info("detection is processing.... so this command would be ignored.")
return ResCode.FAIL
self.setdetection(True, opid)
if params:
self._detection["saddr"] = params['saddr']
self._detection["eaddr"] = params['eaddr']
self._detection["port"] = params['port']
else:
self._detection["saddr"] = 1
self._detection["eaddr"] = 12
self._detection["port"] = None
return ResCode.OK
def readregister(self, conn, addr, count, unit):
print "....... before lock for read"
with self._lock:
time.sleep(KSX3267MateV2._SLEEP)
#mrchoi87
self._logger.info("read_holding_registers: " + str(unit) + " " + str(addr) + " " + str(count))
print "read register", unit, addr, count
try:
return conn.read_holding_registers(addr, count, unit=unit)
except Exception as ex:
self._logger.warn("fail to read holding registers. : " + str(ex))
return None
def detect(self):
detected = {}
for port, conn in self._conn.iteritems():
if self._isdetecting == False or self.isexecuting() == False:
self._logger.info("Total detection is canceled.")
break
info = self.detectone(port, conn)
detected[port] = info
self._logger.info ("finished to detect devices : " + str(detected))
noti = Notice(None, NotiCode.DETECT_FINISHED) # Detection Started
if noti:
noti.setkeyvalue("opid", self._detection["opid"])
for port, info in detected.iteritems():
noti.setcontent(port, info)
self.writecb(noti)
self.setdetection(False)
def detectone(self, port, conn):
detected = {}
if self._detection["port"] is not None and port not in self._detection["port"]:
return detected
#mrchoi87
#for unit in range(self._detection["saddr"], 12):
for unit in range(self._detection["saddr"], self._detection["eaddr"]):
if self._isdetecting == False or self.isexecuting() == False:
self._logger.info("A port " + str(port) + " detection is canceled.")
break
tempid = port + "-" + str(unit)
noti = Notice(None, NotiCode.DETECT_NODE_STARTED, devid=tempid) # Detection Started
if noti:
noti.setkeyvalue("opid", self._detection["opid"])
self.writecb(noti)
noti = None
info = None
res = None
for _ in range(3):
res = self.readregister(conn, KSX3267MateV2._DEVINFOREG, 6, unit)
if res is None or res.isError():
continue
if len(res.registers) != 6:
self._logger.info("retry to get data since size of data is not matched. 6 " + str(len(res.registers)))
continue
break
if res is None or res.isError():
noti = Notice(None, NotiCode.DETECT_NO_NODE, devid=tempid) # Detection Started
self._logger.info ("Fail to get information from a node : " + str(unit) + " " + str(res))
elif res.registers[1] in (NodeType.SENNODE, NodeType.ACTNODE, NodeType.INTNODE): # device type
if res.registers[3] == ProtoVer.KS_X_3267_2020 or res.registers[3] == ProtoVer.KS_X_3267_2018:
info = self.detect_node(conn, unit, res.registers)
self._logger.info ("Found a node : " + str(unit) + " " + str(info))
else:
noti = Notice(None, NotiCode.DETECT_UNKNOWN_PROTOCOL_VER, devid=tempid) # unknown protocol version
elif res.registers[1] == NodeType.NUTNODE:
if res.registers[3] == ProtoVer.TTA_1:
info = self.detect_node(conn, unit, res.registers)
self._logger.info ("Found a nutrient system : " + str(unit) + " " + str(info))
else:
noti = Notice(None, NotiCode.DETECT_UNKNOWN_PROTOCOL_VER, devid=tempid) # unknown protocol version
else:
noti = Notice(unit, NotiCode.DETECT_UNKNOWN_NODE, devid=tempid) # unknown device
if noti is None:
if info is None:
noti = Notice(None, NotiCode.DETECT_WRONG_DEVICE, devid=tempid) # fail to find a node
else:
noti = Notice(None, NotiCode.DETECT_NODE_DETECTED, devid=port, content={unit : info}) # found a node
detected[unit] = info
noti.setkeyvalue("opid", self._detection["opid"])
print "noti", noti.stringify()
self.writecb(noti)
time.sleep(0.1)
return detected
def canceldetection(self, params):
time.sleep(self._timeout)
noti = Notice(None, NotiCode.DETECT_CANCELED) # detection is canceled
noti.setkeyvalue("opid", self._detection["opid"])
self.writecb(noti)
self.setdetection(False)
return ResCode.OK
def _listen(self, opt):
try:
servsoc = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
servsoc.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
servsoc.bind((opt['host'], opt['port']))
servsoc.listen(1)
self._logger.info("listen : " + str(opt))
executing = True
while executing:
self._logger.info("waiting a client~")
rsoc, wsoc, esoc = select.select([servsoc], [], [], 10)
for sock in rsoc:
if sock == servsoc:
clisoc, address = servsoc.accept()
self._logger.info("client connected from " + str(address))
for tmp in self._tempthd:
if tmp["port"] == opt["port"]:
conn = ModbusTcpClient(timeout=self._timeout)
conn.socket = clisoc
self._conn[opt["port"]] = conn
tmp["status"] = 10 # connected
executing = False
except Exception as ex:
servsoc.close()
for tmp in self._tempthd:
if tmp["port"] == opt["port"]:
self._logger.warn(" port [" + str(opt["port"]) + "] exception : " + str(ex))
tmp["status"] = 5 # error
def listen(self, opt):
tmp = {"thd" : Thread(target=self._listen, args=(opt)), "status": 0, "port":opt['port']}
self._tempthd.append(tmp)
tmp["thd"].start()
def checktempthreads(self):
for tmp in self._tempthd:
if tmp["status"] > 2:
tmp["thd"].stop()
tmp["thd"].join()
def connectone(self, opt):
ret = False
conn = None
if opt['method'] == 'rtu':
conn = ModbusSerialClient(method='rtu', port=opt['port'],
timeout=self._timeout, baudrate=opt['baudrate'])
ret = conn.connect()
msg = "failed to connect with rtu"
code = NotiCode.RTU_CONNECTED if ret else NotiCode.RTU_FAILED_CONNECTION
elif opt['method'] == 'tcpc':
conn = ModbusTcpClient(opt['host'], port=opt['port'], timeout=self._timeout)
ret = conn.connect()
msg = "failed to connect with tcp"
code = NotiCode.TCP_CONNECTED if ret else NotiCode.RTU_FAILED_CONNECTION
elif opt['method'] == 'tcpcs':
self._logger.info("It would wait for a while to connect a client.")
ret = self.listen(opt)
msg = "failed to connect with tcp"
code = NotiCode.TCP_WAITING if ret else NotiCode.RTU_FAILED_CONNECTION
conn = None
else:
msg = "It's a wrong connection method. " + str(opt['method'])
if ret == False:
self._logger.warn(msg)
noti = Notice(None, NotiCode.RTU_FAILED_CONNECTION) # detection is canceled
else:
noti = Notice(None, NotiCode.RTU_CONNECTED) # detection is canceled
self.writecb(noti)
return conn
def connect(self):
ret = False
for opt in self._option['conn']:
conn = self.connectone(opt)
if conn:
self._conn[opt["port"][8:]] = conn
super(KSX3267MateV2, self).connect()
return ret
def closeone(self, port):
self._conn[port].close()
def close(self):
for port in self._conn.keys():
self.closeone(port)
super(KSX3267MateV2, self).close()
def readmsg(self):
self._msgq = []
for gw in self._devinfo:
for nd in gw["children"]:
self._msgq.append(self.readsensornodeinfo(nd))
if self._isdetecting:
self.detect()
self.checktempthreads()
def processrequest(self, dev, request, node):
gw = self._devinfo.findgateway(request.getnodeid())
unit = self.getdk(dev, 0)
operation = request.getcommand()
params = request.getparams()
params["operation"] = operation # need to convert by case
params["opid"] = request.getopid() # need to convert by case
properparams = CmdCode.getparams(operation) + ["operation", "opid"]
registers = []
for key in self.getdk(dev, 4):
if key not in properparams:
# key param is not used for this operation
# However, the register should be filled.
val = 0
elif key in params:
val = params[key]
else:
self._logger.warn("Wrong Keyword : " + str(key))
return ResCode.FAIL_WRONG_KEYWORD
if KSX3267MateV2._KEYWORDS[key][0] == 1:
registers.append(val)
elif KSX3267MateV2._KEYWORDS[key][1] == "int":
registers.extend(struct.unpack('HH', struct.pack('i', val)))
elif KSX3267MateV2._KEYWORDS[key][1] == "float":
registers.extend(struct.unpack('HH', struct.pack('f', val)))
#else:
# self._logger.warn("This param is needed for this operation. " + str(params['operation']) + ", " + str(key))
# return ResCode.FAIL_WRONG_KEYWORD
print "....... befor lock for write"
with self._lock:
time.sleep(KSX3267MateV2._SLEEP)
print "....... lock for write", self.getdk(dev, 3), registers
res = self._conn[gw["dk"]].write_registers(self.getdk(dev, 3), registers, unit=unit)
if res.isError():
self._logger.warn("Fail to write a request to dev." + str(dev) + "," + str(res) + ":" + str(request))
return ResCode.FAIL_TO_WRITE
msg = self.readactinfo(node, dev)
if msg is None:
self._logger.warn("Fail to read dev status.")
else:
self.sendnoticeforactuatorstatus(msg)
return ResCode.OK
def writeblk(self, blk):
print "received message", blk.getdevid(), self._coupleid
if BlkType.isrequest(blk.gettype()) is False:
self._logger.warn("The message is not request. " + str(blk.gettype()))
return False
response = Response(blk)
cmd = blk.getcommand()
nd = self._devinfo.finddevbyid(blk.getnodeid())
dev = self._devinfo.finddevbyid(blk.getdevid())
if blk.getdevid() == self._coupleid:
params = blk.getparams()
if cmd == CmdCode.DETECT_DEVICE:
print "detect device"
code = self.startdetection(params, blk.getopid())
elif cmd == CmdCode.CANCEL_DETECT:
print "cancel to detect device"
code = self.canceldetection(params)
else:
self._logger.warn("Unknown Error. " + str(blk) + ", " + str(dev))
code = ResCode.FAIL
elif dev is None:
self._logger.warn("There is no device. " + str(blk.getdevid()))
code = ResCode.FAIL_NO_DEVICE
elif DevType.ispropercommand(dev['dt'], cmd) is False:
self._logger.warn("The request is not proper. " + str(cmd) + " " + str(dev['dt']))
code = ResCode.FAIL_NOT_PROPER_COMMAND
elif DevType.isactuator(dev['dt']) or DevType.isnode(dev['dt']):
# modbus
code = self.processrequest(dev, blk, nd)
self._logger.info("Actuator processed : " + str(code))
elif DevType.isgateway(dev['dt']):
self._logger.info("Gateway does not receive a request")
code = ResCode.FAIL
else:
self._logger.warn("Unknown Error. " + str(blk) + ", " + str(dev))
code = ResCode.FAIL
response.setresult(code)
self._logger.info("write response: " + str(response))
self.writecb(response)
return True #if code == ResCode.OK else False
def parseregisters(self, names, values):
idx = 0
ret = {}
for nm in names:
(size, vtype) = KSX3267MateV2._KEYWORDS[nm]
if vtype == "float":
val = struct.unpack('f', struct.pack('HH', values[idx], values[idx+1]))[0]
elif vtype == "int":
val = struct.unpack('i', struct.pack('HH', values[idx], values[idx+1]))[0]
else:
val = values[idx]
ret[nm] = val
idx = idx + size
print "parsed", ret
return ret
def readinfofromdev(self, conn, dev):
size = self.getsize(self.getdk(dev, 2))
#for _ in range(3):
res = self.readregister(conn, self.getdk(dev, 1), size, self.getdk(dev, 0))
if res is None:
self._logger.warn("fail to get status from " + str(dev['dk']))
# break
elif res.isError():
self._logger.info("retry to get status from " + str(dev['dk']) + " " + str(res))
# continue
else:
if len(res.registers) == size:
return self.parseregisters(self.getdk(dev, 2), res.registers)
else:
self._logger.info("retry to get data since size of data is not matched. " + str(size) + " " + str(len(res.registers)))
return None
def readnodeinfo(self, node):
ret = {"id" : node["id"], "sen" : {}, "act" : {}, "nd" : {"status":StatCode.ERROR.value}}
gw = self._devinfo.findgateway(node["id"])
conn = self._conn[gw["dk"]]
ret["conn"] = conn
info = self.readinfofromdev(conn, node)
if info:
ret["nd"] = info
else:
self._logger.warn("fail to read node info : " + str(node))
return ret
def readsensornodeinfo(self, node):
ret = self.readnodeinfo(node)
for dev in node['children']:
if DevType.issensor(dev["dt"]):
info = self.readinfofromdev(ret["conn"], dev)
if info:
ret["sen"][dev["id"]] = info
#else:
# self._logger.warn("fail to read sensor info : " + str(dev) + " however continue to read other device")
return ret
def readactnodeinfo(self, node):
ret = self.readnodeinfo(node)
for dev in node['children']:
if DevType.issensor(dev["dt"]) == False:
info = self.readinfofromdev(ret["conn"], dev)
if info:
ret["act"][dev["id"]] = info
else:
self._logger.warn("fail to read actuator info : " + str(dev) + " however continue to read other device")
return ret
def readactinfo(self, node, act):
ret = self.readnodeinfo(node)
info = self.readinfofromdev(ret["conn"], act)
if info:
ret["act"][act["id"]] = info
else:
self._logger.warn("fail to read actuator info : " + str(act) + " however continue to read other device")
return ret
def sendobs(self):
for msg in self._msgq:
if msg is None:
continue
self.sendobservation(msg)
def sendnoti(self):
for gw in self._devinfo:
for node in gw["children"]:
ret = self.readnodeinfo(node)
i = 1
for dev in node['children']:
if DevType.issensor(dev["dt"]) == False:
info = self.readinfofromdev(ret["conn"], dev)
if info:
ret["act"][dev["id"]] = info
i = i + 1
if i % 3 == 0:
self.sendnoticeforactuatorstatus(ret)
ret["act"] = {}
self.sendnoticeforactuatorstatus(ret)
def sendobservation(self, ndinfo):
if StatCode.has_value(ndinfo["nd"]["status"]) == False:
ndinfo["nd"]["status"] = StatCode.ERROR.value
obsblk = Observation(ndinfo["id"])
obsblk.setobservation(ndinfo["id"], 0, StatCode(ndinfo["nd"]["status"]))
for devid, info in ndinfo["sen"].iteritems():
if StatCode.has_value(info["status"]) == False:
info["status"] = StatCode.ERROR.value
obsblk.setobservation(devid, info["value"], StatCode(info["status"]))
# do not send observation for actuator
#for devid, info in ndinfo["act"].iteritems():
# if StatCode.has_value(info["status"]) == False:
# info["status"] = StatCode.ERROR.value
# obsblk.setobservation(devid, 0, StatCode(info["status"]))
self.writecb(obsblk)
def sendnoticeforactuatorstatus(self, ndinfo):
blk = Notice(ndinfo["id"], NotiCode.ACTUATOR_STATUS, ndinfo["id"], ndinfo["nd"])
for devid, info in ndinfo["act"].iteritems():
blk.setcontent(devid, info)
self.writecb(blk)
def start(self, writecb):
super(KSX3267MateV2, self).start(writecb)
return True
def stop(self):
super(KSX3267MateV2, self).stop()
return True
def getsize(self, lst):
size =0
for k in lst:
if k in KSX3267MateV2._KEYWORDS:
size = size + KSX3267MateV2._KEYWORDS[k][0]
else:
self._logger.warn("wrong keyword : " + str(k))
return -1
return size
if __name__ == "__main__":
isnutri = False
opt = {
'conn' : [{
'method': 'rtu',
'port' : '/dev/ttyJND2',
'baudrate' : 9600,
'timeout': 5
}]
}
nutriinfo = [{
"id" : "1", "dk" : "", "dt": "gw", "children" : [{
"id" : "101", "dk" : '[1,40201,["status"],45001,["operation","opid"]]', "dt": "nd", "children" : [
{"id" : "102", "dk" : '[1,40211,["control","status","area","alert","opid"],45001,["operation", "opid", "control","EC","pH", "start-area", "stop-area", "on-sec"]]', "dt": "nutrient-supply/level1"},
{"id" : "103", "dk" : '[1,40221,["value","status"]]', "dt": "sen"},
{"id" : "104", "dk" : '[1,40231,["value","status"]]', "dt": "sen"},
{"id" : "105", "dk" : '[1,40241,["value","status"]]', "dt": "sen"},
{"id" : "106", "dk" : '[1,40251,["value","status"]]', "dt": "sen"},
{"id" : "107", "dk" : '[1,40261,["value","status"]]', "dt": "sen"},
{"id" : "109", "dk" : '[1,40271,["value","status"]]', "dt": "sen"},
{"id" : "110", "dk" : '[1,40281,["value","status"]]', "dt": "sen"},
{"id" : "111", "dk" : '[1,40291,["value","status"]]', "dt": "sen"},
{"id" : "112", "dk" : '[1,40301,["value","status"]]', "dt": "sen"},
{"id" : "113", "dk" : '[1,40311,["value","status"]]', "dt": "sen"}
]}
]}
]
devinfo = [{
"id" : "1", "dk" : "JND2", "dt": "gw", "children" : [
# {
# "id" : "101", "dk" : '[1,201,["status"],301,["operation","opid"]]', "dt": "nd", "children" : [
#{"id" : "102", "dk" : '[1,210,["value","status"]]', "dt": "sen"},
#{"id" : "103", "dk" : '[1,220,["value","status"]]', "dt": "sen"}
# "id" : "101", "dk" : '[1,40201,["status"],45001,["operation","opid"]]', "dt": "nd", "children" : [
#{"id" : "102", "dk" : '[1,41010,["value","status"]]', "dt": "sen"},
#{"id" : "103", "dk" : '[1,41020,["value","status"]]', "dt": "sen"}
# {"id" : "102", "dk" : '[1,40202,["value","status"]]', "dt": "sen"},
# {"id" : "103", "dk" : '[1,40205,["value","status"]]', "dt": "sen"},
#{"id" : "104", "dk" : '[1,40208,["value","status"]]', "dt": "sen"},
# {"id" : "105", "dk" : '[1,40211,["value","status"]]', "dt": "sen"},
#{"id" : "106", "dk" : '[1,40251,["value","status"]]', "dt": "sen"},
#{"id" : "107", "dk" : '[1,40261,["value","status"]]', "dt": "sen"},
#{"id" : "108", "dk" : '[1,40271,["value","status"]]', "dt": "sen"},
#{"id" : "109", "dk" : '[1,40281,["value","status"]]', "dt": "sen"},
#{"id" : "110", "dk" : '[1,40291,["value","status"]]', "dt": "sen"}
# ]
# }
]
}]
"""
}, {
"id" : "201", "dk" : '[2,40201,["status"],45001,["operation","opid"]]', "dt": "nd", "children" : [
{"id" : "202", "dk" : '[2,40202,["opid","status","state-hold-time","remain-time"],40206,["operation","opid","time"]]', "dt": "act/retractable/level1"},
{"id" : "202", "dk" : '[2,40209,["opid","status","state-hold-time","remain-time"],40213,["operation","opid","time"]]', "dt": "act/retractable/level1"},
{"id" : "203", "dk" : '[2,40216,["value","status"]]', "dt": "sen"},
{"id" : "204", "dk" : '[2,40219,["value","status"]]', "dt": "sen"},
#{"id" : "203", "dk" : (2,40221,["opid","status"],45021,["operation","opid"]), "dt": "act/switch/level0"},
#{"id" : "204", "dk" : (2,40231,["opid","status"],45031,["operation","opid"]), "dt": "act/switch/level0"},
#{"id" : "205", "dk" : (2,40241,["opid","status"],45041,["operation","opid"]), "dt": "act/switch/level0"},
#{"id" : "206", "dk" : (2,40251,["opid","status"],45051,["operation","opid"]), "dt": "act/switch/level0"},
#{"id" : "207", "dk" : (2,40261,["opid","status"],45061,["operation","opid"]), "dt": "act/switch/level0"},
#{"id" : "208", "dk" : (2,40271,["opid","status"],45071,["operation","opid"]), "dt": "act/switch/level0"},
#{"id" : "209", "dk" : (2,40281,["opid","status"],45081,["operation","opid"]), "dt": "act/switch/level0"}
]
}, {
"id" : "301", "dk" : (3,40201,["opid","status"],45001,["operation","opid"]), "dt": "nd", "children" : [
{"id" : "302", "dk" : (3,40211,["opid","status"],45011,["operation","opid"]), "dt": "act/retractable/level0"},
{"id" : "303", "dk" : (3,40221,["opid","status"],45021,["operation","opid"]), "dt": "act/retractable/level0"},
{"id" : "304", "dk" : (3,40231,["opid","status"],45031,["operation","opid"]), "dt": "act/retractable/level0"},
{"id" : "305", "dk" : (3,40241,["opid","status"],45041,["operation","opid"]), "dt": "act/retractable/level0"}
]
}]
}]
"""
if isnutri:
kdmate = KSX3267MateV2(opt, nutriinfo, "1", None)
else:
kdmate = KSX3267MateV2(opt, devinfo, "1", None)
mate = Mate ({}, [], "1", None)
kdmate.start (mate.writeblk)
print "mate started"
time.sleep(10)
req = Request(None)
req.setcommand("1", CmdCode.DETECT_DEVICE, None)
print "=======================================#1"
kdmate.writeblk(req)
print "=======================================#1"
"""
time.sleep(1)
req = Request(None)
req.setcommand("1", CmdCode.CANCEL_DETECT, {})
print "=======================================#2"
kdmate.writeblk(req)
print "=======================================#2"
time.sleep(1)
req = Request(None)
req.setcommand("1", CmdCode.DETECT_DEVICE, None)
print "=======================================#3"
kdmate.writeblk(req)
print "=======================================#3"
time.sleep(1)
req = Request(None)
req.setcommand("1", CmdCode.CANCEL_DETECT, {})
print "=======================================#4"
kdmate.writeblk(req)
print "=======================================#4"
time.sleep(10)
req = Request(201)
req.setcommand(202, CmdCode.OPEN, {})
kdmate.writeblk(req)
time.sleep(5)
req = Request(201)
req.setcommand(202, CmdCode.OFF, {})
kdmate.writeblk(req)
time.sleep(10)
req = Request(201)
req.setcommand(202, CmdCode.TIMED_OPEN, {"time":10})
kdmate.writeblk(req)
time.sleep(15)
req = Request(201)
req.setcommand(202, CmdCode.TIMED_CLOSE, {"time":10})
kdmate.writeblk(req)
time.sleep(5)
req = Request(201)
req.setcommand(202, CmdCode.OFF, {})
kdmate.writeblk(req)
"""
time.sleep(30)
kdmate.stop()
print "mate stopped"
| 40.71835 | 212 | 0.516681 | 3,147 | 28,625 | 4.632031 | 0.13918 | 0.026754 | 0.022295 | 0.02744 | 0.41737 | 0.369349 | 0.339164 | 0.265487 | 0.201413 | 0.175894 | 0 | 0.041242 | 0.307109 | 28,625 | 702 | 213 | 40.776353 | 0.693708 | 0.081013 | 0 | 0.256619 | 0 | 0.002037 | 0.132772 | 0.023035 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.026477 | null | null | 0.028513 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc6537e769f6b3ef7aa9a2e3afa098d8b075693f | 1,220 | py | Python | evsim/assessor.py | cbchoi/nppsim | 4d096f9d2fdb5ebf3e3e83be7b1974bfc92554c1 | [
"MIT"
] | 3 | 2020-01-21T13:06:37.000Z | 2021-03-01T23:35:20.000Z | evsim/assessor.py | cbchoi/pohangsim | e978ff39ec94413ae44129510c56acb134770298 | [
"MIT"
] | null | null | null | evsim/assessor.py | cbchoi/pohangsim | e978ff39ec94413ae44129510c56acb134770298 | [
"MIT"
] | null | null | null | from evsim.system_simulator import SystemSimulator
from evsim.behavior_model_executor import BehaviorModelExecutor
from evsim.system_message import SysMessage
from evsim.definition import *
import os
import subprocess as sp
class Assessor(BehaviorModelExecutor):
def __init__(self, instance_time, destruct_time, name, engine_name):
BehaviorModelExecutor.__init__(self, instance_time, destruct_time, name, engine_name)
# Open CSV
self.init_state("IDLE")
self.insert_state("IDLE", Infinite)
self.insert_state("MOVE", 1)
self.insert_input_port("assess")
self.insert_output_port("done")
def ext_trans(self,port, msg):
data = msg.retrieve()
#print("Assessor")
#print(str(datetime.datetime.now()) + " " + str(data[0]))
#temp = "[%f] %s" % (SystemSimulator().get_engine(self.engine_name).get_global_time(), str(data[0]))
#print(temp)
def output(self):
#temp = "[%f] %s" % (SystemSimulator().get_engine(self.engine_name).get_global_time(), "Human Receiver Object: Move")
#print(temp)
return None
def int_trans(self):
self._cur_state = "MOVE" | 34.857143 | 126 | 0.657377 | 145 | 1,220 | 5.275862 | 0.427586 | 0.047059 | 0.039216 | 0.052288 | 0.269281 | 0.269281 | 0.269281 | 0.269281 | 0.269281 | 0.14902 | 0 | 0.003165 | 0.222951 | 1,220 | 35 | 127 | 34.857143 | 0.803797 | 0.261475 | 0 | 0 | 0 | 0 | 0.030233 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.3 | 0.05 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bc6c634a8700c04a493d0272caa915d4e9038c51 | 580 | py | Python | mvp/migrations/0004_auto_20201127_0649.py | Wastecoinng/mvp_beta | 2faa4b9eeac99b2c284bafad955b90f9951991fc | [
"MIT"
] | null | null | null | mvp/migrations/0004_auto_20201127_0649.py | Wastecoinng/mvp_beta | 2faa4b9eeac99b2c284bafad955b90f9951991fc | [
"MIT"
] | null | null | null | mvp/migrations/0004_auto_20201127_0649.py | Wastecoinng/mvp_beta | 2faa4b9eeac99b2c284bafad955b90f9951991fc | [
"MIT"
] | null | null | null | # Generated by Django 2.2.13 on 2020-11-27 05:49
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('mvp', '0003_hublocation'),
]
operations = [
migrations.RemoveField(
model_name='hublocation',
name='longitude',
),
migrations.AddField(
model_name='hublocation',
name='longi',
field=models.TextField(default=654433, max_length=90, unique=True, verbose_name='Longitude'),
preserve_default=False,
),
]
| 24.166667 | 105 | 0.591379 | 57 | 580 | 5.912281 | 0.719298 | 0.053412 | 0.118694 | 0.142433 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06846 | 0.294828 | 580 | 23 | 106 | 25.217391 | 0.755501 | 0.07931 | 0 | 0.235294 | 1 | 0 | 0.120301 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc6daca01b612effa21edcc2bb7e569ddcd2750f | 751 | py | Python | adminapp/migrations/0012_auto_20210714_1155.py | mofresh27/MuseumExperience-Group2-Python-BE-1 | d6ca7aceeddfcfdefdf112ab5e40cf74d6b472ce | [
"MIT"
] | null | null | null | adminapp/migrations/0012_auto_20210714_1155.py | mofresh27/MuseumExperience-Group2-Python-BE-1 | d6ca7aceeddfcfdefdf112ab5e40cf74d6b472ce | [
"MIT"
] | 1 | 2021-07-19T14:27:28.000Z | 2021-07-19T14:27:28.000Z | adminapp/migrations/0012_auto_20210714_1155.py | mofresh27/MuseumExperience-Group2-Python-BE-1 | d6ca7aceeddfcfdefdf112ab5e40cf74d6b472ce | [
"MIT"
] | 2 | 2021-07-14T21:56:46.000Z | 2021-07-15T16:11:41.000Z | # Generated by Django 3.2.4 on 2021-07-14 11:55
from django.db import migrations, models
import uuid
class Migration(migrations.Migration):
dependencies = [
('adminapp', '0011_faq'),
]
operations = [
migrations.AddField(
model_name='faq',
name='uuid',
field=models.UUIDField(blank=True, default=uuid.uuid4, null=True),
),
migrations.AlterField(
model_name='faq',
name='answer',
field=models.TextField(blank=True, default=None, null=True),
),
migrations.AlterField(
model_name='faq',
name='question',
field=models.TextField(blank=True, default=None, null=True),
),
]
| 25.033333 | 78 | 0.565912 | 78 | 751 | 5.397436 | 0.512821 | 0.064133 | 0.085511 | 0.114014 | 0.418052 | 0.418052 | 0.418052 | 0.418052 | 0.228029 | 0 | 0 | 0.03861 | 0.310253 | 751 | 29 | 79 | 25.896552 | 0.774131 | 0.05992 | 0 | 0.434783 | 1 | 0 | 0.06108 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc6e15840fb47699b6ed6ae5254ac356715fcfad | 2,794 | py | Python | tests/gen_test.py | tinylambda/tornadio2 | 7b112e2e207bd7500288b42896f9970c16e623ad | [
"Apache-2.0"
] | null | null | null | tests/gen_test.py | tinylambda/tornadio2 | 7b112e2e207bd7500288b42896f9970c16e623ad | [
"Apache-2.0"
] | null | null | null | tests/gen_test.py | tinylambda/tornadio2 | 7b112e2e207bd7500288b42896f9970c16e623ad | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
tornadio2.tests.gen
~~~~~~~~~~~~~~~~~~~
:copyright: (c) 2011 by the Serge S. Koval, see AUTHORS for more details.
:license: Apache, see LICENSE for more details.
"""
from collections import deque
from nose.tools import eq_
from tornadio2 import gen
_queue = None
def init_environment():
global _queue
_queue = deque()
def run_sync(test, callback):
callback(test)
def queue_async(test, callback):
global _queue
_queue.append((callback, test))
def step_async():
callback = _queue.popleft()
callback[0](callback[1])
def run_async():
global _queue
while True:
try:
step_async()
except IndexError:
break
def run_async_oor():
global _queue
while True:
try:
callback = _queue.pop()
callback[0](callback[1])
except IndexError:
break
class Dummy():
def __init__(self, queue_type):
self.v = None
self.queue_type = queue_type
@gen.sync_engine
def test(self, value):
self.v = yield gen.Task(self.queue_type, value)
class DummyList():
def __init__(self, queue_type):
self.v = []
self.queue_type = queue_type
@gen.sync_engine
def test(self, value):
self.v.append((yield gen.Task(self.queue_type, value)))
class DummyListOutOfOrder():
def __init__(self, queue_type):
self.v = []
self.queue_type = queue_type
@gen.engine
def test(self, value):
self.v.append((yield gen.Task(self.queue_type, value)))
class DummyLoop():
def __init__(self, queue_type):
self.v = 0
self.queue_type = queue_type
@gen.sync_engine
def test(self, value):
for n in range(2):
self.v += (yield gen.Task(self.queue_type, value))
def test():
init_environment()
dummy = Dummy(run_sync)
dummy.test('test')
eq_(dummy.v, 'test')
def test_async():
init_environment()
dummy = Dummy(queue_async)
dummy.test('test')
run_async()
# Verify value
eq_(dummy.v, 'test')
def test_sync_queue():
init_environment()
dummy = DummyList(queue_async)
dummy.test('1')
dummy.test('2')
dummy.test('3')
run_async()
# Verify value
eq_(dummy.v, ['1', '2', '3'])
def test_sync_queue_oor():
init_environment()
dummy = DummyList(queue_async)
dummy.test('1')
dummy.test('2')
dummy.test('3')
run_async_oor()
# Verify value
eq_(dummy.v, ['1', '2', '3'])
def test_async_queue_oor():
init_environment()
dummy = DummyListOutOfOrder(queue_async)
dummy.test('1')
dummy.test('2')
dummy.test('3')
run_async_oor()
# Verify value
eq_(dummy.v, ['3', '2', '1'])
| 17.910256 | 77 | 0.598067 | 362 | 2,794 | 4.38674 | 0.19337 | 0.09068 | 0.098237 | 0.040302 | 0.561083 | 0.509446 | 0.490554 | 0.442065 | 0.438917 | 0.397985 | 0 | 0.015093 | 0.264853 | 2,794 | 155 | 78 | 18.025806 | 0.758033 | 0.084825 | 0 | 0.62766 | 0 | 0 | 0.013444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.202128 | false | 0 | 0.031915 | 0 | 0.276596 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc7c42367a8432fba7810ae50ee93f6f9fc12d32 | 2,516 | py | Python | unittests/tools/test_intsights_parser.py | M-Rod101/django-DefectDojo | 7b09a00b1a526abaf40455c2ddec16aaa06b16e2 | [
"BSD-3-Clause"
] | 249 | 2016-09-06T21:04:40.000Z | 2018-01-19T15:59:44.000Z | unittests/tools/test_intsights_parser.py | OWASP/django-DefectDojo | c101e47b294863877cd68a82d0cc60f8017b45b1 | [
"BSD-3-Clause"
] | 255 | 2016-09-06T21:36:37.000Z | 2018-01-19T19:57:57.000Z | unittests/tools/test_intsights_parser.py | M-Rod101/django-DefectDojo | 7b09a00b1a526abaf40455c2ddec16aaa06b16e2 | [
"BSD-3-Clause"
] | 152 | 2016-09-06T21:04:54.000Z | 2018-01-18T08:52:24.000Z | from ..dojo_test_case import DojoTestCase
from dojo.models import Test
from dojo.tools.intsights.parser import IntSightsParser
class TestIntSightsParser(DojoTestCase):
def test_intsights_parser_with_one_critical_vuln_has_one_findings_json(
self):
testfile = open("unittests/scans/intsights/intsights_one_vul.json")
parser = IntSightsParser()
findings = parser.get_findings(testfile, Test())
testfile.close()
self.assertEqual(1, len(findings))
finding = list(findings)[0]
self.assertEqual(
'5c80dbf83b4a3900078b6be6',
finding.unique_id_from_tool)
self.assertEqual(
'HTTP headers weakness in initech.com web server',
finding.title)
self.assertEquals('Critical', finding.severity)
self.assertEquals(
"https://dashboard.intsights.com/#/threat-command/alerts?search=5c80dbf83b4a3900078b6be6",
finding.references)
def test_intsights_parser_with_one_critical_vuln_has_one_findings_csv(
self):
testfile = open("unittests/scans/intsights/intsights_one_vuln.csv")
parser = IntSightsParser()
findings = parser.get_findings(testfile, Test())
testfile.close()
self.assertEqual(1, len(findings))
finding = list(findings)[0]
self.assertEqual(
"mn7xy83finmmth4ja363rci9",
finding.unique_id_from_tool)
self.assertEqual(
"HTTP headers weakness in company-domain.com web server",
finding.title)
def test_intsights_parser_with_many_vuln_has_many_findings_json(self):
testfile = open("unittests/scans/intsights/intsights_many_vul.json")
parser = IntSightsParser()
findings = parser.get_findings(testfile, Test())
testfile.close()
self.assertEqual(3, len(findings))
def test_intsights_parser_with_many_vuln_has_many_findings_csv(self):
testfile = open("unittests/scans/intsights/intsights_many_vuln.csv")
parser = IntSightsParser()
findings = parser.get_findings(testfile, Test())
testfile.close()
self.assertEqual(9, len(findings))
def test_intsights_parser_invalid_text_with_error_csv(self):
with self.assertRaises(ValueError):
testfile = open(
"unittests/scans/intsights/intsights_invalid_file.txt")
parser = IntSightsParser()
findings = parser.get_findings(testfile, Test())
| 38.121212 | 102 | 0.677663 | 265 | 2,516 | 6.177358 | 0.267925 | 0.073305 | 0.04887 | 0.067196 | 0.736714 | 0.707392 | 0.653635 | 0.653635 | 0.609652 | 0.483812 | 0 | 0.022786 | 0.232512 | 2,516 | 65 | 103 | 38.707692 | 0.824961 | 0 | 0 | 0.518519 | 0 | 0 | 0.194754 | 0.116852 | 0 | 0 | 0 | 0 | 0.203704 | 1 | 0.092593 | false | 0 | 0.055556 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc831b7e95388ec378c7efd07e50c5540c59f285 | 435 | py | Python | gullveig/web/__init__.py | Addvilz/gullveig | 6ac5e66062c1b5ea8ad7c66f69be9e3d99ac0825 | [
"Apache-2.0"
] | 8 | 2020-08-24T14:53:14.000Z | 2021-03-16T03:58:01.000Z | gullveig/web/__init__.py | Addvilz/gullveig | 6ac5e66062c1b5ea8ad7c66f69be9e3d99ac0825 | [
"Apache-2.0"
] | 6 | 2020-08-25T13:19:02.000Z | 2021-02-21T21:55:34.000Z | gullveig/web/__init__.py | Addvilz/gullveig | 6ac5e66062c1b5ea8ad7c66f69be9e3d99ac0825 | [
"Apache-2.0"
] | null | null | null | import logging
from gullveig import bootstrap_default_logger
# Configure default logging
def _configure_default_web_logger():
logger = logging.getLogger('gullveig-web')
bootstrap_default_logger(logger)
api_logger = logging.getLogger('gullveig-api')
bootstrap_default_logger(api_logger)
aio_logger = logging.getLogger('aiohttp.server')
bootstrap_default_logger(aio_logger)
_configure_default_web_logger()
| 22.894737 | 52 | 0.795402 | 51 | 435 | 6.392157 | 0.294118 | 0.196319 | 0.269939 | 0.153374 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128736 | 435 | 18 | 53 | 24.166667 | 0.860158 | 0.057471 | 0 | 0 | 0 | 0 | 0.093137 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.2 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc926bb3d7c2f20a37f4cae0b86f7455ebdb913c | 1,430 | py | Python | scalability/tests/test_misc.py | ggreif/ic | ac56ec91f077c00d59eea3f73f51e14a1b3ea882 | [
"Apache-2.0"
] | 941 | 2021-05-10T08:14:14.000Z | 2022-03-31T11:40:24.000Z | scalability/tests/test_misc.py | ggreif/ic | ac56ec91f077c00d59eea3f73f51e14a1b3ea882 | [
"Apache-2.0"
] | 3 | 2022-02-16T12:24:20.000Z | 2022-03-23T12:05:41.000Z | scalability/tests/test_misc.py | ggreif/ic | ac56ec91f077c00d59eea3f73f51e14a1b3ea882 | [
"Apache-2.0"
] | 122 | 2021-05-10T08:21:23.000Z | 2022-03-25T20:34:12.000Z | import unittest
from unittest import TestCase
from misc import verify
class TestVerify(TestCase):
"""Tests misc.py verifies function."""
def test_verify__with_zero_threshold_and_expected_succeeds(self):
"""Test passes when expected rate, actual rate and threshold are all zero."""
result = verify(metric="Query failure rate", actual=0.0, expected=0.0, threshold=0.0)
self.assertEqual(result, 0)
def test_verify__fails_when_positive_delta_is_larger_than_postive_threshold(self):
"""Test fails when positive delta between actual rate and expected rate exceeds positive threshold."""
result = verify(metric="Update latency", actual=200, expected=100, threshold=0.1)
self.assertEqual(result, 1)
def test_verify__fails_when_negative_delta_is_smaller_than_negative_threshold(self):
"""Test fails when negative delta between actual rate and expected rate exceeds negative threshold."""
result = verify(metric="Update latency", actual=50, expected=100, threshold=-0.01)
self.assertEqual(result, 1)
def test_verify__fails_when_negative_delta_and_positive_threshold(self):
"""Test fails when delta between actual rate and expected rate exceeds threshold."""
result = verify(metric="Update latency", actual=50, expected=100, threshold=0.01)
self.assertEqual(result, 0)
if __name__ == "__main__":
unittest.main()
| 43.333333 | 110 | 0.735664 | 189 | 1,430 | 5.31746 | 0.285714 | 0.053731 | 0.051741 | 0.053731 | 0.552239 | 0.452736 | 0.452736 | 0.406965 | 0.275622 | 0.275622 | 0 | 0.02874 | 0.172727 | 1,430 | 32 | 111 | 44.6875 | 0.820795 | 0.263636 | 0 | 0.222222 | 0 | 0 | 0.066212 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 1 | 0.222222 | false | 0 | 0.166667 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc9aea616fee38b1a73a79e690091369c909ef06 | 737 | py | Python | var/spack/repos/builtin/packages/aspell/package.py | jeanbez/spack | f4e51ce8f366c85bf5aa0eafe078677b42dae1ba | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | var/spack/repos/builtin/packages/aspell/package.py | jeanbez/spack | f4e51ce8f366c85bf5aa0eafe078677b42dae1ba | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 8 | 2021-11-09T20:28:40.000Z | 2022-03-15T03:26:33.000Z | var/spack/repos/builtin/packages/aspell/package.py | jeanbez/spack | f4e51ce8f366c85bf5aa0eafe078677b42dae1ba | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 2 | 2019-02-08T20:37:20.000Z | 2019-03-31T15:19:26.000Z | # Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
# See also: AspellDictPackage
class Aspell(AutotoolsPackage, GNUMirrorPackage):
"""GNU Aspell is a Free and Open Source spell checker designed to
eventually replace Ispell."""
homepage = "http://aspell.net/"
gnu_mirror_path = "aspell/aspell-0.60.6.1.tar.gz"
extendable = True # support activating dictionaries
version('0.60.6.1', sha256='f52583a83a63633701c5f71db3dc40aab87b7f76b29723aeb27941eff42df6e1')
patch('fix_cpp.patch')
patch('issue-519.patch', when='@:0.60.6.1')
| 32.043478 | 98 | 0.738128 | 95 | 737 | 5.694737 | 0.778947 | 0.016636 | 0.022181 | 0.027726 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113419 | 0.150611 | 737 | 22 | 99 | 33.5 | 0.750799 | 0.459973 | 0 | 0 | 0 | 0 | 0.409922 | 0.24282 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bc9f42407dc824808c93da43b669882c77d6d9f4 | 9,461 | py | Python | web/app/forms.py | Devidence7/Break | f961b1b46977c86739ff651fe81a1d9fff98a8e1 | [
"MIT"
] | null | null | null | web/app/forms.py | Devidence7/Break | f961b1b46977c86739ff651fe81a1d9fff98a8e1 | [
"MIT"
] | null | null | null | web/app/forms.py | Devidence7/Break | f961b1b46977c86739ff651fe81a1d9fff98a8e1 | [
"MIT"
] | null | null | null | from flask_wtf import FlaskForm
from wtforms import Form, StringField, PasswordField, BooleanField, SubmitField, IntegerField, validators, FileField, \
MultipleFileField, SelectField, RadioField, HiddenField, DecimalField, TextAreaField
from wtforms.fields.html5 import DateField
from wtforms.validators import DataRequired
# Structure of the Login form
class LoginForm(Form):
email = StringField('Email', [
validators.DataRequired(message='Es necesario introducir un email')])
password = PasswordField('Contraseña', [
validators.DataRequired(message='Es necesario introducir una contraseña')])
remember_me = BooleanField('Recuerdame')
submit = SubmitField('Iniciar Sesión')
# Structure of the Register form
class RegisterForm(Form):
name = StringField('Nombre', [
validators.DataRequired(message='Es necesario introducir un nombre'),
validators.Length(min=4, max=50, message='El tamaño máximo del nombre son 50 carácteres')])
lastname = StringField('Apellidos', [
validators.DataRequired(message='Es necesario introducir apellidos'),
validators.Length(min=4, max=50, message='El tamaño máximo del nombre son 50 carácteres')])
# username = StringField('Username', [
# validators.Length(min=4, max=25, message='El nombre de usuario debe tener entre 4 y 25 carácteres')])
email = StringField('Email', [
validators.DataRequired(message='Es necesario introducir un email'),
validators.Length(min=1, max=50, message='El email no puede contener más de 50 carácteres')])
password = PasswordField('Contraseña', [
validators.DataRequired(message='Es necesario una contraseña'),
validators.Length(min=8, message='La contraseña debe tener al menos 8 caracteres')
])
confirm = PasswordField('Confirmar Contraseña', [
validators.EqualTo('password', message='Las contraseñas no coinciden')
])
# Structure of the Login form
class RestorePasswordForm(Form):
email = StringField('Email', [
validators.DataRequired(message='Es necesario introducir un email')])
submit = SubmitField("Correo de Recuperación")
class EditProfile(FlaskForm):
name = StringField('Nombre', [
validators.DataRequired(message='Es necesario introducir un nombre'),
validators.Length(min=4, max=50, message='El tamaño máximo del nombre son 50 carácteres')])
lastname = StringField('Apellidos', [
validators.DataRequired(message='Es necesario introducir apellidos'),
validators.Length(min=4, max=50, message='El tamaño máximo del nombre son 50 carácteres')])
gender = RadioField('Género', choices = [('hombre','Hombre'),('mujer','Mujer')])
submit = SubmitField('Guardar cambios')
class EditLocation(FlaskForm):
lat = HiddenField('Latitud', [
validators.DataRequired(message='No se ha podido obtener la nueva localización')
])
lng = HiddenField('Longitud', [
validators.DataRequired(message='No se ha podido obtener la nueva localización')
])
submit = SubmitField('Establecer ubicación')
class EditPassword(FlaskForm):
old = PasswordField('Contraseña Anterior', [
validators.DataRequired(message='Es necesario introducir una contraseña')
])
password = PasswordField('Eliga una contraseña', [
validators.DataRequired(message='Es necesario introducir una contraseña'),
validators.Length(min=8, message='La contraseña debe tener al menos 8 caracteres')
])
confirm = PasswordField('Confirme la contraseña', [
validators.EqualTo('password', message='Las contraseñas no coinciden')
])
submit = SubmitField('Cambiar contraseña')
class EditEmail(FlaskForm):
email = StringField('Correo electrónico', [
validators.DataRequired(message='Es necesario introducir una dirección de correo'),
validators.Length(min=1, max=50, message='El correo no puede contener más de 50 carácteres')])
confirm = StringField('Confirmar correo electrónico', [
validators.EqualTo('email', message='Los correos no coinciden')
])
submit = SubmitField('Cambiar correo')
class EditPicture(FlaskForm):
picture = FileField('Imagen de perfil')
submit = SubmitField('Establecer imagen')
delete = SubmitField('Eliminar imagen')
class DeleteAccount(FlaskForm):
delete = SubmitField("Eliminar cuenta")
# Structure of the Subir Anuncio form
class SubirAnuncioForm(FlaskForm):
# pictures = HiddenField("Imágenes")
# mimes = HiddenField("Formatos de imagen")
name = StringField('Nombre del producto', [
validators.DataRequired(message='Es necesario introducir un nombre de producto'),
validators.Length(min=1, max=50, message='El tamaño máximo del nombre del producto son 50 carácteres')])
price = DecimalField('Precio (€)', [
validators.DataRequired(message='Es necesario introducir un precio'),
validators.NumberRange(min=0, max=1000000, message='El precio intoducido no es válido (de 0 € a 999.999,99 €)')])
category = SelectField('Categoría',
choices = [
('Automoción', 'Automoción'),
('Informática', 'Informática'),
('Moda', 'Moda'),
('Deporte y ocio', 'Deporte y ocio'),
('Videojuegos', 'Videojuegos'),
('Libros y música', 'Libros y música'),
('Hogar y jardín', 'Hogar y jardín'),
('Foto y audio', 'Foto y audio')
], validators = [
validators.DataRequired(message='Es necesario seleccionar una categoría') ])
description = TextAreaField('Descripción', [
validators.DataRequired(message='Es necesario escribir una descripción')])
lat = HiddenField('Latitud')
lng = HiddenField('Longitud')
enddate = DateField('End', format = '%Y-%m-%d', description = 'Time that the event will occur',
validators= [validators.Optional()] )
submit = SubmitField('Publicar')
class ProductSearch(Form):
categories = ['Automoción', 'Informática', 'Moda', 'Deporte y ocio', 'Videojuegos', 'Libros y música', 'Hogar y jardín', 'Foto y audio']
category = SelectField('Categoría',
choices = [
('Automoción', 'Automoción'),
('Informática', 'Informática'),
('Moda', 'Moda'),
('Deporte y ocio', 'Deporte y ocio'),
('Videojuegos', 'Videojuegos'),
('Libros y música', 'Libros y música'),
('Hogar y jardín', 'Hogar y jardín'),
('Foto y audio', 'Foto y audio')
])
estados = [('en venta', 'En Venta'), ('vendido', 'Vendido')]
resultadosporpag = ['15', '30', '45', '60', '75', '90']
ordenacionlist = [('published ASC', 'Fecha (Más viejos primero)'), ('published DESC', 'Fecha (Más nuevos primero)'), ('distance DESC', 'Distancia Descendente'), ('distance ASC', 'Distancia Ascendente'), ('price ASC', 'Precio Ascendente'), ('price DESC', 'Precio Descendente'), ('views DESC', 'Popularidad descendente')]
status = SelectField('Estado',
choices = [
('en venta','En Venta'),
('vendido','Vendido')
])
keywords = StringField('Palabras Clave')
minprice = StringField('Precio Mínimo')
maxprice = StringField('Precio Máximo')
minpublished = DateField('Start', format = '%Y-%m-%d', description = 'Time that the event will occur')
maxpublished = DateField('Start', format = '%Y-%m-%d', description = 'Time that the event will occur')
resultados = SelectField('Resultados Por Página',
choices = [
('15', '15'),
('30', '30'),
('45', '45'),
('60', '60'),
('75', '75'),
('90', '90')
])
ordenacion = SelectField('Ordenación de Resultados',
choices = [
('published ASC', 'Fecha (Más viejos primero)'),
('published DESC', 'Fecha (Más nuevos primero)'),
('distance DESC', 'Distancia Descendente'),
('distance ASC', 'Distancia Ascendente'),
('price ASC', 'Precio Ascendente'),
('price DESC', 'Precio Descendente'),
('views DESC', 'Popularidad descendente')
])
distancia = StringField('Distancia')
submit = SubmitField('Buscar')
class Review(FlaskForm):
stars = IntegerField('Puntuación', [
validators.DataRequired(message='Es necesario introducir una puntuación entre 1 y 5'),
validators.NumberRange(min=1, max=5, message='La puntuación debe ser de 1 a 5 estrellas')])
comment = TextAreaField('Comentario', [
validators.DataRequired(message='Es necesario escribir un comentario')])
submit = SubmitField('Publicar Valoración')
class bidPlacementForm(FlaskForm):
amount = StringField('Cantidad')
submit = SubmitField('Realizar Puja')
class reportForm(Form):
category = SelectField('Categoría',
choices = [
('Sospecha de fraude', 'Sospecha de fraude'),
('No acudió a la cita', 'No acudió a la cita'),
('Mal comportamiento', 'Mal comportamiento'),
('Artículo defectuoso', 'Artículo defectuoso'),
('Otros', 'Otros')])
description = TextAreaField('Descripción del informe', [
validators.DataRequired(message='Es necesario escribir una descripción')])
submit = SubmitField('Publicar Informe')
| 48.025381 | 323 | 0.649931 | 957 | 9,461 | 6.426332 | 0.258098 | 0.075122 | 0.099024 | 0.095772 | 0.56374 | 0.539024 | 0.507317 | 0.47122 | 0.396423 | 0.353008 | 0 | 0.014233 | 0.220273 | 9,461 | 196 | 324 | 48.270408 | 0.819032 | 0.036043 | 0 | 0.376471 | 0 | 0 | 0.374383 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.064706 | 0.023529 | 0 | 0.458824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
bcaa324b8d6cf63921fcf9763740cc9027d44173 | 855 | py | Python | trips/migrations/0004_invoice.py | chorna/taxi24 | 09e174a0cb3b9543ca4987e60cd0d37ecda6ac3c | [
"MIT"
] | null | null | null | trips/migrations/0004_invoice.py | chorna/taxi24 | 09e174a0cb3b9543ca4987e60cd0d37ecda6ac3c | [
"MIT"
] | null | null | null | trips/migrations/0004_invoice.py | chorna/taxi24 | 09e174a0cb3b9543ca4987e60cd0d37ecda6ac3c | [
"MIT"
] | null | null | null | # Generated by Django 3.2.5 on 2021-07-11 23:51
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('trips', '0003_alter_trip_state'),
]
operations = [
migrations.CreateModel(
name='Invoice',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('serie', models.CharField(max_length=4)),
('number', models.CharField(max_length=8)),
('tax_amount', models.FloatField(default=0.0)),
('base_amount', models.FloatField()),
('trip_id', models.ForeignKey(db_column='trip_id', on_delete=django.db.models.deletion.PROTECT, to='trips.trip')),
],
),
]
| 32.884615 | 130 | 0.592982 | 94 | 855 | 5.244681 | 0.62766 | 0.048682 | 0.056795 | 0.089249 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036566 | 0.264327 | 855 | 25 | 131 | 34.2 | 0.747218 | 0.052632 | 0 | 0 | 1 | 0 | 0.115099 | 0.02599 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.105263 | 0 | 0.263158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcaae8938e310a72ba14496462496246c713e82d | 577 | py | Python | contrib/micronet/scripts/file2buf.py | pmalhaire/WireHub | 588a372e678b49557deed6ba88a896596222fb2d | [
"Apache-2.0"
] | 337 | 2018-12-21T22:13:57.000Z | 2019-11-01T18:35:10.000Z | contrib/micronet/scripts/file2buf.py | nask0/WireHub | 588a372e678b49557deed6ba88a896596222fb2d | [
"Apache-2.0"
] | 8 | 2018-12-24T20:16:40.000Z | 2019-09-02T11:54:48.000Z | contrib/micronet/scripts/file2buf.py | nask0/WireHub | 588a372e678b49557deed6ba88a896596222fb2d | [
"Apache-2.0"
] | 18 | 2018-12-24T02:49:38.000Z | 2019-07-31T20:00:47.000Z | #!/usr/bin/env python3
import os
import sys
MAX = 8
fpath = sys.argv[1]
name = sys.argv[2]
with open(fpath, "rb") as fh:
sys.stdout.write("char %s[] = {" % (name,) )
i = 0
while True:
if i > 0:
sys.stdout.write(", ")
if i % MAX == 0:
sys.stdout.write("\n\t")
c = fh.read(1)
if not c:
sys.stdout.write("\n")
break
sys.stdout.write("0x%.2x" % (ord(c), ))
i = i + 1
print("};")
print("")
print("unsigned int %s_sz = %s;" % (name, i))
print("")
| 15.594595 | 49 | 0.443674 | 82 | 577 | 3.109756 | 0.5 | 0.176471 | 0.27451 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029891 | 0.362218 | 577 | 36 | 50 | 16.027778 | 0.663043 | 0.036395 | 0 | 0.086957 | 0 | 0 | 0.099278 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.086957 | 0.173913 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcab784b7b6a5cec70ca79e8907334431af7152b | 2,505 | py | Python | shudder/__main__.py | fitpay/shudder | 3bd3d7d712f60b7c7db1d259c024dde3eaeed26c | [
"Apache-2.0"
] | null | null | null | shudder/__main__.py | fitpay/shudder | 3bd3d7d712f60b7c7db1d259c024dde3eaeed26c | [
"Apache-2.0"
] | null | null | null | shudder/__main__.py | fitpay/shudder | 3bd3d7d712f60b7c7db1d259c024dde3eaeed26c | [
"Apache-2.0"
] | null | null | null | # Copyright 2014 Scopely, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Start polling of SQS and metadata."""
import shudder.queue as queue
import shudder.metadata as metadata
from shudder.config import CONFIG
import time
import os
import requests
import signal
import subprocess
import sys
if __name__ == '__main__':
sqs_connection, sqs_queue = queue.create_queue()
sns_connection, subscription_arn = queue.subscribe_sns(sqs_queue)
def receive_signal(signum, stack):
if signum in [1, 2, 3, 15]:
print 'Caught signal %s, exiting.' % (str(signum))
queue.clean_up_sns(sns_connection, subscription_arn, sqs_queue)
sys.exit()
else:
print 'Caught signal %s, ignoring.' % (str(signum))
uncatchable = ['SIG_DFL','SIGSTOP','SIGKILL']
for i in [x for x in dir(signal) if x.startswith("SIG")]:
if not i in uncatchable:
signum = getattr(signal,i)
signal.signal(signum, receive_signal)
while True:
message = queue.poll_queue(sqs_connection, sqs_queue)
if message or metadata.poll_instance_metadata():
queue.clean_up_sns(sns_connection, subscription_arn, sqs_queue)
if 'endpoint' in CONFIG:
requests.get(CONFIG["endpoint"])
if 'endpoints' in CONFIG:
for endpoint in CONFIG["endpoints"]:
requests.get(endpoint)
if 'commands' in CONFIG:
for command in CONFIG["commands"]:
print 'Running command: %s' % command
process = subprocess.Popen(command)
while process.poll() is None:
time.sleep(30)
"""Send a heart beat to aws"""
queue.record_lifecycle_action_heartbeat(message)
"""Send a complete lifecycle action"""
queue.complete_lifecycle_action(message)
sys.exit(0)
time.sleep(5)
| 37.38806 | 75 | 0.637126 | 314 | 2,505 | 4.961783 | 0.44586 | 0.038511 | 0.048139 | 0.053915 | 0.065469 | 0.065469 | 0.065469 | 0.065469 | 0.065469 | 0.065469 | 0 | 0.009377 | 0.276248 | 2,505 | 66 | 76 | 37.954545 | 0.849972 | 0.219561 | 0 | 0.046512 | 0 | 0 | 0.084015 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.209302 | null | null | 0.069767 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcae93da2c9dcb0c8765b93504dcb020462aad8e | 1,696 | py | Python | game/ball.py | geoncic/PyBlock | 69c8220e38a21b7e1c6dd2196752173f9e78981f | [
"MIT"
] | null | null | null | game/ball.py | geoncic/PyBlock | 69c8220e38a21b7e1c6dd2196752173f9e78981f | [
"MIT"
] | null | null | null | game/ball.py | geoncic/PyBlock | 69c8220e38a21b7e1c6dd2196752173f9e78981f | [
"MIT"
] | null | null | null | import pygame
import pygame.gfxdraw
from constants import Constants
class Balls(object):
def __init__(self, all_sprites, all_balls):
self.all_sprites = all_sprites
self.all_balls = all_balls
def spawn_ball(self, pos, vel, team):
# Todo: Figure out how to spawn multiple balls with some sort of delay
ball = Ball(pos, vel, team)
self.all_sprites.add(ball)
self.all_balls.add(ball)
def ball_test(self):
print("This is a Ball Test!")
print(self)
def update(self):
print(self.__dict__)
print(type(self))
class Ball(pygame.sprite.Sprite):
def __init__(self, pos, vel, team):
super().__init__()
self.color = team
self.file = Constants.BALL_TEAMS[self.color]
self.rad = int(Constants.BALL_SIZE/2)
self.image = pygame.Surface([Constants.BALL_SIZE, Constants.BALL_SIZE], pygame.SRCALPHA)
pygame.draw.circle(self.image, self.file, (self.rad, self.rad), self.rad)
self.x_pos = pos[0]
self.y_pos = pos[1]
self.rect = self.image.get_rect(center=(self.x_pos, self.y_pos))
self.dx = vel[0]
self.dy = vel[1]
def update(self):
self.check_boundary()
self.x_pos += self.dx
self.y_pos += self.dy
self.rect.center = [self.x_pos, self.y_pos]
# self.rect.center = pygame.mouse.get_pos() # has sprite follow the mouse
def check_boundary(self):
if not Constants.PLAYER_WIDTH <= self.x_pos <= (Constants.PLAYER_WIDTH+Constants.BOARD_WIDTH):
self.dx = -1*self.dx
if not 0 <= self.y_pos <= Constants.SCREEN_HEIGHT:
self.dy = -1*self.dy
| 30.836364 | 102 | 0.625 | 245 | 1,696 | 4.130612 | 0.302041 | 0.041502 | 0.039526 | 0.035573 | 0.083992 | 0.059289 | 0.059289 | 0.059289 | 0.059289 | 0 | 0 | 0.006339 | 0.255896 | 1,696 | 54 | 103 | 31.407407 | 0.795563 | 0.081958 | 0 | 0.05 | 0 | 0 | 0.012878 | 0 | 0 | 0 | 0 | 0.018519 | 0 | 1 | 0.175 | false | 0 | 0.075 | 0 | 0.3 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcaebabdfa8553517a45b393cf40eff654bc096f | 36,597 | py | Python | program/eggUI.py | otills/embryocv | d501f057bada15ff5dc753d3dae5a883b5c9e244 | [
"MIT"
] | 1 | 2020-08-05T02:47:12.000Z | 2020-08-05T02:47:12.000Z | program/eggUI.py | otills/embryocv | d501f057bada15ff5dc753d3dae5a883b5c9e244 | [
"MIT"
] | null | null | null | program/eggUI.py | otills/embryocv | d501f057bada15ff5dc753d3dae5a883b5c9e244 | [
"MIT"
] | 1 | 2020-08-05T02:47:16.000Z | 2020-08-05T02:47:16.000Z | from pyqtgraph.Qt import QtCore, QtGui
import numpy as np
from scipy.spatial import distance as dist
import glob
import re
import os
from PyQt5 import QtGui
from PyQt5.QtCore import *
from PyQt5.QtGui import *
import sys
import cv2
import pandas as pd
from PyQt5.Qt import *
import pyqtgraph as pg
#from PyQt4.Qt import *
#%%
class eggUI(QDialog):
'''
createOpenCVEggROI : take eggID defined ROIs and visualise
'''
sliderUpdate = QtCore.pyqtSignal()
embryoUpdate = QtCore.pyqtSignal()
keyPressed = QtCore.pyqtSignal()
def __init__(self, parent=None):
super(eggUI, self).__init__(parent)
# Make QDialog
self.diag = QtGui.QDialog()
global parentPath, vidTime
self.diag.setWindowTitle('Identify eggs')
self.diag.imv = pg.ImageView()
self.btn_save = QPushButton('Save', self)
#==============================================================================
#
#==============================================================================
def showUI(self,ims,eggRotBBox, eggBoxPoints, embryoLabels, eggInt):
self.eggInt = eggInt
self.embryoLabels = embryoLabels
self.diag.setWindowTitle('Identify eggs')
# Make ImageView
self.diag.imv = pg.ImageView()
self.diag.resize(1000,600)
# Make ROI
self.importOpenCVROIs(eggRotBBox, eggBoxPoints)
if (eggRotBBox[0][0][0] != 'nan'):
self.createOpenCVEggROI()
self.diag.imv.addItem(self.roi)
# Remove buttons from ImageView widget
self.diag.imv.ui.roiBtn.hide()
self.diag.imv.ui.menuBtn.hide()
# Make tableview
self.diag.table = QtGui.QTableWidget()
self.diag.table.setShowGrid(True)
self.diag.table.setHorizontalHeaderLabels(['Embryo', 'Sorted'])
# Sets different alignment data just on the first column
self.diag.table.setRowCount(int(len(self.embryoLabels)))
self.diag.table.setColumnCount(2)
# Highlight first row
self.diag.table.selectRow(0)
# Make layout
checkLayout = QGridLayout()
# Deal with stretching for approrpraite formatting.
checkLayout.setColumnStretch(0, 3)
checkLayout.setColumnStretch(1, 1)
checkLayout.setRowStretch(0, 1)
checkLayout.setRowStretch(1, 3)
# Add to layout
checkLayout.addWidget(self.diag.imv,0,0,2,2)
checkLayout.addWidget(self.diag.table,1,5)
# Apply layout
self.diag.setLayout(checkLayout)
# Make buttons
self.cpROI_btn = QtGui.QPushButton('&Copy ROI')
self.cpROI_btn.setMinimumHeight(40);
self.useCpROI_btn = QtGui.QPushButton('&Use Copied ROI')
self.useCpROI_btn.setMinimumHeight(40);
self.noEgg_btn = QtGui.QPushButton('&No Egg')
self.noEgg_btn.setMinimumHeight(40);
self.approveROI_btn = QtGui.QPushButton('&Approve ROIs')
self.approveROI_btn.setMinimumHeight(40);
self.exit_btn = QtGui.QPushButton('Exit')
self.exit_btn.setMinimumHeight(40);
# Make button layout
self.btnLayout = QGridLayout()
self.btnLayout.addWidget(self.cpROI_btn,0,0)
self.btnLayout.addWidget(self.useCpROI_btn,0,1)
self.btnLayout.addWidget(self.noEgg_btn,1,1)
self.btnLayout.addWidget(self.approveROI_btn,1,0)
# Exit button not implemented, just use window x (topRight).
# self.btnLayout.addWidget(self.exit_btn,2,1)
# Add button layout to GridLayout.
checkLayout.addLayout(self.btnLayout,0,5)
# Format images for pyqtgraph and put in ImageView
# self.formatSequence(ims)
self.imImport()
self.diag.imv.setImage(self.compSeq)
# Add the ROI to ImageItem
self.diag.show()
# Call function to add data
self.dataForTable()
# Function for modifying the table when ROI is approved.
self.approveROI_btn.clicked.connect(self.updateTable)
# Copy current ROI
self.cpROI_btn.clicked.connect(self.cpROI)
# Apply copied ROI
self.useCpROI_btn.clicked.connect(self.applyCopiedROI)
# Assign nan to frames not containing egg
self.noEgg_btn.clicked.connect(self.recordNoEgg)
# Exit - prompt user to confirm
#self.exit_btn.clicked.connect(self.closeEvent)
# Connect changes in timeline so correct ROI is created and displayed.
self.diag.imv.timeLine.sigPositionChanged.connect(self.updateOpenCVEggROICurrEmbryo)
#self.diag.keyPressEvent(self.keyPressEvent)
#==============================================================================
# Generate data for populating the embryo/approveROI table.
#==============================================================================
def dataForTable(self):
self.tableData = {'Embryo':list(self.embryoLabels),
'ROI approved':['No'] * len(list(self.embryoLabels))}
self.tableCols = [QtGui.QColor(0,0,100,120)]* len(list(self.embryoLabels))
# Enter data onto Table
horHeaders = []
for n, key in enumerate(sorted(self.tableData.keys())):
horHeaders.append(key)
for m, item in enumerate(self.tableData[key]):
newitem = QtGui.QTableWidgetItem(item)
newitem.setBackground(QtGui.QColor(0,0,100,120))
self.diag.table.setItem(m, n, newitem)
# Add Header
self.diag.table.setHorizontalHeaderLabels(horHeaders)
# Adjust size of Table
self.diag.table.resizeRowsToContents()
# self.diag.table.resizeColumnsToContents()
#==============================================================================
# Update table when approve ROI button clicked.
#==============================================================================
def updateTable(self):
self.tableData['ROI approved'][self.diag.table.currentRow()] = 'Approved'
self.tableCols[self.diag.table.currentRow()] = QtGui.QColor(0,100,0,120)
horHeaders = []
for n, key in enumerate(sorted(self.tableData.keys())):
horHeaders.append(key)
for m, item in enumerate(self.tableData[key]):
newitem = QtGui.QTableWidgetItem(item)
self.diag.table.setItem(m, n, newitem)
newitem.setBackground(self.tableCols[m])
#Add Header
self.diag.table.setHorizontalHeaderLabels(horHeaders)
#Adjust size of Table
self.diag.table.resizeRowsToContents()
#==============================================================================
# Update the user interface
#==============================================================================
def updateUI(self,ims,eggRotBBox, eggBoxPoints):
self.imImport()
self.diag.imv.setImage(self.compSeq)
self.importOpenCVROIs(eggRotBBox, eggBoxPoints)
self.getSeqValsAndCurrROI()
self.updateOpenCVEggROINewEmbryo()
# Add the ROI to ImageItem
#self.diag.imv.addItem(self.roi)
#==============================================================================
# Deal with data from the dataHandling class
#==============================================================================
def formatSequence(self,ims):
# Format seq appropriately for pyqtgraph ROIs
self.tSeqd = np.zeros_like(ims)
for l in range(len(self.tSeqd)):
self.tSeqd[l] = ims[l].T
#==============================================================================
# Get folders for a particular embryo
#==============================================================================
def getEmbryoFolders(self, parentPath, embryo):
self.parentPath = parentPath
self.embryo = embryo
self.embryoFolders = glob.glob(parentPath + "*/" + embryo +"/")
self.embryoFolders.sort(key=os.path.getctime)
#==============================================================================
# Get image
#==============================================================================
def imImport(self):
for f in range(len(self.eggUIimPaths)):
im = cv2.imread(self.eggUIimPaths[f],cv2.IMREAD_ANYDEPTH)
ran = (im.max()-im.min())/255.
out = (im/ran)
out = out-out.min()
self.compSeq[int(f)] = out.astype(np.uint8)
self.compSeq[f] = self.compSeq[f].T
#==============================================================================
# Update image iteratively when slider moved
#==============================================================================
#==============================================================================
# def updateImage(self):
# self.getSeqValsAndCurrROI()
# #self.UI.compSeq[e*len(self.eggIDIms):(e*len(self.eggIDIms)+len(self.eggIDIms))] = self.seq
# #self.UI.comp(self.imImport(self.diag.imv.currentIndex()))
# im = cv2.imread(self.eggUIimPaths[self.diag.imv.currentIndex],cv2.IMREAD_ANYDEPTH)
# ran = (im.max()-im.min())/255.
# out = (im/ran)
# out = out-out.min()
# self.compSeq[self.diag.imv.currentIndex] = out.astype(np.uint8)
# self.diag.imv.setImage(self.compSeq.T)
# self.diag.imv.show()
# #========
#==============================================================================
#==============================================================================
# ROI functions
#==============================================================================
#==============================================================================
# Import OpenCV determined ROIs from dataHandling instance. Called from showUI and updateUI.
#==============================================================================
def importOpenCVROIs(self,eggRotBBox, eggBoxPoints):
self.eggRotBBox = eggRotBBox
self.eggBoxPoints = eggBoxPoints
self.originalEggRotBBox = eggRotBBox.copy()
self.originalEggBoxPoints = eggBoxPoints.copy()
#==============================================================================
# Get index values for ROI data.
#==============================================================================
def getSeqValsAndCurrROI(self):
# Calculate the indices for current frame
if self.eggInt != 1234:
self.divVal = self.diag.imv.currentIndex/float(len(self.eggRotBBox[1]))
self.intDivVal = int(self.divVal)
self.withinSeqVal = int((self.divVal - self.intDivVal)*len(self.eggRotBBox[self.intDivVal]))
self.currROI_eggRotBBox = self.eggRotBBox[self.intDivVal,self.withinSeqVal]
self.currROI_eggBoxPoints = self.eggBoxPoints[self.intDivVal,self.withinSeqVal]
else:
self.divVal = self.diag.imv.currentIndex
self.intDivVal = int(self.divVal)
self.currROI_eggRotBBox = self.eggRotBBox[0,self.intDivVal]
self.currROI_eggBoxPoints = self.eggBoxPoints[0,self.intDivVal]
#==============================================================================
# Generate a pyqtgraph ROI, using data from OpenCV.
#==============================================================================
def createOpenCVEggROI(self):
# Get relevant sequence position and ROI.
self.getSeqValsAndCurrROI()
if (self.currROI_eggRotBBox[0] != 'nan'):
# 0 or 90 degree angles seem very buggy. Shift to 1 and 89 as a bodge fix.
if self.currROI_eggRotBBox[4] == -90:
#self.currROI_eggRotBBox[4] = -89
# Get rotated bounding box points
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
elif self.currROI_eggRotBBox[4] == -0:
#self.currROI_eggRotBBox[4] = -1
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
elif self.currROI_eggRotBBox[4] == -180:
#self.currROI_eggRotBBox[4] = -179
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
else:
# Get rotated bounding box points
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
# Make ROI - note non 0,or 90 degree angles, require different of the X size
# Rectangular ROI used to enable more easy handling of corner handles for tracking user chagnges.
if (self.currROI_eggRotBBox[4] == -90.0) | (self.currROI_eggRotBBox[4] == -0.0)| (self.currROI_eggRotBBox[4] == 0.0):
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
# roi = pg.EllipseROI([bottomMost[0][0], bottomMost[0][1]], [eggRotBBox[vidTime][2], eggRotBBox[vidTime][3]])
# Debug
# print 'no angle'
else:
# Random angle ROIs
self.roi = pg.ROI([bottomMost[0][0], bottomMost[0][1]], [-self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
self.roi.setAngle(self.currROI_eggRotBBox[4], update=True)
# roi = pg.EllipseROI([bottomMost[0][0], bottomMost[0][1]], [-eggRotBBox[vidTime][2], eggRotBBox[vidTime][3]])
# Add handles
self.roi.addRotateHandle([1, 0],[0.5,0.5])
self.roi.addRotateHandle([0, 1], [0.5,0.5])
self.roi.addScaleHandle([1, 1], [0, 0])
self.roi.addScaleHandle([0, 0], [1, 1])
self.roi.setPen('y',width=3)
self.roi.removable
self.roi.invertible = 'True'
# Make var for dealing with modifications to roi
self.updatedEggROI=[]
self.roi.sigRegionChangeFinished.connect(self.updateROI)
#else:
#==============================================================================
# Update the ROI for current embryo.
#==============================================================================
def updateOpenCVEggROICurrEmbryo(self):
# Remove previous
if (hasattr(self, 'roi')):
self.diag.imv.removeItem(self.roi)
# Get relevant video position and ROI.
self.getSeqValsAndCurrROI()
# 0 or 90 degree angles seem very buggy. Shift to 1 and 89 as a bodge fix.
if self.currROI_eggRotBBox[4] == -90:
#self.currROI_eggRotBBox[4] = -89
# Get rotated bounding box points
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
elif self.currROI_eggRotBBox[4] == -0:
#self.currROI_eggRotBBox[4] = -1
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
elif self.currROI_eggRotBBox[4] == -180:
#self.currROI_eggRotBBox[4] = -179
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
else:
# Get rotated bounding box points
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
# Make ROI - note non 0,or 90 degree angles, require different of the X size
# Rectangular ROI used to enable more easy handling of corner handles for tracking user chagnges.
if (self.currROI_eggRotBBox[4] == -90.0) | (self.currROI_eggRotBBox[4] == -0.0)| (self.currROI_eggRotBBox[4] == 0.0):
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
# roi = pg.EllipseROI([bottomMost[0][0], bottomMost[0][1]], [eggRotBBox[vidTime][2], eggRotBBox[vidTime][3]])
# Debug
# print 'no angle'
else:
# Random angle ROIs
self.roi = pg.ROI([bottomMost[0][0], bottomMost[0][1]], [-self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
self.roi.setAngle(self.currROI_eggRotBBox[4], update=True)
# roi = pg.EllipseROI([bottomMost[0][0], bottomMost[0][1]], [-eggRotBBox[vidTime][2], eggRotBBox[vidTime][3]])
# roi = pg.EllipseROI([bottomMost[0][0], bottomMost[0][1]], [-eggRotBBox[vidTime][2], eggRotBBox[vidTime][3]])
# Add handles
self.roi.addRotateHandle([1, 0],[0.5,0.5])
self.roi.addRotateHandle([0, 1], [0.5,0.5])
self.roi.addScaleHandle([1, 1], [0, 0])
self.roi.addScaleHandle([0, 0], [1, 1])
self.roi.setPen('y',width=3)
self.roi.removable
self.roi.invertible = 'True'
# Make var for dealing with modifications to roi
self.updatedEggROI=[]
### Still to do...
self.diag.imv.addItem(self.roi)
self.roi.sigRegionChangeFinished.connect(self.updateROI)
#==============================================================================
# Update ROI for new embryo.
#==============================================================================
def updateOpenCVEggROINewEmbryo(self):
# Remove old ROI
if (hasattr(self, 'roi')):
self.diag.imv.removeItem(self.roi)
# Get relevant video position and ROI
self.getSeqValsAndCurrROI()
# 0 or 90 degree angles seem very buggy. Shift to 1 and 89 as a bodge fix.
if self.currROI_eggRotBBox[4] == -90:
#self.currROI_eggRotBBox[4] = -89
# Get rotated bounding box points
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
elif self.currROI_eggRotBBox[4] == -0:
#self.currROI_eggRotBBox[4] = -1
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
elif self.currROI_eggRotBBox[4] == -180:
#self.currROI_eggRotBBox[4] = -179
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
else:
# Get rotated bounding box points
ySorted = self.currROI_eggBoxPoints[np.argsort(self.currROI_eggBoxPoints[:, 1]), :]
# Get bottom most, and top most sorted corner points
bottomMost = ySorted[:2, :]
topMost = ySorted[2:, :]
# Get bottom most
bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
(bl, br) = bottomMost
# Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# The point with the largest distance will be our bottom-right point
D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
(tl, tr) = topMost[np.argsort(D)[::-1], :]
# Make ROI - note non 0,or 90 degree angles, require different of the X size
# Rectangular ROI used to enable more easy handling of corner handles for tracking user chagnges.
if (self.currROI_eggRotBBox[4] == -90.0) | (self.currROI_eggRotBBox[4] == -0.0)| (self.currROI_eggRotBBox[4] == 0.0):
self.roi = pg.ROI([bl[0], bl[1]], [self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
# roi = pg.EllipseROI([bottomMost[0][0], bottomMost[0][1]], [eggRotBBox[vidTime][2], eggRotBBox[vidTime][3]])
# Debug
# print 'no angle'
else:
# Random angle ROIs
self.roi = pg.ROI([bottomMost[0][0], bottomMost[0][1]], [-self.currROI_eggRotBBox[2], self.currROI_eggRotBBox[3]])
self.roi.setAngle(self.currROI_eggRotBBox[4], update=True)
# Add handles
self.roi.addRotateHandle([1, 0],[0.5,0.5])
self.roi.addRotateHandle([0, 1], [0.5,0.5])
self.roi.addScaleHandle([1, 1], [0, 0])
self.roi.addScaleHandle([0, 0], [1, 1])
self.roi.setPen('y',width=3)
self.roi.removable
self.roi.invertible = 'True'
# Make var for dealing with modifications to roi
self.updatedEggROI=[]
### Still to do...
self.diag.imv.addItem(self.roi)
self.roi.sigRegionChangeFinished.connect(self.updateROI)
#==============================================================================
# Update ROI.
#==============================================================================
def updateROI(self):
#global vidTime, xyPosHandles, ellipse, changeAngle, roiChanges,updatedEggROI, changeX, changeY, changeScaleX, changeScaleY, changeAngle
# Get changes to ROI scale, angle and position
roiChanges = self.roi.getGlobalTransform()
changeX = -roiChanges.getTranslation()[0]
changeY = roiChanges.getTranslation()[1]
changeScaleX = roiChanges.getScale()[0]
changeScaleY = roiChanges.getScale()[1]
changeAngle = roiChanges.getAngle()
# Update ROI, either updating the previously updated or taking the unaltered ROI from OpenCV as a starting point.
#if len(self.updatedEggROI) == 0:
self.updatedEggROI = (((self.currROI_eggRotBBox[0]-changeX),(self.currROI_eggRotBBox[1]+changeY)),((max((self.currROI_eggRotBBox[3]*changeScaleX),(self.currROI_eggRotBBox[2]*changeScaleY))),(min((self.currROI_eggRotBBox[3]*changeScaleX),(self.currROI_eggRotBBox[2]*changeScaleY)))),self.currROI_eggRotBBox[4]+changeAngle)
#else:
#self.updatedEggROI = (((self.updatedEggROI[0][0]-changeX),(self.updatedEggROI[0][1]+changeY)),((max((self.updatedEggROI[1][0]*changeScaleX),(self.updatedEggROI[1][1]*changeScaleY))),(min((self.updatedEggROI[1][0]*changeScaleX),(self.updatedEggROI[1][1]*changeScaleY)))),self.updatedEggROI[2]+changeAngle)
hh = self.roi.getHandles()
hh = [self.roi.mapToItem(self.diag.imv.getImageItem(), h.pos()) for h in hh]
# Handle on each corner. Get handle positions
self.xyPosHandles =[]
for h in hh:
self.xyPosHandles.append([h.x(),h.y()])
(eggBBX, eggBBY), (eggBBW, eggBBH), eggBBAng = cv2.minAreaRect(np.array(self.xyPosHandles, dtype=np.int32) )
if eggBBAng == -90:
eggBBAng = -89
elif eggBBAng == -180:
eggBBAng = -179
elif eggBBAng == -0:
eggBBAng = -1
# Save updated
# If more than one frame eggID per sequence..
if self.eggInt != 1234:
self.eggRotBBox[self.intDivVal,self.withinSeqVal] = [eggBBX, eggBBY, eggBBW, eggBBH, eggBBAng]
self.eggBoxPoints[self.intDivVal,self.withinSeqVal] = cv2.boxPoints(((eggBBX, eggBBY), (eggBBW, eggBBH), eggBBAng))
# Otherwise just save simply
else:
self.eggRotBBox[0,self.intDivVal] = [eggBBX, eggBBY, eggBBW, eggBBH, eggBBAng]
self.eggBoxPoints[0,self.intDivVal] = cv2.boxPoints(((eggBBX, eggBBY), (eggBBW, eggBBH), eggBBAng))
#==============================================================================
# Copy ROI on button click.
#==============================================================================
def cpROI(self):
self.originalEggRotBBox = self.currROI_eggRotBBox
self.originalEggBoxPoints = self.currROI_eggBoxPoints
#==============================================================================
# Assign nan to current ROI if 'No Egg' button clicked
#==============================================================================
def recordNoEgg(self):
# Remove ROI
self.diag.imv.removeItem(self.roi)
# Store nans in place of ROI
if self.eggInt != 1234:
self.eggRotBBox[self.intDivVal,self.withinSeqVal] = [np.nan, np.nan, np.nan, np.nan, np.nan]
self.eggBoxPoints[0,self.intDivVal] = [np.nan,np.nan,np.nan,np.nan]
else:
self.eggBoxPoints[0,self.intDivVal] = [np.nan,np.nan,np.nan,np.nan]
self.eggRotBBox[0,self.intDivVal] = [np.nan, np.nan, np.nan, np.nan, np.nan]
#==============================================================================
# Copy ROI on button click.
#==============================================================================
def applyCopiedROI(self):
self.getSeqValsAndCurrROI()
# Store copied ROI to embryo sequence ROIs
if self.eggInt != 1234:
self.divVal = self.diag.imv.currentIndex/float(len(self.eggRotBBox[1]))
self.intDivVal = int(self.divVal)
self.withinSeqVal = int((self.divVal - self.intDivVal)*len(self.eggRotBBox[self.intDivVal]))
self.eggRotBBox[self.intDivVal,self.withinSeqVal] = self.originalEggRotBBox
self.eggBoxPoints[self.intDivVal,self.withinSeqVal] = self.originalEggBoxPoints
else:
self.divVal = self.diag.imv.currentIndex
self.intDivVal = int(self.divVal)
self.eggRotBBox[0,self.intDivVal] = self.originalEggRotBBox
self.eggBoxPoints[0,self.intDivVal] = self.originalEggBoxPoints
self.updateOpenCVEggROICurrEmbryo()
#==============================================================================
#
#==============================================================================
#==============================================================================
# Close button - not implemented (hidden)
#==============================================================================
#==============================================================================
# def closeEvent(self, event):
#
# quit_msg = "Are you sure you want to exit the program?"
# reply = QtGui.QMessageBox.question(self, 'Message',
# quit_msg, QtGui.QMessageBox.Yes, QtGui.QMessageBox.No)
#
# if reply == QtGui.QMessageBox.Yes:
# #event.accept()
# app.quit()
# else:
# event.ignore()
#
#==============================================================================
#==============================================================================
# #self.originalEggRotBBox = eggRotBBox.copy()
# #self.originalEggBoxPoints = eggBoxPoints.copy()
# #self.currROI_eggRotBBox = self.eggRotBBox[self.intDivVal,self.withinSeqVal]
# #self.currROI_eggBoxPoints = self.eggBoxPoints[self.intDivVal,self.withinSeqVal]
#
# # Modified version of updateOpenCVEggROICurrEmbryo
# # Remove previous
# self.diag.imv.removeItem(self.roi)
# # Get relevant video position and ROI.
# self.getSeqValsAndCurrROI()
# # Get rotated bounding box points
# ySorted = self.originalEggBoxPoints[np.argsort(self.originalEggBoxPoints[:, 1]), :]
# # Get bottom most, and top most sorted corner points
# bottomMost = ySorted[:2, :]
# topMost = ySorted[2:, :]
# # Get bottom most
# bottomMost = bottomMost[np.argsort(bottomMost[:, 1]), :]
# (bl, br) = bottomMost
# # Use bottom-left coordinate as anchor to calculate the Euclidean distance between the
# # The point with the largest distance will be our bottom-right point
# D = dist.cdist(bl[np.newaxis], topMost, "euclidean")[0]
# (tl, tr) = topMost[np.argsort(D)[::-1], :]
# # Make ROI - note non 0,or 90 degree angles, require different of the X size
# # Rectangular ROI used to enable more easy handling of corner handles for tracking user chagnges.
# if (self.originalEggRotBBox[4] == -90.0) | (self.originalEggRotBBox[4] == -0.0)| (self.originalEggRotBBox[4] == 0.0):
# self.roi = pg.ROI([bottomMost[0][0], bottomMost[0][1]], [self.originalEggRotBBox[2], self.originalEggRotBBox[3]])
# # roi = pg.EllipseROI([bottomMost[0][0], bottomMost[0][1]], [eggRotBBox[vidTime][2], eggRotBBox[vidTime][3]])
# else:
# # Random angle ROIs
# self.roi = pg.ROI([bottomMost[0][0], bottomMost[0][1]], [-self.originalEggRotBBox[2], self.originalEggRotBBox[3]])
# self.roi.setAngle(self.originalEggRotBBox[4], update=True)
# # roi = pg.EllipseROI([bottomMost[0][0], bottomMost[0][1]], [-eggRotBBox[vidTime][2], eggRotBBox[vidTime][3]])
# # Add handles
# self.roi.addRotateHandle([1, 0],[0.5,0.5])
# self.roi.addRotateHandle([0, 1], [0.5,0.5])
# self.roi.addScaleHandle([1, 1], [0, 0])
# self.roi.addScaleHandle([0, 0], [1, 1])
# self.roi.setPen('y',width=3)
# self.roi.removable
# self.roi.invertible = 'True'
# # Make var for dealing with modifications to roi
# self.updatedEggROI=[]
# ### Still to do...
# self.diag.imv.addItem(self.roi)
# self.roi.sigRegionChangeFinished.connect(self.updateROI)
#==============================================================================
#=============== | 54.540984 | 329 | 0.545099 | 3,856 | 36,597 | 5.138745 | 0.111774 | 0.055514 | 0.076306 | 0.034418 | 0.686652 | 0.660207 | 0.640424 | 0.617159 | 0.605249 | 0.605047 | 0 | 0.020821 | 0.244064 | 36,597 | 671 | 330 | 54.540984 | 0.695427 | 0.409924 | 0 | 0.585227 | 0 | 0 | 0.012608 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048295 | false | 0 | 0.056818 | 0 | 0.116477 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcaf69ed2a6fded7e4b539b423940b33563b6d40 | 540 | py | Python | tests/unittest/options/pricing/test_binomial_trees.py | yiluzhu/quant | 784c5cc20eeded2ff684b464eec4744f000d9638 | [
"MIT"
] | 1 | 2020-10-14T12:56:14.000Z | 2020-10-14T12:56:14.000Z | tests/unittest/options/pricing/test_binomial_trees.py | yiluzhu/quant | 784c5cc20eeded2ff684b464eec4744f000d9638 | [
"MIT"
] | null | null | null | tests/unittest/options/pricing/test_binomial_trees.py | yiluzhu/quant | 784c5cc20eeded2ff684b464eec4744f000d9638 | [
"MIT"
] | null | null | null |
from unittest import TestCase
from options.pricing.binomial_trees import BinomialTreePricer
from options.option import OptionType, Option
class BinomialTreeTestCase(TestCase):
def test_basic(self):
"""European option, spot price 50, strike price 52, risk free interest rate 5%
expiry 2 years, volatility 30%
"""
pricer = BinomialTreePricer(steps=100)
option = Option(OptionType.PUT, 50, 52, 0.05, 2, 0.3)
result = pricer.price_option(option)
self.assertEqual(6.7781, result)
| 30 | 86 | 0.698148 | 67 | 540 | 5.58209 | 0.656716 | 0.058824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061466 | 0.216667 | 540 | 17 | 87 | 31.764706 | 0.822695 | 0.196296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
bcb0236709da62fc588329e551c92b5fc621ffd9 | 2,927 | py | Python | kafka/structs.py | informatique-cdc/kafka-python | d73bd6fc2f8825e2fddb7c4f091af7b266e37aea | [
"Apache-2.0"
] | 4,389 | 2015-06-12T06:00:10.000Z | 2022-03-31T20:41:42.000Z | kafka/structs.py | informatique-cdc/kafka-python | d73bd6fc2f8825e2fddb7c4f091af7b266e37aea | [
"Apache-2.0"
] | 1,595 | 2015-12-02T20:58:22.000Z | 2022-03-27T07:28:03.000Z | kafka/structs.py | informatique-cdc/kafka-python | d73bd6fc2f8825e2fddb7c4f091af7b266e37aea | [
"Apache-2.0"
] | 1,115 | 2015-12-02T23:17:52.000Z | 2022-03-30T03:34:29.000Z | """ Other useful structs """
from __future__ import absolute_import
from collections import namedtuple
"""A topic and partition tuple
Keyword Arguments:
topic (str): A topic name
partition (int): A partition id
"""
TopicPartition = namedtuple("TopicPartition",
["topic", "partition"])
"""A Kafka broker metadata used by admin tools.
Keyword Arguments:
nodeID (int): The Kafka broker id.
host (str): The Kafka broker hostname.
port (int): The Kafka broker port.
rack (str): The rack of the broker, which is used to in rack aware
partition assignment for fault tolerance.
Examples: `RACK1`, `us-east-1d`. Default: None
"""
BrokerMetadata = namedtuple("BrokerMetadata",
["nodeId", "host", "port", "rack"])
"""A topic partition metadata describing the state in the MetadataResponse.
Keyword Arguments:
topic (str): The topic name of the partition this metadata relates to.
partition (int): The id of the partition this metadata relates to.
leader (int): The id of the broker that is the leader for the partition.
replicas (List[int]): The ids of all brokers that contain replicas of the
partition.
isr (List[int]): The ids of all brokers that contain in-sync replicas of
the partition.
error (KafkaError): A KafkaError object associated with the request for
this partition metadata.
"""
PartitionMetadata = namedtuple("PartitionMetadata",
["topic", "partition", "leader", "replicas", "isr", "error"])
"""The Kafka offset commit API
The Kafka offset commit API allows users to provide additional metadata
(in the form of a string) when an offset is committed. This can be useful
(for example) to store information about which node made the commit,
what time the commit was made, etc.
Keyword Arguments:
offset (int): The offset to be committed
metadata (str): Non-null metadata
"""
OffsetAndMetadata = namedtuple("OffsetAndMetadata",
# TODO add leaderEpoch: OffsetAndMetadata(offset, leaderEpoch, metadata)
["offset", "metadata"])
"""An offset and timestamp tuple
Keyword Arguments:
offset (int): An offset
timestamp (int): The timestamp associated to the offset
"""
OffsetAndTimestamp = namedtuple("OffsetAndTimestamp",
["offset", "timestamp"])
MemberInformation = namedtuple("MemberInformation",
["member_id", "client_id", "client_host", "member_metadata", "member_assignment"])
GroupInformation = namedtuple("GroupInformation",
["error_code", "group", "state", "protocol_type", "protocol", "members", "authorized_operations"])
"""Define retry policy for async producer
Keyword Arguments:
Limit (int): Number of retries. limit >= 0, 0 means no retries
backoff_ms (int): Milliseconds to backoff.
retry_on_timeouts:
"""
RetryOptions = namedtuple("RetryOptions",
["limit", "backoff_ms", "retry_on_timeouts"])
| 33.261364 | 102 | 0.702767 | 358 | 2,927 | 5.692737 | 0.374302 | 0.023553 | 0.027478 | 0.023553 | 0.102552 | 0.069676 | 0.069676 | 0.035329 | 0.035329 | 0 | 0 | 0.001697 | 0.194739 | 2,927 | 87 | 103 | 33.643678 | 0.862961 | 0.031773 | 0 | 0 | 0 | 0 | 0.415135 | 0.022703 | 0 | 0 | 0 | 0.011494 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcb21686e2484863628d877e956c259a49e6e1be | 2,542 | py | Python | app/resources/magic_castle_api.py | ComputeCanada/mc-hub | 92b4c212ba8f7b5b1c8b8700f981275605a07067 | [
"BSD-3-Clause"
] | 5 | 2020-09-04T16:34:36.000Z | 2020-09-25T19:14:59.000Z | app/resources/magic_castle_api.py | ComputeCanada/mc-hub | 92b4c212ba8f7b5b1c8b8700f981275605a07067 | [
"BSD-3-Clause"
] | 39 | 2020-09-12T17:37:14.000Z | 2022-03-10T17:49:57.000Z | app/resources/magic_castle_api.py | ComputeCanada/mc-hub | 92b4c212ba8f7b5b1c8b8700f981275605a07067 | [
"BSD-3-Clause"
] | 1 | 2021-03-29T15:42:13.000Z | 2021-03-29T15:42:13.000Z | from flask import request
from resources.api_view import ApiView
from exceptions.invalid_usage_exception import InvalidUsageException
from models.user.user import User
from models.user.authenticated_user import AuthenticatedUser
class MagicCastleAPI(ApiView):
def get(self, user: User, hostname):
if hostname:
magic_castle = user.get_magic_castle_by_hostname(hostname)
return magic_castle.dump_configuration()
else:
if type(user) == AuthenticatedUser:
return [
{
**magic_castle.dump_configuration(planned_only=True),
"hostname": magic_castle.get_hostname(),
"status": magic_castle.get_status().value,
"freeipa_passwd": magic_castle.get_freeipa_passwd(),
"owner": magic_castle.get_owner_username(),
}
for magic_castle in user.get_all_magic_castles()
]
else:
return [
{
**magic_castle.dump_configuration(planned_only=True),
"hostname": magic_castle.get_hostname(),
"status": magic_castle.get_status().value,
"freeipa_passwd": magic_castle.get_freeipa_passwd(),
}
for magic_castle in user.get_all_magic_castles()
]
def post(self, user: User, hostname, apply=False):
if apply:
magic_castle = user.get_magic_castle_by_hostname(hostname)
magic_castle.apply()
return {}
else:
magic_castle = user.create_empty_magic_castle()
json_data = request.get_json()
if not json_data:
raise InvalidUsageException("No json data was provided")
magic_castle.set_configuration(json_data)
magic_castle.plan_creation()
return {}
def put(self, user: User, hostname):
magic_castle = user.get_magic_castle_by_hostname(hostname)
json_data = request.get_json()
if not json_data:
raise InvalidUsageException("No json data was provided")
magic_castle.set_configuration(json_data)
magic_castle.plan_modification()
return {}
def delete(self, user: User, hostname):
magic_castle = user.get_magic_castle_by_hostname(hostname)
magic_castle.plan_destruction()
return {}
| 39.71875 | 77 | 0.5893 | 257 | 2,542 | 5.509728 | 0.249027 | 0.217514 | 0.093927 | 0.056497 | 0.658192 | 0.634181 | 0.634181 | 0.634181 | 0.634181 | 0.580508 | 0 | 0 | 0.335563 | 2,542 | 63 | 78 | 40.349206 | 0.838366 | 0 | 0 | 0.54386 | 0 | 0 | 0.043666 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.070175 | false | 0.035088 | 0.087719 | 0 | 0.298246 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcb4c4328d404e8eec9df91c64d171e98d7a2415 | 5,778 | py | Python | src/Gismo_XY To Location.py | AntonelloDN/gismo | 3ffbabaf8405efd3572701c9e0b7497211dfc248 | [
"Apache-2.0"
] | 57 | 2017-01-31T11:55:22.000Z | 2022-03-26T16:00:40.000Z | src/Gismo_XY To Location.py | AntonelloDN/gismo | 3ffbabaf8405efd3572701c9e0b7497211dfc248 | [
"Apache-2.0"
] | 11 | 2017-02-22T16:45:11.000Z | 2020-05-06T17:00:07.000Z | src/Gismo_XY To Location.py | AntonelloDN/gismo | 3ffbabaf8405efd3572701c9e0b7497211dfc248 | [
"Apache-2.0"
] | 19 | 2017-01-29T18:02:58.000Z | 2021-08-25T10:56:57.000Z | # xy to location
#
# Gismo is a plugin for GIS environmental analysis (GPL) started by Djordje Spasic.
#
# This file is part of Gismo.
#
# Copyright (c) 2019, Djordje Spasic <djordjedspasic@gmail.com>
# Gismo is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
#
# Gismo is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.
#
# The GPL-3.0+ license <http://spdx.org/licenses/GPL-3.0+>
"""
Use this component to calculate latitude and longitude coordinates of the _point in Rhino scene.
For example: you created some building shapes with Gismo "OSM Shapes" component, and now you would like to check what are the latitude and longtitude coordinates of particular part of the building.
-
Provided by Gismo 0.0.3
input:
_point: A point for which we would like to calculate its latitude and longitude coordinates
anchorLocation_: Represents latitude,longitude coordinates which correspond to anchorOrigin_ in Rhino scene.
-
If nothing added to this input, anchorLocation_ with both latitude and longitude set to "0" will be used as a default.
anchorOrigin_: A point in Rhino scene which corresponds to anchorLocation_.
-
If nothing added to this input, anchorOrigin will be set to: 0,0,0.
output:
readMe!: ...
location: Location (latitude, longitude coordinates) of the _point input.
"""
ghenv.Component.Name = "Gismo_XY To Location"
ghenv.Component.NickName = "XYtoLocation"
ghenv.Component.Message = "VER 0.0.3\nJAN_29_2019"
ghenv.Component.IconDisplayMode = ghenv.Component.IconDisplayMode.application
ghenv.Component.Category = "Gismo"
ghenv.Component.SubCategory = "1 | Gismo"
#compatibleGismoVersion = VER 0.0.3\nJAN_29_2019
try: ghenv.Component.AdditionalHelpFromDocStrings = "2"
except: pass
import scriptcontext as sc
import Grasshopper
import Rhino
def main(requiredPoint, anchorLocation, anchorOrigin):
# check inputs
if (requiredPoint == None):
required_location = None
validInputData = False
printMsg = "Please add a point to this component's \"_point\" input."
return required_location, validInputData, printMsg
if (anchorLocation == None):
locationName = "unknown location"
anchor_locationLatitudeD = 0
anchor_locationLongitudeD = 0
timeZone = 0
elevation = 0
else:
locationName, anchor_locationLatitudeD, anchor_locationLongitudeD, timeZone, elevation, validLocationData, printMsg = gismo_preparation.checkLocationData(anchorLocation)
if (anchorOrigin == None):
anchorOrigin = Rhino.Geometry.Point3d(0,0,0)
unitConversionFactor, unitSystemLabel = gismo_preparation.checkUnits()
anchorOrigin_meters = Rhino.Geometry.Point3d(anchorOrigin.X*unitConversionFactor, anchorOrigin.Y*unitConversionFactor, anchorOrigin.Z*unitConversionFactor)
requiredPoint_meters = Rhino.Geometry.Point3d(requiredPoint.X*unitConversionFactor, requiredPoint.Y*unitConversionFactor, requiredPoint.Z*unitConversionFactor)
# inputCRS
EPSGcode = 4326 # WGS 84
inputCRS_dummy = gismo_gis.CRS_from_EPSGcode(EPSGcode)
# outputCRS
outputCRS_dummy = gismo_gis.UTM_CRS_from_latitude(anchor_locationLatitudeD, anchor_locationLongitudeD)
anchor_originProjected_meters = gismo_gis.convertBetweenTwoCRS(inputCRS_dummy, outputCRS_dummy, anchor_locationLongitudeD, anchor_locationLatitudeD) # in meters
# inputCRS
# based on assumption that both anchorLocation_ input and required_location belong to the same UTM zone
inputCRS = gismo_gis.UTM_CRS_from_latitude(anchor_locationLatitudeD, anchor_locationLongitudeD, anchor_locationLatitudeD, anchor_locationLongitudeD)
# outputCRS
EPSGcode = 4326
outputCRS = gismo_gis.CRS_from_EPSGcode(EPSGcode)
latitudeLongitudePt = gismo_gis.convertBetweenTwoCRS(inputCRS, outputCRS, (anchor_originProjected_meters.X - anchorOrigin_meters.X) + requiredPoint_meters.X, (anchor_originProjected_meters.Y - anchorOrigin_meters.Y) + requiredPoint_meters.Y)
required_location = gismo_preparation.constructLocation(locationName, latitudeLongitudePt.Y, latitudeLongitudePt.X, timeZone, elevation)
validInputData = True
printMsg = "ok"
return required_location, validInputData, printMsg
level = Grasshopper.Kernel.GH_RuntimeMessageLevel.Warning
if sc.sticky.has_key("gismoGismo_released"):
validVersionDate, printMsg = sc.sticky["gismo_check"].versionDate(ghenv.Component)
if validVersionDate:
gismo_preparation = sc.sticky["gismo_Preparation"]()
gismo_gis = sc.sticky["gismo_GIS"]()
location, validInputData, printMsg = main(_point, anchorLocation_, anchorOrigin_)
if not validInputData:
print printMsg
ghenv.Component.AddRuntimeMessage(level, printMsg)
else:
print printMsg
ghenv.Component.AddRuntimeMessage(level, printMsg)
else:
printMsg = "First please run the Gismo Gismo component."
print printMsg
ghenv.Component.AddRuntimeMessage(level, printMsg)
| 47.360656 | 246 | 0.72776 | 653 | 5,778 | 6.320061 | 0.338438 | 0.040708 | 0.028108 | 0.045554 | 0.17228 | 0.130119 | 0.089411 | 0.067846 | 0.038284 | 0.038284 | 0 | 0.012208 | 0.206127 | 5,778 | 121 | 247 | 47.752066 | 0.887508 | 0.18207 | 0 | 0.220339 | 0 | 0 | 0.065931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.016949 | 0.050847 | null | null | 0.237288 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcbef8c15ce4fa1656c062f45eb901b87f935220 | 1,828 | py | Python | musicLrc.py | xiangxing98/Rhythm-Enlightment | d6302321e858d07480b18e94c59de87f91c39202 | [
"MIT"
] | null | null | null | musicLrc.py | xiangxing98/Rhythm-Enlightment | d6302321e858d07480b18e94c59de87f91c39202 | [
"MIT"
] | null | null | null | musicLrc.py | xiangxing98/Rhythm-Enlightment | d6302321e858d07480b18e94c59de87f91c39202 | [
"MIT"
] | null | null | null | import time
musicLrc = """
[00:03.50]传奇
[00:19.10]作词:刘兵 作曲:李健
[00:20.60]演唱:王菲
[00:26.60]
[04:40.75][02:39.90][00:36.25]只是因为在人群中多看了你一眼
[04:49.00]
[02:47.44][00:43.69]再也没能忘掉你容颜
[02:54.83][00:51.24]梦想着偶然能有一天再相见
[03:02.32][00:58.75]从此我开始孤单思念
[03:08.15][01:04.30]
[03:09.35][01:05.50]想你时你在天边
[03:16.90][01:13.13]想你时你在眼前
[03:24.42][01:20.92]想你时你在脑海
[03:31.85][01:28.44]想你时你在心田
[03:38.67][01:35.05]
[04:09.96][03:39.87][01:36.25]宁愿相信我们前世有约
[04:16.37][03:46.38][01:42.47]今生的爱情故事 不会再改变
[04:24.82][03:54.83][01:51.18]宁愿用这一生等你发现
[04:31.38][04:01.40][01:57.43]我一直在你身旁 从未走远
[04:39.55][04:09.00][02:07.85]
"""
lrcDict = {}
musicLrcList = musicLrc.splitlines()
#print(musicLrcList)
for lrcLine in musicLrcList:
#[04:40.75][02:39.90][00:36.25]只是因为在人群中多看了你一眼
#[04:40.75 [02:39.90 [00:36.25 只是因为在人群中多看了你一眼
#[00:20.60]演唱:王菲
lrcLineList = lrcLine.split("]")
for index in range(len(lrcLineList) - 1):
timeStr = lrcLineList[index][1:]
#print(timeStr)
#00:03.50
timeList = timeStr.split(":")
timelrc = float(timeList[0]) * 60 + float(timeList[1])
#print(time)
lrcDict[timelrc] = lrcLineList[-1]
print(lrcDict)
allTimeList = []
for t in lrcDict:
allTimeList.append(t)
allTimeList.sort()
#print(allTimeList)
'''
while 1:
getTime = float(input("请输入一个时间"))
for n in range(len(allTimeList)):
tempTime = allTimeList[n]
if getTime < tempTime:
break
if n == 0:
print("时间太小")
else:
print(lrcDict[allTimeList[n - 1]])
'''
getTime = 0
while 1:
for n in range(len(allTimeList)):
tempTime = allTimeList[n]
if getTime < tempTime:
break
lrc = lrcDict.get(allTimeList[n - 1])
if lrc == None:
pass
else:
print(lrc)
time.sleep(1)
getTime += 1 | 22.292683 | 62 | 0.605033 | 297 | 1,828 | 3.723906 | 0.380471 | 0.014467 | 0.016275 | 0.0217 | 0.227848 | 0.209765 | 0.209765 | 0.209765 | 0.209765 | 0.209765 | 0 | 0.212225 | 0.203501 | 1,828 | 82 | 63 | 22.292683 | 0.54739 | 0.098468 | 0 | 0 | 0 | 0.04 | 0.413392 | 0.310044 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.02 | 0.02 | 0 | 0.02 | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcc231c6648af0cd64b843faf63ad79a79b6853b | 895 | py | Python | src/plugins/sjsy.py | 2443391447/nonebot2 | c9fa0c44c130b8a1425b2b71105fa909232c37b0 | [
"MIT"
] | 1 | 2021-08-24T03:18:23.000Z | 2021-08-24T03:18:23.000Z | src/plugins/sjsy.py | 2443391447/nonebot2 | c9fa0c44c130b8a1425b2b71105fa909232c37b0 | [
"MIT"
] | null | null | null | src/plugins/sjsy.py | 2443391447/nonebot2 | c9fa0c44c130b8a1425b2b71105fa909232c37b0 | [
"MIT"
] | 1 | 2021-09-01T07:50:03.000Z | 2021-09-01T07:50:03.000Z | from nonebot import on_keyword, on_command
from nonebot.typing import T_State
from nonebot.adapters.cqhttp import Message, Bot, Event # 这两个没用的别删
from nonebot.adapters.cqhttp.message import MessageSegment
import requests
from nonebot.permission import *
from nonebot.rule import to_me
from aiocqhttp.exceptions import Error as CQHttpError
sheying = on_keyword({'随机摄影'})
@sheying.handle()
async def main(bot: Bot, event: Event, state: T_State):
msg = await downloads()
try:
await sheying.send(message=Message(msg))
except CQHttpError:
pass
async def downloads():
url = "https://yanghanwen.xyz/tu/ren.php"
resp = requests.get(url).json()
url_ing = resp['data']
xians = f"[CQ:image,file={url_ing}]"
return xians
# await xians.send("正在爬取图片,请稍后……")
# await xians.send(MessageSegment.at(id) + xians + "精选摄影") | 28.870968 | 68 | 0.689385 | 119 | 895 | 5.168067 | 0.529412 | 0.107317 | 0.061789 | 0.081301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 895 | 31 | 69 | 28.870968 | 0.850559 | 0.109497 | 0 | 0 | 0 | 0 | 0.086387 | 0.032723 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.045455 | 0.363636 | 0 | 0.409091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
bccb37cf2799cc964344db7c5cf679594dae2889 | 2,252 | py | Python | tests/test_api.py | ines/spacy-js | 5b7a86cb0d1099285e01252f7e1d44a36ad9a07f | [
"MIT"
] | 141 | 2018-10-27T17:18:54.000Z | 2022-03-31T11:08:02.000Z | tests/test_api.py | Fabulabs/spacy-js | c7a34298203d26b25f9eb1f6b9eb875faa33d144 | [
"MIT"
] | 16 | 2018-10-27T21:44:36.000Z | 2022-01-22T03:01:54.000Z | tests/test_api.py | Fabulabs/spacy-js | c7a34298203d26b25f9eb1f6b9eb875faa33d144 | [
"MIT"
] | 22 | 2019-01-12T16:38:20.000Z | 2022-03-14T19:11:38.000Z | # coding: utf8
from __future__ import unicode_literals
import pytest
import spacy
import json
from api.server import parse, doc2json, load_model
@pytest.fixture(scope="session")
def model():
return "en_core_web_sm"
@pytest.fixture(scope="session")
def text():
return "This is a sentence about Facebook. This is another one."
@pytest.fixture(scope="session")
def nlp(model):
return spacy.load(model)
@pytest.fixture(scope="session")
def doc(nlp, text):
return nlp(text)
def test_server_parse(model, text, doc):
load_model(model)
json_doc = parse(model, text)
direct_json_doc = doc2json(doc, model)
assert json.dumps(json_doc, sort_keys=True) == json.dumps(
direct_json_doc, sort_keys=True
)
def test_doc2json_doc_tokens(doc, model):
data = doc2json(doc, model)
assert data["model"] == model
assert data["doc"]["text"] == doc.text
assert data["doc"]["text_with_ws"] == doc.text_with_ws
assert data["doc"]["is_tagged"]
assert data["doc"]["is_parsed"]
assert data["doc"]["is_sentenced"]
assert len(data["tokens"]) == len(doc)
assert data["tokens"][0]["text"] == doc[0].text
assert data["tokens"][0]["head"] == doc[0].head.i
def test_doc2json_doc_ents(doc, model):
data = doc2json(doc, model)
ents = list(doc.ents)
assert "ents" in data
assert len(data["ents"]) == len(ents)
assert len(data["ents"]) >= 1
assert data["ents"][0]["start"] == ents[0].start
assert data["ents"][0]["end"] == ents[0].end
assert data["ents"][0]["label"] == ents[0].label_
def test_doc2json_doc_sents(doc, model):
data = doc2json(doc, model)
sents = list(doc.sents)
assert "sents" in data
assert len(data["sents"]) == len(sents)
assert len(data["sents"]) >= 1
assert data["sents"][0]["start"] == sents[0].start
assert data["sents"][0]["end"] == sents[0].end
def test_doc2json_doc_noun_chunks(doc, model):
data = doc2json(doc, model)
chunks = list(doc.noun_chunks)
assert "noun_chunks" in data
assert len(data["noun_chunks"]) == len(chunks)
assert len(data["noun_chunks"]) >= 1
assert data["noun_chunks"][0]["start"] == chunks[0].start
assert data["noun_chunks"][0]["end"] == chunks[0].end
| 27.463415 | 68 | 0.654085 | 330 | 2,252 | 4.321212 | 0.193939 | 0.105189 | 0.063815 | 0.070126 | 0.28892 | 0.130435 | 0.051893 | 0 | 0 | 0 | 0 | 0.017214 | 0.174512 | 2,252 | 81 | 69 | 27.802469 | 0.749866 | 0.005329 | 0 | 0.133333 | 0 | 0 | 0.143878 | 0 | 0 | 0 | 0 | 0 | 0.433333 | 1 | 0.15 | false | 0 | 0.083333 | 0.066667 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcd088f1e5c34ccfa8be8350d7cb0a6ebc06a38b | 4,979 | py | Python | HealthNet/prescriptions/views.py | jimga150/HealthNet | 84e55302b02221ae6e93640904af837fdfe09a83 | [
"MIT"
] | null | null | null | HealthNet/prescriptions/views.py | jimga150/HealthNet | 84e55302b02221ae6e93640904af837fdfe09a83 | [
"MIT"
] | null | null | null | HealthNet/prescriptions/views.py | jimga150/HealthNet | 84e55302b02221ae6e93640904af837fdfe09a83 | [
"MIT"
] | null | null | null | from django.shortcuts import redirect
from .forms import PrescriptionForm
from core.views import is_doctor, is_nurse, is_admin, is_patient
from core.models import *
from .models import Prescription
from django.contrib.auth.decorators import login_required, user_passes_test
from django.utils import timezone
from django.shortcuts import render
from django.core.urlresolvers import reverse
def not_admin(user):
"""
:param user: The User in question
:return: True if the user is anything but an Admin
"""
return not is_admin(user)
def is_doctor_or_nurse(user):
"""
:param user: The User in question
:return: True if the user is a Doctor or Nurse
"""
return is_doctor(user) or is_nurse(user)
@login_required
@user_passes_test(is_doctor)
def new_prescription(request):
"""
Page for the form a doctor fills out to prescribe a drug
:param request: the request with possible form submission
:return: Prescription form or redirect to listing page (below)
"""
if request.method == 'POST':
prescription_form = PrescriptionForm(data=request.POST)
validity = prescription_form.is_valid()
if validity:
prescription = prescription_form.save(commit=False)
prescription.date_prescribed = timezone.now()
prescription.doctor = Doctor.objects.all().get(user=request.user)
prescription.save()
log = Log.objects.create_Log(request.user, request.user.username, timezone.now(),
"Prescription filled out")
log.save()
else:
print("Error")
print(prescription_form.errors)
if 'submit_singular' in request.POST and validity:
return redirect('prescriptions')
elif 'submit_another' in request.POST:
prescription_form = PrescriptionForm()
else:
prescription_form = PrescriptionForm()
context = {"prescription_form": prescription_form}
return render(request, 'prescriptions/makenew.html', context)
def get_prescription_list_for(cpatient):
"""
Generic getter for a specific patient's prescription list
:param cpatient: Patient to fetch list for
:return: context of Prescription list
"""
Prescriptions = Prescription.objects.all().filter(patient=cpatient)
per = []
for p in Prescriptions.iterator():
per.append(str(dict(p.TIME_CHOICES)[p.Time_units]))
p_list = zip(Prescriptions, per)
return {"Labels": ["Doctor", "Drug", "Dosage", "Rate"], "Name": str(cpatient), "Prescriptions": p_list}
@login_required
@user_passes_test(not_admin)
def prescriptions(request):
"""
Lists either all patients in the hospital with links to their prescription lists, or the prescriptions applied to a
single defined patient.
:param request: The request sent in, not used here
:return: List page rendering
"""
context = {}
if is_doctor(request.user) or is_nurse(request.user):
context["Labels"] = ["Name", "Prescriptions"]
patients = Patient.objects.all()
prescription_nums = []
for pat in patients.iterator():
prescription_nums.append(Prescription.objects.filter(patient=pat).count())
context["Patients"] = zip(patients, prescription_nums)
elif is_patient(request.user):
cpatient = Patient.objects.get(user=request.user)
context = get_prescription_list_for(cpatient)
context["is_doctor"] = is_doctor(request.user)
context["is_doctor"] = is_doctor(request.user)
return render(request, 'prescriptions/list.html', context)
@login_required
@user_passes_test(is_doctor_or_nurse)
def prescriptions_list(request, patient_id):
"""
Page that doctors and nurses are sent to when accessing a single patient's prescription list.
:param request: The request sent in, not used here
:param patient_id: ID of the patient who's being listed
:return: List page rendering
"""
cpatient = Patient.objects.get(pk=patient_id)
context = get_prescription_list_for(cpatient)
context["is_doctor"] = is_doctor(request.user)
return render(request, 'prescriptions/list.html', context)
@login_required
@user_passes_test(is_doctor)
def delete_prescription(request, prescription_id):
"""
Page for confirming/deleting a single prescription
:param request: The request sent in, not used here
:param prescription_id: ID number of the prescription in question
:return: Redirect or confirmation page
"""
prescription = Prescription.objects.get(pk=prescription_id)
patient_id = prescription.patient.id
if request.method == 'POST':
prescription.delete()
return redirect(reverse('list prescriptions for patient', kwargs={'patient_id': patient_id}))
context = {"Prescription": prescription, 'patient_id': patient_id}
return render(request, 'prescriptions/delete.html', context)
| 33.193333 | 119 | 0.69753 | 612 | 4,979 | 5.537582 | 0.236928 | 0.030688 | 0.025081 | 0.033933 | 0.238713 | 0.179994 | 0.179994 | 0.179994 | 0.167896 | 0.156388 | 0 | 0 | 0.207672 | 4,979 | 149 | 120 | 33.416107 | 0.859062 | 0.228962 | 0 | 0.240506 | 0 | 0 | 0.096651 | 0.026409 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088608 | false | 0.063291 | 0.113924 | 0 | 0.316456 | 0.025316 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
bcd3c580510f803674768f898ad9016345f92071 | 3,027 | py | Python | scripts/test_cache_size_vs_code_balance.py | tareqmalas/girih | 0c126788937d189147be47115703b752235e585c | [
"BSD-3-Clause"
] | 7 | 2015-07-14T08:29:14.000Z | 2021-07-30T14:53:13.000Z | scripts/test_cache_size_vs_code_balance.py | tareqmalas/girih | 0c126788937d189147be47115703b752235e585c | [
"BSD-3-Clause"
] | null | null | null | scripts/test_cache_size_vs_code_balance.py | tareqmalas/girih | 0c126788937d189147be47115703b752235e585c | [
"BSD-3-Clause"
] | 3 | 2016-08-30T01:25:40.000Z | 2017-06-22T05:50:05.000Z | #!/usr/bin/env python
def igs_test(target_dir, exp_name, th, group='', dry_run=0):
from scripts.conf.conf import machine_conf, machine_info
from scripts.utils import run_test
import itertools
cs = 8192
th = th
# Test using rasonable time
# T = scale * size / perf
# scale = T*perf/size
desired_time = 20
if(machine_info['hostname']=='Haswell_18core'):
k_perf_order = {0:150, 1:500, 4:40, 5:200 ,6:20}
elif(machine_info['hostname']=='IVB_10core'):
k_perf_order = {0:120, 1:300, 4:35, 5:150 ,6:20}
k_time_scale = {n: desired_time*k_perf_order[n] for n in k_perf_order.keys()}
#exp = is_dp, ts, k, N, bs_z, tb_l
exp_l = []
# spatial blocking
exp_l = exp_l + \
[(0, 0, 0, 960, 0, [-1])
,(1, 0, 0, 960, 0, [-1])
,(1, 0, 1, 960, 0, [-1])
,(1, 0, 4, 480, 0, [-1])
,(1, 0, 5, 680, 0, [-1])
]
# 1WD
exp_l = exp_l + \
[(0, 2, 0, 960, 1, [1, 3, 5])
,(1, 2, 0, 960, 1, [1, 3, 5])
,(1, 2, 1, 960, 1, [1, 3, 5, 7, 9, 11, 15, 19, 23, 29])
,(1, 2, 4, 480, 1, [1, 3, 5])
,(1, 2, 5, 680, 1, [1, 3, 9, 19])
]
# Solar kernel
exp_l = exp_l + \
[(1, 2, 6, 480, 1, [1, 3, 5, 7])
,(1, 2, 6, 480, 2, [1, 3, 5, 7])
,(1, 2, 6, 480, 3, [1, 3, 5, 7])
,(1, 2, 6, 480, 6, [1, 3, 5, 7])
,(1, 2, 6, 480, 9, [1, 3, 5, 7])]
mwdt=1
tgs, thx, thy, thz = (1,1,1,1)
count=0
for is_dp, ts, kernel, N, bs_z, tb_l in exp_l:
for tb in tb_l:
outfile=('kernel%d_isdp%d_ts%d_bsz$d_tb%d_N%d_%s_%s.txt' % (kernel, is_dp, ts, bs_z, tb, N, group, exp_name[-13:]))
nt = max(int(k_time_scale[kernel]/(N**3/1e6)), 30)
# print outfile, ts, kernel, tb, N
run_test(ntests=1,dry_run=dry_run, is_dp=is_dp, th=th, tgs=tgs, thx=thx, thy=thy, thz=thz, kernel=kernel, ts=ts, nx=N, ny=N, nz=N, nt=nt, outfile=outfile, target_dir=target_dir, cs=cs, mwdt=mwdt, tb=tb, nwf=bs_z)
count = count+1
return count
def main():
from scripts.utils import create_project_tarball, get_stencil_num, parse_results
from scripts.conf.conf import machine_conf, machine_info
import os, sys
import time,datetime
# user params
dry_run = 1 if len(sys.argv)<2 else int(sys.argv[1]) # dry run
time_stamp = datetime.datetime.fromtimestamp(time.time()).strftime('%Y%m%d_%H_%M')
exp_name = "cache_size_vs_code_balance_at_%s_%s" % (machine_info['hostname'], time_stamp)
tarball_dir='results/'+exp_name
if(dry_run==0): create_project_tarball(tarball_dir, "project_"+exp_name)
target_dir='results/' + exp_name
th = 1
pin_str = "S0:0-%d "%(th-1)
count=0
group = 'MEM'
if( (machine_info['hostname']=='IVB_10core') and (group=='TLB_DATA') ): group='TLB'
machine_conf['pinning_args'] = "-m -g " + group + " -C " + pin_str + ' -s 0x03 --'
count= count + igs_test(target_dir, exp_name, th=th, group=group, dry_run=dry_run)
print "experiments count =" + str(count)
if __name__ == "__main__":
main()
| 31.863158 | 218 | 0.573835 | 546 | 3,027 | 2.981685 | 0.272894 | 0.015971 | 0.016585 | 0.014742 | 0.207617 | 0.140049 | 0.136364 | 0.095209 | 0.070639 | 0 | 0 | 0.102475 | 0.239181 | 3,027 | 94 | 219 | 32.202128 | 0.604429 | 0.072019 | 0 | 0.109375 | 0 | 0 | 0.094353 | 0.028592 | 0 | 0 | 0.00143 | 0 | 0 | 0 | null | null | 0 | 0.109375 | null | null | 0.015625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcd52639c509cc2628a1148eef258524825f4528 | 8,408 | py | Python | pyripple/protocol/orderbook.py | gip/pyripple | d0c696bed7c6ad4c2309733484f9915074f9acdd | [
"Apache-2.0"
] | null | null | null | pyripple/protocol/orderbook.py | gip/pyripple | d0c696bed7c6ad4c2309733484f9915074f9acdd | [
"Apache-2.0"
] | null | null | null | pyripple/protocol/orderbook.py | gip/pyripple | d0c696bed7c6ad4c2309733484f9915074f9acdd | [
"Apache-2.0"
] | null | null | null | # PyRipple
#
# Copyright 2015 Gilles Pirio
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
.. moduleauthor:: Gilles Pirio <gilles.xrp@gmail.com>
"""
import numpy as np
import pandas as pd
import mpmath as mp
from mpmath import mpf
import matplotlib
import matplotlib.pyplot as plt
import json
def _weigtedAverage(book, target):
rs = 0
ws = 0
t = target
for order in book:
if t <= order['limit']:
rs += t
ws += t*order['rate']
return ws / rs
else:
rs += order['limit']
ws += order['limit']*order['rate']
t -= order['limit']
def _currencyStr((c, i)):
return 'XRP' if c=='XRP' else '%s@%s' % (c, i)
def _foldBook(accumulator, orderbook):
if accumulator is None:
accumulator = { 'bids': { }, 'asks': { }, 'ledgers': [ ] }
ldg= orderbook.ledger.index
accumulator['ledgers'].append(ldg)
for offer in orderbook.offersA:
uid = (offer['account'], offer['sequence'])
if uid in accumulator['asks']:
accumulator['asks'][uid]['end'] = ldg
else:
accumulator['asks'][uid] = { 'start': ldg, 'end': ldg, 'offer': offer }
for offer in orderbook.offersB:
uid = (offer['account'], offer['sequence'])
if uid in accumulator['bids']:
accumulator['bids'][uid]['end'] = ldg
else:
accumulator['bids'][uid] = { 'start': ldg, 'end': ldg, 'offer': offer }
return accumulator
def _foldBooks(orderbooks):
acc = None
for orderbook in orderbooks:
acc = _foldBook(acc, orderbook)
return acc
class Orderbook:
def __init__(self, (c0, i0), (c1, i1), ldgA, offersA, ldbB, offersB, through=[]):
self.c0= c0
self.i0= i0
self.c1= c1
self.i1= i1
self.offersA= offersA
self.offersB= offersB
self.spread= offersA[0]['rate']-offersB[0]['rate']
self.spread_pct= self.spread*100 / offersA[0]['rate']
self.ledger= ldgA
self.through= through
def weigtedAverageA(self, v):
return _weigtedAverage(self.offersA, v)
def weigtedAverageB(self, v):
return _weigtedAverage(self.offersB, v)
def info(self):
return {
'currency': _currencyStr((self.c0, self.i0)),
'counter_currency': _currencyStr((self.c1, self.i1)),
'spread': self.spread,
'spread': self.spread_pct,
'best_ask': self.offersA[0]['rate'],
'n_asks': len(self.offersA),
'n_bids': len(self.offersB),
'best_bid': self.offersB[0]['rate'],
'through': self.through
}
def showInfo(self):
print ('Orderbook %s%s in ledger %i' % (self.c0, self.c1, self.ledger.index))
print (' Close date: %s' % self.ledger.date_human)
print (' Currency: XRP' if self.c0=='XRP' else ' Currency: %s@%s' % (self.c0, self.i0))
print (' Counter currency: XRP' if self.c1=='XRP' else ' Counter currency: %s@%s' % (self.c1, self.i1))
print (' Spread: %f (%f %%)' % (self.spread, self.spread_pct))
print (' Best ask/bid: %f / %f' % (self.offersA[0]['rate'], self.offersB[0]['rate']))
print ' Through: ', self.through
def __mul__(self, other):
assert self.c1 == other.c0 and self.i1 == other.i0, "Invalide trade"
# Let's compute the new orderbook!
def prudctOffers(o0, o1):
offers = []
i0= 0
i1= 0
xlim= 0
o0limit= 0
o1limit= 0
while i1 < len(o1) and i0 < len(o0):
if o0limit==0:
o0rate= o0[i0]['rate']
o0limit= o0[i0]['limit']
i0+= 1
if o1limit==0:
o1rate= o1[i1]['rate']
o1limit= o1[i1]['limit']
i1+= 1
delta = o0limit*o0rate-o1limit
if delta<0:
amt= o0limit*o1rate
o0limit= 0
o1limit-= amt
xlim+= amt
offers.append({ 'rate': o0rate*o1rate, 'limit': amt, 'xlimit': xlim })
elif delta>0:
amt= o1limit
o1limit= 0
o0limit-= amt
xlim+= amt
offers.append({ 'rate': o0rate*o1rate, 'limit': amt, 'xlimit': xlim })
else:
o0limit= 0
o1limit= 0
xlim+= o1limit
offers.append({ 'rate': o0rate*o1rate, 'limit': o1limit, 'xlimit': xlim })
return offers
through = list(self.through)
through.append((self.c1, self.i1))
return Orderbook((self.c0, self.i0),
(other.c1, other.i1),
self.ledger, prudctOffers(self.offersA, other.offersA), other.ledger, prudctOffers(self.offersB, other.offersB), through)
def plot(self, *args, **kwargs):
fA = pd.DataFrame(self.offersA)
fB = pd.DataFrame(self.offersB)
newfig= kwargs.get('newfig', True)
if newfig:
plt.figure(num=None, figsize=(16, 12), dpi=80, facecolor='w', edgecolor='k');
axes = plt.gca();
plt.title('Order book for %s / %s at ledger %i' % (_currencyStr((self.c0, self.i0)), _currencyStr((self.c1, self.i1)), self.ledger.index));
plt.xlabel(_currencyStr((self.c1, self.i1)))
plt.ylabel('%s%s' % (self.c0, self.c1))
plt.gca().xaxis.set_major_formatter(matplotlib.ticker.FuncFormatter(lambda x, p: format(int(x), ',')))
if kwargs.get('orders', True):
plt.hlines(fA['rate'], 0, fA['limit'], color='b', label= 'Asks')
plt.plot(fA['limit'], fA['rate'], 'b^')
plt.hlines(fB['rate'], 0, fB['limit'], color='r', label= 'Bids')
plt.plot(fB['limit'], fB['rate'], 'r^')
def supplyDemand(xlimits):
x= []
y= []
limit= 0
for (r, l) in xlimits:
x.append(r)
x.append(r)
y.append(limit)
limit= l
y.append(limit)
return (x,y)
if kwargs.get('supplydemand', True):
(x, y)= supplyDemand(zip(fA['rate'], fA['xlimit']))
plt.plot(y, x, 'b--', label= 'Supply')
(x, y)= supplyDemand(zip(fB['rate'], fB['xlimit']))
plt.plot(y, x, 'r--', label= 'Demand')
if newfig:
plt.legend()
def plotWeighted(self, limit, *args, **kwargs):
newfig= kwargs.get('newfig', True)
if newfig:
plt.figure(num=None, figsize=(16, 12), dpi=80, facecolor='w', edgecolor='k');
plt.xlabel('%s@%s' % (self.c1, self.i1))
plt.title('Rate (weigthed average) for %s / %s ledger %i' % (_currencyStr((self.c0, self.i0)), _currencyStr((self.c1, self.i1)), self.ledger.index))
plt.gca().xaxis.set_major_formatter(matplotlib.ticker.FuncFormatter(lambda x, p: format(int(x), ',')))
x = np.arange(1, limit, limit / 1000 if limit > 1000 else 1)
cask = kwargs.get('styleask', 'b')
cbid = kwargs.get('stylebid', 'r')
label = kwargs.get('label', 'Weighted avg')
plt.plot(x, map(self.weigtedAverageA, x), cask, label= label + ' (ask)')
plt.plot(x, map(self.weigtedAverageB, x), cbid, label= label + ' (bid)')
if newfig:
plt.legend()
@staticmethod
def plotTimeResolvedBook(orderbooks):
ob0 = orderbooks[0]
fold = _foldBooks(orderbooks)
plt.figure(num=None, figsize=(16, 12), dpi=80, facecolor='w', edgecolor='k');
plt.hlines(map(lambda x: x['offer']['rate'], fold['asks'].values()),
map(lambda x: x['start'], fold['asks'].values()),
map(lambda x: x['end'], fold['asks'].values()), color ='b', label= 'asks' )
plt.hlines(map(lambda x: x['offer']['rate'], fold['bids'].values()),
map(lambda x: x['start'], fold['bids'].values()),
map(lambda x: x['end'], fold['bids'].values()), color ='r', label= 'bids' )
x = map(lambda ob: ob.ledger.index, orderbooks)
plt.plot(x, map(lambda x: x.offersA[0]['rate'], orderbooks), 'b--')
plt.plot(x, map(lambda x: x.offersB[0]['rate'], orderbooks), 'r--')
axes = plt.gca()
axes.get_xaxis().set_major_formatter(matplotlib.ticker.FuncFormatter(lambda x, p: format(int(x))))
axes.set_xlabel('Ripple ledger #')
axes.set_ylabel('%s%s' % (ob0.c0, ob0.c1))
plt.title('Order books for %s / %s' % (_currencyStr((ob0.c0, ob0.i0)), _currencyStr((ob0.c1, ob0.i1))));
plt.legend()
| 37.20354 | 154 | 0.593958 | 1,142 | 8,408 | 4.335377 | 0.21366 | 0.014543 | 0.012927 | 0.017774 | 0.286811 | 0.226621 | 0.2143 | 0.172288 | 0.172288 | 0.140982 | 0 | 0.027131 | 0.228473 | 8,408 | 225 | 155 | 37.368889 | 0.736088 | 0.070171 | 0 | 0.177083 | 0 | 0 | 0.11526 | 0 | 0 | 0 | 0 | 0 | 0.005208 | 0 | null | null | 0 | 0.036458 | null | null | 0.036458 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcd88cb9aee8377371dcb96cf615ef4e2ec10580 | 4,113 | py | Python | exercises/level_0/stringing.py | eliranM98/python_course | d9431dd6c0f27fca8ca052cc2a821ed0b883136c | [
"MIT"
] | 6 | 2019-03-29T06:14:53.000Z | 2021-10-15T23:42:36.000Z | exercises/level_0/stringing.py | eliranM98/python_course | d9431dd6c0f27fca8ca052cc2a821ed0b883136c | [
"MIT"
] | 4 | 2019-09-06T10:03:40.000Z | 2022-03-11T23:30:55.000Z | exercises/level_0/stringing.py | eliranM98/python_course | d9431dd6c0f27fca8ca052cc2a821ed0b883136c | [
"MIT"
] | 12 | 2019-06-20T19:34:52.000Z | 2021-10-15T23:42:39.000Z | text = '''
Victor Hugo's ({}) tale of injustice, heroism and love follows the fortunes of Jean Valjean, an escaped convict determined to put his criminal past behind him. But his attempts to become a respected member of the community are constantly put under threat: by his own conscience, when, owing to a case of mistaken identity, another man is arrested in his place; and by the relentless investigations of the dogged Inspector Javert. It is not simply for himself that Valjean must stay free, however, for he has sworn to protect the baby daughter of Fantine, driven to prostitution by poverty.
Norman Denny's ({}) lively English translation is accompanied by an introduction discussing Hugo's political and artistic aims in writing Les Miserables.
Victor Hugo (1802-85) wrote volumes of criticism, dramas, satirical verse and political journalism but is best remembered for his novels, especially Notre-Dame de Paris (also known as The Hunchback of Notre-Dame) and Les Miserables, which was adapted into one of the most successful musicals of all time.
'All human life is here'
Cameron Mackintosh, producer of the musical Les Miserables
'One of the half-dozen greatest novels of the world'
Upton Sinclair
'A great writer - inventive, witty, sly, innovatory'
A. S. Byatt, author of Possession
'''
name = 'Victor'
word1 = 'writer'
word2 = 'witty'
numbers = "0123456789"
small_letters = 'abcdefghijklmnopqrstuvwxyz'
big_letters = 'ABCDEFGHIJKLMNOPQRSTUVWXYZ'
name_index = text.find(name)
name_plus3 = text[name_index: name_index+len(name)+3]
word1_index = text.find(word1, 0, 100)
word2_index = text.find(word2, int(len(text)/2), len(text))
count_characters = text.count('of')
is_text_starts_with_name = text.startswith(name)
is_text_ends_with_name = text.endswith(name)
text = text.format('1822-95', '1807-63')
words = text.split(' ')
text1 = ''.join(words)
text2 = ','.join(words)
text3 = '_'.join(words)
text4 = ' '.join(words)
text5 = text.replace('of', '@🐔')
text6 = text.capitalize()
text7 = text.replace('a', '')
text8 = text.strip()
upper_name = name.upper()
lower_name = name.lower()
is_name_upper = name.isupper()
is_name_lower = name.islower()
is_big_letters_upper = big_letters.isupper()
is_small_letters_lower = small_letters.islower()
stringed_integer = '90'.isnumeric()
stringed_float = '90.5'.isnumeric()
converted_int = int('90')
converted_float = float('90.5')
converted_string = str(183)
is_digit = converted_string[1].isdigit()
edges = small_letters[0] + big_letters[-1]
body = numbers[1:-1]
evens = numbers[::2]
odds = numbers[1::2]
print('name', name)
print('word1', word1)
print('word2', word2)
print('numbers', numbers)
print('small_letters', small_letters)
print('big_letters', big_letters)
print('name_index', name_index)
print('name_plus3', name_plus3)
print('word1_index', word1_index)
print('word2_index', word2_index)
print('count_characters -> \'of\' in the text', count_characters)
print('is_text_starts_with_name', is_text_starts_with_name)
print('is_text_ends_with_name', is_text_ends_with_name)
print('\n\n\n\n\n', 'text', text, '\n\n\n\n\n')
print('\n\n\n\n\n', 'words', words, '\n\n\n\n\n')
print('\n\n\n\n\n', 'text1', text1, '\n\n\n\n\n')
print('\n\n\n\n\n', 'text2', text2, '\n\n\n\n\n')
print('\n\n\n\n\n', 'text3', text3, '\n\n\n\n\n')
print('\n\n\n\n\n', 'text4', text4, '\n\n\n\n\n')
print('\n\n\n\n\n', 'text5', text5, '\n\n\n\n\n')
print('\n\n\n\n\n', 'text6', text6, '\n\n\n\n\n')
print('\n\n\n\n\n', 'text7', text7, '\n\n\n\n\n')
print('\n\n\n\n\n', 'text8', text8, '\n\n\n\n\n')
print('upper_name', upper_name)
print('lower_name', lower_name)
print('is_name_upper', is_name_upper)
print('is_name_lower', is_name_lower)
print('is_big_letters_upper', is_big_letters_upper)
print('is_small_letters_lower', is_small_letters_lower)
print('stringed_integer', stringed_integer)
print('stringed_float', stringed_float)
print('converted_int', converted_int)
print('converted_float', converted_float)
print('converted_string', converted_string)
print('is_digit', is_digit)
print('edges', edges)
print('body', body)
print('evens', evens)
print('odds', odds)
| 41.545455 | 590 | 0.735959 | 680 | 4,113 | 4.295588 | 0.310294 | 0.054776 | 0.061623 | 0.054776 | 0.093461 | 0.068127 | 0.046217 | 0.046217 | 0.046217 | 0.046217 | 0 | 0.026237 | 0.110382 | 4,113 | 98 | 591 | 41.969388 | 0.771796 | 0 | 0 | 0 | 0 | 0.033708 | 0.490396 | 0.029176 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.438202 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
bcddbefe85e0c400583bdfd288157408fcf8f518 | 11,271 | py | Python | rpython/translator/platform/posix.py | wdv4758h/mu-client-pypy | d2fcc01f0b4fe3ffa232762124e3e6d38ed3a0cf | [
"Apache-2.0",
"OpenSSL"
] | null | null | null | rpython/translator/platform/posix.py | wdv4758h/mu-client-pypy | d2fcc01f0b4fe3ffa232762124e3e6d38ed3a0cf | [
"Apache-2.0",
"OpenSSL"
] | null | null | null | rpython/translator/platform/posix.py | wdv4758h/mu-client-pypy | d2fcc01f0b4fe3ffa232762124e3e6d38ed3a0cf | [
"Apache-2.0",
"OpenSSL"
] | null | null | null | """Base support for POSIX-like platforms."""
import py, os, sys
from rpython.translator.platform import Platform, log, _run_subprocess
import rpython
rpydir = str(py.path.local(rpython.__file__).join('..'))
class BasePosix(Platform):
exe_ext = ''
make_cmd = 'make'
relevant_environ = ('CPATH', 'LIBRARY_PATH', 'C_INCLUDE_PATH')
DEFAULT_CC = 'gcc'
rpath_flags = ['-Wl,-rpath=\'$$ORIGIN/\'']
def __init__(self, cc=None):
self.cc = cc or os.environ.get('CC', self.DEFAULT_CC)
def _libs(self, libraries):
return ['-l%s' % lib for lib in libraries]
def _libdirs(self, library_dirs):
assert '' not in library_dirs
return ['-L%s' % ldir for ldir in library_dirs]
def _includedirs(self, include_dirs):
assert '' not in include_dirs
return ['-I%s' % idir for idir in include_dirs]
def _linkfiles(self, link_files):
return list(link_files)
def _compile_c_file(self, cc, cfile, compile_args):
oname = self._make_o_file(cfile, ext='o')
args = ['-c'] + compile_args + [str(cfile), '-o', str(oname)]
self._execute_c_compiler(cc, args, oname,
cwd=str(cfile.dirpath()))
return oname
def _link_args_from_eci(self, eci, standalone):
return Platform._link_args_from_eci(self, eci, standalone)
def _exportsymbols_link_flags(self):
if (self.cc == 'mingw32' or (self.cc== 'gcc' and os.name=='nt')
or sys.platform == 'cygwin'):
return ["-Wl,--export-all-symbols"]
return ["-Wl,--export-dynamic"]
def _link(self, cc, ofiles, link_args, standalone, exe_name):
args = [str(ofile) for ofile in ofiles] + link_args
args += ['-o', str(exe_name)]
if not standalone:
args = self._args_for_shared(args)
self._execute_c_compiler(cc, args, exe_name,
cwd=str(exe_name.dirpath()))
return exe_name
def _pkg_config(self, lib, opt, default, check_result_dir=False):
try:
ret, out, err = _run_subprocess("pkg-config", [lib, opt])
except OSError, e:
err = str(e)
ret = 1
if ret:
result = default
else:
# strip compiler flags
result = [entry[2:] for entry in out.split()]
#
if not result:
pass # if pkg-config explicitly returned nothing, then
# we assume it means no options are needed
elif check_result_dir:
# check that at least one of the results is a valid dir
for check in result:
if os.path.isdir(check):
break
else:
if ret:
msg = ("running 'pkg-config %s %s' failed:\n%s\n"
"and the default %r is not a valid directory" % (
lib, opt, err.rstrip(), default))
else:
msg = ("'pkg-config %s %s' returned no valid directory:\n"
"%s\n%s" % (lib, opt, out.rstrip(), err.rstrip()))
raise ValueError(msg)
return result
def get_rpath_flags(self, rel_libdirs):
# needed for cross-compilation i.e. ARM
return self.rpath_flags + ['-Wl,-rpath-link=\'%s\'' % ldir
for ldir in rel_libdirs]
def get_shared_only_compile_flags(self):
return tuple(self.shared_only) + ('-fvisibility=hidden',)
def gen_makefile(self, cfiles, eci, exe_name=None, path=None,
shared=False, headers_to_precompile=[],
no_precompile_cfiles = [], icon=None):
cfiles = self._all_cfiles(cfiles, eci)
if path is None:
path = cfiles[0].dirpath()
rpypath = py.path.local(rpydir)
if exe_name is None:
exe_name = cfiles[0].new(ext=self.exe_ext)
else:
exe_name = exe_name.new(ext=self.exe_ext)
linkflags = list(self.link_flags)
if shared:
linkflags = self._args_for_shared(linkflags)
linkflags += self._exportsymbols_link_flags()
if shared:
libname = exe_name.new(ext='').basename
target_name = 'lib' + exe_name.new(ext=self.so_ext).basename
else:
target_name = exe_name.basename
if shared:
cflags = tuple(self.cflags) + self.get_shared_only_compile_flags()
else:
cflags = tuple(self.cflags) + tuple(self.standalone_only)
m = GnuMakefile(path)
m.exe_name = path.join(exe_name.basename)
m.eci = eci
def rpyrel(fpath):
lpath = py.path.local(fpath)
rel = lpath.relto(rpypath)
if rel:
return os.path.join('$(RPYDIR)', rel)
# Hack: also relativize from the path '$RPYDIR/..'.
# Otherwise, when translating pypy, we get the paths in
# pypy/module/* that are kept as absolute, which makes the
# whole purpose of $RPYDIR rather pointless.
rel = lpath.relto(rpypath.join('..'))
if rel:
return os.path.join('$(RPYDIR)', '..', rel)
m_dir = m.makefile_dir
if m_dir == lpath:
return '.'
if m_dir.dirpath() == lpath:
return '..'
return fpath
rel_cfiles = [m.pathrel(cfile) for cfile in cfiles]
rel_ofiles = [rel_cfile[:rel_cfile.rfind('.')]+'.o' for rel_cfile in rel_cfiles]
m.cfiles = rel_cfiles
rel_includedirs = [rpyrel(incldir) for incldir in
self.preprocess_include_dirs(eci.include_dirs)]
rel_libdirs = [rpyrel(libdir) for libdir in
self.preprocess_library_dirs(eci.library_dirs)]
m.comment('automatically generated makefile')
definitions = [
('RPYDIR', '"%s"' % rpydir),
('TARGET', target_name),
('DEFAULT_TARGET', exe_name.basename),
('SOURCES', rel_cfiles),
('OBJECTS', rel_ofiles),
('LIBS', self._libs(eci.libraries) + list(self.extra_libs)),
('LIBDIRS', self._libdirs(rel_libdirs)),
('INCLUDEDIRS', self._includedirs(rel_includedirs)),
('CFLAGS', cflags),
('CFLAGSEXTRA', list(eci.compile_extra)),
('LDFLAGS', linkflags),
('LDFLAGS_LINK', list(self.link_flags)),
('LDFLAGSEXTRA', list(eci.link_extra)),
('CC', self.cc),
('CC_LINK', eci.use_cpp_linker and 'g++' or '$(CC)'),
('LINKFILES', eci.link_files),
('RPATH_FLAGS', self.get_rpath_flags(rel_libdirs)),
]
for args in definitions:
m.definition(*args)
rules = [
('all', '$(DEFAULT_TARGET)', []),
('$(TARGET)', '$(OBJECTS)', '$(CC_LINK) $(LDFLAGSEXTRA) -o $@ $(OBJECTS) $(LIBDIRS) $(LIBS) $(LINKFILES) $(LDFLAGS)'),
('%.o', '%.c', '$(CC) $(CFLAGS) $(CFLAGSEXTRA) -o $@ -c $< $(INCLUDEDIRS)'),
('%.o', '%.s', '$(CC) $(CFLAGS) $(CFLAGSEXTRA) -o $@ -c $< $(INCLUDEDIRS)'),
('%.o', '%.cxx', '$(CXX) $(CFLAGS) $(CFLAGSEXTRA) -o $@ -c $< $(INCLUDEDIRS)'),
]
for rule in rules:
m.rule(*rule)
if shared:
m.definition('SHARED_IMPORT_LIB', libname),
m.definition('PYPY_MAIN_FUNCTION', "pypy_main_startup")
m.rule('main.c', '',
'echo "'
'int $(PYPY_MAIN_FUNCTION)(int, char*[]); '
'int main(int argc, char* argv[]) '
'{ return $(PYPY_MAIN_FUNCTION)(argc, argv); }" > $@')
m.rule('$(DEFAULT_TARGET)', ['$(TARGET)', 'main.o'],
'$(CC_LINK) $(LDFLAGS_LINK) main.o -L. -l$(SHARED_IMPORT_LIB) -o $@ $(RPATH_FLAGS)')
return m
def execute_makefile(self, path_to_makefile, extra_opts=[]):
if isinstance(path_to_makefile, GnuMakefile):
path = path_to_makefile.makefile_dir
else:
path = path_to_makefile
log.execute('make %s in %s' % (" ".join(extra_opts), path))
returncode, stdout, stderr = _run_subprocess(
self.make_cmd, ['-C', str(path)] + extra_opts)
self._handle_error(returncode, stdout, stderr, path.join('make'))
class Definition(object):
def __init__(self, name, value):
self.name = name
self.value = value
def write(self, f):
def write_list(prefix, lst):
lst = lst or ['']
for i, fn in enumerate(lst):
fn = fn.replace('\\', '\\\\')
print >> f, prefix, fn,
if i < len(lst)-1:
print >> f, '\\'
else:
print >> f
prefix = ' ' * len(prefix)
name, value = self.name, self.value
if isinstance(value, str):
f.write('%s = %s\n' % (name, value.replace('\\', '\\\\')))
else:
write_list('%s =' % (name,), value)
f.write('\n')
class Rule(object):
def __init__(self, target, deps, body):
self.target = target
self.deps = deps
self.body = body
def write(self, f):
target, deps, body = self.target, self.deps, self.body
if isinstance(deps, str):
dep_s = deps
else:
dep_s = ' '.join(deps)
f.write('%s: %s\n' % (target, dep_s))
if isinstance(body, str):
f.write('\t%s\n' % body)
elif body:
f.write('\t%s\n' % '\n\t'.join(body))
f.write('\n')
class Comment(object):
def __init__(self, body):
self.body = body
def write(self, f):
f.write('# %s\n' % (self.body,))
class GnuMakefile(object):
def __init__(self, path=None):
self.defs = {}
self.lines = []
self.makefile_dir = py.path.local(path)
def pathrel(self, fpath):
if fpath.dirpath() == self.makefile_dir:
return fpath.basename
elif fpath.dirpath().dirpath() == self.makefile_dir.dirpath():
assert fpath.relto(self.makefile_dir.dirpath()), (
"%r should be relative to %r" % (
fpath, self.makefile_dir.dirpath()))
path = '../' + fpath.relto(self.makefile_dir.dirpath())
return path.replace('\\', '/')
else:
return str(fpath)
def definition(self, name, value):
defs = self.defs
defn = Definition(name, value)
if name in defs:
self.lines[defs[name]] = defn
else:
defs[name] = len(self.lines)
self.lines.append(defn)
def rule(self, target, deps, body):
self.lines.append(Rule(target, deps, body))
def comment(self, body):
self.lines.append(Comment(body))
def write(self, out=None):
if out is None:
f = self.makefile_dir.join('Makefile').open('w')
else:
f = out
for line in self.lines:
line.write(f)
f.flush()
if out is None:
f.close()
| 36.009585 | 130 | 0.53021 | 1,322 | 11,271 | 4.350227 | 0.196672 | 0.019475 | 0.018258 | 0.011824 | 0.113719 | 0.061902 | 0.041732 | 0.010433 | 0 | 0 | 0 | 0.000927 | 0.329784 | 11,271 | 312 | 131 | 36.125 | 0.760392 | 0.035933 | 0 | 0.11811 | 0 | 0.007874 | 0.11812 | 0.009157 | 0 | 0 | 0 | 0 | 0.011811 | 0 | null | null | 0.003937 | 0.019685 | null | null | 0.011811 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bce0bfd9222f594d713d4743ed32c26bb4279c4c | 1,483 | py | Python | check_perm.py | codecakes/random_games | 1e670021ec97a196726e937e658878dc63ba9d34 | [
"MIT"
] | null | null | null | check_perm.py | codecakes/random_games | 1e670021ec97a196726e937e658878dc63ba9d34 | [
"MIT"
] | null | null | null | check_perm.py | codecakes/random_games | 1e670021ec97a196726e937e658878dc63ba9d34 | [
"MIT"
] | null | null | null | """
PermCheck
Check whether array A is a permutation.
https://codility.com/demo/results/demoANZ7M2-GFU/
Task description
A non-empty zero-indexed array A consisting of N integers is given.
A permutation is a sequence containing each element from 1 to N once, and only once.
For example, array A such that:
A[0] = 4
A[1] = 1
A[2] = 3
A[3] = 2
is a permutation, but array A such that:
A[0] = 4
A[1] = 1
A[2] = 3
is not a permutation, because value 2 is missing.
The goal is to check whether array A is a permutation.
Write a function:
def solution(A)
that, given a zero-indexed array A, returns 1 if array A is a permutation and 0 if it is not.
For example, given array A such that:
A[0] = 4
A[1] = 1
A[2] = 3
A[3] = 2
the function should return 1.
Given array A such that:
A[0] = 4
A[1] = 1
A[2] = 3
the function should return 0.
Assume that:
N is an integer within the range [1..100,000];
each element of array A is an integer within the range [1..1,000,000,000].
Complexity:
expected worst-case time complexity is O(N);
expected worst-case space complexity is O(N), beyond input storage (not counting the storage required for input arguments).
Elements of input arrays can be modified.
"""
def solution(A):
# write your code in Python 2.7
s = set(A)
N_set = len(s) #O(n)
N = len(A)
if N != N_set: return 0
sum_N = N*(N+1)/2 #O(1)
sum_A = sum(A) #O(n)
return 1 if sum_N == sum_A else 0 | 27.981132 | 123 | 0.662171 | 285 | 1,483 | 3.424561 | 0.329825 | 0.061475 | 0.032787 | 0.057377 | 0.25 | 0.229508 | 0.229508 | 0.110656 | 0.110656 | 0.110656 | 0 | 0.054916 | 0.238705 | 1,483 | 53 | 124 | 27.981132 | 0.809566 | 0.864464 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bce207bdb62870e146b1e56a1b168c691c0515ac | 4,273 | py | Python | utils/scene_bounding_box.py | davidemarelli/sfm_flow | 7a96d8309cc01b8499347ba0cae882923d82bbcc | [
"MIT"
] | 8 | 2020-10-27T12:52:17.000Z | 2022-03-30T04:15:37.000Z | utils/scene_bounding_box.py | ElsevierSoftwareX/SOFTX_2020_51 | b240a113c91405fac60444a6e56e87e3cf17a27b | [
"MIT"
] | 1 | 2020-11-09T01:56:04.000Z | 2020-11-24T15:58:26.000Z | utils/scene_bounding_box.py | davidemarelli/sfm_flow | 7a96d8309cc01b8499347ba0cae882923d82bbcc | [
"MIT"
] | 2 | 2021-12-02T10:04:39.000Z | 2022-03-28T07:54:07.000Z |
import logging
from typing import Tuple
import bpy
from mathutils import Vector
from .object import get_objs
logger = logging.getLogger(__name__)
class SceneBoundingBox():
"""Scene bounding box, build a bounding box that includes all objects except the excluded ones."""
################################################################################################
# Properties
#
# ==============================================================================================
@property
def width(self):
"""Scene's bounding box width."""
return self.x_max - self.x_min
# ==============================================================================================
@property
def depth(self):
"""Scene's bounding box depth."""
return self.y_max - self.y_min
# ==============================================================================================
@property
def height(self):
"""Scene's bounding box height."""
return self.z_max - self.z_min
# ==============================================================================================
@property
def floor_center(self):
"""Scene's bounding center on lower bbox plane."""
return Vector((self.center[0], self.center[1], self.z_min))
################################################################################################
# Constructor
#
# ==============================================================================================
def __init__(self, scene: bpy.types.Scene,
exclude_collections: Tuple[str] = ("SfM_Environment", "SfM_Reconstructions")):
self.scene = scene
self.exclude_collections = exclude_collections
#
self.center = Vector() # type: Vector
self.x_min = float("inf") # type: float
self.x_max = float("-inf") # type: float
self.y_min = float("inf") # type: float
self.y_max = float("-inf") # type: float
self.z_min = float("inf") # type: float
self.z_max = float("-inf") # type: float
#
self.compute()
################################################################################################
# Methods
#
# ==============================================================================================
def compute(self):
"""Compute the scene bounding box values."""
objs = get_objs(self.scene, exclude_collections=self.exclude_collections, mesh_only=True)
logger.debug("Found %i objects in scene %s", len(objs), self.scene.name)
for obj in objs:
obb = obj.bound_box
for i in range(8):
p = obj.matrix_world @ Vector(obb[i])
self.x_min = min(self.x_min, p[0])
self.x_max = max(self.x_max, p[0])
self.y_min = min(self.y_min, p[1])
self.y_max = max(self.y_max, p[1])
self.z_min = min(self.z_min, p[2])
self.z_max = max(self.z_max, p[2])
if objs:
self.center = Vector(((self.x_max + self.x_min) / 2,
(self.y_max + self.y_min) / 2,
(self.z_max + self.z_min) / 2))
logger.debug(str(self))
# ==============================================================================================
def get_min_vector(self):
"""Get minimum axis."""
return Vector((self.x_min, self.y_min, self.z_min))
# ==============================================================================================
def get_max_vector(self):
"""Get maximum axis."""
return Vector((self.x_max, self.y_max, self.z_max))
################################################################################################
# Builtin methods
#
# ==============================================================================================
def __str__(self):
return "Scene bbox values: X=({:.3f}, {:.3f}), Y=({:.3f}, {:.3f}), Z=({:.3f}, {:.3f}), Center={}".format(
self.x_min, self.x_max, self.y_min, self.y_max, self.z_min, self.z_max, self.center)
| 39.934579 | 113 | 0.388018 | 402 | 4,273 | 3.945274 | 0.226368 | 0.047289 | 0.040353 | 0.064313 | 0.250315 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0.00545 | 0.227007 | 4,273 | 106 | 114 | 40.311321 | 0.47472 | 0.300725 | 0 | 0.071429 | 0 | 0.017857 | 0.067402 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.160714 | false | 0 | 0.089286 | 0.017857 | 0.392857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bce8b5c80fccdda525f4313d1b8dac7df83862d2 | 3,765 | py | Python | scraper-code/myanimelist/base.py | XueAlfred/MALAnalysis | 630d578b30f7540769774e1e4ee072d9775bf4bf | [
"MIT"
] | 15 | 2015-01-24T10:52:42.000Z | 2021-08-09T10:23:58.000Z | scraper-code/myanimelist/base.py | XueAlfred/MALAnalysis | 630d578b30f7540769774e1e4ee072d9775bf4bf | [
"MIT"
] | 10 | 2015-01-24T10:51:18.000Z | 2018-09-05T00:17:03.000Z | scraper-code/myanimelist/base.py | XueAlfred/MALAnalysis | 630d578b30f7540769774e1e4ee072d9775bf4bf | [
"MIT"
] | 18 | 2015-01-24T11:29:38.000Z | 2021-12-04T02:41:09.000Z | #!/usr/bin/python
# -*- coding: utf-8 -*-
import abc
import bs4
import functools
import utilities
class Error(Exception):
"""Base exception class that takes a message to display upon raising.
"""
def __init__(self, message=None):
"""Creates an instance of Error.
:type message: str
:param message: A message to display when raising the exception.
"""
super(Error, self).__init__()
self.message = message
def __str__(self):
return unicode(self.message) if self.message is not None else u""
class MalformedPageError(Error):
"""Indicates that a page on MAL has broken markup in some way.
"""
def __init__(self, id, html, message=None):
super(MalformedPageError, self).__init__(message=message)
if isinstance(id, unicode):
self.id = id
else:
self.id = str(id).decode(u'utf-8')
if isinstance(html, unicode):
self.html = html
else:
self.html = str(html).decode(u'utf-8')
def __str__(self):
return "\n".join([
super(MalformedPageError, self).__str__(),
"ID: " + self.id,
"HTML: " + self.html
]).encode(u'utf-8')
class InvalidBaseError(Error):
"""Indicates that the particular resource instance requested does not exist on MAL.
"""
def __init__(self, id, message=None):
super(InvalidBaseError, self).__init__(message=message)
self.id = id
def __str__(self):
return "\n".join([
super(InvalidBaseError, self).__str__(),
"ID: " + unicode(self.id)
])
def loadable(func_name):
"""Decorator for getters that require a load() upon first access.
:type func_name: function
:param func_name: class method that requires that load() be called if the class's _attribute value is None
:rtype: function
:return: the decorated class method.
"""
def inner(func):
cached_name = '_' + func.__name__
@functools.wraps(func)
def _decorator(self, *args, **kwargs):
if getattr(self, cached_name) is None:
getattr(self, func_name)()
return func(self, *args, **kwargs)
return _decorator
return inner
class Base(object):
"""Abstract base class for MAL resources. Provides autoloading, auto-setting functionality for other MAL objects.
"""
__metaclass__ = abc.ABCMeta
"""Attribute name for primary reference key to this object.
When an attribute by the name given by _id_attribute is passed into set(), set() doesn't prepend an underscore for load()ing.
"""
_id_attribute = "id"
def __repr__(self):
return u"".join([
"<",
self.__class__.__name__,
" ",
self._id_attribute,
": ",
unicode(getattr(self, self._id_attribute)),
">"
])
def __hash__(self):
return hash('-'.join([self.__class__.__name__, unicode(getattr(self, self._id_attribute))]))
def __eq__(self, other):
return isinstance(other, self.__class__) and getattr(self, self._id_attribute) == getattr(other, other._id_attribute)
def __ne__(self, other):
return not self.__eq__(other)
def __init__(self, session):
"""Create an instance of Base.
:type session: :class:`myanimelist.session.Session`
:param session: A valid MAL session.
"""
self.session = session
@abc.abstractmethod
def load(self):
"""A callback to run before any @loadable attributes are returned.
"""
pass
def set(self, attr_dict):
"""Sets attributes of this user object.
:type attr_dict: dict
:param attr_dict: Parameters to set, with attribute keys.
:rtype: :class:`.Base`
:return: The current object.
"""
for key in attr_dict:
if key == self._id_attribute:
setattr(self, self._id_attribute, attr_dict[key])
else:
setattr(self, u"_" + key, attr_dict[key])
return self | 27.683824 | 127 | 0.662151 | 493 | 3,765 | 4.791075 | 0.298174 | 0.033023 | 0.038103 | 0.032176 | 0.063506 | 0.052498 | 0.052498 | 0 | 0 | 0 | 0 | 0.001691 | 0.214608 | 3,765 | 136 | 128 | 27.683824 | 0.797092 | 0.290571 | 0 | 0.155844 | 0 | 0 | 0.018143 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.207792 | false | 0.012987 | 0.051948 | 0.090909 | 0.480519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcefbd3c4fdc15d991e7f75b480521ed8994a120 | 2,234 | py | Python | mandelbruh/util.py | pereradrian/mandelbruh | fb68c5f2af84d51097e73f3a248e3a1b95fbbf47 | [
"MIT"
] | null | null | null | mandelbruh/util.py | pereradrian/mandelbruh | fb68c5f2af84d51097e73f3a248e3a1b95fbbf47 | [
"MIT"
] | null | null | null | mandelbruh/util.py | pereradrian/mandelbruh | fb68c5f2af84d51097e73f3a248e3a1b95fbbf47 | [
"MIT"
] | null | null | null | import numpy as np
def normalize(x):
return x / np.linalg.norm(x)
def norm_sq(v):
return np.dot(v,v)
def norm(v):
return np.linalg.norm(v)
def get_sub_keys(v):
if type(v) is not tuple and type(v) is not list:
return []
return [k for k in v if type(k) is str]
def to_vec3(v):
if isinstance(v, (float, int)):
return np.array([v, v, v], dtype=np.float32)
elif len(get_sub_keys(v)) > 0:
return v
else:
return np.array([v[0], v[1], v[2]], dtype=np.float32)
def to_str(x):
if type(x) is bool:
return "1" if x else "0"
elif isinstance(x, (list, tuple)):
return vec3_str(x)
else:
return str(x)
def float_str(x):
if type(x) is str:
return '_' + x
else:
return str(x)
def vec3_str(v):
if type(v) is str:
return '_' + v
elif isinstance(v, (float, int)):
return 'vec3(' + str(v) + ')'
else:
return 'vec3(' + float_str(v[0]) + ',' + float_str(v[1]) + ',' + float_str(v[2]) + ')'
def vec3_eq(v, val):
if type(v) is str:
return False
for i in range(3):
if v[i] != val[i]:
return False
return True
def smin(a, b, k):
h = min(max(0.5 + 0.5*(b - a)/k, 0.0), 1.0)
return b*(1 - h) + a*h - k*h*(1.0 - h)
def get_global(k):
if type(k) is str:
return _mandelbruh_GLOBAL_VARS[k]
elif type(k) is tuple or type(k) is list:
return np.array([get_global(i) for i in k], dtype=np.float32)
else:
return k
def set_global_float(k):
if type(k) is str:
_mandelbruh_GLOBAL_VARS[k] = 0.0
return k
def set_global_vec3(k):
if type(k) is str:
_mandelbruh_GLOBAL_VARS[k] = to_vec3((0,0,0))
return k
elif isinstance(k, (float, int)):
return to_vec3(k)
else:
sk = get_sub_keys(k)
for i in sk:
_mandelbruh_GLOBAL_VARS[i] = 0.0
return to_vec3(k)
def cond_offset(p):
if type(p) is str or np.count_nonzero(p) > 0:
return ' - vec4(' + vec3_str(p) + ', 0)'
return ''
def cond_subtract(p):
if type(p) is str or p > 0:
return ' - ' + float_str(p)
return ''
def make_color(geo):
if type(geo.color) is tuple or type(geo.color) is np.ndarray:
return 'vec4(' + vec3_str(geo.color) + ', ' + geo.glsl() + ')'
elif geo.color == 'orbit' or geo.color == 'o':
return 'vec4(orbit, ' + geo.glsl() + ')'
else:
raise Exception("Invalid coloring type")
_mandelbruh_GLOBAL_VARS = {}
| 21.68932 | 88 | 0.625783 | 427 | 2,234 | 3.159251 | 0.187354 | 0.053373 | 0.031134 | 0.026686 | 0.237213 | 0.15493 | 0.072646 | 0.050408 | 0.050408 | 0.050408 | 0 | 0.029067 | 0.199194 | 2,234 | 102 | 89 | 21.901961 | 0.724986 | 0 | 0 | 0.270588 | 0 | 0 | 0.036258 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.188235 | false | 0 | 0.011765 | 0.035294 | 0.588235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bcf633a886ab43d9c3c7c35185345d3f776c81e3 | 4,899 | py | Python | src/promnesia/sources/telegram.py | halhenke/promnesia | 03f46b7e0740790ef091e6f48d0ac2e6bf05bcb7 | [
"MIT"
] | 1,327 | 2019-11-02T20:10:38.000Z | 2022-03-29T16:58:36.000Z | src/promnesia/sources/telegram.py | halhenke/promnesia | 03f46b7e0740790ef091e6f48d0ac2e6bf05bcb7 | [
"MIT"
] | 157 | 2019-09-06T11:16:40.000Z | 2022-03-27T20:01:52.000Z | src/promnesia/sources/telegram.py | halhenke/promnesia | 03f46b7e0740790ef091e6f48d0ac2e6bf05bcb7 | [
"MIT"
] | 60 | 2020-06-08T22:12:24.000Z | 2022-03-22T16:57:22.000Z | '''
Uses [[https://github.com/fabianonline/telegram_backup#readme][telegram_backup]] database for messages data
'''
from pathlib import Path
from textwrap import dedent
from typing import Optional, Union, TypeVar
from urllib.parse import unquote # TODO mm, make it easier to rememember to use...
from ..common import PathIsh, Visit, get_logger, Loc, extract_urls, from_epoch, Results, echain
# TODO potentially, belongs to my. package
# TODO kython?
T = TypeVar("T")
def unwrap(res: Union[T, Exception]) -> T:
if isinstance(res, Exception):
raise res
else:
return res
# TODO move to common?
def dataset_readonly(db: Path):
import dataset # type: ignore
# see https://github.com/pudo/dataset/issues/136#issuecomment-128693122
import sqlite3
creator = lambda: sqlite3.connect(f'file:{db}?immutable=1', uri=True)
return dataset.connect('sqlite:///' , engine_kwargs={'creator': creator})
def index(database: PathIsh, *, http_only: bool=None) -> Results:
"""
:param database:
the path of the sqlite generated by the _telegram_backup_ java program
:param http_only:
when true, do not collect IP-addresses and `python.py` strings
"""
logger = get_logger()
path = Path(database)
assert path.is_file(), path # TODO could check is_file inside `dataset_readonly()`
def make_query(text_query: str):
extra_criteria = "AND (M.has_media == 1 OR text LIKE '%http%')" if http_only else ""
return dedent(
f"""
WITH entities AS (
SELECT 'dialog' as type
, id
, coalesce(username, id) as handle
, coalesce(first_name || " " || last_name
, username
, id
) as display_name FROM users
UNION
SELECT 'group' as type
, id
, id as handle
, coalesce(name, id) as display_name FROM chats
)
SELECT src.display_name AS chatname
, src.handle AS chat
, snd.display_name AS sender
, M.time AS time
, {text_query} AS text
, M.id AS mid
FROM messages AS M
/* chat types are 'dialog' (1-1), 'group' and 'supergroup' */
/* this is abit hacky way to handle all groups in one go */
LEFT JOIN entities AS src ON M.source_id = src.id AND src.type = (CASE M.source_type WHEN 'supergroup' THEN 'group' ELSE M.source_type END)
LEFT JOIN entities AS snd ON M.sender_id = snd.id AND snd.type = 'dialog'
WHERE
M.message_type NOT IN ('service_message', 'empty_message')
{extra_criteria}
ORDER BY time;
""")
# TODO context manager?
with dataset_readonly(path) as db:
# TODO yield error if chatname or chat or smth else is null?
for row in db.query(make_query('M.text')):
try:
yield from _handle_row(row)
except Exception as ex:
yield echain(RuntimeError(f'While handling {row}'), ex)
# , None, sys.exc_info()[2]
# TODO hmm. traceback isn't preserved; wonder if that's because it's too heavy to attach to every single exception object..
# old (also 'stable') version doesn't have 'json' column yet...
if 'json' in db['messages'].columns:
for row in db.query(make_query("json_extract(json, '$.media.webpage.description')")):
try:
yield from _handle_row(row)
except Exception as ex:
yield echain(RuntimeError(f'While handling {row}'), ex)
def _handle_row(row) -> Results:
text = row['text']
if text is None:
return
urls = extract_urls(text)
if len(urls) == 0:
return
dt = from_epoch(row['time'])
mid: str = unwrap(row['mid'])
# TODO perhaps we could be defensive with null sender/chat etc and still emit the Visit
sender: str = unwrap(row['sender'])
chatname: str = unwrap(row['chatname'])
chat: str = unwrap(row['chat'])
in_context = f'https://t.me/{chat}/{mid}'
for u in urls:
# https://www.reddit.com/r/Telegram/comments/6ufwi3/link_to_a_specific_message_in_a_channel_possible/
# hmm, only seems to work on mobile app, but better than nothing...
yield Visit(
url=unquote(u),
dt=dt,
context=f"{sender}: {text}",
locator=Loc.make(
title=f"chat with {chatname}",
href=in_context,
),
)
| 37.684615 | 154 | 0.560931 | 606 | 4,899 | 4.437294 | 0.40429 | 0.007438 | 0.017851 | 0.013388 | 0.095203 | 0.081071 | 0.081071 | 0.063221 | 0.063221 | 0.063221 | 0 | 0.006813 | 0.340886 | 4,899 | 129 | 155 | 37.976744 | 0.825952 | 0.221678 | 0 | 0.146067 | 0 | 0.022472 | 0.464523 | 0.013553 | 0 | 0 | 0 | 0.015504 | 0.011236 | 1 | 0.05618 | false | 0 | 0.078652 | 0 | 0.191011 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcf9e42ce187d88ea3d733bded3e343188bcd463 | 10,196 | py | Python | daproli/transformer.py | ermshaua/daproli | c1f7aeec431d9c60ae06eeac23455c1a03bc82cf | [
"BSD-3-Clause"
] | null | null | null | daproli/transformer.py | ermshaua/daproli | c1f7aeec431d9c60ae06eeac23455c1a03bc82cf | [
"BSD-3-Clause"
] | null | null | null | daproli/transformer.py | ermshaua/daproli | c1f7aeec431d9c60ae06eeac23455c1a03bc82cf | [
"BSD-3-Clause"
] | null | null | null | from joblib import Parallel, delayed
from tqdm import tqdm
from .processing import map, filter, split, expand, combine, join
from .manipulation import windowed, flatten
class BaseTransformer:
'''
The BaseTransformer defines a generic data transformation pattern that
can be implemented with a number of data processing concepts.
'''
def transform(self, data, *args, **kwargs):
raise NotImplementedError()
class Mapper(BaseTransformer):
def __init__(self, func, ret_type=None, expand_args=True, n_jobs=1, verbose=0, **kwargs):
'''
dp.Mapper is the respective transformer for dp.map.
Parameters
-----------
:param func: the mapping function
:param ret_type: if provided the used return type, otherwise ret_type(data)
:param expand_args: true if args should be expanded, false otherwise
:param n_jobs: amount of used threads/processes
:param verbose: verbosity level for tqdm / joblib
:param kwargs: additional arguments for joblib.Parallel, e.g. backend='loky'
'''
self.func = func
self.ret_type = ret_type
self.expand_args = expand_args
self.n_jobs = n_jobs
self.verbose = verbose
self.kwargs = kwargs
def transform(self, data, *args, **kwargs):
return map(self.func, data, self.ret_type, expand_args=self.expand_args, n_jobs=self.n_jobs,
verbose=self.verbose, **self.kwargs)
class Filter(BaseTransformer):
def __init__(self, pred, ret_type=None, expand_args=True, n_jobs=1, verbose=0, **kwargs):
'''
dp.Filter is the respective transformer for dp.filter.
Parameters
-----------
:param pred: the filter predicate
:param ret_type: if provided the used return type, otherwise ret_type(data)
:param expand_args: true if args should be expanded, false otherwise
:param n_jobs: amount of used threads/processes
:param verbose: verbosity level for tqdm / joblib
:param kwargs: additional arguments for joblib.Parallel, e.g. backend='loky'
'''
self.pred = pred
self.ret_type = ret_type
self.expand_args = expand_args
self.n_jobs = n_jobs
self.verbose = verbose
self.kwargs = kwargs
def transform(self, data, *args, **kwargs):
return filter(self.pred, data, ret_type=self.ret_type, expand_args=self.expand_args, n_jobs=self.n_jobs,
verbose=self.verbose, **self.kwargs)
class Splitter(BaseTransformer):
def __init__(self, func, ret_type=None, return_labels=False, expand_args=True, n_jobs=1, verbose=0, **kwargs):
'''
dp.Splitter is the respective transformer for dp.split.
Parameters
-----------
:param func: the discriminator function
:param ret_type: if provided the used return type, otherwise ret_type(data)
:param return_labels: true if the associated labels should be returned, false otherwise
:param expand_args: true if args should be expanded, false otherwise
:param n_jobs: amount of used threads/processes
:param verbose: verbosity level for tqdm / joblib
:param kwargs: additional arguments for joblib.Parallel, e.g. backend='loky'
'''
self.func = func
self.ret_type = ret_type
self.return_labels = return_labels
self.expand_args = expand_args
self.n_jobs = n_jobs
self.verbose = verbose
self.kwargs = kwargs
def transform(self, data, *args, **kwargs):
return split(self.func, data, ret_type=self.ret_type, return_labels=self.return_labels,
expand_args=self.expand_args, n_jobs=self.n_jobs, verbose=self.verbose, **self.kwargs)
class Expander(BaseTransformer):
def __init__(self, func, ret_type=None, expand_args=True, n_jobs=1, verbose=0, **kwargs):
'''
dp.Expander is the respective transformer for dp.expand.
Parameters
-----------
:param func: the expansion function
:param ret_type: if provided the used return type, otherwise ret_type(data)
:param expand_args: true if args should be expanded, false otherwise
:param n_jobs: amount of used threads/processes
:param verbose: verbosity level for tqdm / joblib
:param kwargs: additional arguments for joblib.Parallel, e.g. backend='loky'
'''
self.func = func
self.ret_type = ret_type
self.expand_args = expand_args
self.n_jobs = n_jobs
self.verbose = verbose
self.kwargs = kwargs
def transform(self, data, *args, **kwargs):
return expand(self.func, data, ret_type=self.ret_type, expand_args=self.expand_args, n_jons=self.n_jobs,
verbose=self.verbose, **self.kwargs)
class Combiner(BaseTransformer):
def __init__(self, func, expand_args=True, n_jobs=1, verbose=0, **kwargs):
'''
dp.Combiner is the respective transformer for dp.combine.
Parameters
-----------
:param func: the combination function
:param expand_args: true if args should be expanded, false otherwise
:param n_jobs: amount of used threads/processes
:param verbose: verbosity level for tqdm / joblib
:param kwargs: additional arguments for joblib.Parallel, e.g. backend='loky'
'''
self.func = func
self.expand_args = expand_args
self.n_jobs = n_jobs
self.verbose = verbose
self.kwargs = kwargs
def transform(self, data, *args, **kwargs):
return combine(self.func, *data, expand_args=self.expand_args, n_jobs=self.n_jobs, verbose=self.verbose, **self.kwargs)
class Joiner(BaseTransformer):
def __init__(self, func, expand_args=True, n_jobs=1, verbose=0, **kwargs):
'''
dp.Joiner is the respective transformer for dp.join.
Parameters
-----------
:param func: the join function
:param expand_args: true if args should be expanded, false otherwise
:param n_jobs: amount of used threads/processes
:param verbose: verbosity level for tqdm / joblib
:param kwargs: additional arguments for joblib.Parallel, e.g. backend='loky'
'''
self.func = func
self.expand_args = expand_args
self.n_jobs = n_jobs
self.verbose = verbose
self.kwargs = kwargs
def transform(self, data, *args, **kwargs):
return join(self.func, *data, expand_args=self.expand_args, n_jobs=self.n_jobs, verbose=self.verbose, **self.kwargs)
class Manipulator(BaseTransformer):
def __init__(self, func, void=False, *args, **kwargs):
'''
dp.Manipulator is a transformer to manipulate the entire collection of data items.
Parameters
-----------
:param func: the manipulation function
:param void: if true the result is not returned
:param args: additional args for func
:param kwargs: additional kwargs for func
'''
self.func = func
self.void = void
self.args = args
self.kwargs = kwargs
def transform(self, data, *args, **kwargs):
res = self.func(data, *self.args, **self.kwargs)
return res if self.void is False else data
class Window(BaseTransformer):
def __init__(self, size, step=1, ret_type=None):
'''
dp.Window is the respective transformer for dp.windowed.
Parameters
-----------
:param data: an iterable collection of data
:param size: the window size
:param step: the window step
:param ret_type: if provided the used return type, otherwise ret_type(data)
'''
self.size = size
self.step = step
self.ret_type = ret_type
def transform(self, data, *args, **kwargs):
return windowed(data, self.size, step=self.step, ret_type=self.ret_type)
class Flat(BaseTransformer):
def __init__(self, ret_type=None):
'''
dp.Flat is the respective transformer for dp.flatten.
Parameters
-----------
:param ret_type: if provided the used return type, otherwise ret_type(data)
'''
self.ret_type = ret_type
def transform(self, data, *args, **kwargs):
return flatten(data, ret_type=self.ret_type)
class Union(BaseTransformer):
def __init__(self, *transformers, n_jobs=1, verbose=0, **kwargs):
'''
dp.Union is a construct to manipulate mutli-collections of data tiems.
Parameters
-----------
:param transformers: the transformers for the respective collections of data items
:param n_jobs: amount of used threads/processes
:param verbose: verbosity level for tqdm / joblib
:param kwargs: additional arguments for joblib.Parallel, e.g. backend='loky'
'''
self.transformers = transformers
self.n_jobs = n_jobs
self.verbose = verbose
self.kwargs = kwargs
def transform(self, data, *args, **kwargs):
if self.n_jobs == 1:
return [transformer.transform(items, *args, **kwargs)
for transformer, items in tqdm(zip(self.transformers, data), disable=self.verbose < 1)]
return Parallel(n_jobs=self.n_jobs, verbose=self.verbose, **self.kwargs)(delayed(transformer.transform)
(items, *args, **kwargs) for transformer, items in zip(self.transformers, data))
class Pipeline(BaseTransformer):
def __init__(self, *transformers, verbose=0):
'''
dp.Pipeline is a construct to pipe a collection of transformers.
Parameters
-----------
:param transformers: the transformer sequence to apply
:param verbose: verbosity level for tqdm
'''
self.transformers = list(transformers)
self.verbose = verbose
def transform(self, data, *args, **kwargs):
res = data
for transformer in tqdm(self.transformers, disable=self.verbose < 1):
res = transformer.transform(res, *args, **kwargs)
return res
| 36.028269 | 127 | 0.640643 | 1,277 | 10,196 | 4.981989 | 0.097103 | 0.033008 | 0.02122 | 0.037724 | 0.682804 | 0.666143 | 0.6094 | 0.600283 | 0.587865 | 0.556429 | 0 | 0.002518 | 0.260004 | 10,196 | 282 | 128 | 36.156028 | 0.840689 | 0.382601 | 0 | 0.535714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205357 | false | 0 | 0.035714 | 0.071429 | 0.455357 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcfb7330e40f9b79f2ab184f143d401951828548 | 2,513 | py | Python | tacker/sol_refactored/common/vnf_instance_utils.py | h1r0mu/tacker | 8c69dda51fcfe215c4878a86b82018d2b96e5561 | [
"Apache-2.0"
] | 116 | 2015-10-18T02:57:08.000Z | 2022-03-15T04:09:18.000Z | tacker/sol_refactored/common/vnf_instance_utils.py | h1r0mu/tacker | 8c69dda51fcfe215c4878a86b82018d2b96e5561 | [
"Apache-2.0"
] | 6 | 2016-11-07T22:15:54.000Z | 2021-05-09T06:13:08.000Z | tacker/sol_refactored/common/vnf_instance_utils.py | h1r0mu/tacker | 8c69dda51fcfe215c4878a86b82018d2b96e5561 | [
"Apache-2.0"
] | 166 | 2015-10-20T15:31:52.000Z | 2021-11-12T08:39:49.000Z | # Copyright (C) 2021 Nippon Telegraph and Telephone Corporation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from oslo_log import log as logging
from tacker.sol_refactored.common import exceptions as sol_ex
from tacker.sol_refactored import objects
LOG = logging.getLogger(__name__) # not used at the moment
def get_inst(context, inst_id):
inst = objects.VnfInstanceV2.get_by_id(context, inst_id)
if inst is None:
raise sol_ex.VnfInstanceNotFound(inst_id=inst_id)
return inst
def get_inst_all(context):
return objects.VnfInstanceV2.get_all(context)
def inst_href(inst_id, endpoint):
return "{}/v2/vnflcm/vnf_instances/{}".format(endpoint, inst_id)
def make_inst_links(inst, endpoint):
links = objects.VnfInstanceV2_Links()
self_href = inst_href(inst.id, endpoint)
links.self = objects.Link(href=self_href)
if inst.instantiationState == 'NOT_INSTANTIATED':
links.instantiate = objects.Link(href=self_href + "/instantiate")
else: # 'INSTANTIATED'
links.terminate = objects.Link(href=self_href + "/terminate")
# TODO(oda-g): add when the operation supported
# links.scale = objects.Link(href = self_href + "/scale")
# etc.
return links
# see IETF RFC 7396
def json_merge_patch(target, patch):
if isinstance(patch, dict):
if not isinstance(target, dict):
target = {}
for key, value in patch.items():
if value is None:
if key in target:
del target[key]
else:
target[key] = json_merge_patch(target.get(key), value)
return target
else:
return patch
def select_vim_info(vim_connection_info):
# NOTE: It is assumed that vimConnectionInfo has only one item
# at the moment. If there are multiple items, it is uncertain
# which item is selected.
for vim_info in vim_connection_info.values():
return vim_info
| 32.217949 | 78 | 0.68842 | 345 | 2,513 | 4.884058 | 0.446377 | 0.024926 | 0.035608 | 0.045104 | 0.080712 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008264 | 0.229606 | 2,513 | 77 | 79 | 32.636364 | 0.862087 | 0.374055 | 0 | 0.078947 | 0 | 0 | 0.043254 | 0.018722 | 0 | 0 | 0 | 0.012987 | 0 | 1 | 0.157895 | false | 0 | 0.078947 | 0.052632 | 0.421053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bcfe9ac7b39f229df7d1b4504478244ea3835c1b | 483 | py | Python | tests/migrations/0010_modeltest_datetime_field1.py | intellineers/django-bridger | ed097984a99df7da40a4d01bd00c56e3c6083056 | [
"BSD-3-Clause"
] | 2 | 2020-03-17T00:53:23.000Z | 2020-07-16T07:00:33.000Z | tests/migrations/0010_modeltest_datetime_field1.py | intellineers/django-bridger | ed097984a99df7da40a4d01bd00c56e3c6083056 | [
"BSD-3-Clause"
] | 76 | 2019-12-05T01:15:57.000Z | 2021-09-07T16:47:27.000Z | tests/migrations/0010_modeltest_datetime_field1.py | intellineers/django-bridger | ed097984a99df7da40a4d01bd00c56e3c6083056 | [
"BSD-3-Clause"
] | 1 | 2020-02-05T15:09:47.000Z | 2020-02-05T15:09:47.000Z | # Generated by Django 2.2.9 on 2020-01-28 14:50
import django.utils.timezone
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("tests", "0009_auto_20200113_1239"),
]
operations = [
migrations.AddField(
model_name="modeltest",
name="datetime_field1",
field=models.DateTimeField(default=django.utils.timezone.now),
preserve_default=False,
),
]
| 23 | 74 | 0.63354 | 52 | 483 | 5.769231 | 0.75 | 0.073333 | 0.126667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089636 | 0.26087 | 483 | 20 | 75 | 24.15 | 0.7507 | 0.093168 | 0 | 0 | 1 | 0 | 0.119266 | 0.052752 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c09b0745eee677f40540659a3c584b6e7535d0a | 968 | py | Python | js/matrixjs/matrix_compile.py | kennytilton/ConnectJS | a16121052839b6f447718dccb008761d92094885 | [
"MIT"
] | 7 | 2017-07-31T20:28:33.000Z | 2020-11-23T13:18:20.000Z | js/matrixjs/matrix_compile.py | kennytilton/ConnectJS | a16121052839b6f447718dccb008761d92094885 | [
"MIT"
] | null | null | null | js/matrixjs/matrix_compile.py | kennytilton/ConnectJS | a16121052839b6f447718dccb008761d92094885 | [
"MIT"
] | 1 | 2020-02-26T06:09:33.000Z | 2020-02-26T06:09:33.000Z | #!/usr/bin/python2.4
import httplib, urllib, sys
# Define the parameters for the POST request and encode them in
# a URL-safe format.
params = urllib.urlencode([
#('js_code', sys.argv[1]),
('code_url', 'https://raw.githubusercontent.com/kennytilton/MatrixJS/master/js/matrixjs/js/Matrix/Cells.js'),
('code_url', 'https://raw.githubusercontent.com/kennytilton/MatrixJS/master/js/matrixjs/js/Matrix/Model.js'),
('code_url', 'https://raw.githubusercontent.com/kennytilton/MatrixJS/master/js/matrixjs/js/Tag.js'),
('compilation_level', 'ADVANCED_OPTIMIZATIONS'),
('output_format', 'text'),
('output_info', 'warnings'),
])
# Always use the following value for the Content-type header.
headers = { "Content-type": "application/x-www-form-urlencoded" }
conn = httplib.HTTPConnection('closure-compiler.appspot.com')
conn.request('POST', '/compile', params, headers)
response = conn.getresponse()
data = response.read()
print data
conn.close() | 38.72 | 113 | 0.72314 | 126 | 968 | 5.492063 | 0.571429 | 0.026012 | 0.052023 | 0.065029 | 0.33526 | 0.33526 | 0.33526 | 0.33526 | 0.33526 | 0.33526 | 0 | 0.003472 | 0.107438 | 968 | 25 | 114 | 38.72 | 0.797454 | 0.191116 | 0 | 0 | 0 | 0.1875 | 0.578947 | 0.106547 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.0625 | null | null | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c0acd1ef9075a9d08118479182c8461e04d6e01 | 3,844 | py | Python | texts.py | ProtKsen/pgame | c4455c6c07eaf275f9fcfa661cd6933ee7b1ff92 | [
"MIT"
] | 2 | 2021-04-14T09:49:27.000Z | 2022-03-08T17:26:49.000Z | texts.py | ProtKsen/pgame | c4455c6c07eaf275f9fcfa661cd6933ee7b1ff92 | [
"MIT"
] | null | null | null | texts.py | ProtKsen/pgame | c4455c6c07eaf275f9fcfa661cd6933ee7b1ff92 | [
"MIT"
] | 2 | 2021-01-11T12:09:26.000Z | 2021-04-14T09:49:45.000Z | """Text parts."""
SEPARATOR = '----------------------------------'
CONT_GAME = 'enter для продолжения игры'
GREETING = 'Добро пожаловать в игру ''Сундук сокровищ''!\n' \
'Попробуй себя в роли капитана корабля, собери ' \
'команду и достань все сокровища!'
NAME_QUESTION = 'Как тебя зовут?'
CHOOSE_LEVEL = 'Выбери уровень сложности, он влияет на стоимость ' \
'сокровищ на островах. \n' \
'1 - легко \n' \
'2 - средне \n' \
'3 - тяжело'
INTRODUCTION = 'В наследство от дядюшки тебе достался корабль, \n' \
'несколько золотых монет и карта, на которой \n' \
'отмечены 10 островов. На каждом из островов \n' \
'зарыт клад. Но для того, чтобы достать его, \n' \
'необходимо обезвредить ловушку. Чем больше \n' \
'порядковый номер острова, тем ценнее хранящееся \n' \
'на нем сокровище и тем труднее его получить. \n\n' \
'Цель игры - добыть все сокровища и скопить как можно больше монет. \n\n' \
'Команда твоего корабля сможет обезвредить ловушку, \n' \
'только если будет иметь нужное количество очков \n' \
'логики, силы и ловкости. \n\n' \
'!!! Сумма всех требуемых очков равна номеру острова,\n' \
'но точная комбинация тебе неизвестна. !!!'
ORACLE_QUESTION = 'Здесь неподалеку живет известный оракул. За определенную\n' \
'плату он сможет предсказать с какой ловушкой\n' \
'ты столкнешься на острове. Пойдешь ли ты к нему?\n' \
'----------------------------------\n'\
'1 - да, пойду\n' \
'2 - нет, сам разберусь'
ORACLE_QUESTION_1 = 'Что ты хочешь узнать у оракула? \n' \
'----------------------------------\n'\
'1 - я передумал, буду сам себе оракул! \n'\
'2 - сколько очков логики должно быть у команды? (1 монета) \n'\
'3 - сколько очков силы должно быть у команды? (1 монета) \n'\
'4 - сколько очков ловкости должно быть у команды? (1 монета) \n'\
'5 - узнать все требуемые характеристики (3 монеты)'
ORACLE_QUESTION_2 = 'Что ты хочешь узнать у оракула? \n' \
'----------------------------------\n'\
'1 - я передумал, буду сам себе оракул! \n'\
'2 - сколько очков логики должно быть у команды? (1 монета) \n'\
'3 - сколько очков силы должно быть у команды? (1 монета) \n'\
'4 - сколько очков ловкости должно быть у команды? (1 монета)'
GO_TAVERN_TEXT = 'Отлично! Для похода на остров тебе понадобится \n' \
'команда, а нанять ее ты сможешь в таверне.'
EXIT_QUESTION = 'Продолжить игру?\n' \
'----------------------------------\n'\
'1 - да\n' \
'2 - нет'
SUCCESS_STEP = 'Поздравляю! Ты смог достать спрятанное сокровище! \n' \
'Самое время готовиться к следующему походу.'
FAILURE_STEP = 'К сожалению, ты не смог достать сокровище. \n' \
'Если у тебя еще остались монеты, то можешь \n' \
'попробовать организовать поход заново. Удачи!'
WINNING = 'Поздравляю! Ты собрал сокровища со всех окрестных \n' \
'островов, можешь выкинуть ненужную теперь карту) \n' \
'Конец игры.'
LOSING = 'Сожалею, ты потратил все деньги. Карьера пиратского \n' \
'капитана подошла к концу. А дядюшка в тебя верил! \n' \
'Конец игры.'
NAMES = ['Боб', 'Ричард', 'Алан', 'Степан', 'Грозный Глаз', 'Гарри',
'Моррис', 'Джек', 'Алекс', 'Сэм', 'Том', 'Янис', 'Геральт',
'Ринсвинд', 'Купер', 'Борис', 'Джон', 'Рон']
| 48.05 | 90 | 0.533299 | 434 | 3,844 | 4.693548 | 0.532258 | 0.006873 | 0.032401 | 0.053019 | 0.18704 | 0.18704 | 0.18704 | 0.186549 | 0.186549 | 0.186549 | 0 | 0.010349 | 0.32128 | 3,844 | 79 | 91 | 48.658228 | 0.77041 | 0.002862 | 0 | 0.190476 | 0 | 0 | 0.655344 | 0.046512 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c0b4ff66b7c9992658d93e432fdd2bd5452694f | 3,587 | py | Python | api/migrations/0001_initial.py | alerin345/Instagram-React | 25dfbcbff2a2d050e4f2804a74cd7c901cd2cb66 | [
"MIT"
] | null | null | null | api/migrations/0001_initial.py | alerin345/Instagram-React | 25dfbcbff2a2d050e4f2804a74cd7c901cd2cb66 | [
"MIT"
] | null | null | null | api/migrations/0001_initial.py | alerin345/Instagram-React | 25dfbcbff2a2d050e4f2804a74cd7c901cd2cb66 | [
"MIT"
] | null | null | null | # Generated by Django 3.1.3 on 2021-01-07 00:42
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import django.utils.timezone
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='Image',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('picture', models.ImageField(null=True, upload_to='')),
('description', models.TextField(blank=True, default='')),
('likes', models.IntegerField(default=0)),
('comments', models.IntegerField(default=0)),
('date', models.DateTimeField(default=django.utils.timezone.now)),
('user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Subscription',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='user', to=settings.AUTH_USER_MODEL)),
('userSubscribed', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='userSubscribed', to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Profile',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('picture', models.ImageField(blank=True, default='default.png', null=True, upload_to='')),
('description', models.TextField(blank=True, default='')),
('user', models.OneToOneField(null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Like',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('image', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='api.image')),
('user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.CreateModel(
name='Comment',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('value', models.TextField()),
('date', models.DateTimeField(default=django.utils.timezone.now)),
('image', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='api.image')),
('user', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
migrations.AddConstraint(
model_name='subscription',
constraint=models.UniqueConstraint(fields=('user', 'userSubscribed'), name='unique subscribes'),
),
migrations.AddConstraint(
model_name='like',
constraint=models.UniqueConstraint(fields=('image', 'user'), name='unique likes'),
),
]
| 48.472973 | 170 | 0.611096 | 368 | 3,587 | 5.836957 | 0.214674 | 0.037244 | 0.058659 | 0.092179 | 0.674581 | 0.663873 | 0.663873 | 0.663873 | 0.59311 | 0.59311 | 0 | 0.006287 | 0.246167 | 3,587 | 73 | 171 | 49.136986 | 0.788092 | 0.012545 | 0 | 0.575758 | 1 | 0 | 0.079096 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.060606 | 0 | 0.121212 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c10bad25f060a9091150e60b188f22ceaae17b0 | 14,801 | py | Python | PythonServer/UnitTestCasesForWebSocket.py | Cyberlightning/2D-3DCapture | e5fdcec4f25358fc1964068180e4e774f45daa8a | [
"Apache-2.0"
] | 2 | 2015-11-04T10:21:48.000Z | 2016-03-07T15:14:35.000Z | 2D-3D-Capture/PythonServer/UnitTestCasesForWebSocket.py | Cyberlightning/Cyber-WeX | 11dc560b7a30eb31c1dfa18196f6a0760648f9a7 | [
"Apache-2.0"
] | null | null | null | 2D-3D-Capture/PythonServer/UnitTestCasesForWebSocket.py | Cyberlightning/Cyber-WeX | 11dc560b7a30eb31c1dfa18196f6a0760648f9a7 | [
"Apache-2.0"
] | null | null | null | '''
Created on Mar 6, 2014
@author: tharanga
'''
import unittest
from time import sleep
import EventService as es
from EventService import WebSocketServer as ws
from EventService import EventManager as em
import socket
from base64 import b64encode
import struct
import MySQLdb
import json
import EventService
import flaskr
import tempfile
def encodeMessage( message):
message = b64encode(message)
b1 =0x80 | 0x1 & 0x0f
b2 = 0
header=""
payload_len = len(message)
if payload_len < 126 :
header = struct.pack('>BB', b1, payload_len)
message= header +message
elif (payload_len < ((2 ** 16) - 1)):
b2 |= 126
header += chr(b1)
header += chr(b2)
l = struct.pack(">H", payload_len)
header += l
message = header +message
else:
b2 |= 127
header += chr(b1)
header += chr(b2)
l = struct.pack(">Q", payload_len)
header += l
message = header +message
return message
class TestWebSockets(unittest.TestCase):
def setUp(self):
self.wsServer = ws('',12345,'127.0.0.1')
self.wsServer.setRunning(True);
sleep(1)
self.testsocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.testsocket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) # Create a socket object
host = 'localhost' # Get local machine name
port = 12345
self.testsocket.connect((host, port))
def tearDown(self):
self.wsServer.closeConnection();
self.testsocket.close()
sleep(1)
def test_webSocketServerOBject(self):
self.assertEqual(self.wsServer.SERVER, '', "Server set to the desired value")
self.assertEqual(self.wsServer.PORT, 12345, "Server port is set correctly")
self.assertEqual(self.wsServer.LOCALHOST, "127.0.0.1", "Localhost set to 127.0.0.1")
def test_invalid_Request(self):
message= "Test Message"
self.testsocket.send(message)
data = repr(self.testsocket.recv(1024))
#print 'Response to invalid message<TestMessage> %s'%(data)
self.assertEqual(data, '\'CONNECTION_REJECTED\'', "Invalid Message rejected")
def test_valid_WS_Request(self):
message = "GET /mychat HTTP/1.1\nHost: server.example.com\nUpgrade: websocket\nConnection: Upgrade\nSec-WebSocket-Key: x3JJHMbDL1EzLkh9GBhXDw==\nSec-WebSocket-Protocol: chat\nSec-WebSocket-Version: 13\nOrigin: localhost\n\n"
# message = "Test message"
self.testsocket.sendall(message)
wsresponse = repr(self.testsocket.recv(1024))
#print 'Response to valid ws request %s'%wsresponse
self.assertNotEqual(wsresponse, '\'CONNECTION_REJECTED\'', "Connection is not rejected")
self.assertIsNotNone(wsresponse, "Connection Response is not Empty")
self.testsocket.sendall(("Test Message"))
data = repr(self.testsocket.recv(1024))
#print 'Response to un encoded Request %s'%(data)
self.assertEqual(data, "\'Un expected opcode\'", "In valid Message rejected")
def test_invalid_Messge(self):
message = "GET /mychat HTTP/1.1\nHost: server.example.com\nUpgrade: websocket\nConnection: Upgrade\nSec-WebSocket-Key: x3JJHMbDL1EzLkh9GBhXDw==\nSec-WebSocket-Protocol: chat\nSec-WebSocket-Version: 13\nOrigin: localhost\n\n"
self.testsocket.sendall(message)
wsresponse = repr(self.testsocket.recv(1024))
sleep(1)
self.testsocket.sendall("Test Message")
data = repr(self.testsocket.recv(1024))
self.assertEqual(data, "\'Un expected opcode\'", "In valid Message rejected")
def test_malformed_Message(self):
message = "GET /mychat HTTP/1.1\nHost: server.example.com\nUpgrade: websocket\nConnection: Upgrade\nSec-WebSocket-Key: x3JJHMbDL1EzLkh9GBhXDw==\nSec-WebSocket-Protocol: chat\nSec-WebSocket-Version: 13\nOrigin: localhost\n\n"
self.testsocket.sendall(message)
wsresponse = repr(self.testsocket.recv(1024))
# print wsresponse
self.testsocket.send(encodeMessage("Test Message"))#This line seems to get stuck at times. Solution is to use sendAll, use \n at the end
data = repr(self.testsocket.recv(1024))
self.assertEqual(data, "\'MISFORMATED MESSAGE\'", "Messages with out a type is rejected")
def test_wellformed_Message_for_Text(self):
message = "GET /mychat HTTP/1.1\nHost: server.example.com\nUpgrade: websocket\nConnection: Upgrade\nSec-WebSocket-Key: x3JJHMbDL1EzLkh9GBhXDw==\nSec-WebSocket-Protocol: chat\nSec-WebSocket-Version: 13\nOrigin: localhost\n\n"
self.testsocket.sendall(message)
wsresponse = repr(self.testsocket.recv(1024))
# print wsresponse
self.testsocket.send(encodeMessage("1<---->Test Message"))#This line seems to get stuck at times. Solution is to use sendAll, use \n at the end
data = repr(self.testsocket.recv(1024))
print data
self.assertEqual(data, "\'Text received\'", "Text Messages is identified and accepted")
def test_wellformed_Message_for_Json(self):
message = "GET /mychat HTTP/1.1\nHost: server.example.com\nUpgrade: websocket\nConnection: Upgrade\nSec-WebSocket-Key: x3JJHMbDL1EzLkh9GBhXDw==\nSec-WebSocket-Protocol: chat\nSec-WebSocket-Version: 13\nOrigin: localhost\n\n"
self.testsocket.sendall(message)
wsresponse = repr(self.testsocket.recv(1024))
self.testsocket.send(encodeMessage("2<---->Test Message"))#This line seems to get stuck at times. Solution is to use sendAll, use \n at the end
data = repr(self.testsocket.recv(1024))
# print data
self.assertEqual(data, "\'json is received\'", "json Messages is identified and accepted")
##TO RUN THE FOLLOWING UNIT TESTS IT IS EXPECTED HAVE THE DATABASE
##CREATED. DATABASE SCRIPT IS PROVIDED TO CREATE THE NECESSARY DATABASES AND TABLES
##ASSISCIATED DATA IS NOT PROVIDED.
class TestDatabase(unittest.TestCase):
def setUp(self):
self.connection = es.dbConnect()
def tearDown(self):
self.connection.close()
def test_data_insert_data_Read(self):
self.assertIsInstance(self.connection, MySQLdb.connection, "Database connection accurately set")
jsondata ={"type":"image", "time":"2014.3.4_14.40.30", "ext":"png", "deviceType":"Mobile", "deviceOS":"Badda", "browsertype":"Firefox", "position":{"lon":25.4583105, "lat":65.0600797, "alt":-1000, "acc":48.38800048828125}, "device":{"ax":0, "ay":0, "az":0, "gx":0, "gy":0, "gz":0, "ra":210.5637, "rb":47.5657, "rg":6.9698, "orientation":"potrait"}, "vwidth":480, "vheight":800}
alt = str(jsondata["position"]["alt"]);
if alt=="None":
alt = '0'
heading = '0'
speed = '0'
width = jsondata["vwidth"]
height =jsondata["vheight"]
if width > height :
screenorientation= 1.00#landscape
else :
screenorientation= 0.00#potrait
filename = jsondata["type"]+"_"+jsondata["time"]+"."+jsondata["ext"]
sqlstring1 = "INSERT INTO Imagedata values (\'"+filename+"\',GeomFromText ('POINT("+ str(jsondata["position"]["lat"])+" "+str(jsondata["position"]["lon"])+")'),"+str(jsondata["position"]["alt"])+","+str(jsondata["position"]["acc"])
sqlstring2 =","+str(jsondata["device"]["gx"])+","+str(jsondata["device"]["gy"])+","+str(jsondata["device"]["gz"])
sqlstring3 = ","+str(jsondata["device"]["ra"])+","+str(jsondata["device"]["rb"])+","+str(jsondata["device"]["rg"])+","+str(screenorientation)+",\'"+jsondata["device"]["orientation"]+"\',now(),\'"+str(jsondata["deviceOS"])+"\',\'"+str(jsondata["browsertype"])+"\',\'"+str(jsondata["deviceType"])+"\');"
sqlstring = sqlstring1 + sqlstring2+ sqlstring3
#print(sqlstring)
es.dbInsert(sqlstring)
sqlreadsting = 'select imagename, Browser,devicetype,X(location) as latitude, Y(location) as longitude from Imagedata where time=\'2014.3.4_14.40.31\''
result = es.dbRead(sqlreadsting)
self.assertIsNotNone(result, "Inserted data is retrieved and it is not null")
for row in result:
self.assertEqual(row[0], "image_2014.3.4_14.40.30.png", "Image name is correctly set and saved")
self.assertEqual(row[1], 65.0600797, "Latitudes are saved")
self.assertEqual(row[2], 25.4583105, "Longitude are saved")
HOST = '127.0.0.1' # The remote host
PORT = 17322
class RestServerTestCase(unittest.TestCase):
def setUp(self):
self.db_fd, flaskr.app.config['DATABASE'] = tempfile.mkstemp()
EventService.app.config['TESTING'] = True
self.app = EventService.app.test_client()
flaskr.init_db()
#self.socketServer = self.app.WebSocketServer('',wsport,'127.0.0.1')
def test_rootpath(self):
rv = self.app.get('/')
assert 'This is a REST Service for 2D3DCapture Server.' in rv.data
def test_post_image(self):
rv = self.app.post('/postImage')
assert 'READY' in rv.data
def test_clossing_websocket(self):
rv =self.app.post('/closewebsocketserver')
assert 'CLOSED' or 'ALREADY_CLOSSED' in rv.data
def test_start_websocket(self):
rv =self.app.get('/startwebsocketserver')
# print rv.data
assert 'READY' in rv.data
def test_post_binary_image(self):
rv =self.app.post('/postBinaryImage')
assert 'READY' or '415 Unsupported Media Type' in rv.data
def test_get_All_Image_Data(self):
rv =self.app.get('/getAllImageData')
jsonmsg = json.loads(rv.data)
self.assertIsNotNone(jsonmsg['imageList'] , "getImageData returns a non None list")
def test_get_location_Image_Data(self):
rv =self.app.get('/getLocationImageData?lat=65.0600797&lon=25.4583105')
jsonmsg = json.loads(rv.data)
self.assertIsNotNone(jsonmsg['imageList'] , "getLocationImageData returns a non None list.This is a feature test for location based image data")
def test_closest_Image_retrieval(self):
jsondata1 ={"type":"image", "time":"2014.3.4_14.40.31", "ext":"png", "deviceType":"Mobile", "deviceOS":"Badda", "browsertype":"Firefox", "position":{"lon":25.4583105, "lat":65.0600797, "alt":-1000, "acc":48.38800048828125}, "device":{"ax":0, "ay":0, "az":0, "gx":0, "gy":0, "gz":0, "ra":210.5637, "rb":47.5657, "rg":6.9698, "orientation":"potrait"}, "vwidth":480, "vheight":800}
jsondata2 ={"type":"image", "time":"2014.3.4_14.40.32", "ext":"png", "deviceType":"Mobile", "deviceOS":"Badda", "browsertype":"Firefox", "position":{"lon":25.4582115, "lat":65.0600797, "alt":-1000, "acc":48.38800048828125}, "device":{"ax":0, "ay":0, "az":0, "gx":0, "gy":0, "gz":0, "ra":210.5637, "rb":47.5657, "rg":6.9698, "orientation":"potrait"}, "vwidth":480, "vheight":800}
jsondata3 ={"type":"image", "time":"2014.3.4_14.40.33", "ext":"png", "deviceType":"Mobile", "deviceOS":"Badda", "browsertype":"Firefox", "position":{"lon":25.4584104, "lat":65.0600797, "alt":-1000, "acc":48.38800048828125}, "device":{"ax":0, "ay":0, "az":0, "gx":0, "gy":0, "gz":0, "ra":210.5637, "rb":47.5657, "rg":6.9698, "orientation":"potrait"}, "vwidth":480, "vheight":800}
jsondata4 ={"type":"image", "time":"2014.3.4_14.40.34", "ext":"png", "deviceType":"Mobile", "deviceOS":"Badda", "browsertype":"Firefox", "position":{"lon":25.4586115, "lat":65.0600797, "alt":-1000, "acc":48.38800048828125}, "device":{"ax":0, "ay":0, "az":0, "gx":0, "gy":0, "gz":0, "ra":210.5637, "rb":47.5657, "rg":6.9698, "orientation":"potrait"}, "vwidth":480, "vheight":800}
jsondata5 ={"type":"image", "time":"2014.3.4_14.40.35", "ext":"png", "deviceType":"Mobile", "deviceOS":"Badda", "browsertype":"Firefox", "position":{"lon":25.4587125, "lat":65.0600797, "alt":-1000, "acc":48.38800048828125}, "device":{"ax":0, "ay":0, "az":0, "gx":0, "gy":0, "gz":0, "ra":210.5637, "rb":47.5657, "rg":6.9698, "orientation":"potrait"}, "vwidth":480, "vheight":800}
jsondata6 ={"type":"image", "time":"2014.3.4_14.40.36", "ext":"png", "deviceType":"Mobile", "deviceOS":"Badda", "browsertype":"Firefox", "position":{"lon":25.4588125, "lat":65.0600797, "alt":-1000, "acc":48.38800048828125}, "device":{"ax":0, "ay":0, "az":0, "gx":0, "gy":0, "gz":0, "ra":210.5637, "rb":47.5657, "rg":6.9698, "orientation":"potrait"}, "vwidth":480, "vheight":800}
es.saveData(jsondata1)
es.saveData(jsondata2)
es.saveData(jsondata3)
es.saveData(jsondata4)
es.saveData(jsondata5)
es.saveData(jsondata6)
radius = 0.0001
photoList = es.getClosestImages( 65.0601787, 25.4583107, radius )
self.assertEqual(len(photoList), 4, "Length of the list should be equal of the first test")
for row in photoList:
assert 'image_2014.3.4_14.40.32.png' or 'image_2014.3.4_14.40.31.png' in row[0]
photoList2 = es.getClosestImages( 65.0601787, 25.4587107, radius )
self.assertEqual(len(photoList2), 2, "Length of the list should be equal of the second test")
for row in photoList2:
assert 'image_2014.3.4_14.40.34.png' or 'image_2014.3.4_14.40.35.png' in row[0]
def suite():
testsuit =unittest.TestSuite()
testsuit.addTest(TestWebSockets('test_webSocketServerOBject'))
testsuit.addTest(TestWebSockets('test_valid_WS_Request'))
testsuit.addTest(TestWebSockets('test_invalid_Messge'))
testsuit.addTest(TestWebSockets('test_invalid_Request'))
testsuit.addTest(TestWebSockets('test_malformed_Message'))
testsuit.addTest(TestWebSockets('test_wellformed_Message_for_Text'))
testsuit.addTest(TestWebSockets('test_wellformed_Message_for_Json'))
testsuit.addTest(TestDatabase('test_data_insert_data_Read'))
testsuit.addTest(RestServerTestCase('test_rootpath'))
testsuit.addTest(RestServerTestCase('test_post_image'))
testsuit.addTest(RestServerTestCase('test_start_websocket'))
testsuit.addTest(RestServerTestCase('test_clossing_websocket'))
testsuit.addTest(RestServerTestCase('test_post_binary_image'))
testsuit.addTest(RestServerTestCase('test_get_All_Image_Data'))
testsuit.addTest(RestServerTestCase('test_closest_Image_retrieval'))
return testsuit
suite = suite()
runner = unittest.TextTestRunner(verbosity=3)
runner.run(suite)
# if __name__ == "__main__":
# #import sys;sys.argv = ['', 'Test.testName']
# unittest.main() | 57.368217 | 386 | 0.646375 | 1,841 | 14,801 | 5.128734 | 0.197175 | 0.038551 | 0.008261 | 0.011015 | 0.553908 | 0.46738 | 0.443444 | 0.407223 | 0.384876 | 0.35681 | 0 | 0.077514 | 0.191136 | 14,801 | 258 | 387 | 57.368217 | 0.711159 | 0.066347 | 0 | 0.22439 | 0 | 0.029268 | 0.311163 | 0.093363 | 0 | 0 | 0.0008 | 0 | 0.131707 | 0 | null | null | 0 | 0.063415 | null | null | 0.004878 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c1514125da0b6d26946b5990ca8e3d69b019fd3 | 1,369 | py | Python | tests/core/feature_extraction/test_galaxyProcessor.py | EmilioCC/gti770-student-framework | 3cd72da8fe78c7ecfc26c9e688cbe1b7deee353a | [
"MIT"
] | null | null | null | tests/core/feature_extraction/test_galaxyProcessor.py | EmilioCC/gti770-student-framework | 3cd72da8fe78c7ecfc26c9e688cbe1b7deee353a | [
"MIT"
] | null | null | null | tests/core/feature_extraction/test_galaxyProcessor.py | EmilioCC/gti770-student-framework | 3cd72da8fe78c7ecfc26c9e688cbe1b7deee353a | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import numpy as np
from unittest import TestCase
from core.feature_extraction.galaxy.galaxy_processor import GalaxyProcessor
from commons.helpers.dataset.strategies.galaxy_dataset.label_strategy import GalaxyDataSetLabelStrategy
from commons.helpers.dataset.context import Context
class TestGalaxyProcessor(TestCase):
def setUp(self):
validation_size = 0.2
# Get the ground truth CSV file from script's parameters.
self.galaxy_csv_file = os.environ["VIRTUAL_ENV"] + "/data/csv/galaxy/galaxy.csv"
self.galaxy_images_path = os.environ["VIRTUAL_ENV"] + "/data/images/"
# Create instance of data set loading strategies.
galaxy_label_data_set_strategy = GalaxyDataSetLabelStrategy()
# Set the context to galaxy label data set loading strategy.
context = Context(galaxy_label_data_set_strategy)
context.set_strategy(galaxy_label_data_set_strategy)
self.label_dataset = context.load_dataset(csv_file=self.galaxy_csv_file, one_hot=False,
validation_size=np.float32(validation_size))
def testGalaxyProcessor(self):
# Process galaxies.
galaxy_processor = GalaxyProcessor(self.galaxy_images_path)
#features = galaxy_processor.process_galaxy(self.label_dataset) | 42.78125 | 103 | 0.731921 | 166 | 1,369 | 5.807229 | 0.39759 | 0.036307 | 0.062241 | 0.074689 | 0.128631 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004513 | 0.19065 | 1,369 | 32 | 104 | 42.78125 | 0.865523 | 0.208181 | 0 | 0 | 0 | 0 | 0.057514 | 0.025046 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
4c155f18a1b1670b63f094d3de08857496d9f8be | 1,630 | py | Python | gmso/formats/formats_registry.py | chrisiacovella/gmso | c78e2425ccb98ea952f024a569346d36045f6918 | [
"MIT"
] | 20 | 2020-02-28T21:47:54.000Z | 2022-02-14T20:13:56.000Z | gmso/formats/formats_registry.py | chrisiacovella/gmso | c78e2425ccb98ea952f024a569346d36045f6918 | [
"MIT"
] | 364 | 2020-03-02T16:11:57.000Z | 2022-03-29T00:57:00.000Z | gmso/formats/formats_registry.py | chrisiacovella/gmso | c78e2425ccb98ea952f024a569346d36045f6918 | [
"MIT"
] | 28 | 2020-02-28T21:12:30.000Z | 2022-01-31T21:02:30.000Z | """Registry utilities to handle formats for gmso Topology."""
class UnsupportedFileFormatError(Exception):
"""Exception to be raised whenever the file loading or saving is not supported."""
class Registry:
"""A registry to incorporate a callable with a file extension."""
def __init__(self):
self.handlers = {}
def _assert_can_process(self, extension):
if extension not in self.handlers:
raise UnsupportedFileFormatError(
f"Extension {extension} cannot be processed as no utility "
f"is defined in the current API to handle {extension} files."
)
def get_callable(self, extension):
"""Get the callable associated with extension."""
self._assert_can_process(extension)
return self.handlers[extension]
SaversRegistry = Registry()
LoadersRegistry = Registry()
class saves_as:
"""Decorator to aid saving."""
def __init__(self, *extensions):
extension_set = set(extensions)
self.extensions = extension_set
def __call__(self, method):
"""Register the method as saver for an extension."""
for ext in self.extensions:
SaversRegistry.handlers[ext] = method
return method
class loads_as:
"""Decorator to aid loading."""
def __init__(self, *extensions):
extension_set = set(extensions)
self.extensions = extension_set
def __call__(self, method):
"""Register the method as loader for an extension."""
for ext in self.extensions:
LoadersRegistry.handlers[ext] = method
return method
| 28.596491 | 86 | 0.655828 | 183 | 1,630 | 5.661202 | 0.355191 | 0.081081 | 0.088803 | 0.100386 | 0.333977 | 0.277992 | 0.277992 | 0.277992 | 0.208494 | 0.208494 | 0 | 0 | 0.257055 | 1,630 | 56 | 87 | 29.107143 | 0.855491 | 0.234356 | 0 | 0.387097 | 0 | 0 | 0.094449 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 1 | 0.225806 | false | 0 | 0 | 0 | 0.451613 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c193e499f0f1632e4dcf16c607003de7e5c3eaa | 14,091 | py | Python | docker/docker-puppet.py | mail2nsrajesh/tripleo-heat-templates | 368b3eadda577f9914d181893df2df96367e8fad | [
"Apache-2.0"
] | null | null | null | docker/docker-puppet.py | mail2nsrajesh/tripleo-heat-templates | 368b3eadda577f9914d181893df2df96367e8fad | [
"Apache-2.0"
] | null | null | null | docker/docker-puppet.py | mail2nsrajesh/tripleo-heat-templates | 368b3eadda577f9914d181893df2df96367e8fad | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# Shell script tool to run puppet inside of the given docker container image.
# Uses the config file at /var/lib/docker-puppet/docker-puppet.json as a source for a JSON
# array of [config_volume, puppet_tags, manifest, config_image, [volumes]] settings
# that can be used to generate config files or run ad-hoc puppet modules
# inside of a container.
import glob
import json
import logging
import os
import sys
import subprocess
import sys
import tempfile
import multiprocessing
log = logging.getLogger()
ch = logging.StreamHandler(sys.stdout)
if os.environ.get('DEBUG', False):
log.setLevel(logging.DEBUG)
ch.setLevel(logging.DEBUG)
else:
log.setLevel(logging.INFO)
ch.setLevel(logging.INFO)
formatter = logging.Formatter('%(asctime)s %(levelname)s: %(message)s')
ch.setFormatter(formatter)
log.addHandler(ch)
# this is to match what we do in deployed-server
def short_hostname():
subproc = subprocess.Popen(['hostname', '-s'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
cmd_stdout, cmd_stderr = subproc.communicate()
return cmd_stdout.rstrip()
def pull_image(name):
log.info('Pulling image: %s' % name)
subproc = subprocess.Popen(['/usr/bin/docker', 'pull', name],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
cmd_stdout, cmd_stderr = subproc.communicate()
if cmd_stdout:
log.debug(cmd_stdout)
if cmd_stderr:
log.debug(cmd_stderr)
def match_config_volume(prefix, config):
# Match the mounted config volume - we can't just use the
# key as e.g "novacomute" consumes config-data/nova
volumes = config.get('volumes', [])
config_volume=None
for v in volumes:
if v.startswith(prefix):
config_volume = os.path.relpath(
v.split(":")[0], prefix).split("/")[0]
break
return config_volume
def get_config_hash(prefix, config_volume):
hashfile = os.path.join(prefix, "%s.md5sum" % config_volume)
hash_data = None
if os.path.isfile(hashfile):
with open(hashfile) as f:
hash_data = f.read().rstrip()
return hash_data
def rm_container(name):
if os.environ.get('SHOW_DIFF', None):
log.info('Diffing container: %s' % name)
subproc = subprocess.Popen(['/usr/bin/docker', 'diff', name],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
cmd_stdout, cmd_stderr = subproc.communicate()
if cmd_stdout:
log.debug(cmd_stdout)
if cmd_stderr:
log.debug(cmd_stderr)
log.info('Removing container: %s' % name)
subproc = subprocess.Popen(['/usr/bin/docker', 'rm', name],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
cmd_stdout, cmd_stderr = subproc.communicate()
if cmd_stdout:
log.debug(cmd_stdout)
if cmd_stderr and \
cmd_stderr != 'Error response from daemon: ' \
'No such container: {}\n'.format(name):
log.debug(cmd_stderr)
process_count = int(os.environ.get('PROCESS_COUNT',
multiprocessing.cpu_count()))
log.info('Running docker-puppet')
config_file = os.environ.get('CONFIG', '/var/lib/docker-puppet/docker-puppet.json')
log.debug('CONFIG: %s' % config_file)
with open(config_file) as f:
json_data = json.load(f)
# To save time we support configuring 'shared' services at the same
# time. For example configuring all of the heat services
# in a single container pass makes sense and will save some time.
# To support this we merge shared settings together here.
#
# We key off of config_volume as this should be the same for a
# given group of services. We are also now specifying the container
# in which the services should be configured. This should match
# in all instances where the volume name is also the same.
configs = {}
for service in (json_data or []):
if service is None:
continue
if isinstance(service, dict):
service = [
service.get('config_volume'),
service.get('puppet_tags'),
service.get('step_config'),
service.get('config_image'),
service.get('volumes', []),
]
config_volume = service[0] or ''
puppet_tags = service[1] or ''
manifest = service[2] or ''
config_image = service[3] or ''
volumes = service[4] if len(service) > 4 else []
if not manifest or not config_image:
continue
log.info('config_volume %s' % config_volume)
log.info('puppet_tags %s' % puppet_tags)
log.info('manifest %s' % manifest)
log.info('config_image %s' % config_image)
log.info('volumes %s' % volumes)
# We key off of config volume for all configs.
if config_volume in configs:
# Append puppet tags and manifest.
log.info("Existing service, appending puppet tags and manifest")
if puppet_tags:
configs[config_volume][1] = '%s,%s' % (configs[config_volume][1],
puppet_tags)
if manifest:
configs[config_volume][2] = '%s\n%s' % (configs[config_volume][2],
manifest)
if configs[config_volume][3] != config_image:
log.warn("Config containers do not match even though"
" shared volumes are the same!")
else:
log.info("Adding new service")
configs[config_volume] = service
log.info('Service compilation completed.')
def mp_puppet_config((config_volume, puppet_tags, manifest, config_image, volumes)):
log.debug('config_volume %s' % config_volume)
log.debug('puppet_tags %s' % puppet_tags)
log.debug('manifest %s' % manifest)
log.debug('config_image %s' % config_image)
log.debug('volumes %s' % volumes)
sh_script = '/var/lib/docker-puppet/docker-puppet.sh'
with open(sh_script, 'w') as script_file:
os.chmod(script_file.name, 0755)
script_file.write("""#!/bin/bash
set -ex
mkdir -p /etc/puppet
cp -a /tmp/puppet-etc/* /etc/puppet
rm -Rf /etc/puppet/ssl # not in use and causes permission errors
echo "{\\"step\\": $STEP}" > /etc/puppet/hieradata/docker.json
TAGS=""
if [ -n "$PUPPET_TAGS" ]; then
TAGS="--tags \"$PUPPET_TAGS\""
fi
# workaround LP1696283
mkdir -p /etc/ssh
touch /etc/ssh/ssh_known_hosts
FACTER_hostname=$HOSTNAME FACTER_uuid=docker /usr/bin/puppet apply --verbose $TAGS /etc/config.pp
# Disables archiving
if [ -z "$NO_ARCHIVE" ]; then
archivedirs=("/etc" "/root" "/var/lib/ironic/tftpboot" "/var/lib/ironic/httpboot" "/var/www")
rsync_srcs=""
for d in "${archivedirs[@]}"; do
if [ -d "$d" ]; then
rsync_srcs+=" $d"
fi
done
rsync -a -R --delay-updates --delete-after $rsync_srcs /var/lib/config-data/${NAME}
# Also make a copy of files modified during puppet run
# This is useful for debugging
mkdir -p /var/lib/config-data/puppet-generated/${NAME}
rsync -a -R -0 --delay-updates --delete-after \
--files-from=<(find $rsync_srcs -newer /etc/ssh/ssh_known_hosts -print0) \
/ /var/lib/config-data/puppet-generated/${NAME}
# Write a checksum of the config-data dir, this is used as a
# salt to trigger container restart when the config changes
tar -c -f - /var/lib/config-data/${NAME} --mtime='1970-01-01' | md5sum | awk '{print $1}' > /var/lib/config-data/${NAME}.md5sum
fi
""")
with tempfile.NamedTemporaryFile() as tmp_man:
with open(tmp_man.name, 'w') as man_file:
man_file.write('include ::tripleo::packages\n')
man_file.write(manifest)
rm_container('docker-puppet-%s' % config_volume)
pull_image(config_image)
dcmd = ['/usr/bin/docker', 'run',
'--user', 'root',
'--name', 'docker-puppet-%s' % config_volume,
'--env', 'PUPPET_TAGS=%s' % puppet_tags,
'--env', 'NAME=%s' % config_volume,
'--env', 'HOSTNAME=%s' % short_hostname(),
'--env', 'NO_ARCHIVE=%s' % os.environ.get('NO_ARCHIVE', ''),
'--env', 'STEP=%s' % os.environ.get('STEP', '6'),
'--volume', '%s:/etc/config.pp:ro' % tmp_man.name,
'--volume', '/etc/puppet/:/tmp/puppet-etc/:ro',
'--volume', '/usr/share/openstack-puppet/modules/:/usr/share/openstack-puppet/modules/:ro',
'--volume', '/var/lib/config-data/:/var/lib/config-data/:rw',
'--volume', 'tripleo_logs:/var/log/tripleo/',
# OpenSSL trusted CA injection
'--volume', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro',
'--volume', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro',
'--volume', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro',
'--volume', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro',
# script injection
'--volume', '%s:%s:rw' % (sh_script, sh_script) ]
for volume in volumes:
if volume:
dcmd.extend(['--volume', volume])
dcmd.extend(['--entrypoint', sh_script])
env = {}
# NOTE(flaper87): Always copy the DOCKER_* environment variables as
# they contain the access data for the docker daemon.
for k in filter(lambda k: k.startswith('DOCKER'), os.environ.keys()):
env[k] = os.environ.get(k)
if os.environ.get('NET_HOST', 'false') == 'true':
log.debug('NET_HOST enabled')
dcmd.extend(['--net', 'host', '--volume',
'/etc/hosts:/etc/hosts:ro'])
dcmd.append(config_image)
log.debug('Running docker command: %s' % ' '.join(dcmd))
subproc = subprocess.Popen(dcmd, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, env=env)
cmd_stdout, cmd_stderr = subproc.communicate()
if subproc.returncode != 0:
log.error('Failed running docker-puppet.py for %s' % config_volume)
if cmd_stdout:
log.error(cmd_stdout)
if cmd_stderr:
log.error(cmd_stderr)
else:
if cmd_stdout:
log.debug(cmd_stdout)
if cmd_stderr:
log.debug(cmd_stderr)
# only delete successful runs, for debugging
rm_container('docker-puppet-%s' % config_volume)
return subproc.returncode
# Holds all the information for each process to consume.
# Instead of starting them all linearly we run them using a process
# pool. This creates a list of arguments for the above function
# to consume.
process_map = []
for config_volume in configs:
service = configs[config_volume]
puppet_tags = service[1] or ''
manifest = service[2] or ''
config_image = service[3] or ''
volumes = service[4] if len(service) > 4 else []
if puppet_tags:
puppet_tags = "file,file_line,concat,augeas,%s" % puppet_tags
else:
puppet_tags = "file,file_line,concat,augeas"
process_map.append([config_volume, puppet_tags, manifest, config_image, volumes])
for p in process_map:
log.debug('- %s' % p)
# Fire off processes to perform each configuration. Defaults
# to the number of CPUs on the system.
p = multiprocessing.Pool(process_count)
returncodes = list(p.map(mp_puppet_config, process_map))
config_volumes = [pm[0] for pm in process_map]
success = True
for returncode, config_volume in zip(returncodes, config_volumes):
if returncode != 0:
log.error('ERROR configuring %s' % config_volume)
success = False
# Update the startup configs with the config hash we generated above
config_volume_prefix = os.environ.get('CONFIG_VOLUME_PREFIX', '/var/lib/config-data')
log.debug('CONFIG_VOLUME_PREFIX: %s' % config_volume_prefix)
startup_configs = os.environ.get('STARTUP_CONFIG_PATTERN', '/var/lib/tripleo-config/docker-container-startup-config-step_*.json')
log.debug('STARTUP_CONFIG_PATTERN: %s' % startup_configs)
infiles = glob.glob('/var/lib/tripleo-config/docker-container-startup-config-step_*.json')
for infile in infiles:
with open(infile) as f:
infile_data = json.load(f)
for k, v in infile_data.iteritems():
config_volume = match_config_volume(config_volume_prefix, v)
if config_volume:
config_hash = get_config_hash(config_volume_prefix, config_volume)
if config_hash:
env = v.get('environment', [])
env.append("TRIPLEO_CONFIG_HASH=%s" % config_hash)
log.debug("Updating config hash for %s, config_volume=%s hash=%s" % (k, config_volume, config_hash))
infile_data[k]['environment'] = env
outfile = os.path.join(os.path.dirname(infile), "hashed-" + os.path.basename(infile))
with open(outfile, 'w') as out_f:
json.dump(infile_data, out_f)
if not success:
sys.exit(1)
| 39.581461 | 139 | 0.613299 | 1,823 | 14,091 | 4.626988 | 0.224355 | 0.065442 | 0.014226 | 0.015175 | 0.251452 | 0.219324 | 0.182573 | 0.14179 | 0.108477 | 0.097095 | 0 | 0.0052 | 0.263076 | 14,091 | 355 | 140 | 39.692958 | 0.807107 | 0.155986 | 0 | 0.203065 | 0 | 0.034483 | 0.32239 | 0.099671 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.034483 | null | null | 0.007663 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c1b595f4c6b8f77081c78b4858260e00facf459 | 4,420 | py | Python | pylinkcheck.py | clayball/pylinkcheck | 085e5562525bebc77b8ebfd3b0fb676b01f4be68 | [
"MIT"
] | null | null | null | pylinkcheck.py | clayball/pylinkcheck | 085e5562525bebc77b8ebfd3b0fb676b01f4be68 | [
"MIT"
] | null | null | null | pylinkcheck.py | clayball/pylinkcheck | 085e5562525bebc77b8ebfd3b0fb676b01f4be68 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# Copyright (c) 2016 Clay Wells
#
# A Python-based link checker.
#
# Usage: pylinkcheck.py -r https://www.example.com
#
# By default, we can spider and check all of the links found at the URL's
# domain. For example, a check of https://foo.example.com will only check
# links with the base URL path of foo.example.com. Link found to
# bar.example.com will not be checked.
#
# Fancy run-time options
# url root (domain): this is simply required
# generate report file: -o output.txt, --output=output.txt
# limit depth: -l 2, --limit=2
# TODO: report format: --format=txt,html,xml
##############################################################################
import argparse
import urllib2
import csv
from datetime import datetime
import re
from urlparse import urlparse
from bs4 import BeautifulSoup
#######################################
# Functions
# Spider the base URL
def spiderURL(baseurl, pathlimit):
# build a list based on each sub directory found
print '[spider] path limit set to %d' % pathlimit
# Print an informative summary of the dead links
def printReport(deadlinks):
# print each item in the deadlinks list or CLEAN if empty
print '\n\n'
print '#' * 79
print ' Link Checker Results\n'
if not deadlinks:
print '[+] CLEAN: No dead links found'
else:
for item in deadlinks:
print '[-] NOT FOUND: %s' % item
#######################################
# Main program
#
# Get command line options
parser = argparse.ArgumentParser(description='A Python-based link checker.')
parser.add_argument('-f','--format', required=False, default='txt',
help='Output file format ')
parser.add_argument('-l','--limit', required=False, default=2,
help='Limit directory depth, example.com/limit/dir/depth/')
parser.add_argument('-u','--url', help='Base URL to check', required=True)
parser.add_argument('-o','--output', help='Output file name', required=False)
args = parser.parse_args()
# Assign program arguments to variables
# - we may want to add a '/' to baseurl if it's not present.
# - if the href links are relative we need to add the baseurl when checking
# the link.
baseurl = str(args.url)
pathlimit = int(args.limit)
# Show values
print 'Base URL: %s' % args.url
print 'Output file format: %s' % args.format
print 'Output file: %s' % args.output
print 'Limit spider: %d' % args.limit
# Grab today's date for timestamping output file.
now = datetime.now()
tstamp = now.strftime("%Y%m%d-%H%M")
# Grab all a href links
checkurl = urllib2.urlopen(baseurl).read()
soup = BeautifulSoup(checkurl, 'html.parser')
# Spider the site and build our list of URLs to check
spiderURL(baseurl, pathlimit)
deadlinks = []
# This for loop will completely change once the spiderURL function is working.
# We'll iterate over the various directory paths instead.
outofscope = 0
# Check the URLs
for link in soup("a"):
# Fetch the link but only return the status code
# hrefs are unpredicatable we can add a function to 'clean' them up, i.e.,
# get the proto, domain, path, file (TODO: for a complete solution we
# need to get all of this)
#if baseurl[:-1] == '/':
# print '[debug] strip last char from baseurl'
# mailto: is causing an error
href = link.get('href')
print '[debug] href: %s' % href
if re.match('^mailto', href):
# skip this one
continue
# Separate the file from the path
thisurl = urlparse(href)
if thisurl.netloc != baseurl and thisurl.netloc != '':
print '[-] HREF %s is out of scope' % thisurl.netloc
outofscope = 1
else:
print '[debug] path %s' % thisurl.path
outofscope = 0
# Build the full URL if the href is relative.
# - assuming, for now, other protocols are not desired
# - place this in the Spider function
try:
if re.match('^http', href):
checkurl = href
else:
checkurl = baseurl + href
except:
print '[-] Unknown error in re.match()'
try:
#print '[+] checking %s' % checkurl
hrefpage = urllib2.urlopen(checkurl)
except urllib2.HTTPError as e:
if e.code == 404:
print '[-] 404 ERROR: %s' % checkurl
# add this URL to deadlink list
deadlinks.append(checkurl)
else:
print '[-] HTTP ERROR: %d - %s' % (e.code, checkurl)
except urllib2.URLError as e:
# Not an HTTP-specific error (e.g. connection refused)
print '[-] NON-HTTP ERROR: %d - %s' % (e.code, checkurl)
else:
print '[+] Status %d for %s' % (hrefpage.getcode(), checkurl)
printReport(deadlinks)
# EOF | 29.466667 | 78 | 0.671946 | 651 | 4,420 | 4.554531 | 0.360983 | 0.016863 | 0.022934 | 0.010793 | 0.031703 | 0.016189 | 0.016189 | 0 | 0 | 0 | 0 | 0.006862 | 0.175792 | 4,420 | 150 | 79 | 29.466667 | 0.807027 | 0.423303 | 0 | 0.126761 | 0 | 0 | 0.23655 | 0.011956 | 0 | 0 | 0 | 0.006667 | 0 | 0 | null | null | 0 | 0.098592 | null | null | 0.28169 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c1e5197c84f6ae0e879e45cf958dcfad6b26bdf | 7,000 | py | Python | console.py | aplneto/redes_projeto | 450ef8ac61e46bc38ff34142d07eda3d726ce326 | [
"MIT"
] | 1 | 2019-04-04T13:10:01.000Z | 2019-04-04T13:10:01.000Z | console.py | aplneto/redes_projeto | 450ef8ac61e46bc38ff34142d07eda3d726ce326 | [
"MIT"
] | null | null | null | console.py | aplneto/redes_projeto | 450ef8ac61e46bc38ff34142d07eda3d726ce326 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""Módulo de configuração dos consoles
"""
from Crypto.PublicKey import RSA
import socket
import os
import base64
class Console(object):
"""Superclasse Console
Classe base para os terminais de cliente e servidor.
Attributes:
logged (bool): True caso o usuário tenha realizado o login com sucesso,
False caso contrário
"""
def __init__(self, **kwargs):
"""Método construtor do console
Kwargs:
sock (socket): socket de comunicação
key_file (str): arquivo para inicialização de par de chaves
"""
self.sock = kwargs.get('sock',
socket.socket(socket.AF_INET,
socket.SOCK_STREAM))
key_file = kwargs.get('key_file', '')
if key_file:
self.privatekey, self.publickey = Console.start_key(key_file)
def run(self):
"""Método run difere entre o Console do Host e o do Client
O Método run controla o comportamento do objeto como um todo.
Todo o comportamento de um console individual deve ser definido dentro
do método run.
"""
raise NotImplemented
@staticmethod
def start_key(key_file):
"""Método de inicialização das chaves
Esse método inicializa a chave privada e prepara, também, a chave
pública para envio.
Args:
key_file (str): endereço do arquivo da chave privada
Returns:
(tuple) uma tupla contendo um par _RSAobj (chave privada) e byte
(inicializador da chave pública)
"""
try:
keyfile = open(key_file, 'rb')
except FileNotFoundError:
private_key = RSA.generate(1024)
else:
private_key = RSA.importKey(keyfile.read())
keyfile.close()
finally:
public_key = private_key.publickey().exportKey()
return private_key, public_key
def receive_key(self):
"""Troca de chaves no início da comunicação
Ao se conectarem, servidor e cliente trocam suas chaves públicas um com
o outro. Esse método retorna um objeto do tipo RSA público a partir da
chave pública recebida através de um socket.
Returns:
(_RSAobj) chave pública para criptografia.
"""
k = self.sock.recv(1024)
key = RSA.importKey(k)
return key
def send(self, msg):
"""Método send envia strings simples através do socket
O Método send é o método usado apara enviar mensagens simples através
de um socket. Dentro desse método ocorrem as criptografias RSA e base64
antes do envio."
Args:
msg (str ou bytes): mensagem a ser enviada
"""
msg = self.encrypt(msg)
self.sock.send(msg)
def receive(self, b = 160):
"""Método receive recebe mensagens simples através do socket
É através desse método que o usuário recebe mensagens simples através
do socket. As mensagens chegam criptografadas e a descriptografia
acontece dentro do método receive.
Args:
b (int): quantidade de bytes a serem recebidos
Returns:
(str) mensagem decifrada
"""
msg = self.decrypt(self.sock.recv(b))
return msg.decode('utf-8')
def encrypt(self, msg):
"""Criptografia de uma string ou trecho de bytes
Args:
msg (str ou bytes): string ou bytes a serem criptografados.
Returns:
(bytes) segmento de bytes criptografados
"""
if isinstance(msg, str):
msg = msg.encode('utf-8')
msg = self.publickey.encrypt(msg, 3.14159265359)
msg = base64.a85encode(msg[0])
return msg
def decrypt(self, msg):
"""Método de conversão de um trecho criptografado
Args:
msg (bytes): trecho de mensagem a ser decifrado
Returns:
(bytes): trecho de bytes decifrados
"""
msg = base64.a85decode(msg)
msg = self.privatekey.decrypt(msg)
return msg
def send_file(self, filename):
"""Rotina de envio de arquivos através de sockets
Esse método controla o envio sequencial de segmentos de um arquivo
através de um socket, gerando a cada envio um número inteiro referente
a quantidade de bytes enviados até o momento.
Método deve ser usado como um gerador. Veja exemplo abaixo.
Example:
for b in self.sendfile('alice.txt'):
if b == -1:
print("Houve um erro na transferência")
else:
print(str(b) + "de " str(file_size) "bytes enviados")
Args:
filename (str): endereço do arquivo
Yields:
(int) quantidade de bytes enviados ou -1, em caso de erro
"""
size = os.path.getsize(filename)
self.send(str(size))
sent = 0
file = open(filename, 'rb')
while sent < size:
ack = self.receive()
nxt = file.read(1024)
self.sock.send(nxt)
sent += len(nxt)
yield sent
file.close()
def receive_file(self, filename):
"""Rotina de recebimento de arquivos através de sockets
Esse método controla o recebeimendo de sementos de arquivos através de
um socket. O método gera a quantidade de bytes recebidos a cada nova
mensagem recebida do socket, por tanto, deve ser usado como um gerador.
Example:
for b in receive_file(filename):
print(str(b) + " de " str(filesize) " bytes recebidos.")
Args:
filename(str): nome do arquivo
Yields:
(int) quantidade de bytes recebidos
"""
size = int(self.receive())
file = open(filename, 'wb')
rcvd = 0
while rcvd < size:
self.send('ack')
nxt = self.sock.recv(1024)
rcvd += len(nxt)
file.write(nxt)
yield rcvd
file.close()
def __repr__(self):
return "{0}({1}, {2}, key_file = {3})".format(self.__class__.__name__,
self.sock.__repr__(), self.client.__repr__(),
repr(self.key_file)) | 32.110092 | 80 | 0.532143 | 770 | 7,000 | 4.771429 | 0.327273 | 0.019053 | 0.023136 | 0.018508 | 0.10724 | 0.0773 | 0.043549 | 0.024496 | 0.024496 | 0 | 0 | 0.013242 | 0.395857 | 7,000 | 218 | 81 | 32.110092 | 0.855521 | 0.460857 | 0 | 0.054795 | 0 | 0 | 0.02305 | 0 | 0 | 0 | 0 | 0.087156 | 0 | 1 | 0.150685 | false | 0 | 0.082192 | 0.013699 | 0.328767 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c1e5c9719ab7645023165c5beb655aadf6e00c7 | 4,988 | py | Python | sandbox/settings.py | OmenApps/marion | f501674cafbd91f0bbad7454e4dcf3527cf4445e | [
"MIT"
] | null | null | null | sandbox/settings.py | OmenApps/marion | f501674cafbd91f0bbad7454e4dcf3527cf4445e | [
"MIT"
] | null | null | null | sandbox/settings.py | OmenApps/marion | f501674cafbd91f0bbad7454e4dcf3527cf4445e | [
"MIT"
] | null | null | null | """
Django settings for marion project.
"""
from pathlib import Path
from tempfile import mkdtemp
from configurations import Configuration, values
BASE_DIR = Path(__file__).parent.resolve()
DATA_DIR = Path("/data")
# pylint: disable=no-init
class Base(Configuration):
"""
This is the base configuration every configuration (aka environnement)
should inherit from. It is recommended to configure third-party
applications by creating a configuration mixins in ./configurations and
compose the Base configuration with those mixins.
It depends on an environment variable that SHOULD be defined:
* DJANGO_SECRET_KEY
You may also want to override default configuration by setting the
following environment variables:
* DB_NAME
* DB_HOST
* DB_PASSWORD
* DB_USER
"""
DEBUG = False
# Security
ALLOWED_HOSTS = []
SECRET_KEY = values.Value(None)
# SECURE_PROXY_SSL_HEADER allows to fix the scheme in Django's HttpRequest
# object when you application is behind a reverse proxy.
#
# Keep this SECURE_PROXY_SSL_HEADER configuration only if :
# - your Django app is behind a proxy.
# - your proxy strips the X-Forwarded-Proto header from all incoming requests
# - Your proxy sets the X-Forwarded-Proto header and sends it to Django
#
# In other cases, you should comment the following line to avoid security issues.
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
AUTH_PASSWORD_VALIDATORS = [
{
"NAME": (
"django.contrib.auth.password_validation."
"UserAttributeSimilarityValidator"
),
},
{"NAME": "django.contrib.auth.password_validation.MinimumLengthValidator"},
{"NAME": "django.contrib.auth.password_validation.CommonPasswordValidator"},
{"NAME": "django.contrib.auth.password_validation.NumericPasswordValidator"},
]
# Application
ROOT_URLCONF = "urls"
WSGI_APPLICATION = "wsgi.application"
# Database
DATABASES = {
"default": {
"ENGINE": values.Value(
"django.db.backends.postgresql_psycopg2",
environ_name="DB_ENGINE",
environ_prefix=None,
),
"NAME": values.Value("marion", environ_name="DB_NAME", environ_prefix=None),
"USER": values.Value("fun", environ_name="DB_USER", environ_prefix=None),
"PASSWORD": values.Value(
"pass", environ_name="DB_PASSWORD", environ_prefix=None
),
"HOST": values.Value(
"localhost", environ_name="DB_HOST", environ_prefix=None
),
"PORT": values.Value(5432, environ_name="DB_PORT", environ_prefix=None),
}
}
# Static files (CSS, JavaScript, Images)
STATIC_URL = "/static/"
STATIC_ROOT = DATA_DIR.joinpath("static")
MEDIA_URL = "/media/"
MEDIA_ROOT = DATA_DIR.joinpath("media")
# Internationalization
LANGUAGE_CODE = "en-us"
TIME_ZONE = "UTC"
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Application definition
INSTALLED_APPS = [
"django.contrib.admin",
"django.contrib.auth",
"django.contrib.contenttypes",
"django.contrib.sessions",
"django.contrib.messages",
"django.contrib.staticfiles",
"rest_framework",
"marion",
]
MIDDLEWARE = [
"django.middleware.security.SecurityMiddleware",
"django.contrib.sessions.middleware.SessionMiddleware",
"django.middleware.common.CommonMiddleware",
"django.middleware.csrf.CsrfViewMiddleware",
"django.contrib.auth.middleware.AuthenticationMiddleware",
"django.contrib.messages.middleware.MessageMiddleware",
"django.middleware.clickjacking.XFrameOptionsMiddleware",
]
TEMPLATES = [
{
"BACKEND": "django.template.backends.django.DjangoTemplates",
"DIRS": [],
"APP_DIRS": True,
"OPTIONS": {
"context_processors": [
"django.template.context_processors.debug",
"django.template.context_processors.request",
"django.contrib.auth.context_processors.auth",
"django.contrib.messages.context_processors.messages",
],
},
},
]
class Development(Base):
"""
Development environment settings
We set DEBUG to True and configure the server to respond from all hosts.
"""
DEBUG = True
ALLOWED_HOSTS = ["*"]
ROOT_URLCONF = "urls.debug"
# Application definition
INSTALLED_APPS = Base.INSTALLED_APPS + [
"howard",
]
MARION_DOCUMENT_ISSUER_CHOICES_CLASS = "howard.defaults.DocumentIssuerChoices"
class Test(Base):
"""Test environment settings"""
MEDIA_ROOT = Path(mkdtemp())
ROOT_URLCONF = "urls.debug"
| 30.230303 | 88 | 0.635525 | 503 | 4,988 | 6.139165 | 0.405567 | 0.063148 | 0.038536 | 0.027202 | 0.066062 | 0.050518 | 0 | 0 | 0 | 0 | 0 | 0.002456 | 0.265237 | 4,988 | 164 | 89 | 30.414634 | 0.840109 | 0.255413 | 0 | 0.063158 | 0 | 0 | 0.370668 | 0.282783 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.073684 | 0.031579 | 0 | 0.347368 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
4c215b75166895f7437c7eb221fac00ff09ebb82 | 2,032 | py | Python | coingate/migrations/0004_auto_20200207_1959.py | glitzybunny/coingate_sandbox_payment | f5686964cdd6b7d65f9f37957da4b2cda6a02f63 | [
"MIT"
] | 2 | 2020-08-31T17:53:06.000Z | 2020-08-31T18:33:05.000Z | coingate/migrations/0004_auto_20200207_1959.py | glitzybunny/coingate_sandbox_payment | f5686964cdd6b7d65f9f37957da4b2cda6a02f63 | [
"MIT"
] | 5 | 2021-03-30T12:48:17.000Z | 2021-09-22T18:32:14.000Z | coingate/migrations/0004_auto_20200207_1959.py | glitzybunny/coingate_sandbox_payment | f5686964cdd6b7d65f9f37957da4b2cda6a02f63 | [
"MIT"
] | 1 | 2020-11-04T04:42:58.000Z | 2020-11-04T04:42:58.000Z | # Generated by Django 3.0.3 on 2020-02-07 19:59
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('coingate', '0003_auto_20200207_1513'),
]
operations = [
migrations.RemoveField(
model_name='payment',
name='token',
),
migrations.AddField(
model_name='payment',
name='expire_at',
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name='payment',
name='pay_amount',
field=models.DecimalField(blank=True, decimal_places=1, max_digits=10, null=True),
),
migrations.AddField(
model_name='payment',
name='payment_address',
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AlterField(
model_name='payment',
name='created_at',
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AlterField(
model_name='payment',
name='price_currency',
field=models.CharField(choices=[('USD', 'USD'), ('EUR', 'EUR'), ('BTC', 'BTC'), ('LTC', 'LTC'), ('ETH', 'ETH')], default='USD', max_length=10),
),
migrations.AlterField(
model_name='payment',
name='receive_currency',
field=models.CharField(choices=[('USD', 'USD'), ('EUR', 'EUR'), ('BTC', 'BTC'), ('LTC', 'LTC'), ('ETH', 'ETH')], default='BTC', max_length=10),
),
migrations.AlterField(
model_name='payment',
name='status',
field=models.CharField(choices=[('new', 'Newly created invoice'), ('pending', 'Awaiting payment'), ('confirming', 'Awaiting blockchain network confirmation'), ('paid', 'Confirmed'), ('invalid', 'Rejected'), ('expired', 'Expired'), ('canceled', 'Canceled'), ('refunded', 'Refunded')], default='new', max_length=10),
),
]
| 38.339623 | 326 | 0.561024 | 199 | 2,032 | 5.603015 | 0.41206 | 0.088789 | 0.114798 | 0.143498 | 0.423318 | 0.423318 | 0.389238 | 0.389238 | 0.220628 | 0.129148 | 0 | 0.028995 | 0.270177 | 2,032 | 52 | 327 | 39.076923 | 0.722859 | 0.022146 | 0 | 0.5 | 1 | 0 | 0.207557 | 0.011587 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021739 | 0 | 0.086957 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4c219c1f42bd3a942209df9b52e42549c3b34e00 | 309 | py | Python | space_trace/__init__.py | SpaceTeam/space-event-trace | ec00d6895e0bdc2a046ec2d45143d6f8d47ace6f | [
"MIT"
] | 2 | 2022-01-04T00:34:27.000Z | 2022-01-04T00:51:14.000Z | space_trace/__init__.py | SpaceTeam/space-event-trace | ec00d6895e0bdc2a046ec2d45143d6f8d47ace6f | [
"MIT"
] | null | null | null | space_trace/__init__.py | SpaceTeam/space-event-trace | ec00d6895e0bdc2a046ec2d45143d6f8d47ace6f | [
"MIT"
] | null | null | null | import toml
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__, instance_relative_config=True)
app.config.from_file("config.toml", load=toml.load)
db = SQLAlchemy(app)
@app.before_first_request
def create_table():
db.create_all()
from space_trace import views, cli
| 18.176471 | 52 | 0.789644 | 46 | 309 | 5.021739 | 0.543478 | 0.077922 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122977 | 309 | 16 | 53 | 19.3125 | 0.852399 | 0 | 0 | 0 | 0 | 0 | 0.035599 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.