hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0a8309eb3ae604dcaac32cbfe6abca6ec54556a5 | 200 | py | Python | tests/IT/fixtures/test_fixtures.py | testandconquer/pytest-conquer | da600c7f5bcd06aa62c5cca9b75370bf1a6ebf05 | [
"MIT"
] | null | null | null | tests/IT/fixtures/test_fixtures.py | testandconquer/pytest-conquer | da600c7f5bcd06aa62c5cca9b75370bf1a6ebf05 | [
"MIT"
] | 5 | 2018-12-27T02:52:01.000Z | 2019-01-02T01:52:55.000Z | tests/IT/fixtures/test_fixtures.py | testandconquer/pytest-conquer | da600c7f5bcd06aa62c5cca9b75370bf1a6ebf05 | [
"MIT"
] | null | null | null | import pytest
@pytest.fixture
def fixture1():
return True
@pytest.fixture
def fixture2():
return True
def test_with_fixtures(fixture1, fixture2):
assert fixture1
assert fixture2
| 11.764706 | 43 | 0.725 | 24 | 200 | 5.958333 | 0.5 | 0.181818 | 0.223776 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037736 | 0.205 | 200 | 16 | 44 | 12.5 | 0.861635 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.3 | false | 0 | 0.1 | 0.2 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
0ad05cfb1d9eb5c322308d777a4b2cb7e8a71ce8 | 1,819 | py | Python | openapi_client/models/__init__.py | osuka/dognews-scraper | 12373064061157083a48ced8e2cabf9d1ace30a5 | [
"MIT"
] | 1 | 2019-11-15T13:19:36.000Z | 2019-11-15T13:19:36.000Z | openapi_client/models/__init__.py | osuka/news-extractor | 12373064061157083a48ced8e2cabf9d1ace30a5 | [
"MIT"
] | null | null | null | openapi_client/models/__init__.py | osuka/news-extractor | 12373064061157083a48ced8e2cabf9d1ace30a5 | [
"MIT"
] | null | null | null | # flake8: noqa
# import all models into this package
# if you have many models here with many references from one model to another this may
# raise a RecursionError
# to avoid this, import only the models that you directly need like:
# from from openapi_client.model.pet import Pet
# or import this package, but before doing it, use:
# import sys
# sys.setrecursionlimit(n)
from openapi_client.model.article import Article
from openapi_client.model.auth_token import AuthToken
from openapi_client.model.fetch import Fetch
from openapi_client.model.fetch_status_enum import FetchStatusEnum
from openapi_client.model.moderation import Moderation
from openapi_client.model.moderation_status_enum import ModerationStatusEnum
from openapi_client.model.paginated_article_list import PaginatedArticleList
from openapi_client.model.paginated_fetch_list import PaginatedFetchList
from openapi_client.model.paginated_moderation_list import PaginatedModerationList
from openapi_client.model.paginated_submission_list import PaginatedSubmissionList
from openapi_client.model.paginated_user_list import PaginatedUserList
from openapi_client.model.paginated_vote_list import PaginatedVoteList
from openapi_client.model.patched_fetch import PatchedFetch
from openapi_client.model.patched_moderation import PatchedModeration
from openapi_client.model.patched_submission import PatchedSubmission
from openapi_client.model.patched_user import PatchedUser
from openapi_client.model.submission import Submission
from openapi_client.model.token_obtain_pair import TokenObtainPair
from openapi_client.model.token_refresh import TokenRefresh
from openapi_client.model.token_verify import TokenVerify
from openapi_client.model.user import User
from openapi_client.model.value_enum import ValueEnum
from openapi_client.model.vote import Vote
| 51.971429 | 86 | 0.876306 | 248 | 1,819 | 6.225806 | 0.314516 | 0.170984 | 0.264249 | 0.341969 | 0.324482 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000602 | 0.087411 | 1,819 | 34 | 87 | 53.5 | 0.929518 | 0.195162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
e4012065cb6fc1bc64ef9ee79bd5e8118d1ecf36 | 218 | py | Python | src/bitcoin_rpc.py | perryatdmg/python_essentials | e33b0ff1aa039dc72bdb70b5ba73b60b5ba98a05 | [
"MIT"
] | null | null | null | src/bitcoin_rpc.py | perryatdmg/python_essentials | e33b0ff1aa039dc72bdb70b5ba73b60b5ba98a05 | [
"MIT"
] | null | null | null | src/bitcoin_rpc.py | perryatdmg/python_essentials | e33b0ff1aa039dc72bdb70b5ba73b60b5ba98a05 | [
"MIT"
] | null | null | null | from bitcoin_requests import BitcoinRPC
def bitcoindrpc():
rpc = BitcoinRPC('http://75.157.6.175:4000', 'user', 'pass')
blocks = rpc.generate(101)
# tx = rpc.sendtoaddress(address, 20)
return 1
| 21.8 | 64 | 0.651376 | 28 | 218 | 5.035714 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109827 | 0.206422 | 218 | 9 | 65 | 24.222222 | 0.705202 | 0.16055 | 0 | 0 | 1 | 0 | 0.178771 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.2 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
7c25e0d1b02afee3a8e3698f056b478765c67b76 | 46 | py | Python | settings.py | johnelutz/unoagilepythoncourse2 | 043a07f4234c2f98dbcca477102bf9ca0d1ed73e | [
"MIT"
] | null | null | null | settings.py | johnelutz/unoagilepythoncourse2 | 043a07f4234c2f98dbcca477102bf9ca0d1ed73e | [
"MIT"
] | null | null | null | settings.py | johnelutz/unoagilepythoncourse2 | 043a07f4234c2f98dbcca477102bf9ca0d1ed73e | [
"MIT"
] | null | null | null | import os
SECRET_KEY = os.getenv("SECRET_KEY") | 23 | 36 | 0.782609 | 8 | 46 | 4.25 | 0.625 | 0.529412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 46 | 2 | 36 | 23 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0.212766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
7c340de7ce983734ff9e04b17c7e84164ab2bae1 | 2,276 | py | Python | python/phonenumbers/data/region_CZ.py | Eyepea/python-phonenumbers | 0336e191fda80a21ed5c19d5e029ad8c70f620ee | [
"Apache-2.0"
] | 2 | 2019-03-30T02:12:54.000Z | 2021-03-08T18:59:40.000Z | python/phonenumbers/data/region_CZ.py | Eyepea/python-phonenumbers | 0336e191fda80a21ed5c19d5e029ad8c70f620ee | [
"Apache-2.0"
] | null | null | null | python/phonenumbers/data/region_CZ.py | Eyepea/python-phonenumbers | 0336e191fda80a21ed5c19d5e029ad8c70f620ee | [
"Apache-2.0"
] | 1 | 2018-11-10T03:47:34.000Z | 2018-11-10T03:47:34.000Z | """Auto-generated file, do not edit by hand. CZ metadata"""
from ..phonemetadata import NumberFormat, PhoneNumberDesc, PhoneMetadata
PHONE_METADATA_CZ = PhoneMetadata(id='CZ', country_code=420, international_prefix='00',
general_desc=PhoneNumberDesc(national_number_pattern='[2-8]\\d{8}|9\\d{8,11}', possible_number_pattern='\\d{9,12}'),
fixed_line=PhoneNumberDesc(national_number_pattern='2\\d{8}|(?:3[1257-9]|4[16-9]|5[13-9])\\d{7}', possible_number_pattern='\\d{9,12}', example_number='212345678'),
mobile=PhoneNumberDesc(national_number_pattern='(?:60[1-8]|7(?:0[2-5]|[2379]\\d))\\d{6}', possible_number_pattern='\\d{9,12}', example_number='601123456'),
toll_free=PhoneNumberDesc(national_number_pattern='800\\d{6}', possible_number_pattern='\\d{9,12}', example_number='800123456'),
premium_rate=PhoneNumberDesc(national_number_pattern='9(?:0[05689]|76)\\d{6}', possible_number_pattern='\\d{9,12}', example_number='900123456'),
shared_cost=PhoneNumberDesc(national_number_pattern='8[134]\\d{7}', possible_number_pattern='\\d{9,12}', example_number='811234567'),
personal_number=PhoneNumberDesc(national_number_pattern='70[01]\\d{6}', possible_number_pattern='\\d{9,12}', example_number='700123456'),
voip=PhoneNumberDesc(national_number_pattern='9[17]0\\d{6}', possible_number_pattern='\\d{9,12}', example_number='910123456'),
pager=PhoneNumberDesc(national_number_pattern='NA', possible_number_pattern='NA'),
uan=PhoneNumberDesc(national_number_pattern='9(?:5\\d|7[234])\\d{6}', possible_number_pattern='\\d{9,12}', example_number='972123456'),
voicemail=PhoneNumberDesc(national_number_pattern='9(?:3\\d{9}|6\\d{7,10})', possible_number_pattern='\\d{9,12}', example_number='93123456789'),
no_international_dialling=PhoneNumberDesc(national_number_pattern='NA', possible_number_pattern='NA'),
number_format=[NumberFormat(pattern='([2-9]\\d{2})(\\d{3})(\\d{3})', format=u'\\1 \\2 \\3', leading_digits_pattern=['[2-8]|9[015-7]']),
NumberFormat(pattern='(96\\d)(\\d{3})(\\d{3})(\\d{3})', format=u'\\1 \\2 \\3 \\4', leading_digits_pattern=['96']),
NumberFormat(pattern='(9\\d)(\\d{3})(\\d{3})(\\d{3})', format=u'\\1 \\2 \\3 \\4', leading_digits_pattern=['9[36]'])],
mobile_number_portable_region=True)
| 108.380952 | 167 | 0.711336 | 334 | 2,276 | 4.610778 | 0.278443 | 0.202597 | 0.225974 | 0.280519 | 0.530519 | 0.386364 | 0.37013 | 0.37013 | 0.345455 | 0.257143 | 0 | 0.110225 | 0.063269 | 2,276 | 20 | 168 | 113.8 | 0.612101 | 0.023286 | 0 | 0 | 1 | 0.111111 | 0.249436 | 0.117727 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
7c89a00ba235e8b613721019e9d937d33f7e6b44 | 123 | py | Python | depmgmtsystem/decoders/__init__.py | dm03514/architecting-a-package-manager | 9f97c0ab6879aa055821d4bb4cac8d83e4a6fe3d | [
"Unlicense"
] | null | null | null | depmgmtsystem/decoders/__init__.py | dm03514/architecting-a-package-manager | 9f97c0ab6879aa055821d4bb4cac8d83e4a6fe3d | [
"Unlicense"
] | null | null | null | depmgmtsystem/decoders/__init__.py | dm03514/architecting-a-package-manager | 9f97c0ab6879aa055821d4bb4cac8d83e4a6fe3d | [
"Unlicense"
] | null | null | null | from abc import ABC, abstractmethod
class DepsDecoder(ABC):
@abstractmethod
def decode(self):
return []
| 13.666667 | 35 | 0.666667 | 13 | 123 | 6.307692 | 0.769231 | 0.414634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.252033 | 123 | 8 | 36 | 15.375 | 0.891304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
7ca57c0c542cf53ee32989d65ba99920cd000b6a | 690 | py | Python | src/resistance.py | kyleyarwood/the-resistance | 6d36104fc036408446af39628451bd2e41bafc77 | [
"Apache-2.0"
] | null | null | null | src/resistance.py | kyleyarwood/the-resistance | 6d36104fc036408446af39628451bd2e41bafc77 | [
"Apache-2.0"
] | 6 | 2020-11-29T06:09:31.000Z | 2020-12-02T23:00:47.000Z | src/resistance.py | kyleyarwood/the-resistance | 6d36104fc036408446af39628451bd2e41bafc77 | [
"Apache-2.0"
] | null | null | null | from character import Character
from mission_decision import MissionDecision
from vote import Vote
from typing import List
from game import Game
from controller import Controller
class Resistance(Character):
def __init__(self, game: Game, cpu: bool, controller: Controller):
super().__init__(game, cpu, controller)
def mission_decision(self):
return MissionDecision.SUCCESS
def _cpu_vote(self, team: List[int]):
#TODO: figure out a strategy for the computer voting
return Vote.ACCEPT
def _cpu_choose_team(self, num_members: int):
#TODO: figure out how the computer should choose their team
return list(range(num_members))
| 31.363636 | 70 | 0.733333 | 91 | 690 | 5.373626 | 0.43956 | 0.06135 | 0.05317 | 0.06544 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202899 | 690 | 21 | 71 | 32.857143 | 0.889091 | 0.157971 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0 | 1 | 0.266667 | false | 0 | 0.4 | 0.2 | 0.933333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 5 |
7cb2256ccf21ab27084ce7486182f7f6ad17fbac | 336 | py | Python | s41.py | glcrazier/LeetCodePlay | cf951a079d458e02000d170529cb1e3b049da023 | [
"MIT"
] | 1 | 2018-02-20T13:56:02.000Z | 2018-02-20T13:56:02.000Z | s41.py | glcrazier/LeetCodePlay | cf951a079d458e02000d170529cb1e3b049da023 | [
"MIT"
] | null | null | null | s41.py | glcrazier/LeetCodePlay | cf951a079d458e02000d170529cb1e3b049da023 | [
"MIT"
] | null | null | null | from solution import Solution
if __name__ == '__main__':
sol = Solution()
print sol.firstMissingPositive([1,2,0])
print sol.firstMissingPositive([3,4,-1,1])
print sol.firstMissingPositive([])
print sol.firstMissingPositive([1])
print sol.firstMissingPositive([2])
print sol.firstMissingPositive([1,2])
| 28 | 46 | 0.696429 | 38 | 336 | 5.947368 | 0.368421 | 0.212389 | 0.743363 | 0.384956 | 0.265487 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039427 | 0.169643 | 336 | 12 | 47 | 28 | 0.770609 | 0 | 0 | 0 | 0 | 0 | 0.023739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.111111 | null | null | 0.666667 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
7cea852426ceda046f4ba9ad114292eab9468680 | 4,783 | py | Python | z2/part3/updated_part2_batch/jm/parser_errors_2/637395769.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 1 | 2020-04-16T12:13:47.000Z | 2020-04-16T12:13:47.000Z | z2/part3/updated_part2_batch/jm/parser_errors_2/637395769.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:50:15.000Z | 2020-05-19T14:58:30.000Z | z2/part3/updated_part2_batch/jm/parser_errors_2/637395769.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:45:13.000Z | 2020-06-09T19:18:31.000Z | from part1 import (
gamma_board,
gamma_busy_fields,
gamma_delete,
gamma_free_fields,
gamma_golden_move,
gamma_golden_possible,
gamma_move,
gamma_new,
)
"""
scenario: test_random_actions
uuid: 637395769
"""
"""
random actions, total chaos
"""
board = gamma_new(5, 5, 4, 7)
assert board is not None
assert gamma_move(board, 1, 0, 3) == 1
assert gamma_move(board, 2, 1, 0) == 1
assert gamma_move(board, 2, 3, 3) == 1
assert gamma_free_fields(board, 2) == 22
assert gamma_golden_possible(board, 2) == 1
assert gamma_move(board, 3, 4, 0) == 1
assert gamma_move(board, 3, 3, 2) == 1
assert gamma_move(board, 4, 1, 0) == 0
assert gamma_move(board, 1, 3, 2) == 0
assert gamma_move(board, 1, 2, 2) == 1
assert gamma_move(board, 2, 0, 2) == 1
assert gamma_free_fields(board, 2) == 18
assert gamma_move(board, 3, 3, 4) == 1
assert gamma_move(board, 3, 0, 2) == 0
assert gamma_golden_possible(board, 3) == 1
assert gamma_move(board, 4, 1, 3) == 1
assert gamma_move(board, 4, 1, 4) == 1
assert gamma_busy_fields(board, 4) == 2
assert gamma_move(board, 1, 1, 1) == 1
assert gamma_busy_fields(board, 1) == 3
assert gamma_move(board, 2, 0, 2) == 0
assert gamma_move(board, 3, 2, 4) == 1
assert gamma_move(board, 4, 3, 2) == 0
assert gamma_free_fields(board, 4) == 13
assert gamma_move(board, 1, 3, 4) == 0
assert gamma_move(board, 2, 4, 4) == 1
assert gamma_move(board, 3, 1, 4) == 0
assert gamma_move(board, 4, 1, 0) == 0
assert gamma_move(board, 1, 1, 4) == 0
assert gamma_move(board, 1, 4, 2) == 1
assert gamma_move(board, 2, 3, 4) == 0
assert gamma_free_fields(board, 2) == 11
board774071270 = gamma_board(board)
assert board774071270 is not None
assert board774071270 == (".4332\n"
"14.2.\n"
"2.131\n"
".1...\n"
".2..3\n")
del board774071270
board774071270 = None
assert gamma_move(board, 3, 3, 2) == 0
assert gamma_move(board, 3, 0, 1) == 1
assert gamma_move(board, 4, 4, 0) == 0
assert gamma_busy_fields(board, 4) == 2
assert gamma_move(board, 1, 3, 4) == 0
assert gamma_move(board, 1, 3, 4) == 0
assert gamma_move(board, 2, 1, 3) == 0
assert gamma_move(board, 2, 4, 2) == 0
assert gamma_free_fields(board, 2) == 10
assert gamma_move(board, 3, 3, 4) == 0
assert gamma_move(board, 3, 2, 1) == 1
assert gamma_move(board, 4, 0, 0) == 1
assert gamma_free_fields(board, 4) == 8
assert gamma_move(board, 1, 0, 4) == 1
assert gamma_move(board, 2, 4, 2) == 0
board555930657 = gamma_board(board)
assert board555930657 is not None
assert board555930657 == ("14332\n"
"14.2.\n"
"2.131\n"
"313..\n"
"42..3\n")
del board555930657
board555930657 = None
assert gamma_move(board, 3, 0, 2) == 0
assert gamma_busy_fields(board, 4) == 3
assert gamma_move(board, 1, 1, 4) == 0
assert gamma_move(board, 1, 3, 1) == 1
assert gamma_move(board, 2, 0, 0) == 0
assert gamma_move(board, 2, 2, 3) == 1
assert gamma_move(board, 3, 1, 4) == 0
assert gamma_move(board, 3, 0, 4) == 0
assert gamma_move(board, 4, 2, 1) == 0
assert gamma_golden_possible(board, 4) == 1
assert gamma_move(board, 1, 3, 4) == 0
assert gamma_move(board, 1, 4, 4) == 0
assert gamma_golden_move(board, 1, 4, 2) == 0
assert gamma_move(board, 2, 4, 2) == 0
assert gamma_move(board, 3, 2, 1) == 0
assert gamma_move(board, 4, 0, 2) == 0
board989072328 = gamma_board(board)
assert board989072328 is not None
assert board989072328 == ("14332\n"
"1422.\n"
"2.131\n"
"3131.\n"
"42..3\n")
del board989072328
board989072328 = None
assert gamma_move(board, 1, 2, 3) == 0
assert gamma_move(board, 2, 4, 3) == 1
assert gamma_move(board, 2, 4, 2) == 0
assert gamma_move(board, 3, 2, 1) == 0
assert gamma_move(board, 3, 3, 0) == 1
assert gamma_move(board, 4, 1, 0) == 0
board102771480 = gamma_board(board)
assert board102771480 is not None
assert board102771480 == ("14332\n"
"14222\n"
"2.131\n"
"3131.\n"
"42.33\n")
del board102771480
board102771480 = None
assert gamma_move(board, 1, 2, 2) == 0
assert gamma_move(board, 1, 2, 2) == 0
assert gamma_move(board, 2, 2, 1) == 0
assert gamma_free_fields(board, 2) == 3
assert gamma_move(board, 3, 0, 0) == 0
assert gamma_move(board, 3, 0, 3) == 0
assert gamma_free_fields(board, 3) == 3
assert gamma_move(board, 4, 2, 2) == 0
assert gamma_move(board, 4, 3, 0) == 0
board521090547 = gamma_board(board)
assert board521090547 is not None
assert board521090547 == ("14332\n"
"14222\n"
"2.131\n"
"3131.\n"
"42.33\n")
del board521090547
board521090547 = None
assert gamma_move(board, 1, 1, 4) == 0
assert gamma_move(board, 1, 3, 4) == 0
assert gamma_move(board, 2, 0, 1) == 0
assert gamma_move(board, 3, 2, 4) == 0
assert gamma_move(board, 4, 2, 1) == 0
assert gamma_move(board, 4, 0, 2) == 0
assert gamma_move(board, 1, 0, 2) == 0
assert gamma_move(board, 2, 1, 4) == 0
gamma_delete(board)
| 28.640719 | 46 | 0.666318 | 859 | 4,783 | 3.566938 | 0.064028 | 0.308747 | 0.342689 | 0.456919 | 0.728786 | 0.70953 | 0.616188 | 0.398499 | 0.350522 | 0.325392 | 0 | 0.166159 | 0.177086 | 4,783 | 166 | 47 | 28.813253 | 0.612297 | 0 | 0 | 0.3125 | 0 | 0 | 0.037274 | 0 | 0 | 0 | 0 | 0 | 0.673611 | 1 | 0 | false | 0 | 0.006944 | 0 | 0.006944 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
6b0768ec19bbd8bac7f3ca8380f0f22bf5cdc1db | 281 | py | Python | tests/helpers/__init__.py | virtuozzo/IMConnectConnector | 50decab95ffde56e8c081426792b5b875bd0e54b | [
"MIT"
] | 2 | 2020-09-25T07:02:27.000Z | 2020-09-27T14:25:43.000Z | cloudblue_connector/__init__.py | virtuozzo/IMConnectConnector | 50decab95ffde56e8c081426792b5b875bd0e54b | [
"MIT"
] | null | null | null | cloudblue_connector/__init__.py | virtuozzo/IMConnectConnector | 50decab95ffde56e8c081426792b5b875bd0e54b | [
"MIT"
] | 1 | 2021-04-28T10:52:53.000Z | 2021-04-28T10:52:53.000Z | # ******************************************************************************
# Copyright (c) 2020-2021, Virtuozzo International GmbH.
# This source code is distributed under MIT software license.
# ******************************************************************************
| 56.2 | 80 | 0.33452 | 16 | 281 | 5.875 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030769 | 0.074733 | 281 | 4 | 81 | 70.25 | 0.330769 | 0.967972 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
6b09786bb7990f2c19e1c5336220f5992dc61e6d | 60 | py | Python | enthought/traits/ui/editors/custom_editor.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 3 | 2016-12-09T06:05:18.000Z | 2018-03-01T13:00:29.000Z | enthought/traits/ui/editors/custom_editor.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 1 | 2020-12-02T00:51:32.000Z | 2020-12-02T08:48:55.000Z | enthought/traits/ui/editors/custom_editor.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | null | null | null | # proxy module
from traitsui.editors.custom_editor import *
| 20 | 44 | 0.816667 | 8 | 60 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116667 | 60 | 2 | 45 | 30 | 0.90566 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
6b0d204de56573964944dda8371db1596d34cb9e | 47 | py | Python | mixnmatchttp/app/exc.py | aayla-secura/simple_CORS_https_server | e32e1c0ce44f3b5a5e5dc830364eddb0b7127040 | [
"MIT"
] | 3 | 2019-03-28T04:17:17.000Z | 2021-08-02T20:15:17.000Z | mixnmatchttp/app/exc.py | aayla-secura/simple_CORS_https_server | e32e1c0ce44f3b5a5e5dc830364eddb0b7127040 | [
"MIT"
] | 1 | 2020-07-02T20:48:18.000Z | 2020-07-07T19:09:24.000Z | mixnmatchttp/app/exc.py | aayla-secura/simple_CORS_https_server | e32e1c0ce44f3b5a5e5dc830364eddb0b7127040 | [
"MIT"
] | 2 | 2020-05-31T11:24:10.000Z | 2021-08-02T20:20:32.000Z | class ArgumentValueError(ValueError):
pass
| 15.666667 | 37 | 0.787234 | 4 | 47 | 9.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 47 | 2 | 38 | 23.5 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
6b1fa81fe9ed832ac753bb5adc34cc6a7575344f | 387 | py | Python | pycontest/__init__.py | matinhimself/pycontest | e506b99e515847f3d6c1cc9b335ed358d1a85e90 | [
"MIT"
] | 10 | 2021-01-02T13:45:14.000Z | 2021-08-16T12:39:58.000Z | pycontest/__init__.py | matinhimself/pycontest | e506b99e515847f3d6c1cc9b335ed358d1a85e90 | [
"MIT"
] | null | null | null | pycontest/__init__.py | matinhimself/pycontest | e506b99e515847f3d6c1cc9b335ed358d1a85e90 | [
"MIT"
] | null | null | null | from .case import Case
from .generator import IntArray
from .generator import IntVar
from .generator import FloatVar
from .generator import FloatArray
from .generator import Collections
from .generator import CharArray
from .generator import ChoiceList
from .generator import Variables
from .generator import CustomArray
from .generator import Array2d
from .helper import OutputHelper
| 25.8 | 34 | 0.839793 | 48 | 387 | 6.770833 | 0.333333 | 0.4 | 0.584615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002967 | 0.129199 | 387 | 14 | 35 | 27.642857 | 0.961424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
6b312945f307715ce1ba5bf0445649fe3a854dbc | 44 | py | Python | iapcompython/exe_1_1.py | edmilsonlibanio/Ola-Mundo-Python | 33fb08da5878f2784983c623df04d2bbdfb30f25 | [
"MIT"
] | null | null | null | iapcompython/exe_1_1.py | edmilsonlibanio/Ola-Mundo-Python | 33fb08da5878f2784983c623df04d2bbdfb30f25 | [
"MIT"
] | null | null | null | iapcompython/exe_1_1.py | edmilsonlibanio/Ola-Mundo-Python | 33fb08da5878f2784983c623df04d2bbdfb30f25 | [
"MIT"
] | null | null | null | # Primeiro exemplo do livro.
print('Olá!')
| 11 | 28 | 0.681818 | 6 | 44 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159091 | 44 | 3 | 29 | 14.666667 | 0.810811 | 0.590909 | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
6b36260e3cde738f02413224ccd5354c4c27a586 | 1,091 | py | Python | flask-app/app/app/core/errors.py | mcelisr1/flask-docker-backend-stack | 07c640401c42db843ba3e77bba460224591506ab | [
"MIT"
] | 2 | 2019-04-30T23:48:36.000Z | 2019-07-17T15:26:57.000Z | flask-app/app/app/core/errors.py | mcelisr1/flask-docker-backend-stack | 07c640401c42db843ba3e77bba460224591506ab | [
"MIT"
] | null | null | null | flask-app/app/app/core/errors.py | mcelisr1/flask-docker-backend-stack | 07c640401c42db843ba3e77bba460224591506ab | [
"MIT"
] | null | null | null | from flask import jsonify
from ..main import app
# 400 Bad Request
@app.errorhandler(400)
def custom400(error):
return jsonify({"msg": error.description}), 400
# 401 Unauthorized
@app.errorhandler(401)
def custom401(error):
return jsonify({"msg": error.description}), 401
# 403 Forbidden
@app.errorhandler(403)
def custom403(error):
return jsonify({"msg": error.description}), 403
# 404 Not Found
@app.errorhandler(404)
def custom404(error):
return jsonify({"msg": error.description}), 404
# 405 Method Not Allowed
@app.errorhandler(405)
def custom405(error):
return jsonify({"msg": error.description}), 405
# 406 Not Acceptable
@app.errorhandler(406)
def custom406(error):
return jsonify({"msg": error.description}), 406
# 422 Unprocessable Entity, for flask-apispec, webargs
@app.errorhandler(422)
def custom422(error):
return jsonify(
{"msg": error.description, "errors": error.exc.messages}
), 422
# 500 Internal Server Error
@app.errorhandler(500)
def custom500(error):
return jsonify({"msg": error.description}), 500
| 20.584906 | 68 | 0.708524 | 136 | 1,091 | 5.683824 | 0.345588 | 0.155239 | 0.186287 | 0.217335 | 0.382924 | 0.382924 | 0 | 0 | 0 | 0 | 0 | 0.104348 | 0.156737 | 1,091 | 52 | 69 | 20.980769 | 0.73587 | 0.165903 | 0 | 0 | 0 | 0 | 0.033296 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.071429 | 0.285714 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
8657d3d9ab98269db1532e0289535d92a4176e22 | 8,447 | py | Python | torchensemble/tests/test_all_models.py | Vivdaddy/Ensemble-Pytorch | 6dcd196396bde882ba3390242a20ad4d1eede41d | [
"BSD-3-Clause"
] | 217 | 2021-05-19T12:59:39.000Z | 2022-03-28T07:18:06.000Z | torchensemble/tests/test_all_models.py | Vivdaddy/Ensemble-Pytorch | 6dcd196396bde882ba3390242a20ad4d1eede41d | [
"BSD-3-Clause"
] | 21 | 2020-12-27T20:24:27.000Z | 2021-04-19T14:52:10.000Z | torchensemble/tests/test_all_models.py | Vivdaddy/Ensemble-Pytorch | 6dcd196396bde882ba3390242a20ad4d1eede41d | [
"BSD-3-Clause"
] | 27 | 2021-05-19T10:18:14.000Z | 2022-03-28T19:27:41.000Z | import torch
import pytest
import numpy as np
import torch.nn as nn
from numpy.testing import assert_array_equal
from torch.utils.data import TensorDataset, DataLoader
import torchensemble
from torchensemble.utils import io
from torchensemble.utils.logging import set_logger
# All classifiers
all_clf = [
torchensemble.FusionClassifier,
torchensemble.VotingClassifier,
torchensemble.BaggingClassifier,
torchensemble.GradientBoostingClassifier,
torchensemble.SnapshotEnsembleClassifier,
torchensemble.AdversarialTrainingClassifier,
torchensemble.FastGeometricClassifier,
torchensemble.SoftGradientBoostingClassifier,
]
# All regressors
all_reg = [
torchensemble.FusionRegressor,
torchensemble.VotingRegressor,
torchensemble.BaggingRegressor,
torchensemble.GradientBoostingRegressor,
torchensemble.SnapshotEnsembleRegressor,
torchensemble.AdversarialTrainingRegressor,
torchensemble.FastGeometricRegressor,
torchensemble.SoftGradientBoostingRegressor,
]
np.random.seed(0)
torch.manual_seed(0)
set_logger("pytest_all_models")
# Base estimator
class MLP_clf(nn.Module):
def __init__(self):
super(MLP_clf, self).__init__()
self.linear1 = nn.Linear(2, 2)
self.linear2 = nn.Linear(2, 2)
def forward(self, X):
X = X.view(X.size()[0], -1)
output = self.linear1(X)
output = self.linear2(output)
return output
class MLP_reg(nn.Module):
def __init__(self):
super(MLP_reg, self).__init__()
self.linear1 = nn.Linear(2, 2)
self.linear2 = nn.Linear(2, 1)
def forward(self, X):
X = X.view(X.size()[0], -1)
output = self.linear1(X)
output = self.linear2(output)
return output
# Training data
X_train = torch.Tensor(
np.array(([0.1, 0.1], [0.2, 0.2], [0.3, 0.3], [0.4, 0.4]))
)
y_train_clf = torch.LongTensor(np.array(([0, 0, 1, 1])))
y_train_reg = torch.FloatTensor(np.array(([0.1, 0.2, 0.3, 0.4])))
y_train_reg = y_train_reg.view(-1, 1)
# Testing data
numpy_X_test = np.array(([0.5, 0.5], [0.6, 0.6]))
X_test = torch.Tensor(numpy_X_test)
y_test_clf = torch.LongTensor(np.array(([1, 0])))
y_test_reg = torch.FloatTensor(np.array(([0.5, 0.6])))
y_test_reg = y_test_reg.view(-1, 1)
@pytest.mark.parametrize("clf", all_clf)
def test_clf_class(clf):
"""
This unit test checks the training and evaluating stage of all classifiers.
"""
epochs = 1
n_estimators = 2
model = clf(estimator=MLP_clf, n_estimators=n_estimators, cuda=False)
# Optimizer
model.set_optimizer("Adam", lr=1e-3, weight_decay=5e-4)
# Scheduler (Snapshot Ensemble Excluded)
if not isinstance(model, torchensemble.SnapshotEnsembleClassifier):
model.set_scheduler("MultiStepLR", milestones=[2, 4])
# Prepare data
train = TensorDataset(X_train, y_train_clf)
train_loader = DataLoader(train, batch_size=2, shuffle=False)
test = TensorDataset(X_test, y_test_clf)
test_loader = DataLoader(test, batch_size=2, shuffle=False)
# Snapshot Ensemble needs more epochs
if isinstance(model, torchensemble.SnapshotEnsembleClassifier):
epochs = 6
# Train
model.fit(train_loader, epochs=epochs, test_loader=test_loader)
# Evaluate
model.evaluate(test_loader)
# Predict
for _, (data, target) in enumerate(test_loader):
model.predict(data)
break
# Reload
new_model = clf(estimator=MLP_clf, n_estimators=n_estimators, cuda=False)
io.load(new_model)
new_model.evaluate(test_loader)
for _, (data, target) in enumerate(test_loader):
new_model.predict(data)
break
@pytest.mark.parametrize("clf", all_clf)
def test_clf_object(clf):
"""
This unit test checks the training and evaluating stage of all classifiers.
"""
epochs = 1
n_estimators = 2
model = clf(estimator=MLP_clf(), n_estimators=n_estimators, cuda=False)
# Optimizer
model.set_optimizer("Adam", lr=1e-3, weight_decay=5e-4)
# Scheduler (Snapshot Ensemble Excluded)
if not isinstance(model, torchensemble.SnapshotEnsembleClassifier):
model.set_scheduler("MultiStepLR", milestones=[2, 4])
# Prepare data
train = TensorDataset(X_train, y_train_clf)
train_loader = DataLoader(train, batch_size=2, shuffle=False)
test = TensorDataset(X_test, y_test_clf)
test_loader = DataLoader(test, batch_size=2, shuffle=False)
# Snapshot Ensemble needs more epochs
if isinstance(model, torchensemble.SnapshotEnsembleClassifier):
epochs = 6
# Train
model.fit(train_loader, epochs=epochs, test_loader=test_loader)
# Evaluate
model.evaluate(test_loader)
# Predict
for _, (data, target) in enumerate(test_loader):
model.predict(data)
break
# Reload
new_model = clf(estimator=MLP_clf(), n_estimators=n_estimators, cuda=False)
io.load(new_model)
new_model.evaluate(test_loader)
for _, (data, target) in enumerate(test_loader):
new_model.predict(data)
break
@pytest.mark.parametrize("reg", all_reg)
def test_reg_class(reg):
"""
This unit test checks the training and evaluating stage of all regressors.
"""
epochs = 1
n_estimators = 2
model = reg(estimator=MLP_reg, n_estimators=n_estimators, cuda=False)
# Optimizer
model.set_optimizer("Adam", lr=1e-3, weight_decay=5e-4)
# Scheduler (Snapshot Ensemble Excluded)
if not isinstance(model, torchensemble.SnapshotEnsembleRegressor):
model.set_scheduler("MultiStepLR", milestones=[2, 4])
# Prepare data
train = TensorDataset(X_train, y_train_reg)
train_loader = DataLoader(train, batch_size=2, shuffle=False)
test = TensorDataset(X_test, y_test_reg)
test_loader = DataLoader(test, batch_size=2, shuffle=False)
# Snapshot Ensemble needs more epochs
if isinstance(model, torchensemble.SnapshotEnsembleRegressor):
epochs = 6
# Train
model.fit(train_loader, epochs=epochs, test_loader=test_loader)
# Evaluate
model.evaluate(test_loader)
# Predict
for _, (data, target) in enumerate(test_loader):
model.predict(data)
break
# Reload
new_model = reg(estimator=MLP_reg, n_estimators=n_estimators, cuda=False)
io.load(new_model)
new_model.evaluate(test_loader)
for _, (data, target) in enumerate(test_loader):
new_model.predict(data)
break
@pytest.mark.parametrize("reg", all_reg)
def test_reg_object(reg):
"""
This unit test checks the training and evaluating stage of all regressors.
"""
epochs = 1
n_estimators = 2
model = reg(estimator=MLP_reg(), n_estimators=n_estimators, cuda=False)
# Optimizer
model.set_optimizer("Adam", lr=1e-3, weight_decay=5e-4)
# Scheduler (Snapshot Ensemble Excluded)
if not isinstance(model, torchensemble.SnapshotEnsembleRegressor):
model.set_scheduler("MultiStepLR", milestones=[2, 4])
# Prepare data
train = TensorDataset(X_train, y_train_reg)
train_loader = DataLoader(train, batch_size=2, shuffle=False)
test = TensorDataset(X_test, y_test_reg)
test_loader = DataLoader(test, batch_size=2, shuffle=False)
# Snapshot Ensemble needs more epochs
if isinstance(model, torchensemble.SnapshotEnsembleRegressor):
epochs = 6
# Train
model.fit(train_loader, epochs=epochs, test_loader=test_loader)
# Evaluate
model.evaluate(test_loader)
# Predict
for _, (data, target) in enumerate(test_loader):
model.predict(data)
break
# Reload
new_model = reg(estimator=MLP_reg(), n_estimators=n_estimators, cuda=False)
io.load(new_model)
new_model.evaluate(test_loader)
for _, (data, target) in enumerate(test_loader):
new_model.predict(data)
break
def test_predict():
fusion = all_clf[0] # FusionClassifier
model = fusion(estimator=MLP_clf, n_estimators=2, cuda=False)
model.set_optimizer("Adam", lr=1e-3, weight_decay=5e-4)
train = TensorDataset(X_train, y_train_clf)
train_loader = DataLoader(train, batch_size=2, shuffle=False)
model.fit(train_loader, epochs=1)
assert_array_equal(model.predict(X_test), model.predict(numpy_X_test))
with pytest.raises(ValueError) as excinfo:
model.predict([X_test]) # list
assert "The type of input X should be one of" in str(excinfo.value)
| 28.063123 | 79 | 0.699183 | 1,090 | 8,447 | 5.222936 | 0.129358 | 0.049183 | 0.015809 | 0.026875 | 0.765501 | 0.73968 | 0.730195 | 0.72071 | 0.72071 | 0.710346 | 0 | 0.017634 | 0.194389 | 8,447 | 300 | 80 | 28.156667 | 0.818957 | 0.107612 | 0 | 0.603448 | 0 | 0 | 0.017341 | 0 | 0 | 0 | 0 | 0 | 0.017241 | 1 | 0.051724 | false | 0 | 0.051724 | 0 | 0.126437 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
8668c5e45a9022f97a47c9fbb8e2425737a15377 | 77 | py | Python | notifications/__init__.py | rajeevratan84/Human-Detection-using-CCTV | 68a5d04c22f60ac9ff77a6fea32edfba6eb3dea8 | [
"Apache-2.0"
] | null | null | null | notifications/__init__.py | rajeevratan84/Human-Detection-using-CCTV | 68a5d04c22f60ac9ff77a6fea32edfba6eb3dea8 | [
"Apache-2.0"
] | null | null | null | notifications/__init__.py | rajeevratan84/Human-Detection-using-CCTV | 68a5d04c22f60ac9ff77a6fea32edfba6eb3dea8 | [
"Apache-2.0"
] | null | null | null | from .twilionotifier import TwilioNotifier
from .msg import MessageGenerator | 38.5 | 43 | 0.87013 | 8 | 77 | 8.375 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103896 | 77 | 2 | 44 | 38.5 | 0.971014 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
868305cb5cc0b74aaf384e2869e895053b761fcf | 32,302 | py | Python | test/transaction/base_test.py | plenarius/iota.lib.py | ac6167dadb8b60a64b33eeb9db755be32c7cef12 | [
"MIT"
] | 2 | 2018-02-21T12:04:41.000Z | 2018-04-01T18:56:18.000Z | test/transaction/base_test.py | plenarius/iota.lib.py | ac6167dadb8b60a64b33eeb9db755be32c7cef12 | [
"MIT"
] | null | null | null | test/transaction/base_test.py | plenarius/iota.lib.py | ac6167dadb8b60a64b33eeb9db755be32c7cef12 | [
"MIT"
] | 3 | 2018-02-19T09:35:44.000Z | 2018-04-01T19:16:26.000Z | # coding=utf-8
from __future__ import absolute_import, division, print_function, \
unicode_literals
from unittest import TestCase
from iota import Address, Bundle, BundleHash, Fragment, Hash, \
Tag, Transaction, TransactionHash, TransactionTrytes, TryteString, Nonce
class BundleTestCase(TestCase):
def setUp(self):
super(BundleTestCase, self).setUp()
# noinspection SpellCheckingInspection
self.bundle = Bundle([
# This transaction does not have a message.
Transaction(
signature_message_fragment = Fragment(b''),
address =
Address(
b'TESTVALUE9DONTUSEINPRODUCTION99999A9PG9A'
b'XCQANAWGJBTFWEAEQCN9WBZB9BJAIIY9UDLIGFOAA'
),
current_index = 0,
last_index = 7,
value = 0,
# These values are not relevant to the tests.
branch_transaction_hash = TransactionHash(b''),
bundle_hash = BundleHash(b''),
hash_ = TransactionHash(b''),
nonce = Nonce(b''),
timestamp = 1485020456,
trunk_transaction_hash = TransactionHash(b''),
tag = Tag(b''),
attachment_timestamp = 1485020456,
attachment_timestamp_upper_bound = 1485020456,
attachment_timestamp_lower_bound = 1485020456,
),
# This transaction has something that can't be decoded as a UTF-8
# sequence.
Transaction(
signature_message_fragment =
Fragment(b'OHCFVELH9GYEMHCF9GPHBGIEWHZFU'),
address =
Address(
b'TESTVALUE9DONTUSEINPRODUCTION99999HAA9UA'
b'MHCGKEUGYFUBIARAXBFASGLCHCBEVGTBDCSAEBTBM'
),
current_index = 1,
last_index = 7,
value = 10,
# These values are not relevant to the tests.
branch_transaction_hash = TransactionHash(b''),
bundle_hash = BundleHash(b''),
hash_ = TransactionHash(b''),
nonce = Nonce(b''),
timestamp = 1485020456,
trunk_transaction_hash = TransactionHash(b''),
tag = Tag(b''),
attachment_timestamp = 1485020456,
attachment_timestamp_upper_bound = 1485020456,
attachment_timestamp_lower_bound = 1485020456,
),
# This transaction has a message that fits into a single
# fragment.
Transaction(
signature_message_fragment =
Fragment.from_string('Hello, world!'),
address =
Address(
b'TESTVALUE9DONTUSEINPRODUCTION99999D99HEA'
b'M9XADCPFJDFANCIHR9OBDHTAGGE9TGCI9EO9ZCRBN'
),
current_index = 2,
last_index = 7,
value = 20,
# These values are not relevant to the tests.
branch_transaction_hash = TransactionHash(b''),
bundle_hash = BundleHash(b''),
hash_ = TransactionHash(b''),
nonce = Nonce(b''),
timestamp = 1485020456,
trunk_transaction_hash = TransactionHash(b''),
tag = Tag(b''),
attachment_timestamp = 1485020456,
attachment_timestamp_upper_bound = 1485020456,
attachment_timestamp_lower_bound = 1485020456,
),
# This transaction has a message that spans multiple fragments.
Transaction(
signature_message_fragment =
Fragment(
b'J9GAQBCDCDSCEAADCDFDBDXCBDVCQAGAEAGDPCXCSCEANBTCTCDDEACCWCCDIDVC'
b'WCHDEAPCHDEA9DPCGDHDSAJ9GAOBFDSASASAEAQBCDCDSCEAADCDFDBDXCBDVCQA'
b'EAYBEANBTCTCDDEACCWCCDIDVCWCHDQAGAEAGDPCXCSCEAVBCDCDBDEDIDPCKD9D'
b'EABDTCFDJDCDIDGD9DMDSAJ9EAEAGANBCDEAMDCDIDEAWCPCJDTCSASASAEATCFD'
b'QAEAHDWCPCHDEAXCGDSASASAGAJ9GASASASAEAPCBDEAPCBDGDKDTCFDEAUCCDFD'
b'EAMDCDIDIBGAEAXCBDHDTCFDFDIDDDHDTCSCEANBTCTCDDEACCWCCDIDVCWCHDEA'
b'ADPCYCTCGDHDXCRCPC9D9DMDSAEAGAHCTCGDSAEASBEAWCPCJDTCSAGAJ9CCWCTC'
b'EAHDKDCDEAADTCBDEAGDWCXCJDTCFDTCSCEAKDXCHDWCEATCLDDDTCRCHDPCBDRC'
b'MDSAEACCWCTCXCFDEAKDPCXCHDXCBDVCEAWCPCSCEABDCDHDEAQCTCTCBDEAXCBD'
b'EAJDPCXCBDSAJ9GACCWCTCFDTCEAFDTCPC9D9DMDEAXCGDEACDBDTCIBGAEAQCFD'
b'TCPCHDWCTCSCEAZBWCCDIDRCWCVCSAJ9GACCWCTCFDTCEAFDTCPC9D9DMDEAXCGD'
b'EACDBDTCQAGAEARCCDBDUCXCFDADTCSCEANBTCTCDDEACCWCCDIDVCWCHDSAJ9GA'
b'CCCDEAOBJDTCFDMDHDWCXCBDVCIBEACCCDEAHDWCTCEAVCFDTCPCHDEA9CIDTCGD'
b'HDXCCDBDEACDUCEAVBXCUCTCQAEAHDWCTCEADCBDXCJDTCFDGDTCEAPCBDSCEAOB'
b'JDTCFDMDHDWCXCBDVCIBGAJ9GAHCTCGDSAGAJ9LBCDHDWCEACDUCEAHDWCTCEAAD'
b'TCBDEAWCPCSCEAQCTCTCBDEAHDFDPCXCBDTCSCEAUCCDFDEAHDWCXCGDEAADCDAD'
b'TCBDHDEBEAHDWCTCXCFDEA9DXCJDTCGDEAWCPCSCEAQCTCTCBDEAPCJ9EAEADDFD'
b'TCDDPCFDPCHDXCCDBDEAUCCDFDEAXCHDEBEAHDWCTCMDEAWCPCSCEAQCTCTCBDEA'
b'GDTC9DTCRCHDTCSCEAPCHDEAQCXCFDHDWCEAPCGDEAHDWCCDGDTCEAKDWCCDEAKD'
b'CDID9DSCJ9EAEAKDXCHDBDTCGDGDEAHDWCTCEAPCBDGDKDTCFDEBEAQCIDHDEATC'
b'JDTCBDEAGDCDEAHDWCTCMDEAUCCDIDBDSCEAHDWCTCADGDTC9DJDTCGDEAVCPCGD'
b'DDXCBDVCEAPCBDSCEAGDEDIDXCFDADXCBDVCJ9EAEA9DXCZCTCEATCLDRCXCHDTC'
b'SCEARCWCXC9DSCFDTCBDSAJ9GAKBBDSCEAMDCDIDLAFDTCEAFDTCPCSCMDEAHDCD'
b'EAVCXCJDTCEAXCHDEAHDCDEAIDGDIBGAEAIDFDVCTCSCEAVBCDCDBDEDIDPCKD9D'
b'SAJ9GASBEAPCADSAGAJ9GAXBCDKDIBGAJ9GAXBCDKDQAGAEAGDPCXCSCEANBTCTC'
b'DDEACCWCCDIDVCWCHDSAJ9CCWCTCMDEAQCCDHDWCEA9DXCRCZCTCSCEAHDWCTCXC'
b'FDEASCFDMDEA9DXCDDGDSAJ9GACCWCCDIDVCWCEASBEASCCDBDLAHDEAHDWCXCBD'
b'ZCQAGAEAPCSCSCTCSCEANBTCTCDDEACCWCCDIDVCWCHDQAEAGAHDWCPCHDEAMDCD'
b'IDLAFDTCEAVCCDXCBDVCEAHDCDEA9DXCZCTCEAXCHDSAGAJ9GANBCDTCGDBDLAHD'
b'EAADPCHDHDTCFDQAGAEAGDPCXCSCEAZBWCCDIDRCWCVCSAEAGAFCTCEAADIDGDHD'
b'EAZCBDCDKDEAXCHDFAEAXBCDKDFAGAJ9GAXBCDKDIBGAEATCBDEDIDXCFDTCSCEA'
b'NBTCTCDDEACCWCCDIDVCWCHDSAJ9GAHCTCGDFAEAXBCDKDFAGAJ9GAKB9D9DEAFD'
b'XCVCWCHDQAGAEAGDPCXCSCEAHDWCTCEARCCDADDDIDHDTCFDEAPCBDSCEAGDTCHD'
b'HD9DTCSCEAXCBDHDCDEAGDXC9DTCBDRCTCEAPCVCPCXCBDSAJ9EAEACCWCTCEAHD'
b'KDCDEAADTCB'
),
address =
Address(
b'TESTVALUE9DONTUSEINPRODUCTION99999A9PG9A'
b'XCQANAWGJBTFWEAEQCN9WBZB9BJAIIY9UDLIGFOAA'
),
current_index = 3,
last_index = 7,
value = 30,
# These values are not relevant to the tests.
branch_transaction_hash = TransactionHash(b''),
bundle_hash = BundleHash(b''),
hash_ = TransactionHash(b''),
nonce = Nonce(b''),
timestamp = 1485020456,
trunk_transaction_hash = TransactionHash(b''),
tag = Tag(b''),
attachment_timestamp = 1485020456,
attachment_timestamp_upper_bound = 1485020456,
attachment_timestamp_lower_bound = 1485020456,
),
Transaction(
signature_message_fragment =
Fragment(
b'DEAUCXCSCVCTCHDTCSCSAEACCWCTCEAHDTCBDGDXCCDBDEAKDPCGDEAIDBDQCTCP'
b'CFDPCQC9DTCSAJ9GAHCCDIDLAFDTCEAFDTCPC9D9DMDEABDCDHDEAVCCDXCBDVCE'
b'AHDCDEA9DXCZCTCEAXCHDQAGAEACDQCGDTCFDJDTCSCEANBTCTCDDEACCWCCDIDV'
b'CWCHDSAJ9GACCTC9D9DEAIDGDFAGAJ9GAKB9D9DEAFDXCVCWCHDQAGAEAGDPCXCS'
b'CEANBTCTCDDEACCWCCDIDVCWCHDSAEAGACCWCTCEAKBBDGDKDTCFDEAHDCDEAHDW'
b'CTCEAQBFDTCPCHDEA9CIDTCGDHDXCCDBDSASASAGAJ9GAHCTCGDIBGAJ9GAYBUCE'
b'AVBXCUCTCQAEAHDWCTCEADCBDXCJDTCFDGDTCEAPCBDSCEAOBJDTCFDMDHDWCXCB'
b'DVCSASASAGAEAGDPCXCSCEANBTCTCDDEACCWCCDIDVCWCHDSAJ9GAHCTCGDIBIBG'
b'AJ9GASBGDSASASAGAJ9GAHCTCGDIBFAGAJ9GAPBCDFDHDMDRAHDKDCDQAGAEAGDP'
b'CXCSCEANBTCTCDDEACCWCCDIDVCWCHDQAEAKDXCHDWCEAXCBDUCXCBDXCHDTCEAA'
b'DPCYCTCGDHDMDEAPCBDSCEARCPC9DADSAJ9EAEAEAEAEAEAEAEA'
),
address =
Address(
b'TESTVALUE9DONTUSEINPRODUCTION99999A9PG9A'
b'XCQANAWGJBTFWEAEQCN9WBZB9BJAIIY9UDLIGFOAA'
),
current_index = 4,
last_index = 7,
value = 0,
# These values are not relevant to the tests.
branch_transaction_hash = TransactionHash(b''),
bundle_hash = BundleHash(b''),
hash_ = TransactionHash(b''),
nonce = Nonce(b''),
timestamp = 1485020456,
trunk_transaction_hash = TransactionHash(b''),
tag = Tag(b''),
attachment_timestamp = 1485020456,
attachment_timestamp_upper_bound = 1485020456,
attachment_timestamp_lower_bound = 1485020456,
),
# Input, Part 1 of 2
Transaction(
# Make the signature look like a message, so we can verify that
# the Bundle skips it correctly.
signature_message_fragment =
Fragment.from_string('This is a signature, not a message!'),
address =
Address(
b'TESTVALUE9DONTUSEINPRODUCTION99999WGSBUA'
b'HDVHYHOBHGP9VCGIZHNCAAQFJGE9YHEHEFTDAGXHY'
),
current_index = 5,
last_index = 7,
value = -100,
# These values are not relevant to the tests.
branch_transaction_hash = TransactionHash(b''),
bundle_hash = BundleHash(b''),
hash_ = TransactionHash(b''),
nonce = Nonce(b''),
timestamp = 1485020456,
trunk_transaction_hash = TransactionHash(b''),
tag = Tag(b''),
attachment_timestamp = 1485020456,
attachment_timestamp_upper_bound = 1485020456,
attachment_timestamp_lower_bound = 1485020456,
),
# Input, Part 2 of 2
Transaction(
# Make the signature look like a message, so we can verify that
# the Bundle skips it correctly.
signature_message_fragment =
Fragment.from_string('This is a signature, not a message!'),
address =
Address(
b'TESTVALUE9DONTUSEINPRODUCTION99999WGSBUA'
b'HDVHYHOBHGP9VCGIZHNCAAQFJGE9YHEHEFTDAGXHY'
),
current_index = 6,
last_index = 7,
value = 0,
# These values are not relevant to the tests.
branch_transaction_hash = TransactionHash(b''),
bundle_hash = BundleHash(b''),
hash_ = TransactionHash(b''),
nonce = Nonce(b''),
timestamp = 1485020456,
trunk_transaction_hash = TransactionHash(b''),
tag = Tag(b''),
attachment_timestamp = 1485020456,
attachment_timestamp_upper_bound = 1485020456,
attachment_timestamp_lower_bound = 1485020456,
),
# Change
Transaction(
# It's unusual for a change transaction to have a message, but
# half the fun of unit tests is designing unusual scenarios!
signature_message_fragment =
Fragment.from_string('I can haz change?'),
address =
Address(
b'TESTVALUE9DONTUSEINPRODUCTION99999FFYALH'
b'N9ACYCP99GZBSDK9CECFI9RAIH9BRCCAHAIAWEFAN'
),
current_index = 7,
last_index = 7,
value = 40,
# These values are not relevant to the tests.
branch_transaction_hash = TransactionHash(b''),
bundle_hash = BundleHash(b''),
hash_ = TransactionHash(b''),
nonce = Nonce(b''),
timestamp = 1485020456,
trunk_transaction_hash = TransactionHash(b''),
tag = Tag(b''),
attachment_timestamp = 1485020456,
attachment_timestamp_upper_bound = 1485020456,
attachment_timestamp_lower_bound = 1485020456,
),
])
def test_get_messages_errors_drop(self):
"""
Decoding messages from a bundle, with ``errors='drop'``.
"""
messages = self.bundle.get_messages('drop')
self.assertEqual(len(messages), 3)
self.assertEqual(messages[0], 'Hello, world!')
# noinspection SpellCheckingInspection
self.assertEqual(
messages[1],
'''
"Good morning," said Deep Thought at last.
"Er... Good morning, O Deep Thought," said Loonquawl nervously.
"Do you have... er, that is..."
"... an answer for you?" interrupted Deep Thought majestically. "Yes. I have."
The two men shivered with expectancy. Their waiting had not been in vain.
"There really is one?" breathed Phouchg.
"There really is one," confirmed Deep Thought.
"To Everything? To the great Question of Life, the Universe and Everything?"
"Yes."
Both of the men had been trained for this moment; their lives had been a
preparation for it; they had been selected at birth as those who would
witness the answer; but even so they found themselves gasping and squirming
like excited children.
"And you're ready to give it to us?" urged Loonquawl.
"I am."
"Now?"
"Now," said Deep Thought.
They both licked their dry lips.
"Though I don't think," added Deep Thought, "that you're going to like it."
"Doesn't matter," said Phouchg. "We must know it! Now!"
"Now?" enquired Deep Thought.
"Yes! Now!"
"All right," said the computer and settled into silence again.
The two men fidgeted. The tension was unbearable.
"You're really not going to like it," observed Deep Thought.
"Tell us!"
"All right," said Deep Thought. "The Answer to the Great Question..."
"Yes?"
"Of Life, the Universe and Everything..." said Deep Thought.
"Yes??"
"Is..."
"Yes?!"
"Forty-two," said Deep Thought, with infinite majesty and calm.
''',
)
self.assertEqual(messages[2], 'I can haz change?')
def test_get_messages_errors_strict(self):
"""
Decoding messages from a bundle, with ``errors='strict'``.
"""
with self.assertRaises(UnicodeDecodeError):
self.bundle.get_messages('strict')
def test_get_messages_errors_ignore(self):
"""
Decoding messages from a bundle, with ``errors='ignore'``.
"""
messages = self.bundle.get_messages('ignore')
self.assertEqual(len(messages), 4)
# The only message that is treated differently is the invalid one.
self.assertEqual(messages[0], '祝你好运\x15')
def test_get_messages_errors_replace(self):
"""
Decoding messages from a bundle, with ``errors='replace'``.
"""
messages = self.bundle.get_messages('replace')
self.assertEqual(len(messages), 4)
# The only message that is treated differently is the invalid one.
self.assertEqual(messages[0], '祝你好运�\x15')
class TransactionTestCase(TestCase):
"""
If you need to generate values for these tests using the JS lib, you
can leverage the following functions:
- ``lib/utils/utils.js:transactionObject``: Convert a sequence of
trytes into an object that you can manipulate easily.
- ``lib/utils/utils.js:transactionTrytes``: Convert an object back
into a tryte sequence.
"""
# noinspection SpellCheckingInspection
def test_from_tryte_string(self):
"""
Initializing a Transaction object from a TryteString.
References:
- http://iotasupport.com/news/index.php/2016/12/02/fixing-the-latest-solid-subtangle-milestone-issue/
"""
trytes =\
TransactionTrytes(
b'GYPRVHBEZOOFXSHQBLCYW9ICTCISLHDBNMMVYD9JJHQMPQCTIQAQTJNNNJ9IDXLRCC'
b'OYOXYPCLR9PBEY9ORZIEPPDNTI9CQWYZUOTAVBXPSBOFEQAPFLWXSWUIUSJMSJIIIZ'
b'WIKIRH9GCOEVZFKNXEVCUCIIWZQCQEUVRZOCMEL9AMGXJNMLJCIA9UWGRPPHCEOPTS'
b'VPKPPPCMQXYBHMSODTWUOABPKWFFFQJHCBVYXLHEWPD9YUDFTGNCYAKQKVEZYRBQRB'
b'XIAUX9SVEDUKGMTWQIYXRGSWYRK9SRONVGTW9YGHSZRIXWGPCCUCDRMAXBPDFVHSRY'
b'WHGB9DQSQFQKSNICGPIPTRZINYRXQAFSWSEWIFRMSBMGTNYPRWFSOIIWWT9IDSELM9'
b'JUOOWFNCCSHUSMGNROBFJX9JQ9XT9PKEGQYQAWAFPRVRRVQPUQBHLSNTEFCDKBWRCD'
b'X9EYOBB9KPMTLNNQLADBDLZPRVBCKVCYQEOLARJYAGTBFR9QLPKZBOYWZQOVKCVYRG'
b'YI9ZEFIQRKYXLJBZJDBJDJVQZCGYQMROVHNDBLGNLQODPUXFNTADDVYNZJUVPGB9LV'
b'PJIYLAPBOEHPMRWUIAJXVQOEM9ROEYUOTNLXVVQEYRQWDTQGDLEYFIYNDPRAIXOZEB'
b'CS9P99AZTQQLKEILEVXMSHBIDHLXKUOMMNFKPYHONKEYDCHMUNTTNRYVMMEYHPGASP'
b'ZXASKRUPWQSHDMU9VPS99ZZ9SJJYFUJFFMFORBYDILBXCAVJDPDFHTTTIYOVGLRDYR'
b'TKHXJORJVYRPTDH9ZCPZ9ZADXZFRSFPIQKWLBRNTWJHXTOAUOL9FVGTUMMPYGYICJD'
b'XMOESEVDJWLMCVTJLPIEKBE9JTHDQWV9MRMEWFLPWGJFLUXI9BXPSVWCMUWLZSEWHB'
b'DZKXOLYNOZAPOYLQVZAQMOHGTTQEUAOVKVRRGAHNGPUEKHFVPVCOYSJAWHZU9DRROH'
b'BETBAFTATVAUGOEGCAYUXACLSSHHVYDHMDGJP9AUCLWLNTFEVGQGHQXSKEMVOVSKQE'
b'EWHWZUDTYOBGCURRZSJZLFVQQAAYQO9TRLFFN9HTDQXBSPPJYXMNGLLBHOMNVXNOWE'
b'IDMJVCLLDFHBDONQJCJVLBLCSMDOUQCKKCQJMGTSTHBXPXAMLMSXRIPUBMBAWBFNLH'
b'LUJTRJLDERLZFUBUSMF999XNHLEEXEENQJNOFFPNPQ9PQICHSATPLZVMVIWLRTKYPI'
b'XNFGYWOJSQDAXGFHKZPFLPXQEHCYEAGTIWIJEZTAVLNUMAFWGGLXMBNUQTOFCNLJTC'
b'DMWVVZGVBSEBCPFSM99FLOIDTCLUGPSEDLOKZUAEVBLWNMODGZBWOVQT9DPFOTSKRA'
b'BQAVOQ9RXWBMAKFYNDCZOJGTCIDMQSQQSODKDXTPFLNOKSIZEOY9HFUTLQRXQMEPGO'
b'XQGLLPNSXAUCYPGZMNWMQWSWCKAQYKXJTWINSGPPZG9HLDLEAWUWEVCTVRCBDFOXKU'
b'ROXH9HXXAXVPEJFRSLOGRVGYZASTEBAQNXJJROCYRTDPYFUIQJVDHAKEG9YACV9HCP'
b'JUEUKOYFNWDXCCJBIFQKYOXGRDHVTHEQUMHO999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999RKWEEVD99A99999999A99999999NFDPEEZCWVYLKZGSLCQNOFUSENI'
b'XRHWWTZFBXMPSQHEDFWZULBZFEOMNLRNIDQKDNNIELAOXOVMYEI9PGTKORV9IKTJZQ'
b'UBQAWTKBKZ9NEZHBFIMCLV9TTNJNQZUIJDFPTTCTKBJRHAITVSKUCUEMD9M9SQJ999'
b'999TKORV9IKTJZQUBQAWTKBKZ9NEZHBFIMCLV9TTNJNQZUIJDFPTTCTKBJRHAITVSK'
b'UCUEMD9M9SQJ999999999999999999999999999999999RKWEEVD99RKWEEVD99RKW'
b'EEVD99999999999999999999999999999'
)
transaction = Transaction.from_tryte_string(trytes)
self.assertIsInstance(transaction, Transaction)
self.assertEqual(
transaction.hash,
Hash(
b'JBVVEWEPYNZ9KRHNUUTRENXXAVXT9MKAVPAUQ9SJ'
b'NSIHDCPQM9LJHIZGXO9PIRWUUVBOXNCBE9XJGMOZF'
),
)
self.assertEqual(
transaction.signature_message_fragment,
Fragment(
b'GYPRVHBEZOOFXSHQBLCYW9ICTCISLHDBNMMVYD9JJHQMPQCTIQAQTJNNNJ9IDXLRCC'
b'OYOXYPCLR9PBEY9ORZIEPPDNTI9CQWYZUOTAVBXPSBOFEQAPFLWXSWUIUSJMSJIIIZ'
b'WIKIRH9GCOEVZFKNXEVCUCIIWZQCQEUVRZOCMEL9AMGXJNMLJCIA9UWGRPPHCEOPTS'
b'VPKPPPCMQXYBHMSODTWUOABPKWFFFQJHCBVYXLHEWPD9YUDFTGNCYAKQKVEZYRBQRB'
b'XIAUX9SVEDUKGMTWQIYXRGSWYRK9SRONVGTW9YGHSZRIXWGPCCUCDRMAXBPDFVHSRY'
b'WHGB9DQSQFQKSNICGPIPTRZINYRXQAFSWSEWIFRMSBMGTNYPRWFSOIIWWT9IDSELM9'
b'JUOOWFNCCSHUSMGNROBFJX9JQ9XT9PKEGQYQAWAFPRVRRVQPUQBHLSNTEFCDKBWRCD'
b'X9EYOBB9KPMTLNNQLADBDLZPRVBCKVCYQEOLARJYAGTBFR9QLPKZBOYWZQOVKCVYRG'
b'YI9ZEFIQRKYXLJBZJDBJDJVQZCGYQMROVHNDBLGNLQODPUXFNTADDVYNZJUVPGB9LV'
b'PJIYLAPBOEHPMRWUIAJXVQOEM9ROEYUOTNLXVVQEYRQWDTQGDLEYFIYNDPRAIXOZEB'
b'CS9P99AZTQQLKEILEVXMSHBIDHLXKUOMMNFKPYHONKEYDCHMUNTTNRYVMMEYHPGASP'
b'ZXASKRUPWQSHDMU9VPS99ZZ9SJJYFUJFFMFORBYDILBXCAVJDPDFHTTTIYOVGLRDYR'
b'TKHXJORJVYRPTDH9ZCPZ9ZADXZFRSFPIQKWLBRNTWJHXTOAUOL9FVGTUMMPYGYICJD'
b'XMOESEVDJWLMCVTJLPIEKBE9JTHDQWV9MRMEWFLPWGJFLUXI9BXPSVWCMUWLZSEWHB'
b'DZKXOLYNOZAPOYLQVZAQMOHGTTQEUAOVKVRRGAHNGPUEKHFVPVCOYSJAWHZU9DRROH'
b'BETBAFTATVAUGOEGCAYUXACLSSHHVYDHMDGJP9AUCLWLNTFEVGQGHQXSKEMVOVSKQE'
b'EWHWZUDTYOBGCURRZSJZLFVQQAAYQO9TRLFFN9HTDQXBSPPJYXMNGLLBHOMNVXNOWE'
b'IDMJVCLLDFHBDONQJCJVLBLCSMDOUQCKKCQJMGTSTHBXPXAMLMSXRIPUBMBAWBFNLH'
b'LUJTRJLDERLZFUBUSMF999XNHLEEXEENQJNOFFPNPQ9PQICHSATPLZVMVIWLRTKYPI'
b'XNFGYWOJSQDAXGFHKZPFLPXQEHCYEAGTIWIJEZTAVLNUMAFWGGLXMBNUQTOFCNLJTC'
b'DMWVVZGVBSEBCPFSM99FLOIDTCLUGPSEDLOKZUAEVBLWNMODGZBWOVQT9DPFOTSKRA'
b'BQAVOQ9RXWBMAKFYNDCZOJGTCIDMQSQQSODKDXTPFLNOKSIZEOY9HFUTLQRXQMEPGO'
b'XQGLLPNSXAUCYPGZMNWMQWSWCKAQYKXJTWINSGPPZG9HLDLEAWUWEVCTVRCBDFOXKU'
b'ROXH9HXXAXVPEJFRSLOGRVGYZASTEBAQNXJJROCYRTDPYFUIQJVDHAKEG9YACV9HCP'
b'JUEUKOYFNWDXCCJBIFQKYOXGRDHVTHEQUMHO999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999'
),
)
self.assertEqual(
transaction.address,
Address(
b'9999999999999999999999999999999999999999'
b'99999999999999999999999999999999999999999'
),
)
self.assertEqual(transaction.value, 0)
self.assertEqual(transaction.legacy_tag, Tag(b'999999999999999999999999999'))
self.assertEqual(transaction.timestamp, 1480690413)
self.assertEqual(transaction.current_index, 1)
self.assertEqual(transaction.last_index, 1)
self.assertEqual(
transaction.bundle_hash,
BundleHash(
b'NFDPEEZCWVYLKZGSLCQNOFUSENIXRHWWTZFBXMPS'
b'QHEDFWZULBZFEOMNLRNIDQKDNNIELAOXOVMYEI9PG'
),
)
self.assertEqual(
transaction.trunk_transaction_hash,
TransactionHash(
b'TKORV9IKTJZQUBQAWTKBKZ9NEZHBFIMCLV9TTNJN'
b'QZUIJDFPTTCTKBJRHAITVSKUCUEMD9M9SQJ999999'
),
)
self.assertEqual(
transaction.branch_transaction_hash,
TransactionHash(
b'TKORV9IKTJZQUBQAWTKBKZ9NEZHBFIMCLV9TTNJN'
b'QZUIJDFPTTCTKBJRHAITVSKUCUEMD9M9SQJ999999'
),
)
self.assertEqual(transaction.tag, Tag(b'999999999999999999999999999'))
self.assertEqual(transaction.attachment_timestamp,1480690413)
self.assertEqual(transaction.attachment_timestamp_lower_bound,1480690413)
self.assertEqual(transaction.attachment_timestamp_upper_bound,1480690413)
self.assertEqual(
transaction.nonce,
Nonce(
b'999999999999999999999999999'
),
)
def test_from_tryte_string_with_hash(self):
"""
Initializing a Transaction object from a TryteString, with a
pre-computed hash.
"""
# noinspection SpellCheckingInspection
txn_hash =\
TransactionHash(
b'TESTVALUE9DONTUSEINPRODUCTION99999VALCXC'
b'DHTDZBVCAAIEZCQCXGEFYBXHNDJFZEBEVELA9HHEJ'
)
txn = Transaction.from_tryte_string(b'', hash_=txn_hash)
self.assertEqual(txn.hash, txn_hash)
# noinspection SpellCheckingInspection
def test_as_tryte_string(self):
"""
Converting a Transaction into a TryteString.
"""
transaction = Transaction(
hash_ =
TransactionHash(
b'SYABNCYPLULQQBTDCUWJPVVMYNWHKEHGAZPKRBGE'
b'QKEHUIKJCHWGAUKLSYMDOUUBMXPKCPTNFWUFU9JKW'
),
signature_message_fragment =
Fragment(
b'GYPRVHBEZOOFXSHQBLCYW9ICTCISLHDBNMMVYD9JJHQMPQCTIQAQTJNNNJ9IDXLRCC'
b'OYOXYPCLR9PBEY9ORZIEPPDNTI9CQWYZUOTAVBXPSBOFEQAPFLWXSWUIUSJMSJIIIZ'
b'WIKIRH9GCOEVZFKNXEVCUCIIWZQCQEUVRZOCMEL9AMGXJNMLJCIA9UWGRPPHCEOPTS'
b'VPKPPPCMQXYBHMSODTWUOABPKWFFFQJHCBVYXLHEWPD9YUDFTGNCYAKQKVEZYRBQRB'
b'XIAUX9SVEDUKGMTWQIYXRGSWYRK9SRONVGTW9YGHSZRIXWGPCCUCDRMAXBPDFVHSRY'
b'WHGB9DQSQFQKSNICGPIPTRZINYRXQAFSWSEWIFRMSBMGTNYPRWFSOIIWWT9IDSELM9'
b'JUOOWFNCCSHUSMGNROBFJX9JQ9XT9PKEGQYQAWAFPRVRRVQPUQBHLSNTEFCDKBWRCD'
b'X9EYOBB9KPMTLNNQLADBDLZPRVBCKVCYQEOLARJYAGTBFR9QLPKZBOYWZQOVKCVYRG'
b'YI9ZEFIQRKYXLJBZJDBJDJVQZCGYQMROVHNDBLGNLQODPUXFNTADDVYNZJUVPGB9LV'
b'PJIYLAPBOEHPMRWUIAJXVQOEM9ROEYUOTNLXVVQEYRQWDTQGDLEYFIYNDPRAIXOZEB'
b'CS9P99AZTQQLKEILEVXMSHBIDHLXKUOMMNFKPYHONKEYDCHMUNTTNRYVMMEYHPGASP'
b'ZXASKRUPWQSHDMU9VPS99ZZ9SJJYFUJFFMFORBYDILBXCAVJDPDFHTTTIYOVGLRDYR'
b'TKHXJORJVYRPTDH9ZCPZ9ZADXZFRSFPIQKWLBRNTWJHXTOAUOL9FVGTUMMPYGYICJD'
b'XMOESEVDJWLMCVTJLPIEKBE9JTHDQWV9MRMEWFLPWGJFLUXI9BXPSVWCMUWLZSEWHB'
b'DZKXOLYNOZAPOYLQVZAQMOHGTTQEUAOVKVRRGAHNGPUEKHFVPVCOYSJAWHZU9DRROH'
b'BETBAFTATVAUGOEGCAYUXACLSSHHVYDHMDGJP9AUCLWLNTFEVGQGHQXSKEMVOVSKQE'
b'EWHWZUDTYOBGCURRZSJZLFVQQAAYQO9TRLFFN9HTDQXBSPPJYXMNGLLBHOMNVXNOWE'
b'IDMJVCLLDFHBDONQJCJVLBLCSMDOUQCKKCQJMGTSTHBXPXAMLMSXRIPUBMBAWBFNLH'
b'LUJTRJLDERLZFUBUSMF999XNHLEEXEENQJNOFFPNPQ9PQICHSATPLZVMVIWLRTKYPI'
b'XNFGYWOJSQDAXGFHKZPFLPXQEHCYEAGTIWIJEZTAVLNUMAFWGGLXMBNUQTOFCNLJTC'
b'DMWVVZGVBSEBCPFSM99FLOIDTCLUGPSEDLOKZUAEVBLWNMODGZBWOVQT9DPFOTSKRA'
b'BQAVOQ9RXWBMAKFYNDCZOJGTCIDMQSQQSODKDXTPFLNOKSIZEOY9HFUTLQRXQMEPGO'
b'XQGLLPNSXAUCYPGZMNWMQWSWCKAQYKXJTWINSGPPZG9HLDLEAWUWEVCTVRCBDFOXKU'
b'ROXH9HXXAXVPEJFRSLOGRVGYZASTEBAQNXJJROCYRTDPYFUIQJVDHAKEG9YACV9HCP'
b'JUEUKOYFNWDXCCJBIFQKYOXGRDHVTHEQUMHO999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999'
),
address =
Address(
b'99999999999999999999999999999999999999999'
b'9999999999999999999999999999999999999999'
),
value = 0,
timestamp = 1480690413,
current_index = 1,
last_index = 1,
bundle_hash =
BundleHash(
b'NFDPEEZCWVYLKZGSLCQNOFUSENIXRHWWTZFBXMPS'
b'QHEDFWZULBZFEOMNLRNIDQKDNNIELAOXOVMYEI9PG'
),
trunk_transaction_hash =
TransactionHash(
b'TKORV9IKTJZQUBQAWTKBKZ9NEZHBFIMCLV9TTNJN'
b'QZUIJDFPTTCTKBJRHAITVSKUCUEMD9M9SQJ999999'
),
branch_transaction_hash =
TransactionHash(
b'TKORV9IKTJZQUBQAWTKBKZ9NEZHBFIMCLV9TTNJN'
b'QZUIJDFPTTCTKBJRHAITVSKUCUEMD9M9SQJ999999'
),
tag = Tag(b'999999999999999999999999999'),
attachment_timestamp = 1480690413,
attachment_timestamp_lower_bound = 1480690413,
attachment_timestamp_upper_bound = 1480690413,
nonce =
Nonce(
b'999999999999999999999999999'
),
)
self.assertEqual(
transaction.as_tryte_string(),
TransactionTrytes(
b'GYPRVHBEZOOFXSHQBLCYW9ICTCISLHDBNMMVYD9JJHQMPQCTIQAQTJNNNJ9IDXLRCC'
b'OYOXYPCLR9PBEY9ORZIEPPDNTI9CQWYZUOTAVBXPSBOFEQAPFLWXSWUIUSJMSJIIIZ'
b'WIKIRH9GCOEVZFKNXEVCUCIIWZQCQEUVRZOCMEL9AMGXJNMLJCIA9UWGRPPHCEOPTS'
b'VPKPPPCMQXYBHMSODTWUOABPKWFFFQJHCBVYXLHEWPD9YUDFTGNCYAKQKVEZYRBQRB'
b'XIAUX9SVEDUKGMTWQIYXRGSWYRK9SRONVGTW9YGHSZRIXWGPCCUCDRMAXBPDFVHSRY'
b'WHGB9DQSQFQKSNICGPIPTRZINYRXQAFSWSEWIFRMSBMGTNYPRWFSOIIWWT9IDSELM9'
b'JUOOWFNCCSHUSMGNROBFJX9JQ9XT9PKEGQYQAWAFPRVRRVQPUQBHLSNTEFCDKBWRCD'
b'X9EYOBB9KPMTLNNQLADBDLZPRVBCKVCYQEOLARJYAGTBFR9QLPKZBOYWZQOVKCVYRG'
b'YI9ZEFIQRKYXLJBZJDBJDJVQZCGYQMROVHNDBLGNLQODPUXFNTADDVYNZJUVPGB9LV'
b'PJIYLAPBOEHPMRWUIAJXVQOEM9ROEYUOTNLXVVQEYRQWDTQGDLEYFIYNDPRAIXOZEB'
b'CS9P99AZTQQLKEILEVXMSHBIDHLXKUOMMNFKPYHONKEYDCHMUNTTNRYVMMEYHPGASP'
b'ZXASKRUPWQSHDMU9VPS99ZZ9SJJYFUJFFMFORBYDILBXCAVJDPDFHTTTIYOVGLRDYR'
b'TKHXJORJVYRPTDH9ZCPZ9ZADXZFRSFPIQKWLBRNTWJHXTOAUOL9FVGTUMMPYGYICJD'
b'XMOESEVDJWLMCVTJLPIEKBE9JTHDQWV9MRMEWFLPWGJFLUXI9BXPSVWCMUWLZSEWHB'
b'DZKXOLYNOZAPOYLQVZAQMOHGTTQEUAOVKVRRGAHNGPUEKHFVPVCOYSJAWHZU9DRROH'
b'BETBAFTATVAUGOEGCAYUXACLSSHHVYDHMDGJP9AUCLWLNTFEVGQGHQXSKEMVOVSKQE'
b'EWHWZUDTYOBGCURRZSJZLFVQQAAYQO9TRLFFN9HTDQXBSPPJYXMNGLLBHOMNVXNOWE'
b'IDMJVCLLDFHBDONQJCJVLBLCSMDOUQCKKCQJMGTSTHBXPXAMLMSXRIPUBMBAWBFNLH'
b'LUJTRJLDERLZFUBUSMF999XNHLEEXEENQJNOFFPNPQ9PQICHSATPLZVMVIWLRTKYPI'
b'XNFGYWOJSQDAXGFHKZPFLPXQEHCYEAGTIWIJEZTAVLNUMAFWGGLXMBNUQTOFCNLJTC'
b'DMWVVZGVBSEBCPFSM99FLOIDTCLUGPSEDLOKZUAEVBLWNMODGZBWOVQT9DPFOTSKRA'
b'BQAVOQ9RXWBMAKFYNDCZOJGTCIDMQSQQSODKDXTPFLNOKSIZEOY9HFUTLQRXQMEPGO'
b'XQGLLPNSXAUCYPGZMNWMQWSWCKAQYKXJTWINSGPPZG9HLDLEAWUWEVCTVRCBDFOXKU'
b'ROXH9HXXAXVPEJFRSLOGRVGYZASTEBAQNXJJROCYRTDPYFUIQJVDHAKEG9YACV9HCP'
b'JUEUKOYFNWDXCCJBIFQKYOXGRDHVTHEQUMHO999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999999999999999999999999999999999999999999999999999999999'
b'999999999999RKWEEVD99A99999999A99999999NFDPEEZCWVYLKZGSLCQNOFUSENI'
b'XRHWWTZFBXMPSQHEDFWZULBZFEOMNLRNIDQKDNNIELAOXOVMYEI9PGTKORV9IKTJZQ'
b'UBQAWTKBKZ9NEZHBFIMCLV9TTNJNQZUIJDFPTTCTKBJRHAITVSKUCUEMD9M9SQJ999'
b'999TKORV9IKTJZQUBQAWTKBKZ9NEZHBFIMCLV9TTNJNQZUIJDFPTTCTKBJRHAITVSK'
b'UCUEMD9M9SQJ999999999999999999999999999999999RKWEEVD99RKWEEVD99RKW'
b'EEVD99999999999999999999999999999'
),
)
| 45.81844 | 107 | 0.702124 | 1,846 | 32,302 | 12.166306 | 0.204225 | 0.107396 | 0.108999 | 0.190926 | 0.698428 | 0.671802 | 0.652166 | 0.637651 | 0.600294 | 0.600294 | 0.000031 | 0.161337 | 0.245898 | 32,302 | 704 | 108 | 45.883523 | 0.760622 | 0.065909 | 0 | 0.708253 | 0 | 0 | 0.506705 | 0.499737 | 0 | 0 | 0 | 0 | 0.053743 | 1 | 0.015355 | false | 0 | 0.005758 | 0 | 0.024952 | 0.001919 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
868f4b9e77b3032398452b3a267f43fce81b75b1 | 975 | py | Python | myadmin/decorators.py | rijalanupraj/halkapan | a1b5964034a4086a890f839ba4d3d2885a54235f | [
"MIT"
] | null | null | null | myadmin/decorators.py | rijalanupraj/halkapan | a1b5964034a4086a890f839ba4d3d2885a54235f | [
"MIT"
] | null | null | null | myadmin/decorators.py | rijalanupraj/halkapan | a1b5964034a4086a890f839ba4d3d2885a54235f | [
"MIT"
] | null | null | null | from django.shortcuts import redirect
def unauthenticated_user(view_function):
def wrapper_function(request, *args, **kwargs):
if request.user.is_authenticated:
if request.user.is_staff:
return redirect('/myadmin')
elif not request.user.is_staff:
return redirect('/')
else:
return view_function(request, *args, **kwargs)
return wrapper_function
def admin_only(view_function):
def wrapper_function(request, *args, **kwargs):
if request.user.is_staff:
return view_function(request, *args, **kwargs)
else:
return redirect('posts:feed')
return wrapper_function
def user_only(view_function):
def wrapper_function(request, *args, **kwargs):
if request.user.is_staff:
return redirect('myadmin:admin-dashboard')
else:
return view_function(request, *args, **kwargs)
return wrapper_function
| 30.46875 | 58 | 0.641026 | 108 | 975 | 5.601852 | 0.259259 | 0.119008 | 0.18843 | 0.247934 | 0.74876 | 0.74876 | 0.647934 | 0.647934 | 0.555372 | 0.555372 | 0 | 0 | 0.262564 | 975 | 31 | 59 | 31.451613 | 0.841446 | 0 | 0 | 0.6 | 0 | 0 | 0.043077 | 0.02359 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24 | false | 0 | 0.04 | 0 | 0.68 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
86905d35ede6bfefaea0b37804f6447367f88377 | 49 | py | Python | sktime/transformations/panel/rocket/tests/__init__.py | marcio55afr/sktime | 25ba2f470f037366ca6b0e529137d3d0a6191e2e | [
"BSD-3-Clause"
] | 2 | 2021-12-28T10:48:11.000Z | 2022-03-06T18:08:01.000Z | sktime/transformations/panel/rocket/tests/__init__.py | marcio55afr/sktime | 25ba2f470f037366ca6b0e529137d3d0a6191e2e | [
"BSD-3-Clause"
] | null | null | null | sktime/transformations/panel/rocket/tests/__init__.py | marcio55afr/sktime | 25ba2f470f037366ca6b0e529137d3d0a6191e2e | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""Rocket unit tests."""
| 16.333333 | 24 | 0.510204 | 6 | 49 | 4.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0.163265 | 49 | 2 | 25 | 24.5 | 0.585366 | 0.836735 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
86c59e141c79e782294f06723419a18882ddb001 | 5,552 | py | Python | discovery-provider/integration_tests/queries/test_get_aggregate_app_metrics.py | Tenderize/audius-protocol | aa15844e3f12812fe8aaa81e2cb6e5c5fa89ff51 | [
"Apache-2.0"
] | 1 | 2022-03-27T21:40:36.000Z | 2022-03-27T21:40:36.000Z | discovery-provider/integration_tests/queries/test_get_aggregate_app_metrics.py | Tenderize/audius-protocol | aa15844e3f12812fe8aaa81e2cb6e5c5fa89ff51 | [
"Apache-2.0"
] | null | null | null | discovery-provider/integration_tests/queries/test_get_aggregate_app_metrics.py | Tenderize/audius-protocol | aa15844e3f12812fe8aaa81e2cb6e5c5fa89ff51 | [
"Apache-2.0"
] | null | null | null | from datetime import date, timedelta
from src.models import AggregateDailyAppNameMetrics, AggregateMonthlyAppNameMetrics
from src.queries.get_app_name_metrics import _get_aggregate_app_metrics
from src.utils.db_session import get_db
limit = 2
today = date.today()
yesterday = today - timedelta(days=1)
def test_get_aggregate_app_metrics_week(app):
with app.app_context():
db_mock = get_db()
with db_mock.scoped_session() as session:
session.bulk_save_objects(
[
AggregateDailyAppNameMetrics(
application_name="will-not-return-because-too-old",
count=3,
timestamp=today - timedelta(days=8),
),
AggregateDailyAppNameMetrics(
application_name="top-app",
count=4,
timestamp=yesterday - timedelta(days=1),
),
AggregateDailyAppNameMetrics(
application_name="will-not-return-because-outside-limit",
count=1,
timestamp=yesterday,
),
AggregateDailyAppNameMetrics(
application_name="best-app", count=5, timestamp=yesterday
),
AggregateDailyAppNameMetrics(
application_name="top-app", count=3, timestamp=yesterday
),
AggregateDailyAppNameMetrics(
application_name="will-not-return-because-too-recent",
count=3,
timestamp=today,
),
]
)
aggregate_metrics = _get_aggregate_app_metrics(session, "week", limit)
assert len(aggregate_metrics) == limit
assert aggregate_metrics[0]["name"] == "top-app"
assert aggregate_metrics[0]["count"] == 7
assert aggregate_metrics[1]["name"] == "best-app"
assert aggregate_metrics[1]["count"] == 5
def test_get_aggregate_app_metrics_month(app):
with app.app_context():
db_mock = get_db()
with db_mock.scoped_session() as session:
session.bulk_save_objects(
[
AggregateDailyAppNameMetrics(
application_name="will-not-return-because-too-old",
count=20,
timestamp=today - timedelta(days=31),
),
AggregateDailyAppNameMetrics(
application_name="best-app",
count=20,
timestamp=today - timedelta(days=31),
),
AggregateDailyAppNameMetrics(
application_name="will-not-return-because-outside-limit",
count=1,
timestamp=yesterday,
),
AggregateDailyAppNameMetrics(
application_name="top-app",
count=5,
timestamp=yesterday - timedelta(days=8),
),
AggregateDailyAppNameMetrics(
application_name="best-app", count=5, timestamp=yesterday
),
AggregateDailyAppNameMetrics(
application_name="top-app", count=7, timestamp=yesterday
),
AggregateDailyAppNameMetrics(
application_name="will-not-return-because-too-recent",
count=20,
timestamp=today,
),
]
)
aggregate_metrics = _get_aggregate_app_metrics(session, "month", limit)
assert len(aggregate_metrics) == limit
assert aggregate_metrics[0]["name"] == "top-app"
assert aggregate_metrics[0]["count"] == 12
assert aggregate_metrics[1]["name"] == "best-app"
assert aggregate_metrics[1]["count"] == 5
def test_get_aggregate_app_metrics_all_time(app):
with app.app_context():
db_mock = get_db()
with db_mock.scoped_session() as session:
session.bulk_save_objects(
[
AggregateMonthlyAppNameMetrics(
application_name="awesome-app",
count=6,
timestamp=today - timedelta(days=367),
),
AggregateMonthlyAppNameMetrics(
application_name="will-not-return-because-outside-limit",
count=1,
timestamp=yesterday,
),
AggregateMonthlyAppNameMetrics(
application_name="best-app", count=5, timestamp=yesterday
),
AggregateMonthlyAppNameMetrics(
application_name="top-app", count=15, timestamp=yesterday
),
AggregateMonthlyAppNameMetrics(
application_name="awesome-app", count=7, timestamp=yesterday
),
AggregateMonthlyAppNameMetrics(
application_name="will-not-return-because-too-recent",
count=20,
timestamp=today,
),
]
)
aggregate_metrics = _get_aggregate_app_metrics(session, "all_time", limit)
assert len(aggregate_metrics) == limit
assert aggregate_metrics[0]["name"] == "top-app"
assert aggregate_metrics[0]["count"] == 15
assert aggregate_metrics[1]["name"] == "awesome-app"
assert aggregate_metrics[1]["count"] == 13
| 37.513514 | 83 | 0.540526 | 465 | 5,552 | 6.247312 | 0.144086 | 0.098107 | 0.192427 | 0.060585 | 0.870568 | 0.812048 | 0.719449 | 0.698795 | 0.679174 | 0.658864 | 0 | 0.01629 | 0.369777 | 5,552 | 147 | 84 | 37.768707 | 0.813947 | 0 | 0 | 0.692308 | 0 | 0 | 0.086996 | 0.049532 | 0 | 0 | 0 | 0 | 0.115385 | 1 | 0.023077 | false | 0 | 0.030769 | 0 | 0.053846 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
86d060404aeae68c0df6585de38039fb5e08389b | 84 | py | Python | accounts/models.py | huseyinyilmaz/worklogger | 3ad66bd7f96e977f5d4a1adb2d3e419a08622cd0 | [
"MIT"
] | 1 | 2017-04-25T10:02:53.000Z | 2017-04-25T10:02:53.000Z | accounts/models.py | huseyinyilmaz/worklogger | 3ad66bd7f96e977f5d4a1adb2d3e419a08622cd0 | [
"MIT"
] | null | null | null | accounts/models.py | huseyinyilmaz/worklogger | 3ad66bd7f96e977f5d4a1adb2d3e419a08622cd0 | [
"MIT"
] | null | null | null | from django.db import models
# class Profile(models.Model):
# is_authenticated
| 16.8 | 30 | 0.75 | 11 | 84 | 5.636364 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 84 | 4 | 31 | 21 | 0.885714 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
86f1df211f6f0f46c52d9ee321cad49626b8dcef | 1,264 | py | Python | vizier/pythia.py | google/vizier | 12b64ce191410e1c3a79a98472a1b17811290ed3 | [
"Apache-2.0"
] | 15 | 2022-03-03T21:05:47.000Z | 2022-03-30T17:17:51.000Z | vizier/pythia.py | google/vizier | 12b64ce191410e1c3a79a98472a1b17811290ed3 | [
"Apache-2.0"
] | null | null | null | vizier/pythia.py | google/vizier | 12b64ce191410e1c3a79a98472a1b17811290ed3 | [
"Apache-2.0"
] | null | null | null | """These are the externally-useful symbols for Pythia."""
# pylint: disable=unused-import
# The API for a Pythia Policy -- i.e. the algorithm that Pythia serves.
from vizier._src.pythia.local_policy_supporters import LocalPolicyRunner
from vizier._src.pythia.policy import EarlyStopDecision
from vizier._src.pythia.policy import EarlyStopRequest
from vizier._src.pythia.policy import Policy
from vizier._src.pythia.policy import SuggestDecision
from vizier._src.pythia.policy import SuggestDecisions
from vizier._src.pythia.policy import SuggestRequest
from vizier._src.pythia.policy_supporter import MetadataDelta
from vizier._src.pythia.policy_supporter import PolicySupporter
from vizier._src.pythia.pythia_errors import CachedPolicyIsStaleError
from vizier._src.pythia.pythia_errors import CancelComputeError
from vizier._src.pythia.pythia_errors import CancelledByVizierError
from vizier._src.pythia.pythia_errors import InactivateStudyError
from vizier._src.pythia.pythia_errors import LoadTooLargeError
from vizier._src.pythia.pythia_errors import PythiaError
from vizier._src.pythia.pythia_errors import PythiaProtocolError
from vizier._src.pythia.pythia_errors import TemporaryPythiaError
from vizier._src.pythia.pythia_errors import VizierDatabaseError
| 54.956522 | 72 | 0.866297 | 164 | 1,264 | 6.487805 | 0.268293 | 0.169173 | 0.219925 | 0.321429 | 0.56297 | 0.56297 | 0.388158 | 0 | 0 | 0 | 0 | 0 | 0.076741 | 1,264 | 22 | 73 | 57.454545 | 0.91174 | 0.120253 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
86f8a3d8364205a3748252279b77d5e179442682 | 23 | py | Python | file.py | KaterinPerdom/Python-Exercise- | d28e89a5c2de74d591e49edd138f13b174f7a096 | [
"MIT"
] | null | null | null | file.py | KaterinPerdom/Python-Exercise- | d28e89a5c2de74d591e49edd138f13b174f7a096 | [
"MIT"
] | null | null | null | file.py | KaterinPerdom/Python-Exercise- | d28e89a5c2de74d591e49edd138f13b174f7a096 | [
"MIT"
] | null | null | null | print("Hello Git como") | 23 | 23 | 0.73913 | 4 | 23 | 4.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 23 | 1 | 23 | 23 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
86fd3d2d3329e7145cae307d9437570b09ee2676 | 106 | py | Python | preview/services/__init__.py | arXiv/arxiv-submission-preview | 2f40ee2e1dc6bf2a1b2fb93c3f2fbe79e5d222b8 | [
"MIT"
] | 1 | 2021-11-05T12:24:57.000Z | 2021-11-05T12:24:57.000Z | preview/services/__init__.py | arXiv/arxiv-submission-preview | 2f40ee2e1dc6bf2a1b2fb93c3f2fbe79e5d222b8 | [
"MIT"
] | 16 | 2019-07-15T12:11:26.000Z | 2019-10-24T14:43:17.000Z | preview/services/__init__.py | arXiv/arxiv-submission-preview | 2f40ee2e1dc6bf2a1b2fb93c3f2fbe79e5d222b8 | [
"MIT"
] | 2 | 2020-12-06T16:32:53.000Z | 2021-11-05T12:24:47.000Z | """Service integration modules used by the submission preview service."""
from .store import PreviewStore | 35.333333 | 73 | 0.801887 | 13 | 106 | 6.538462 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122642 | 106 | 3 | 74 | 35.333333 | 0.913978 | 0.632075 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
810d3482824888d6346e1d88b6d8f377a8cc08aa | 143 | py | Python | codewars/8kyu/doha22/kata8/theater/theater.py | doha22/Training_one | 0cd7cf86c7da0f6175834146296b763d1841766b | [
"MIT"
] | null | null | null | codewars/8kyu/doha22/kata8/theater/theater.py | doha22/Training_one | 0cd7cf86c7da0f6175834146296b763d1841766b | [
"MIT"
] | 2 | 2019-01-22T10:53:42.000Z | 2019-01-31T08:02:48.000Z | codewars/8kyu/doha22/kata8/theater/theater.py | doha22/Training_one | 0cd7cf86c7da0f6175834146296b763d1841766b | [
"MIT"
] | 13 | 2019-01-22T10:37:42.000Z | 2019-01-25T13:30:43.000Z | def seats_in_theater(tot_cols, tot_rows, col, row):
#your code here
res = (tot_cols - col + 1) * (tot_rows - row)
return res
| 23.833333 | 51 | 0.615385 | 23 | 143 | 3.565217 | 0.652174 | 0.170732 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009615 | 0.272727 | 143 | 5 | 52 | 28.6 | 0.778846 | 0.097902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
81103a7b7a0e4c6a8ad25e210b340ae1e59832fe | 3,489 | py | Python | src/logpipe/tests/unit/kafka/test_producer.py | securingsam/django-logpipe | 8db0a5d7df80dbc33708e7ce2f1c939232d39028 | [
"ISC"
] | null | null | null | src/logpipe/tests/unit/kafka/test_producer.py | securingsam/django-logpipe | 8db0a5d7df80dbc33708e7ce2f1c939232d39028 | [
"ISC"
] | null | null | null | src/logpipe/tests/unit/kafka/test_producer.py | securingsam/django-logpipe | 8db0a5d7df80dbc33708e7ce2f1c939232d39028 | [
"ISC"
] | null | null | null | from django.test import TestCase, override_settings
from kafka.consumer.fetcher import ConsumerRecord
from unittest.mock import MagicMock, patch
from logpipe import Producer
from logpipe.tests.common import StateSerializer, StateModel, TOPIC_STATES
import binascii
LOGPIPE = {
'KAFKA_BOOTSTRAP_SERVERS': ['kafka:9092'],
'KAFKA_SEND_TIMEOUT': 5,
'KAFKA_MAX_SEND_RETRIES': 5,
}
class ProducerTest(TestCase):
@override_settings(LOGPIPE=LOGPIPE)
@patch('kafka.KafkaProducer')
def test_normal_send(self, KafkaProducer):
future = MagicMock()
future.get.return_value = self._get_record_metadata()
def test_send_call(topic, key, value):
self.assertEqual(topic, 'us-states')
self.assertEqual(key, b'NY')
self.assertIn(b'json:', value)
self.assertIn(b'"message":{"', value)
self.assertIn(b'"code":"NY"', value)
self.assertIn(b'"name":"New York"', value)
self.assertIn(b'"version":1', value)
return future
client = MagicMock()
client.send.side_effect = test_send_call
KafkaProducer.return_value = client
producer = Producer(TOPIC_STATES, StateSerializer)
ret = producer.send({
'code': 'NY',
'name': 'New York'
})
self.assertEqual(ret.topic, TOPIC_STATES)
self.assertEqual(ret.partition, 0)
self.assertEqual(ret.offset, 42)
self.assertEqual(KafkaProducer.call_count, 1)
self.assertEqual(client.send.call_count, 1)
self.assertEqual(future.get.call_count, 1)
KafkaProducer.assert_called_with(bootstrap_servers=['kafka:9092'], retries=5)
future.get.assert_called_with(timeout=5)
@override_settings(LOGPIPE=LOGPIPE)
@patch('kafka.KafkaProducer')
def test_object_send(self, KafkaProducer):
future = MagicMock()
future.get.return_value = self._get_record_metadata()
def test_send_call(topic, key, value):
self.assertEqual(topic, 'us-states')
self.assertEqual(key, b'NY')
self.assertIn(b'json:', value)
self.assertIn(b'"message":{"', value)
self.assertIn(b'"code":"NY"', value)
self.assertIn(b'"name":"New York"', value)
self.assertIn(b'"version":1', value)
return future
client = MagicMock()
client.send.side_effect = test_send_call
KafkaProducer.return_value = client
producer = Producer(TOPIC_STATES, StateSerializer)
obj = StateModel()
obj.code = 'NY'
obj.name = 'New York'
ret = producer.send(obj)
self.assertEqual(ret.topic, TOPIC_STATES)
self.assertEqual(ret.partition, 0)
self.assertEqual(ret.offset, 42)
self.assertEqual(KafkaProducer.call_count, 1)
self.assertEqual(client.send.call_count, 1)
self.assertEqual(future.get.call_count, 1)
KafkaProducer.assert_called_with(bootstrap_servers=['kafka:9092'], retries=5)
future.get.assert_called_with(timeout=5)
def _get_record_metadata(self):
return ConsumerRecord(
topic=TOPIC_STATES,
partition=0,
offset=42,
timestamp=1467649216540,
timestamp_type=0,
key=b'NY',
value=b'foo',
checksum=binascii.crc32(b'foo'),
serialized_key_size=b'NY',
serialized_value_size=b'foo')
| 35.242424 | 85 | 0.632559 | 397 | 3,489 | 5.397985 | 0.20403 | 0.111993 | 0.060663 | 0.067196 | 0.710219 | 0.710219 | 0.710219 | 0.710219 | 0.710219 | 0.654223 | 0 | 0.019548 | 0.252221 | 3,489 | 98 | 86 | 35.602041 | 0.80184 | 0 | 0 | 0.595238 | 0 | 0 | 0.087704 | 0.012898 | 0 | 0 | 0 | 0 | 0.357143 | 1 | 0.059524 | false | 0 | 0.071429 | 0.011905 | 0.178571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
81103c972c784628f5bd48ea3ecf5f457b6a4215 | 29 | py | Python | LAUG/nlg/__init__.py | wise-east/LAUG | c5fc674e76a0a20622a77301f9986ad58713d58d | [
"Apache-2.0"
] | 10 | 2021-07-10T12:40:42.000Z | 2022-03-14T07:51:06.000Z | LAUG/nlg/__init__.py | wise-east/LAUG | c5fc674e76a0a20622a77301f9986ad58713d58d | [
"Apache-2.0"
] | 5 | 2021-07-01T11:23:58.000Z | 2021-09-09T05:51:02.000Z | LAUG/nlg/__init__.py | wise-east/LAUG | c5fc674e76a0a20622a77301f9986ad58713d58d | [
"Apache-2.0"
] | 2 | 2021-09-13T16:26:42.000Z | 2021-11-16T09:26:54.000Z | from LAUG.nlg.nlg import NLG
| 14.5 | 28 | 0.793103 | 6 | 29 | 3.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
81143e0a94f135cfbe1142988ccd22771e1c077c | 131 | py | Python | boa3_test/test_sc/reversed_test/ReversedList.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 25 | 2020-07-22T19:37:43.000Z | 2022-03-08T03:23:55.000Z | boa3_test/test_sc/reversed_test/ReversedList.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 419 | 2020-04-23T17:48:14.000Z | 2022-03-31T13:17:45.000Z | boa3_test/test_sc/reversed_test/ReversedList.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 15 | 2020-05-21T21:54:24.000Z | 2021-11-18T06:17:24.000Z | from typing import Any, List
from boa3.builtin import public
@public
def main(a: List[Any]) -> reversed:
return reversed(a)
| 14.555556 | 35 | 0.717557 | 20 | 131 | 4.7 | 0.65 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009346 | 0.183206 | 131 | 8 | 36 | 16.375 | 0.869159 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0.2 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 5 |
d4b6df2e9a3b072f67f14949bf7264bc6b5bc466 | 41 | py | Python | sourceknight/errors.py | tmick0/sourceknight | a60204369b6436cf1f38d0f9f470168c0fd49bd8 | [
"MIT"
] | 1 | 2021-06-16T21:10:12.000Z | 2021-06-16T21:10:12.000Z | sourceknight/errors.py | tmick0/sourceknight | a60204369b6436cf1f38d0f9f470168c0fd49bd8 | [
"MIT"
] | null | null | null | sourceknight/errors.py | tmick0/sourceknight | a60204369b6436cf1f38d0f9f470168c0fd49bd8 | [
"MIT"
] | null | null | null |
class skerror (RuntimeError):
pass
| 8.2 | 29 | 0.682927 | 4 | 41 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.243902 | 41 | 4 | 30 | 10.25 | 0.903226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
d4f886d97b6d89d3871b0a61eaa5972334765cd9 | 56 | py | Python | 14. Adv Python Packages/project/package2/subpackage/module5.py | penanrajput/PythonCourseContent | 074a4af9c83a8a6b9b4608ce341ed96d1bd2e999 | [
"MIT"
] | null | null | null | 14. Adv Python Packages/project/package2/subpackage/module5.py | penanrajput/PythonCourseContent | 074a4af9c83a8a6b9b4608ce341ed96d1bd2e999 | [
"MIT"
] | null | null | null | 14. Adv Python Packages/project/package2/subpackage/module5.py | penanrajput/PythonCourseContent | 074a4af9c83a8a6b9b4608ce341ed96d1bd2e999 | [
"MIT"
] | 1 | 2020-12-19T19:29:17.000Z | 2020-12-19T19:29:17.000Z | def function5():
print("Inside function5 function")
| 18.666667 | 38 | 0.714286 | 6 | 56 | 6.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 0.160714 | 56 | 2 | 39 | 28 | 0.808511 | 0 | 0 | 0 | 0 | 0 | 0.446429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
be1e6193088175486709473645ee9c4de55f0b35 | 186 | py | Python | app/core/dependencies.py | BoroviyOrest/QuizzesAPI | 4db3de618ef73140a6a1b659ea458d6a90d3d365 | [
"MIT"
] | null | null | null | app/core/dependencies.py | BoroviyOrest/QuizzesAPI | 4db3de618ef73140a6a1b659ea458d6a90d3d365 | [
"MIT"
] | 11 | 2021-02-02T13:12:29.000Z | 2021-04-05T20:50:52.000Z | app/core/dependencies.py | BoroviyOrest/SpanishQuizzesAPI | 4db3de618ef73140a6a1b659ea458d6a90d3d365 | [
"MIT"
] | null | null | null | from fastapi import Request
def init_service(service_class) -> callable:
def wrapper(request: Request):
return service_class(request.app.state.mongodb)
return wrapper
| 20.666667 | 55 | 0.741935 | 23 | 186 | 5.869565 | 0.608696 | 0.177778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.182796 | 186 | 8 | 56 | 23.25 | 0.888158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.2 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
0790596fe16b96d3b693fc52b8e6ef731ab3d7b0 | 3,513 | py | Python | tests/test_crypto.py | sapo/securitylib-python | afa176c52fc9effa664be895e86ab9cd07e5018f | [
"MIT"
] | 1 | 2015-01-30T16:22:24.000Z | 2015-01-30T16:22:24.000Z | tests/test_crypto.py | sapo/securitylib-python | afa176c52fc9effa664be895e86ab9cd07e5018f | [
"MIT"
] | null | null | null | tests/test_crypto.py | sapo/securitylib-python | afa176c52fc9effa664be895e86ab9cd07e5018f | [
"MIT"
] | 2 | 2015-06-11T14:30:15.000Z | 2021-12-01T14:28:54.000Z | from securitylib.crypto import *
from nose.tools import ok_, eq_, with_setup
from test_utils import setup_seeded_random, teardown_seeded_random, assert_raises_with_message
def test_generate_authenticator():
eq_(generate_authenticator('KJxyKJaV06', '5f07ec7a02bb0d7dc92d8aae1e0817e2a64a1265797b45f4780b49af11df61e1'.decode('hex')), 'aee1a8fc5443bbaf982b074c755b4e4faee028cc54ecb83868ec3e1a64f45e6f'.decode('hex'))
assert_raises_with_message(ValueError, 'Parameter authenticator_key must have length 32 bytes.', generate_authenticator, 'KJxyKJaV06', 'cf9021efdfec6a4e3fd8'.encode('hex'))
def test_validate_authenticator():
ok_(validate_authenticator('KJxyKJaV06', '5f07ec7a02bb0d7dc92d8aae1e0817e2a64a1265797b45f4780b49af11df61e1'.decode('hex'), 'aee1a8fc5443bbaf982b074c755b4e4faee028cc54ecb83868ec3e1a64f45e6f'.decode('hex')))
assert_raises_with_message(ValueError, 'Parameter authenticator_key must have length 32 bytes.', validate_authenticator, 'KJxyKJaV06', 'cf9021efdfec6a4e3fd8', '')
@with_setup(setup_seeded_random, teardown_seeded_random)
def test_generate_encryption_key():
eq_(generate_encryption_key(), '9a45076e45211648b857327311a73c1b'.decode('hex'))
@with_setup(setup_seeded_random, teardown_seeded_random)
def test_generate_authenticator_key():
eq_(generate_authenticator_key(), 'bdb4b6e8d792e4c973c0039c8d4f59a79a45076e45211648b857327311a73c1b'.decode('hex'))
def test_generate_encryption_key_from_password():
eq_(generate_encryption_key_from_password('password', 'salt'), '5c3d0b075ebf4e11b346cf18512e8dda'.decode('hex'))
def test_generate_authenticator_key_from_password():
eq_(generate_authenticator_key_from_password('password', 'salt'), '5c3d0b075ebf4e11b346cf18512e8ddaf29f70d67e67a94e6defe076d461e042'.decode('hex'))
@with_setup(setup_seeded_random, teardown_seeded_random)
def test_encrypt():
eq_(encrypt('The quick brown fox was not quick enough and is now an UNFOX!', 'aa79a8ab43636644d77f2b6b34842b98'.decode('hex'), '61d1a03428fd560ddf93734869ad951cb11d643e69ac19301427f16407d8faf8'.decode('hex')),
'0128153d5614aebc47fc2b69331aa1895d70e45fdffa94f04bae7ecef12f9dd4729a45076e45211648b857327311a73c1b00000000eff464a6b51411e7997787049fb0424faecff0786f213652116b4a50022e04cf24ff607d6366b9e3771486f396f8a3dd3d77f5c07bac8d2e0758454e511157e1'.decode('hex'))
assert_raises_with_message(ValueError, 'Parameter key must have length 16 bytes.', encrypt, '', 'ababab'.decode('hex'), 'abcdef'.decode('hex'))
assert_raises_with_message(ValueError, 'Parameter authenticator_key must have length 32 bytes.', encrypt, '', 'aa79a8ab43636644d77f2b6b34842b98'.decode('hex'), 'abcdef'.decode('hex'))
def test_decrypt():
eq_(decrypt('0128153d5614aebc47fc2b69331aa1895d70e45fdffa94f04bae7ecef12f9dd4729a45076e45211648b857327311a73c1b00000000eff464a6b51411e7997787049fb0424faecff0786f213652116b4a50022e04cf24ff607d6366b9e3771486f396f8a3dd3d77f5c07bac8d2e0758454e511157e1'.decode('hex'),
'aa79a8ab43636644d77f2b6b34842b98'.decode('hex'), '61d1a03428fd560ddf93734869ad951cb11d643e69ac19301427f16407d8faf8'.decode('hex')), 'The quick brown fox was not quick enough and is now an UNFOX!')
assert_raises_with_message(ValueError, 'Parameter key must have length 16 bytes.', decrypt, '', 'ababab'.decode('hex'), 'abcdef'.decode('hex'))
assert_raises_with_message(ValueError, 'Parameter authenticator_key must have length 32 bytes.', decrypt, '', 'aa79a8ab43636644d77f2b6b34842b98'.decode('hex'), 'abcdef'.decode('hex'))
| 74.744681 | 267 | 0.829775 | 306 | 3,513 | 9.218954 | 0.20915 | 0.070188 | 0.039702 | 0.057072 | 0.619638 | 0.539171 | 0.40553 | 0.40553 | 0.40234 | 0.40234 | 0 | 0.246933 | 0.072018 | 3,513 | 46 | 268 | 76.369565 | 0.618405 | 0 | 0 | 0.1 | 1 | 0 | 0.512098 | 0.333618 | 0 | 0 | 0 | 0 | 0.233333 | 1 | 0.266667 | true | 0.133333 | 0.1 | 0 | 0.366667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
07a0a747b64e1c884d063a760adb70cf385ead93 | 123 | py | Python | survos/qt_compat.py | paskino/SuRVoS | e01e784442e2e9f724826cdb70f3a50c034c6455 | [
"Apache-2.0"
] | 22 | 2016-09-30T08:04:42.000Z | 2022-03-05T07:24:18.000Z | survos/qt_compat.py | paskino/SuRVoS | e01e784442e2e9f724826cdb70f3a50c034c6455 | [
"Apache-2.0"
] | 81 | 2016-11-21T15:32:14.000Z | 2022-02-20T00:22:27.000Z | survos/qt_compat.py | paskino/SuRVoS | e01e784442e2e9f724826cdb70f3a50c034c6455 | [
"Apache-2.0"
] | 6 | 2018-11-22T10:19:59.000Z | 2022-02-04T06:15:48.000Z |
import matplotlib as mpl
mpl.use("Qt5Agg", force=True)
from matplotlib.backends.qt_compat import QtGui, QtCore, QtWidgets
| 24.6 | 66 | 0.804878 | 18 | 123 | 5.444444 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009091 | 0.105691 | 123 | 4 | 67 | 30.75 | 0.881818 | 0 | 0 | 0 | 0 | 0 | 0.04918 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
07bb3ee0f4e4cf72144c1ff98db3805f48c71715 | 148 | py | Python | src/schctest/pypacket_dissector/defs_L5.py | saguilarDevel/open_schc | ac7f2a84b6120964c8fdaabf9f5c8ca8ae39c289 | [
"MIT"
] | 21 | 2018-11-05T06:48:32.000Z | 2022-02-28T14:38:09.000Z | src/schctest/pypacket_dissector/defs_L5.py | saguilarDevel/open_schc | ac7f2a84b6120964c8fdaabf9f5c8ca8ae39c289 | [
"MIT"
] | 34 | 2019-01-28T01:32:41.000Z | 2021-05-06T09:40:14.000Z | src/schctest/pypacket_dissector/defs_L5.py | saguilarDevel/open_schc | ac7f2a84b6120964c8fdaabf9f5c8ca8ae39c289 | [
"MIT"
] | 28 | 2018-10-31T22:21:26.000Z | 2022-03-17T09:44:40.000Z | try:
from dissector_coap import dissect_coap
except:
from .dissector_coap import dissect_coap
dissectors_L5 = {
5683: dissect_coap,
}
| 14.8 | 44 | 0.743243 | 19 | 148 | 5.473684 | 0.526316 | 0.317308 | 0.326923 | 0.442308 | 0.653846 | 0.653846 | 0 | 0 | 0 | 0 | 0 | 0.042373 | 0.202703 | 148 | 9 | 45 | 16.444444 | 0.838983 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
07ca79a5586c90e4a0a0a5b55841090555422d20 | 27 | py | Python | buttons/main.py | gregsvo/micropython | 952d0acdc9dafe6a0c5f104b9a09256d85d60489 | [
"MIT"
] | 50 | 2019-05-08T18:17:45.000Z | 2019-05-22T18:00:03.000Z | buttons/main.py | gregsvo/micropython | 952d0acdc9dafe6a0c5f104b9a09256d85d60489 | [
"MIT"
] | 1 | 2017-08-01T02:11:08.000Z | 2017-08-01T02:11:08.000Z | buttons/main.py | gregsvo/micropython | 952d0acdc9dafe6a0c5f104b9a09256d85d60489 | [
"MIT"
] | null | null | null | import button
button.main() | 13.5 | 13 | 0.814815 | 4 | 27 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 27 | 2 | 14 | 13.5 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
07e04f5d4c6a465ceab5dd4d3a5127a7104708ff | 2,304 | py | Python | lib/parsedatetime/tests/TestSimpleOffsetsNoon.py | r3tard/BartusBot | baa0e775a8495c696ca274d78f898eb74d8fa961 | [
"Apache-2.0"
] | 1 | 2015-11-01T00:16:41.000Z | 2015-11-01T00:16:41.000Z | lib/parsedatetime-1.5/parsedatetime/tests/TestSimpleOffsetsNoon.py | r3tard/BartusBot | baa0e775a8495c696ca274d78f898eb74d8fa961 | [
"Apache-2.0"
] | 12 | 2015-10-30T20:22:53.000Z | 2016-02-09T21:56:17.000Z | lib/parsedatetime/tests/TestSimpleOffsetsNoon.py | r3tard/BartusBot | baa0e775a8495c696ca274d78f898eb74d8fa961 | [
"Apache-2.0"
] | null | null | null |
"""
Test parsing of 'simple' offsets
"""
import unittest, time, datetime
import parsedatetime as pdt
class test(unittest.TestCase):
@pdt.tests.assertEqualWithComparator
def assertExpectedResult(self, result, check, **kwargs):
return pdt.tests.compareResultByTimeTuplesAndFlags(result, check, **kwargs)
def setUp(self):
self.cal = pdt.Calendar()
self.yr, self.mth, self.dy, self.hr, self.mn, self.sec, self.wd, self.yd, self.isdst = time.localtime()
def testOffsetAfterNoon(self):
s = datetime.datetime(self.yr, self.mth, self.dy, 10, 0, 0)
t = datetime.datetime(self.yr, self.mth, self.dy, 12, 0, 0) + datetime.timedelta(hours=5)
start = s.timetuple()
target = t.timetuple()
self.assertExpectedResult(self.cal.parse('5 hours after 12pm', start), (target, 2))
self.assertExpectedResult(self.cal.parse('five hours after 12pm', start), (target, 2))
#self.assertExpectedResult(self.cal.parse('5 hours after 12 pm', start), (target, 2))
#self.assertExpectedResult(self.cal.parse('5 hours after 12:00pm', start), (target, 2))
#self.assertExpectedResult(self.cal.parse('5 hours after 12:00 pm', start), (target, 2))
#self.assertExpectedResult(self.cal.parse('5 hours after noon', start), (target, 2))
#self.assertExpectedResult(self.cal.parse('5 hours from noon', start), (target, 2))
def testOffsetBeforeNoon(self):
s = datetime.datetime.now()
t = datetime.datetime(self.yr, self.mth, self.dy, 12, 0, 0) + datetime.timedelta(hours=-5)
start = s.timetuple()
target = t.timetuple()
#self.assertExpectedResult(self.cal.parse('5 hours before noon', start), (target, 2))
self.assertExpectedResult(self.cal.parse('5 hours before 12pm', start), (target, 2))
self.assertExpectedResult(self.cal.parse('five hours before 12pm', start), (target, 2))
#self.assertExpectedResult(self.cal.parse('5 hours before 12 pm', start), (target, 2))
#self.assertExpectedResult(self.cal.parse('5 hours before 12:00pm', start), (target, 2))
#self.assertExpectedResult(self.cal.parse('5 hours before 12:00 pm', start), (target, 2))
if __name__ == "__main__":
unittest.main()
| 45.176471 | 111 | 0.656684 | 295 | 2,304 | 5.101695 | 0.220339 | 0.223256 | 0.24186 | 0.267774 | 0.704319 | 0.704319 | 0.679734 | 0.679734 | 0.652492 | 0.652492 | 0 | 0.035637 | 0.196181 | 2,304 | 50 | 112 | 46.08 | 0.776998 | 0.355903 | 0 | 0.16 | 0 | 0 | 0.059986 | 0 | 0 | 0 | 0 | 0 | 0.24 | 1 | 0.16 | false | 0 | 0.08 | 0.04 | 0.32 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
ed008431b058aa44cc0071125e78a23c24cf4f75 | 56 | py | Python | build/lib/jhu_primitives/oocase/__init__.py | hhelm10/primitives-interfaces | 15766d77dae016fa699a46bade0fe66711b23459 | [
"Apache-2.0"
] | null | null | null | build/lib/jhu_primitives/oocase/__init__.py | hhelm10/primitives-interfaces | 15766d77dae016fa699a46bade0fe66711b23459 | [
"Apache-2.0"
] | 23 | 2017-09-20T08:12:13.000Z | 2022-03-01T01:49:11.000Z | build/lib/jhu_primitives/oocase/__init__.py | hhelm10/primitives-interfaces | 15766d77dae016fa699a46bade0fe66711b23459 | [
"Apache-2.0"
] | 8 | 2018-05-14T18:44:38.000Z | 2021-03-18T19:53:23.000Z | from .oocase import OutOfCoreAdjacencySpectralEmbedding
| 28 | 55 | 0.910714 | 4 | 56 | 12.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 56 | 1 | 56 | 56 | 0.980769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
ed0213656680f26c676116bd8baf128a66d0b5db | 180 | py | Python | products/urls.py | ianjuma/sunpower | 2ba0ff66687993deae515a4108c834ac44c6aa5f | [
"MIT"
] | 1 | 2016-05-16T08:49:35.000Z | 2016-05-16T08:49:35.000Z | products/urls.py | ianjuma/sunpower | 2ba0ff66687993deae515a4108c834ac44c6aa5f | [
"MIT"
] | null | null | null | products/urls.py | ianjuma/sunpower | 2ba0ff66687993deae515a4108c834ac44c6aa5f | [
"MIT"
] | null | null | null | from sunpower import api
from products.views import ProductDetail, ProductList
api.add_resource(ProductDetail, '/products/<string:_id>')
api.add_resource(ProductList, '/products') | 36 | 57 | 0.816667 | 22 | 180 | 6.545455 | 0.545455 | 0.083333 | 0.194444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072222 | 180 | 5 | 58 | 36 | 0.862275 | 0 | 0 | 0 | 0 | 0 | 0.171271 | 0.121547 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
ed0b00ab7007a7095786c03cafd751039dd45df2 | 35 | py | Python | gluten_diary/food/tests/test_admin.py | gyazbek/Gluten-Diary | e48d5c015c776c86f1b4ba9c363cf7895f020e5b | [
"MIT"
] | null | null | null | gluten_diary/food/tests/test_admin.py | gyazbek/Gluten-Diary | e48d5c015c776c86f1b4ba9c363cf7895f020e5b | [
"MIT"
] | null | null | null | gluten_diary/food/tests/test_admin.py | gyazbek/Gluten-Diary | e48d5c015c776c86f1b4ba9c363cf7895f020e5b | [
"MIT"
] | null | null | null | from test_plus.test import TestCase | 35 | 35 | 0.885714 | 6 | 35 | 5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 35 | 1 | 35 | 35 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
ed2a0dddf3a43568fe78df791fd23dfbb74dd9ae | 241 | py | Python | tests/test_callable/test_mul.py | ariebovenberg/lambdas | 4ecd3cf19fb0c2f55470f79138b989b3b69b3710 | [
"BSD-2-Clause"
] | null | null | null | tests/test_callable/test_mul.py | ariebovenberg/lambdas | 4ecd3cf19fb0c2f55470f79138b989b3b69b3710 | [
"BSD-2-Clause"
] | null | null | null | tests/test_callable/test_mul.py | ariebovenberg/lambdas | 4ecd3cf19fb0c2f55470f79138b989b3b69b3710 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from lambdas import _
def test_mul():
"""Ensures that add works correctly."""
assert (_ * 20)(10) == 200
def test_rmul():
"""Ensures that add works correctly."""
assert (99 * _)(4) == (_ * 99)(4)
| 17.214286 | 43 | 0.560166 | 31 | 241 | 4.16129 | 0.677419 | 0.108527 | 0.217054 | 0.294574 | 0.527132 | 0.527132 | 0 | 0 | 0 | 0 | 0 | 0.076503 | 0.240664 | 241 | 13 | 44 | 18.538462 | 0.628415 | 0.373444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.4 | true | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
ed3754da69b3ba19e3c2787c3b64bf98d1219337 | 831 | py | Python | fleet/v1/objects/unit_state.py | simonvanderveldt/python-fleet | a11dcd8bb3986d1d8f0af90d2da7399c9cc54b4d | [
"Apache-2.0"
] | 8 | 2015-03-25T12:08:30.000Z | 2018-12-02T20:10:47.000Z | fleet/v1/objects/unit_state.py | simonvanderveldt/python-fleet | a11dcd8bb3986d1d8f0af90d2da7399c9cc54b4d | [
"Apache-2.0"
] | 5 | 2015-07-22T22:06:21.000Z | 2017-01-08T21:22:33.000Z | fleet/v1/objects/unit_state.py | cnelson/python-fleet | a11dcd8bb3986d1d8f0af90d2da7399c9cc54b4d | [
"Apache-2.0"
] | 3 | 2015-05-08T11:13:17.000Z | 2016-11-17T19:06:57.000Z | from .fleet_object import FleetObject
class UnitState(FleetObject):
"""Whereas Unit entities represent the desired state of units known by fleet,
UnitStates represent the current states of units actually running in the cluster.
The information reported by UnitStates will not always align perfectly with the Units,
as there is a delay between the declaration of desired state and the backend system making
all of the necessary changes.
Attributes:
name: unique identifier of entity
hash: SHA1 hash of underlying unit file
machineID: ID of machine from which this state originated
systemdLoadState: load state as reported by systemd
systemdActiveState: active state as reported by systemd
systemdSubState: sub state as reported by systemd
"""
pass
| 39.571429 | 94 | 0.742479 | 110 | 831 | 5.6 | 0.636364 | 0.064935 | 0.073052 | 0.082792 | 0.116883 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001567 | 0.23225 | 831 | 20 | 95 | 41.55 | 0.96395 | 0.832732 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 5 |
ed3aefd32104d38a9ec5711e95c74248fc72b4c3 | 1,698 | py | Python | tests/test_tools/test_io_tools.py | thrakar9/Evolutron | 1b9b4c364fe531e5001fd9010898b96e0f5907d7 | [
"MIT"
] | 10 | 2017-11-30T20:30:12.000Z | 2021-04-10T21:45:12.000Z | tests/test_tools/test_io_tools.py | thrakar9/Evolutron | 1b9b4c364fe531e5001fd9010898b96e0f5907d7 | [
"MIT"
] | null | null | null | tests/test_tools/test_io_tools.py | thrakar9/Evolutron | 1b9b4c364fe531e5001fd9010898b96e0f5907d7 | [
"MIT"
] | 3 | 2019-06-20T15:13:42.000Z | 2020-03-24T11:44:07.000Z | # coding=utf-8
import pytest
import os
import pandas as pd
from evolutron.tools import io_tools as io
# noinspection PyTypeChecker
def test_fasta_parser():
x_data, y_data = io.fasta_parser('tests/test_tools/samples/no_codes.fasta', codes=False)
assert type(x_data) == pd.Series
assert not y_data
with pytest.raises(IOError):
io.fasta_parser('tests/test_tools/samples/no_codes.fasta', codes=True)
x_data, y_data = io.fasta_parser('tests/test_tools/samples/type2p_codes.fasta', codes=False)
assert type(x_data) == pd.Series
assert not y_data
x_data, y_data = io.fasta_parser('tests/test_tools/samples/type2p_codes.fasta', codes=True, code_key='type2p')
assert type(x_data) == pd.Series
assert type(y_data) == list
def test_tab_parser():
x_data, y_data = io.csv_parser('tests/test_tools/samples/sample.tsv', codes=False)
assert type(x_data) == pd.Series
assert not y_data
x_data, y_data = io.csv_parser('tests/test_tools/samples/sample.tsv', codes=True, code_key='fam')
assert type(x_data) == pd.Series
assert type(y_data) == list
x_data, y_data = io.csv_parser('tests/test_tools/samples/sample.tsv', codes=False)
assert type(x_data) == pd.Series
assert not y_data
# Cleaning up
os.remove('tests/test_tools/samples/sample.h5')
def test_secs_parser():
x_data, y_data = io.secs_parser('tests/test_tools/samples/smallSecS.sec')
assert type(x_data) == pd.Series
assert type(y_data) == list
os.remove('tests/test_tools/samples/smallSecS.h5')
def test_npz_parser():
x_data, y_data = io.npz_parser('/data/datasets/cb513+profile_split1.npy.gz')
assert x_data, y_data
| 26.123077 | 114 | 0.716137 | 275 | 1,698 | 4.178182 | 0.210909 | 0.069626 | 0.121845 | 0.182768 | 0.749347 | 0.704961 | 0.612707 | 0.612707 | 0.612707 | 0.612707 | 0 | 0.007018 | 0.160777 | 1,698 | 64 | 115 | 26.53125 | 0.799298 | 0.030035 | 0 | 0.457143 | 0 | 0 | 0.261108 | 0.25563 | 0 | 0 | 0 | 0 | 0.428571 | 1 | 0.114286 | true | 0 | 0.114286 | 0 | 0.228571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
ed3fba88dce9424a3e01a99034eb5377c611a8db | 96 | py | Python | venv/lib/python3.8/site-packages/yarg/client.py | GiulianaPola/select_repeats | 17a0d053d4f874e42cf654dd142168c2ec8fbd11 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/yarg/client.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/yarg/client.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/99/53/02/5fc09d620d8c7780684c8ca13e29cb22ef12f8ce4f201b5cebabfb5f09 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.395833 | 0 | 96 | 1 | 96 | 96 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
9c49375f31013786618904cf671fe6a6b8d7025d | 96 | py | Python | venv/lib/python3.8/site-packages/clikit/api/application/__init__.py | GiulianaPola/select_repeats | 17a0d053d4f874e42cf654dd142168c2ec8fbd11 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/clikit/api/application/__init__.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/clikit/api/application/__init__.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/16/83/9c/fc04bb422c9d32e0a89f8e13e1988e0b22048ebdbb707e6e4fad7cf8e1 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.385417 | 0 | 96 | 1 | 96 | 96 | 0.510417 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
9c4fa1686dae903ef85f919f9b89ab7dbe7c2b0b | 401 | py | Python | bspump/declarative/expression/lookup/lookupexpr.py | thatch/BitSwanPump | 98a5b8d09f9b59d5361611cee0bd45e7b4c69e3f | [
"BSD-3-Clause"
] | null | null | null | bspump/declarative/expression/lookup/lookupexpr.py | thatch/BitSwanPump | 98a5b8d09f9b59d5361611cee0bd45e7b4c69e3f | [
"BSD-3-Clause"
] | null | null | null | bspump/declarative/expression/lookup/lookupexpr.py | thatch/BitSwanPump | 98a5b8d09f9b59d5361611cee0bd45e7b4c69e3f | [
"BSD-3-Clause"
] | null | null | null | from ...abc import Expression
class LOOKUP(Expression):
def __init__(self, app, *, arg_in, arg_what):
super().__init__(app)
svc = app.get_service("bspump.PumpService")
self.Lookup = svc.locate_lookup(arg_in)
self.Key = arg_what
def __call__(self, context, event, *args, **kwargs):
# TODO: Not correct
return self.Lookup.get(self.evaluate(self.Key, context, event, *args, **kwargs))
| 25.0625 | 82 | 0.708229 | 57 | 401 | 4.666667 | 0.54386 | 0.037594 | 0.120301 | 0.165414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142145 | 401 | 15 | 83 | 26.733333 | 0.773256 | 0.042394 | 0 | 0 | 0 | 0 | 0.04712 | 0 | 0 | 0 | 0 | 0.066667 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0.111111 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
9c653b0074c1a0867f652e22f6fdfbb0ee52640b | 2,308 | py | Python | backend/places4students.py | Abdullah9340/Geese-Migration | 2c62b9cb3f077ca31335dbd8372c851a5dcaefdf | [
"MIT"
] | 1 | 2021-12-21T21:48:45.000Z | 2021-12-21T21:48:45.000Z | backend/places4students.py | Abdullah9340/Geese-Migration | 2c62b9cb3f077ca31335dbd8372c851a5dcaefdf | [
"MIT"
] | null | null | null | backend/places4students.py | Abdullah9340/Geese-Migration | 2c62b9cb3f077ca31335dbd8372c851a5dcaefdf | [
"MIT"
] | null | null | null | import requests
from bs4 import BeautifulSoup
from houses import House
from requests_html import HTMLSession
url = "https://www.places4students.com/Places/PropertyListings?SchoolID=j9CaTYeszhs="
s = HTMLSession()
response = s.get(url)
response.html.render(wait=2, sleep=3)
soup = BeautifulSoup(response.html, 'html.parser')
print(soup)
def get_p4s_listings():
houses = []
for link in soup.find_all('tr', {'class': 'featured'}):
print("test")
posting_name = str(
link.find('div', {'class': 'listing-title'}).get_text()).strip().replace(" ", "")
posting_price = str(
link.find('td', {'class': 'listing-rate'}).get_text()).strip()
posting_desc = str(
link.find('div', {'class': 'listing-description'}).get_text()).strip().replace(" ", "")
posting_beds = "Not Available"
posting_location = ""
posting_link = f"https://www.places4students.com/Places/{ link.find('div',{'class':'title'}).find('a',href=True).get('href').strip()}"
posting_img = f"https://www.places4students.com{link.find('div',{'class':'occupancydate-thumbnail'}).find('img').get('src')}"
houses.append(House(posting_name, posting_price,
posting_desc, posting_beds, posting_location, posting_link, posting_img))
for link in soup.find_all('tr', {'class': 'AltRow'}):
posting_name = str(
link.find('div', {'class': 'listing-title'}).get_text()).strip().replace(" ", "")
posting_price = str(
link.find('td', {'class': 'listing-rate'}).get_text()).strip().replace(" ", "")
posting_desc = str(
link.find('div', {'class': 'listing-description'}).get_text()).strip().replace(" ", "")
posting_beds = "Not Available"
posting_location = ""
posting_link = f"https://www.places4students.com/Places/{link.find('div',{'class':'title'}).find('a',href=True).get('href').strip()}"
posting_img = f"https://www.places4students.com{link.find('div',{'class':'occupancydate-thumbnail'}).find('img').get('src')}"
houses.append(House(posting_name, posting_price,
posting_desc, posting_beds, posting_location, posting_link, posting_img))
return houses
houses = get_p4s_listings()
| 47.102041 | 142 | 0.62305 | 273 | 2,308 | 5.117216 | 0.25641 | 0.057266 | 0.062992 | 0.091625 | 0.775233 | 0.747316 | 0.747316 | 0.747316 | 0.708661 | 0.708661 | 0 | 0.005864 | 0.187175 | 2,308 | 48 | 143 | 48.083333 | 0.738806 | 0 | 0 | 0.536585 | 0 | 0.097561 | 0.319324 | 0.032496 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02439 | false | 0 | 0.097561 | 0 | 0.146341 | 0.04878 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
9c6be5ad89125876027bada3a3c6a4d97484519d | 129 | py | Python | Basics/Lists/File.py | a84885640/Python | 5984f8384abfdbfc83470d1d93b7430297fa654b | [
"Unlicense"
] | 19 | 2018-09-06T01:57:22.000Z | 2022-03-23T04:12:12.000Z | Basics/Lists/File.py | a84885640/Python | 5984f8384abfdbfc83470d1d93b7430297fa654b | [
"Unlicense"
] | null | null | null | Basics/Lists/File.py | a84885640/Python | 5984f8384abfdbfc83470d1d93b7430297fa654b | [
"Unlicense"
] | 43 | 2018-08-02T11:01:11.000Z | 2022-01-03T13:37:27.000Z | awesomeList = ['Hello', 45, 'World', 4.5]
print(awesomeList)
print(awesomeList[0])
print(awesomeList[3])
print(awesomeList[0:2]) | 21.5 | 41 | 0.72093 | 18 | 129 | 5.166667 | 0.555556 | 0.688172 | 0.365591 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067227 | 0.077519 | 129 | 6 | 42 | 21.5 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.8 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
92d419e2f73a1719be3b735a2fc6dbd6196eb6ab | 209 | py | Python | src/backends/nanoleaf.py | MattX/nanospectrum | 42b417d2640d9aecd8c8d5338e4bda09f5c0d6ef | [
"Apache-2.0"
] | null | null | null | src/backends/nanoleaf.py | MattX/nanospectrum | 42b417d2640d9aecd8c8d5338e4bda09f5c0d6ef | [
"Apache-2.0"
] | null | null | null | src/backends/nanoleaf.py | MattX/nanospectrum | 42b417d2640d9aecd8c8d5338e4bda09f5c0d6ef | [
"Apache-2.0"
] | null | null | null | from .backend_base import BackendBase
class NanoleafBackend(BackendBase):
def __init__(self, manager):
self.manager = manager
def show(self, colors):
self.manager.put_colors(colors)
| 20.9 | 39 | 0.708134 | 24 | 209 | 5.916667 | 0.583333 | 0.232394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205742 | 209 | 9 | 40 | 23.222222 | 0.855422 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
92dd360876020c5fbe08e9036f95688d652d2d35 | 4,722 | py | Python | tests/test_api/test_api_facts.py | blarghmatey/pyinfra | b8287618d66a4e00963c88a3ef191c94e8320f70 | [
"MIT"
] | 1,532 | 2015-06-13T19:48:52.000Z | 2022-03-26T15:32:45.000Z | tests/test_api/test_api_facts.py | blarghmatey/pyinfra | b8287618d66a4e00963c88a3ef191c94e8320f70 | [
"MIT"
] | 729 | 2015-09-24T08:42:39.000Z | 2022-03-31T07:15:44.000Z | tests/test_api/test_api_facts.py | blarghmatey/pyinfra | b8287618d66a4e00963c88a3ef191c94e8320f70 | [
"MIT"
] | 419 | 2015-12-16T21:00:34.000Z | 2022-03-05T21:05:07.000Z | from mock import MagicMock, patch
from pyinfra.api import Config, State
from pyinfra.api.connect import connect_all
from pyinfra.api.exceptions import PyinfraError
from pyinfra.api.facts import get_facts
from ..paramiko_util import (
PatchSSHTestCase,
)
from ..util import make_inventory
class TestOperationsApi(PatchSSHTestCase):
def test_get_fact(self):
inventory = make_inventory(hosts=('anotherhost',))
state = State(inventory, Config())
anotherhost = inventory.get_host('anotherhost')
connect_all(state)
with patch('pyinfra.api.connectors.ssh.run_shell_command') as fake_run_command:
fake_run_command.return_value = MagicMock(), [('stdout', 'some-output')]
fact_data = get_facts(state, 'command', ('yes',))
assert fact_data == {anotherhost: 'some-output'}
fake_run_command.assert_called_with(
state,
anotherhost,
'yes',
print_input=False,
print_output=False,
shell_executable=None,
su_user=None,
sudo=False,
sudo_user=None,
timeout=None,
env={},
use_sudo_password=False,
return_combined_output=True,
)
def test_get_fact_current_op_meta(self):
inventory = make_inventory(hosts=('anotherhost',))
state = State(inventory, Config())
anotherhost = inventory.get_host('anotherhost')
connect_all(state)
state.current_op_global_kwargs = {
'sudo': True,
'sudo_user': 'someuser',
'use_sudo_password': True,
'su_user': 'someuser',
'ignore_errors': False,
'timeout': 10,
'env': {'HELLO': 'WORLD'},
}
with patch('pyinfra.api.connectors.ssh.run_shell_command') as fake_run_command:
fake_run_command.return_value = MagicMock(), [('stdout', 'some-output')]
fact_data = get_facts(state, 'command', ('yes',))
assert fact_data == {anotherhost: 'some-output'}
fake_run_command.assert_called_with(
state,
anotherhost,
'yes',
print_input=False,
print_output=False,
shell_executable=None,
su_user='someuser',
sudo=True,
sudo_user='someuser',
timeout=10,
env={'HELLO': 'WORLD'},
use_sudo_password=True,
return_combined_output=True,
)
def test_get_fact_error(self):
inventory = make_inventory(hosts=('anotherhost',))
state = State(inventory, Config())
anotherhost = inventory.get_host('anotherhost')
connect_all(state)
with patch('pyinfra.api.connectors.ssh.run_shell_command') as fake_run_command:
fake_run_command.return_value = False, MagicMock()
with self.assertRaises(PyinfraError) as context:
get_facts(state, 'command', ('fail command',))
assert context.exception.args[0] == 'No hosts remaining!'
fake_run_command.assert_called_with(
state,
anotherhost,
'fail command',
print_input=False,
print_output=False,
shell_executable=None,
su_user=None,
sudo=False,
sudo_user=None,
timeout=None,
env={},
use_sudo_password=False,
return_combined_output=True,
)
def test_get_fact_error_ignore(self):
inventory = make_inventory(hosts=('anotherhost',))
state = State(inventory, Config())
anotherhost = inventory.get_host('anotherhost')
connect_all(state)
state.current_op_global_kwargs = {
'sudo': False,
'sudo_user': None,
'use_sudo_password': False,
'su_user': None,
'ignore_errors': True,
'timeout': None,
'env': {},
}
with patch('pyinfra.api.connectors.ssh.run_shell_command') as fake_run_command:
fake_run_command.return_value = False, MagicMock()
fact_data = get_facts(state, 'command', ('fail command',))
assert fact_data == {anotherhost: None}
fake_run_command.assert_called_with(
state,
anotherhost,
'fail command',
print_input=False,
print_output=False,
shell_executable=None,
su_user=None,
sudo=False,
sudo_user=None,
timeout=None,
env={},
use_sudo_password=False,
return_combined_output=True,
)
| 30.662338 | 87 | 0.578357 | 481 | 4,722 | 5.39501 | 0.16632 | 0.03237 | 0.06474 | 0.02158 | 0.789981 | 0.749518 | 0.746435 | 0.717919 | 0.717919 | 0.699422 | 0 | 0.001554 | 0.318721 | 4,722 | 153 | 88 | 30.862745 | 0.805098 | 0 | 0 | 0.64 | 0 | 0 | 0.126853 | 0.037272 | 0 | 0 | 0 | 0 | 0.072 | 1 | 0.032 | false | 0.048 | 0.056 | 0 | 0.096 | 0.064 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
92fc0015b0574102495dc45a3e07ae4d019a53e0 | 69 | py | Python | .config/autokey/data/keymaps/Super-c.py | aiya000/dotfiles | 0743dcafc967df0838e2705f0f177de67fa6fadb | [
"MIT"
] | 17 | 2016-02-04T14:23:49.000Z | 2021-01-25T06:12:05.000Z | .config/autokey/data/keymaps/Super-c.py | aiya000/dotfiles | 0743dcafc967df0838e2705f0f177de67fa6fadb | [
"MIT"
] | 1 | 2019-02-21T14:12:31.000Z | 2019-05-22T12:13:58.000Z | .config/autokey/data/keymaps/Super-c.py | aiya000/dotfiles | 0743dcafc967df0838e2705f0f177de67fa6fadb | [
"MIT"
] | null | null | null | keyboard.send_keys("<menu>")
time.sleep(0.1)
keyboard.send_keys("c")
| 17.25 | 28 | 0.724638 | 12 | 69 | 4 | 0.75 | 0.5 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030303 | 0.043478 | 69 | 3 | 29 | 23 | 0.69697 | 0 | 0 | 0 | 0 | 0 | 0.101449 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
13000cec858638963a112e34d009e00ba935bb1a | 1,846 | py | Python | lib/dataframe_parser.py | gwilson253/hypothetical_taco | 7898809817e21b5b8c9a8bd2428927b90e2fc56c | [
"MIT"
] | null | null | null | lib/dataframe_parser.py | gwilson253/hypothetical_taco | 7898809817e21b5b8c9a8bd2428927b90e2fc56c | [
"MIT"
] | null | null | null | lib/dataframe_parser.py | gwilson253/hypothetical_taco | 7898809817e21b5b8c9a8bd2428927b90e2fc56c | [
"MIT"
] | null | null | null |
"""
Will likely use
- openpyxl
- pandas
"""
from abc import ABC, abstractmethod
class DataFrameParser(ABC):
def __init__(self, template_file):
super().__init__()
self.template_file = template_file
self.error_df = None
@abstractmethod
def parse_dataframe(self):
pass
@abstractmethod
def validate_dataframe(self):
pass
def get_dataframe(self):
df = self.parse_dataframe()
self.error_df = self.validate_dataframe()
if self.error_df:
raise ValueError("Value errors were detected in the CBD Template component dataframes.")
return df
class HeaderDFParser(DataFrameParser):
def __init__(self, template_file):
super().__init__(template_file)
def check_header_labels(self):
"""Verify header labels are are correct"""
pass
class DateDFParser(DataFrameParser):
def __init__(self, template_file):
super().__init__(template_file)
class YouthStylesDFParser(DataFrameParser):
def __init__(self, template_file):
super().__init__(template_file)
class LocalCurrencyDFParser(DataFrameParser):
def __init__(self, template_file):
super().__init__(template_file)
class FabricDFParser(DataFrameParser):
def __init__(self, template_file):
super().__init__(template_file)
class TrimsDFParser(DataFrameParser):
def __init__(self, template_file):
super().__init__(template_file)
class NoSewAppDFParser(DataFrameParser):
def __init__(self, template_file):
super().__init__(template_file)
class PackagingDFParser(DataFrameParser):
def __init__(self, template_file):
super().__init__(template_file)
class SummaryDFParser(DataFrameParser):
def __init__(self, template_file):
super().__init__(template_file)
| 21.218391 | 100 | 0.697725 | 193 | 1,846 | 6.098446 | 0.274611 | 0.214104 | 0.149533 | 0.186916 | 0.508071 | 0.508071 | 0.508071 | 0.480884 | 0.480884 | 0.480884 | 0 | 0 | 0.210726 | 1,846 | 86 | 101 | 21.465116 | 0.807824 | 0.039003 | 0 | 0.510638 | 0 | 0 | 0.038658 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.297872 | false | 0.06383 | 0.021277 | 0 | 0.553191 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
130ca39947f77c16ba395db1b3402e1a6ff7c0a0 | 177 | py | Python | conflict_simulation/utils.py | appliedprotocolresearch/mahalo | adf95b99fb0285007227220243ed68777a79c3ab | [
"MIT"
] | null | null | null | conflict_simulation/utils.py | appliedprotocolresearch/mahalo | adf95b99fb0285007227220243ed68777a79c3ab | [
"MIT"
] | null | null | null | conflict_simulation/utils.py | appliedprotocolresearch/mahalo | adf95b99fb0285007227220243ed68777a79c3ab | [
"MIT"
] | null | null | null | import random
def exponential_latency(avg_latency):
"""Represents the latency to transfer messages
"""
return lambda: 1 + int(random.expovariate(1) * avg_latency)
| 22.125 | 63 | 0.723164 | 22 | 177 | 5.681818 | 0.727273 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013793 | 0.180791 | 177 | 7 | 64 | 25.285714 | 0.848276 | 0.242938 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
13345d851a268bab893a84de1441c532d16b7145 | 274 | py | Python | backend/app/test/fixtures.py | kzkaneoka/youtube-money-calculator | ac3ac8ace0f732a2acef360e4ea1827c5cda7db0 | [
"MIT"
] | null | null | null | backend/app/test/fixtures.py | kzkaneoka/youtube-money-calculator | ac3ac8ace0f732a2acef360e4ea1827c5cda7db0 | [
"MIT"
] | 2 | 2021-02-08T20:34:51.000Z | 2021-04-30T20:53:26.000Z | backend/app/test/fixtures.py | kzkaneoka/youtube-money-calculator | ac3ac8ace0f732a2acef360e4ea1827c5cda7db0 | [
"MIT"
] | null | null | null | import pytest
from app import create_app, db
@pytest.fixture
def app():
return create_app("test")
@pytest.fixture
def client(app):
return app.test_client()
@pytest.fixture
def db(app):
with app.app_context():
db.drop_all()
db.create_all()
| 13.047619 | 30 | 0.664234 | 40 | 274 | 4.4 | 0.375 | 0.221591 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.215328 | 274 | 20 | 31 | 13.7 | 0.818605 | 0 | 0 | 0.230769 | 0 | 0 | 0.014599 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.153846 | 0.153846 | 0.538462 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
136129c63193a8a23a3f5e4421cd6b641979d201 | 7,269 | py | Python | maya/analytics/analyticGPUDeformers.py | arjun-namdeo/py_stubs | 605bb167e239978f5417f3f1fc1f5c12e2a243cc | [
"MIT"
] | null | null | null | maya/analytics/analyticGPUDeformers.py | arjun-namdeo/py_stubs | 605bb167e239978f5417f3f1fc1f5c12e2a243cc | [
"MIT"
] | null | null | null | maya/analytics/analyticGPUDeformers.py | arjun-namdeo/py_stubs | 605bb167e239978f5417f3f1fc1f5c12e2a243cc | [
"MIT"
] | null | null | null | from maya.analytics.decorators import addHelp
from maya.analytics.BaseAnalytic import BaseAnalytic
from maya.analytics.decorators import makeAnalytic
from maya.analytics.decorators import addMethodDocs
from maya.debug.emModeManager import emModeManager
class analyticGPUDeformers(BaseAnalytic):
"""
Analyze the usage mode of deformer nodes.
"""
def run(self):
"""
Examine animated deformers nodes and check how they are used.
If the 'details' option is set the CSV columns are:
DeformerNode : Name of the animated deformer node
Type : Type for this node
SupportedGeometry : True if the geometry processed by animated
deformer node is supported by deformer evaluator
otherwise the CSV columns are:
DeformerMode : Description of the usage for the animated deformer node
Type : Deformer type
SupportedGeometry : True if the geometry processed by animated
deformer nodes is supported by deformer evaluator
Count : Number of animated deformer nodes in this mode
See is_supported_geometry() for what criteria a geometry must meet to be supported.
One row is output for every animated deformer node.
Return True if the analysis succeeded, else False
"""
pass
def help():
"""
Call this method to print the class documentation, including all methods.
"""
pass
def is_supported_geometry(geometry):
"""
Checks if the geometry is supported by deformer evaluator.
For it to be supported, it must:
1) Be a mesh
2) Not have a connected output
3) Have at least k vertices, where k=2000 on NVidia hardware (hard-coded value)
"""
pass
ANALYTIC_NAME = 'GPUDeformers'
__fulldocs__ = "Analyze the usage mode of deformer nodes.\nBase class for output for analytics.\n\nThe default location for the anlaytic output is in a subdirectory\ncalled 'MayaAnalytics' in your temp directory. You can change that\nat any time by calling set_output_directory().\n\nClass static member:\n ANALYTIC_NAME : Name of the analytic\n\nClass members:\n directory : Directory the output will go to\n is_static : True means this analytic doesn't require a file to run\n logger : Logging object for errors, warnings, and messages\n plug_namer : Object creating plug names, possibly anonymous\n node_namer : Object creating node names, possibly anonymous\n csv_output : Location to store legacy CSV output\n plug_namer : Set by option 'anonymous' - if True then make plug names anonymous\n node_namer : Set by option 'anonymous' - if True then make node names anonymous\n __options : List of per-analytic options\n\n\tMethods\n\t-------\n\tdebug : Utility to standardize debug messages coming from analytics.\n\n\terror : Utility to standardize errors coming from analytics.\n\n\testablish_baseline : This is run on an empty scene, to give the analytic a chance to\n\t establish any baseline data it might need (e.g. the nodes in an\n\t empty scene could all be ignored by the analytic)\n\t \n\t Base implementation does nothing. Derived classes should call\n\t their super() method though, in case something does get added.\n\n\thelp : Call this method to print the class documentation, including all methods.\n\n\tis_supported_geometry : Checks if the geometry is supported by deformer evaluator.\n\t \n\t For it to be supported, it must:\n\t 1) Be a mesh\n\t 2) Not have a connected output\n\t 3) Have at least k vertices, where k=2000 on NVidia hardware (hard-coded value)\n\n\tjson_file : Although an analytic is free to create any set of output files it\n\t wishes there will always be one master JSON file containing the\n\n\tlog : Utility to standardize logging messages coming from analytics.\n\n\tmarker_file : Returns the name of the marker file used to indicate that the\n\t computation of an analytic is in progress. If this file remains\n\t in a directory after the analytic has run that means it was\n\t interrupted and the data is not up to date.\n\t \n\t This file provides a safety measure against machines going down\n\t or analytics crashing.\n\n\tname : Get the name of this type of analytic\n\n\toption : Return TRUE if the option specified has been set on this analytic.\n\t option: Name of option to check\n\n\toutput_files : This is used to get the list of files the analytic will generate.\n\t There will always be a JSON file generated which contains at minimum\n\t the timing information. An analytic should override this method only\n\t if they are adding more output files (e.g. a .jpg file).\n\t \n\t This should only be called after the final directory has been set.\n\n\trun : Examine animated deformers nodes and check how they are used.\n\t \n\t If the 'details' option is set the CSV columns are:\n\t DeformerNode : Name of the animated deformer node\n\t Type : Type for this node\n\t SupportedGeometry : True if the geometry processed by animated\n\t deformer node is supported by deformer evaluator\n\t \n\t otherwise the CSV columns are:\n\t DeformerMode : Description of the usage for the animated deformer node\n\t Type : Deformer type\n\t SupportedGeometry : True if the geometry processed by animated\n\t deformer nodes is supported by deformer evaluator\n\t Count : Number of animated deformer nodes in this mode\n\t \n\t See is_supported_geometry() for what criteria a geometry must meet to be supported.\n\t \n\t One row is output for every animated deformer node.\n\t \n\t Return True if the analysis succeeded, else False\n\n\tset_options : Modify the settings controlling the run operation of the analytic.\n\t Override this method if your analytic has some different options\n\t available to it, but be sure to call this parent version after since\n\t it sets common options.\n\n\tset_output_directory : Call this method to set a specific directory as the output location.\n\t The special names 'stdout' and 'stderr' are recognized as the\n\t output and error streams respectively rather than a directory.\n\n\twarning : Utility to standardize warnings coming from analytics.\n"
is_static = False
OPTION_DETAILS = 'details'
| 96.92 | 5,118 | 0.643692 | 1,028 | 7,269 | 4.519455 | 0.268482 | 0.020663 | 0.006457 | 0.007749 | 0.445114 | 0.399914 | 0.372794 | 0.303917 | 0.286268 | 0.19802 | 0 | 0.002772 | 0.305269 | 7,269 | 74 | 5,119 | 98.22973 | 0.917228 | 0.173476 | 0 | 0.1875 | 0 | 0.0625 | 0.893001 | 0.058474 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0.1875 | 0.3125 | 0 | 0.75 | 0.0625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 5 |
13658719d86b300ad93ff634e7cce6eb2b1ceddc | 5,918 | py | Python | src/ggrc/rbac/user_permissions.py | Smotko/ggrc-core | b3abb58b24e7559960d71a94ba79c75539e7fe29 | [
"Apache-2.0"
] | null | null | null | src/ggrc/rbac/user_permissions.py | Smotko/ggrc-core | b3abb58b24e7559960d71a94ba79c75539e7fe29 | [
"Apache-2.0"
] | 12 | 2015-01-08T14:50:19.000Z | 2017-11-29T19:37:53.000Z | src/ggrc/rbac/user_permissions.py | mikecb/ggrc-core | 1cda560cb0920021416e07740c6cca1acba56268 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2015-01-08T13:25:09.000Z | 2015-01-08T13:25:09.000Z | # Copyright (C) 2013 Google Inc., authors, and contributors <see AUTHORS file>
# Licensed under http://www.apache.org/licenses/LICENSE-2.0 <see LICENSE file>
# Created By: david@reciprocitylabs.com
# Maintained By: david@reciprocitylabs.com
class UserPermissions(object):
"""Interface required for extensions providing user rights information for
role-based access control.
"""
def is_allowed_create(self, resource_type, resource_id, context_id):
"""Whether or not the user is allowed to create a resource of the specified
type in the context."""
raise NotImplementedError()
def is_allowed_read(self, resource_type, resource_id, context_id):
"""Whether or not the user is allowed to read a resource of the specified
type in the context."""
raise NotImplementedError()
def is_allowed_read_for(self, instance):
"""Whether or not the user is allowed to read this particular resource
instance. This is in contrast to ``is_allowed_read`` which checks that the
user can read resources of this type, though may not be able to read any
one particular instance depending upon permissions implementation.
"""
raise NotImplementedError()
def is_allowed_update(self, resource_type, resource_id, context_id):
"""Whether or not the user is allowed to update a resource of the specified
type in the context."""
raise NotImplementedError()
def is_allowed_update_for(self, instance):
"""Whether or not the user is allowed to update this particular resource
instance. This is in contrast to ``is_allowed_update`` which checks that
the user can update resources of this type, though may not be able to
update any one particular instance depending upon permissions
implementation.
"""
raise NotImplementedError()
def is_allowed_delete(self, resource_type, resource_id, context_id):
"""Whether or not the user is allowed to delete a resource of the specified
type in the context."""
raise NotImplementedError()
def is_allowed_delete_for(self, instance):
"""Whether or not the user is allowed to delete this particular resource
instance. This is in contrast to ``is_allowed_delete`` which checks that
the user can delete resources of this type, though may not be able to
delete any one particular instance depending upon permissions
implementation.
"""
raise NotImplementedError()
def create_contexts_for(self, resource_type):
"""All contexts in which the user has create permission."""
raise NotImplementedError()
def read_contexts_for(self, resource_type):
"""All contexts in which the user has read permission."""
raise NotImplementedError()
def update_contexts_for(self, resource_type):
"""All contexts in which the user has update permission."""
raise NotImplementedError()
def delete_contexts_for(self, resource_type):
"""All contexts in which the user has delete permission."""
raise NotImplementedError()
class BasicUserPermissions(UserPermissions):
"""Basic implementation of a UserPermissions object."""
def __init__(
self, create_contexts=None, read_contexts=None, update_contexts=None,
delete_contexts=None):
"""Args:
create_contexts (dict of (resource_type,[context_id])): The contexts
where the user is allowed to create a resource of a given type.
read_contexts (dict of (resource_type,[context_id])): The contexts
where the user is allowed to read a resource of a given type.
update_contexts (dict of (resource_type,[context_id])): The contexts
where the user is allowed to update a resource of a given type.
delete_contexts (dict of (resource_type,[context_id])): The contexts
where the user is allowed to delete a resource of a given type.
"""
self.create_contexts = create_contexts or {}
self.read_contexts = read_contexts or {}
self.update_contexts = update_contexts or {}
self.delete_contexts = delete_contexts or {}
def is_allowed_create(self, resource_type, resource_id, context_id):
"""Whether or not the user is allowed to create a resource of the specified
type in the context."""
return resource_type in self.create_contexts and \
context_id in self.create_contexts[resource_type]
def is_allowed_read(self, resource_type, resource_id, context_id):
"""Whether or not the user is allowed to read a resource of the specified
type in the context."""
return resource_type in self.read_contexts and \
context_id in self.read_contexts[resource_type]
def is_allowed_update(self, resource_type, resource_id, context_id):
"""Whether or not the user is allowed to update a resource of the specified
type in the context."""
return resource_type in self.update_contexts and \
context_id in self.update_contexts[resource_type]
def is_allowed_delete(self, resource_type, resource_id, context_id):
"""Whether or not the user is allowed to delete a resource of the specified
type in the context."""
return resource_type in self.delete_contexts and \
context_id in self.delete_contexts[resource_type]
def is_allowed_delete_for(self, instance):
return True
def create_contexts_for(self, resource_type):
"""All contexts in which the user has create permission."""
return self.create_contexts.get(resource_type) or []
def read_contexts_for(self, resource_type):
"""All contexts in which the user has read permission."""
return self.read_contexts.get(resource_type) or []
def update_contexts_for(self, resource_type):
"""All contexts in which the user has update permission."""
return self.update_contexts.get(resource_type) or []
def delete_contexts_for(self, resource_type):
"""All contexts in which the user has delete permission."""
return self.delete_contexts.get(resource_type) or []
| 42.271429 | 79 | 0.740115 | 840 | 5,918 | 5.05 | 0.116667 | 0.090523 | 0.060349 | 0.056577 | 0.792786 | 0.786893 | 0.706742 | 0.688119 | 0.688119 | 0.668788 | 0 | 0.001247 | 0.186887 | 5,918 | 139 | 80 | 42.57554 | 0.880299 | 0.501521 | 0 | 0.54717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.396226 | false | 0 | 0 | 0.018868 | 0.603774 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
13678dead68000f8deee1bd080b5971cf909508a | 5,209 | py | Python | numba/dppl/tests/dppl/test_numpy_trigonomteric_functions.py | DrTodd13/numba | de35af55d295f1677cca76646691d8c51c79d3cf | [
"BSD-2-Clause"
] | null | null | null | numba/dppl/tests/dppl/test_numpy_trigonomteric_functions.py | DrTodd13/numba | de35af55d295f1677cca76646691d8c51c79d3cf | [
"BSD-2-Clause"
] | null | null | null | numba/dppl/tests/dppl/test_numpy_trigonomteric_functions.py | DrTodd13/numba | de35af55d295f1677cca76646691d8c51c79d3cf | [
"BSD-2-Clause"
] | null | null | null | #! /usr/bin/env python
from __future__ import print_function
from timeit import default_timer as time
import sys
import numpy as np
from numba import dppl, njit
from numba.dppl.testing import unittest
from numba.dppl.testing import DPPLTestCase
class TestNumpy_math_functions(DPPLTestCase):
N = 10
a = np.array(np.random.random(N), dtype=np.float32)
b = np.array(np.random.random(N), dtype=np.float32)
def test_sin(self):
@njit(parallel={'offload':True})
def f(a):
c = np.sin(a)
return c
c = f(self.a)
d = np.sin(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_cos(self):
@njit(parallel={'offload':True})
def f(a):
c = np.cos(a)
return c
c = f(self.a)
d = np.cos(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_tan(self):
@njit(parallel={'offload':True})
def f(a):
c = np.tan(a)
return c
c = f(self.a)
d = np.tan(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_arcsin(self):
@njit(parallel={'offload':True})
def f(a):
c = np.arcsin(a)
return c
c = f(self.a)
d = np.arcsin(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_arccos(self):
@njit(parallel={'offload':True})
def f(a):
c = np.arccos(a)
return c
c = f(self.a)
d = np.arccos(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_arctan(self):
@njit(parallel={'offload':True})
def f(a):
c = np.arctan(a)
return c
c = f(self.a)
d = np.arctan(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_arctan2(self):
@njit(parallel={'offload':True})
def f(a, b):
c = np.arctan2(a, b)
return c
c = f(self.a, self.b)
d = np.arctan2(self.a, self.b)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_sinh(self):
@njit(parallel={'offload':True})
def f(a):
c = np.sinh(a)
return c
c = f(self.a)
d = np.sinh(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_cosh(self):
@njit(parallel={'offload':True})
def f(a):
c = np.cosh(a)
return c
c = f(self.a)
d = np.cosh(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_tanh(self):
@njit(parallel={'offload':True})
def f(a):
c = np.tanh(a)
return c
c = f(self.a)
d = np.tanh(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_arcsinh(self):
@njit(parallel={'offload':True})
def f(a):
c = np.arcsinh(a)
return c
c = f(self.a)
d = np.arcsinh(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_arccosh(self):
@njit(parallel={'offload':True})
def f(a):
c = np.arccosh(a)
return c
input_arr = np.random.randint(1, self.N, size=(self.N))
c = f(input_arr)
d = np.arccosh(input_arr)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_arctanh(self):
@njit(parallel={'offload':True})
def f(a):
c = np.arctanh(a)
return c
c = f(self.a)
d = np.arctanh(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_deg2rad(self):
@njit(parallel={'offload':True})
def f(a):
c = np.deg2rad(a)
return c
c = f(self.a)
d = np.deg2rad(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
def test_rad2deg(self):
@njit(parallel={'offload':True})
def f(a):
c = np.rad2deg(a)
return c
c = f(self.a)
d = np.rad2deg(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-2)
def test_degrees(self):
@njit(parallel={'offload':True})
def f(a):
c = np.degrees(a)
return c
c = f(self.a)
d = np.degrees(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-2)
def test_radians(self):
@njit(parallel={'offload':True})
def f(a):
c = np.radians(a)
return c
c = f(self.a)
d = np.radians(self.a)
max_abs_err = c.sum() - d.sum()
self.assertTrue(max_abs_err < 1e-5)
if __name__ == '__main__':
unittest.main()
| 23.463964 | 63 | 0.502208 | 768 | 5,209 | 3.270833 | 0.101563 | 0.08121 | 0.121815 | 0.155653 | 0.763535 | 0.742834 | 0.737261 | 0.737261 | 0.724522 | 0.588376 | 0 | 0.014824 | 0.352467 | 5,209 | 221 | 64 | 23.570136 | 0.729914 | 0.004031 | 0 | 0.592814 | 0 | 0 | 0.024484 | 0 | 0 | 0 | 0 | 0 | 0.101796 | 1 | 0.203593 | false | 0 | 0.041916 | 0 | 0.371257 | 0.005988 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
13812b7548882b363e119f46315bb4f720769c77 | 334 | py | Python | src/munge/__init__.py | 20c/munge | 579f5ebd899c11586b7db32a116ba31d70fe4d6b | [
"Apache-2.0"
] | 1 | 2021-08-23T18:12:52.000Z | 2021-08-23T18:12:52.000Z | src/munge/__init__.py | 20c/munge | 579f5ebd899c11586b7db32a116ba31d70fe4d6b | [
"Apache-2.0"
] | 15 | 2016-06-16T15:40:55.000Z | 2021-08-24T15:43:49.000Z | src/munge/__init__.py | 20c/munge | 579f5ebd899c11586b7db32a116ba31d70fe4d6b | [
"Apache-2.0"
] | 2 | 2016-06-20T16:24:16.000Z | 2016-10-22T14:58:56.000Z | # namespace imports
from .codec import find_datafile # noqa
from .codec import get_codec # noqa
from .codec import get_codecs # noqa
from .codec import load_datafile # noqa
from .config import Config
if not globals().get("MUNGE_EXPLICIT_IMPORT", False):
from .codec import all # noqa
else:
print(globals())
assert 0
| 25.692308 | 53 | 0.733533 | 48 | 334 | 4.979167 | 0.479167 | 0.188285 | 0.313808 | 0.238494 | 0.1841 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00369 | 0.188623 | 334 | 12 | 54 | 27.833333 | 0.878229 | 0.125749 | 0 | 0 | 0 | 0 | 0.073684 | 0.073684 | 0 | 0 | 0 | 0 | 0.1 | 1 | 0 | true | 0 | 0.7 | 0 | 0.7 | 0.1 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
13a3fffac586266b5ce7945ee536adaf9bc21eaa | 62 | py | Python | CodeWars/8 Kyu/Printing Array elements with Comma delimiters.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | CodeWars/8 Kyu/Printing Array elements with Comma delimiters.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | CodeWars/8 Kyu/Printing Array elements with Comma delimiters.py | anubhab-code/Competitive-Programming | de28cb7d44044b9e7d8bdb475da61e37c018ac35 | [
"MIT"
] | null | null | null | def print_array(arr):
return ','.join(str(a) for a in arr) | 31 | 40 | 0.645161 | 12 | 62 | 3.25 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177419 | 62 | 2 | 40 | 31 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0.015873 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 5 |
13c2543a78e8c3c67d30e99ca0d71bc08efcf82f | 17,289 | py | Python | intents/slice_compare.py | bhagyakjain/debaised-analysis | 6b8b27575bf3f60a6711e370bfad838e29f5cc8a | [
"Apache-2.0"
] | 1 | 2020-06-24T20:57:02.000Z | 2020-06-24T20:57:02.000Z | intents/slice_compare.py | bhagyakjain/debaised-analysis | 6b8b27575bf3f60a6711e370bfad838e29f5cc8a | [
"Apache-2.0"
] | 30 | 2020-06-01T13:42:25.000Z | 2022-03-31T03:58:55.000Z | intents/slice_compare.py | googleinterns/debaised-analysis | 0dad1186a177a171956a33c49999d9387b9f989d | [
"Apache-2.0"
] | 10 | 2020-06-10T05:43:59.000Z | 2020-08-20T10:32:24.000Z | """
Copyright 2020 Google LLC
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
"""This module contains the slice-compare intent.
The slice-compare intent can give the result so that user can easily
compare the data according to the way user want.
Also it supports some operations like cropping based on date range,
slicing(removing rows that do not follow the conditions), group by.
Some of the operations are optional.
"""
from util import aspects, oversights_order, rank_oversights
from oversights.simpsons_paradox import simpsons_paradox
from oversights.calendar_vs_experience_time import calendar_vs_experience_time
from oversights.benchmark_set_too_different import benchmark_set_too_different
from oversights.top_down_error import top_down_error
from util.enums import SummaryOperators, Filters
import pandas
def slice_compare(table, metric, all_dimensions, all_metric,
slice_compare_column, slice1, slice2,
summary_operator, **kwargs):
""" This function returns both the results according to the intent
as well as the debiasing suggestions.
Also, if summary operator is applied, the name of metric column is
renamed to "<summary operator> of metric".
Some of the oversights considered in this intent are-
1. simpson's paradox
Args:
table: Type-pandas.dataframe
It has the contents of the csv file
metric: Type-string
It is the name of the column according to which grouping will be done.
summary operator is applied on metric. Metric could a column
containing strings, if we are applying count operator on it.
dimensions: Type-list of str
It is the name of column we want.
'compare batsman A and B according to total_runs',
dimension is 'batsman'. we group by dimensions.
all_dimensions: Type-list of str
It is the list of dimension columns in the initial table
all_metric: Type-list of str
It is the list of metric columns in the initial table
date_range: Type-tuple
Tuple of start_date and end_date
date_column_name: Type-str
It is the name of column which contains date
day_first: Type-str
Day_first denotes that does day in the date occurs before month in the
dates in the date column
Example - '29-02-19', here day_first is true
slices: Type-List of tuples
Tuple represents the conditon to keep the row.
(column_name, filter, value)
column_name - is the value of the column that the
condition is applied upon.
filter - Filters enum members, ex. Filters.IN
slice_compare_column: Type-list of string
first element denotes the column name by which we will do comparision.
rest elements will the value belongs to that column by which we
will compare the slices.
summary_operator: Type-summary_operators enum members
It denotes the summary operator, after grouping by dimensions.
ex. SummaryOperators.MAX, SummaryOperators.SUM
Note-summary_operator is always applied on metric column passed,
and only when grouping is done
Returns:
The function will return both suggestions and the results in a tuple.
(results, suggestions)
results: Type - pandas dataframe, The results of the intended slice-compare
suggestions: Type - List of strings, List of suggestions.
"""
date_column_name = kwargs.get('date_column_name', 'date')
date_range = kwargs.get('date_range', None)
day_first = kwargs.get('day_first', False)
slices = kwargs.get('slices', None)
dimensions = kwargs.get('dimensions', None)
if slice2 == "*":
result_tuple = _slice_compare_results_for_all(table, metric,
slice_compare_column,
slice1, slice2,
summary_operator,
slices = slices,
dimensions = dimensions,
date_column_name = date_column_name,
date_range = date_range,
day_first = day_first)
result_table = result_tuple[0]
suggestions = result_tuple[1]
if summary_operator == SummaryOperators.MEAN or summary_operator == SummaryOperators.MEDIAN:
suggestions = benchmark_set_too_different(table, metric, all_metric,
slice_compare_column, slice1,
summary_operator,
slices = slices,
dimensions = dimensions,
date_column_name = date_column_name,
day_first = day_first,
date_range = date_range
)
return (result_table, suggestions)
result_tuple = _slice_compare_results(table, metric, slice_compare_column,
slice1, slice2, summary_operator,
slices = slices, dimensions = dimensions,
date_column_name = date_column_name,
date_range = date_range,
day_first = day_first)
result_table = result_tuple[0]
suggestions = result_tuple[1]
simpsons_paradox_suggestion = simpsons_paradox(table, metric, all_dimensions,
slice_compare_column, slice1,
slice2, summary_operator,
dimensions = dimensions,
date_column_name = date_column_name,
date_range = date_range,
day_first = day_first,
slices = slices)
top_down_error_suggestion = top_down_error(table, metric, all_dimensions,
slice_compare_column, slice1,
slice2, summary_operator,
dimensions = dimensions,
date_column_name = date_column_name,
date_range = date_range,
day_first = day_first,
slices = slices)
calendar_vs_experience_time_suggestion = calendar_vs_experience_time(table, metric, all_dimensions,
slice_compare_column, slice1,
slice2, summary_operator,
dimensions = dimensions,
date_column_name = date_column_name,
date_range = date_range,
day_first = day_first,
slices = slices)
suggestions = simpsons_paradox_suggestion + top_down_error_suggestion
if calendar_vs_experience_time_suggestion is not None:
suggestions.append(calendar_vs_experience_time)
order = oversights_order.ORDER_IN_SLICE_COMPARE
suggestions = rank_oversights.rank_oversights(suggestions, order)
if summary_operator is not None:
result_table = aspects.update_metric_column_name(result_table, summary_operator, metric)
return (result_table, suggestions)
def _slice_compare_results(table, metric, slice_compare_column,
slice1, slice2, summary_operator, **kwargs):
"""This function will implement the slice-compare intent
Also removes the tuples that do not lie in the given date range.
The arguments 'table, metric,dimension,slices_compare_column,
summary_operator' are not optional, so they are passed as it is,
'date_range','slices' will be passed in kwargs.
If some the optional args are None(not passed),
it is assumed that we don't have to apply them.
Args:
table: Type-pandas.dataframe
It has the contents of the csv file
metric: Type-string
It is the name of the column according to which grouping will be done.
summary operator is applied on metric. Metric could a column
containing strings, if we are applying count operator on it.
dimensions: Type-list of str
It is the name of column we want.
'compare batsman A and B according to total_runs'
dimension is 'batsman'. we group by dimensions.
date_range: Type-tuple
Tuple of start_date and end_date
date_column_name: Type-str
It is the name of column which contains date
day_first: Type-str
It is required by datetime.strp_time to parse the date in the format
Format Codes
https://docs.python.org/3/library/datetime.html#strftime-and-strptime-behavior
slices: Type-List of tuples
Tuple represents the conditon to keep the row.
(column_name, filter, value)
column_name - is the value of the column that the
condition is applied upon.
filter - Filters enum members, ex. Filters.IN
slice_compare_column: Type-list of string
first element denotes the column name by which we will do comparision.
rest elements will the value belongs to that column by which we
will compare the slices.
summary_operator: Type-summary_operators enum members
It denotes the summary operator, after grouping by dimensions.
ex. SummaryOperators.MAX, SummaryOperators.SUM
Note-summary_operator is always applied on metric column passed,
and only when grouping is done
Returns:
The function will return both suggestions and the results in a tuple.
(results, suggestions)
results: Type - pandas dataframe, The results of the intended slice-compare
suggestions: Type - List of strings, List of suggestions.
"""
date_column_name = kwargs.get('date_column_name', 'date')
date_range = kwargs.get('date_range', None)
day_first = kwargs.get('day_first', False)
slices = kwargs.get('slices', None)
dimensions = kwargs.get('dimensions', None)
table = aspects.apply_date_range(table, date_range,
date_column_name,
day_first)
if slices == None:
slices = [(slice_compare_column, Filters.IN, [slice1, slice2])]
else:
slices.append((slice_compare_column, Filters.IN, [slice1, slice2]))
table = aspects.slice_table(table, slices)
# collecting the colums not to be removed
required_columns = []
if dimensions is not None:
required_columns = dimensions.copy()
required_columns.append(slice_compare_column)
required_columns.append(metric)
table = aspects.crop_other_columns(table, required_columns)
# slice_compare_column should be the last element of the group
# so that groupby will show them together for every grouping
grouping_columns = []
if dimensions is not None:
grouping_columns = dimensions.copy()
grouping_columns.append(slice_compare_column)
after_group_by = aspects.group_by(table, grouping_columns, summary_operator)
result_table = after_group_by['table']
suggestions = after_group_by['suggestions']
return (result_table, suggestions)
def _slice_compare_results_for_all(table, metric, slice_compare_column,
slice1, slice2, summary_operator, **kwargs):
"""This function will implement the slice-compare intent
Also removes the tuples that do not lie in the given date range.
The arguments 'table, metric,dimension,slices_compare_column,
summary_operator' are not optional, so they are passed as it is,
'date_range', 'slices' will be passed in kwargs.
If some the optional args are None(not passed),
it is assumed that we don't have to apply them.
Args:
table: Type-pandas.dataframe
It has the contents of the csv file
metric: Type-string
It is the name of the column according to which grouping will be done.
summary operator is applied on metric. Metric could a column
containing strings, if we are applying count operator on it.
dimensions: Type-list of str
It is the name of column we want.
'compare batsman A and B according to total_runs'
dimension is 'batsman'. we group by dimensions.
date_range: Type-tuple
Tuple of start_date and end_date
date_column_name: Type-str
It is the name of column which contains date
day_first: Type-str
It is required by datetime.strp_time to parse the date in the format
Format Codes
https://docs.python.org/3/library/datetime.html#strftime-and-strptime-behavior
slices: Type-List of tuples
Tuple represents the conditon to keep the row.
(column_name, filter, value)
column_name - is the value of the column that the
condition is applied upon.
filter - Filters enum members, ex. Filters.IN
slice_compare_column: Type-list of string
first element denotes the column name by which we will do comparision.
rest elements will the value belongs to that column by which we
will compare the slices.
summary_operator: Type-summary_operators enum members
It denotes the summary operator, after grouping by dimensions.
ex. SummaryOperators.MAX, SummaryOperators.SUM
Note-summary_operator is always applied on metric column passed,
and only when grouping is done
Returns:
The function will return both suggestions and the results in a tuple.
(results, suggestions)
results: Type - pandas dataframe, The results of the intended slice-compare
suggestions: Type - List of strings, List of suggestions.
"""
date_column_name = kwargs.get('date_column_name', 'date')
date_range = kwargs.get('date_range', None)
day_first = kwargs.get('day_first', False)
slices = kwargs.get('slices', None)
dimensions = kwargs.get('dimensions', None)
table = aspects.apply_date_range(table, date_range,
date_column_name,
day_first)
table = aspects.slice_table(table, slices)
# collecting the colums not to be removed
required_columns = []
if dimensions is not None:
required_columns = dimensions.copy()
required_columns.append(slice_compare_column)
required_columns.append(metric)
table = aspects.crop_other_columns(table, required_columns)
required_table_for_one = aspects.slice_table(table, [(slice_compare_column,
Filters.EQUAL_TO, slice1)])
required_table_for_all = table.copy()
required_table_for_all[slice_compare_column] = 'ALL'
updated_table = pandas.concat([required_table_for_all, required_table_for_one])
updated_table = updated_table.reset_index(drop = True)
# collecting the colums on whcih we shall do grouping
grouping_columns = []
if dimensions is not None:
grouping_columns = dimensions.copy()
grouping_columns.append(slice_compare_column)
after_group_by = aspects.group_by(updated_table, grouping_columns, summary_operator)
result_table = after_group_by['table']
suggestions = after_group_by['suggestions']
return (result_table, suggestions) | 46.727027 | 109 | 0.608017 | 2,003 | 17,289 | 5.073889 | 0.136795 | 0.040146 | 0.031684 | 0.024796 | 0.759913 | 0.745548 | 0.742005 | 0.73433 | 0.730001 | 0.724884 | 0 | 0.003872 | 0.342646 | 17,289 | 370 | 110 | 46.727027 | 0.890365 | 0.47082 | 0 | 0.673913 | 0 | 0 | 0.023424 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021739 | false | 0 | 0.050725 | 0 | 0.101449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
b93d6b5cc1c679aeac9c89df555e216668e92307 | 196 | py | Python | util/pip_package/open3d/win32/__init__.py | junzhang2016/Open3D | 7489c9243607a40f7cdf41879e492eb92fa9852f | [
"MIT"
] | null | null | null | util/pip_package/open3d/win32/__init__.py | junzhang2016/Open3D | 7489c9243607a40f7cdf41879e492eb92fa9852f | [
"MIT"
] | null | null | null | util/pip_package/open3d/win32/__init__.py | junzhang2016/Open3D | 7489c9243607a40f7cdf41879e492eb92fa9852f | [
"MIT"
] | 1 | 2019-09-18T02:09:23.000Z | 2019-09-18T02:09:23.000Z | # Open3D: www.open3d.org
# The MIT License (MIT)
# See license file or visit www.open3d.org for details
import importlib
globals().update(importlib.import_module('open3d.win32.open3d').__dict__) | 28 | 73 | 0.77551 | 29 | 196 | 5.068966 | 0.655172 | 0.122449 | 0.163265 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04 | 0.107143 | 196 | 7 | 73 | 28 | 0.8 | 0.494898 | 0 | 0 | 0 | 0 | 0.197917 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
b9418002e9ff2bbc030f82f21792153c24e63e04 | 102 | py | Python | test/static_analysis/generators.py | mrclary/jedi | 803c3cb271ead297c4fe3ca916b54ed05a623459 | [
"MIT"
] | 4,213 | 2015-01-02T15:43:22.000Z | 2022-03-31T16:15:01.000Z | test/static_analysis/generators.py | mrclary/jedi | 803c3cb271ead297c4fe3ca916b54ed05a623459 | [
"MIT"
] | 1,392 | 2015-01-02T18:43:39.000Z | 2022-03-27T18:43:59.000Z | test/static_analysis/generators.py | PeterJCLaw/jedi | 070f191f550990c23220d7f209df076178307cf6 | [
"MIT"
] | 525 | 2015-01-02T19:07:31.000Z | 2022-03-13T02:03:20.000Z | def generator():
yield 1
#! 11 type-error-not-subscriptable
generator()[0]
list(generator())[0]
| 12.75 | 34 | 0.676471 | 14 | 102 | 4.928571 | 0.785714 | 0.289855 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057471 | 0.147059 | 102 | 7 | 35 | 14.571429 | 0.735632 | 0.323529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
b95852100a286957e5e67f86abe02ff3cfd30d86 | 38,265 | py | Python | sdk/python/pulumi_azure_native/network/v20201101/express_route_circuit_peering.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/network/v20201101/express_route_circuit_peering.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/network/v20201101/express_route_circuit_peering.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from ... import _utilities
from . import outputs
from ._enums import *
from ._inputs import *
__all__ = ['ExpressRouteCircuitPeeringArgs', 'ExpressRouteCircuitPeering']
@pulumi.input_type
class ExpressRouteCircuitPeeringArgs:
def __init__(__self__, *,
circuit_name: pulumi.Input[str],
resource_group_name: pulumi.Input[str],
azure_asn: Optional[pulumi.Input[int]] = None,
connections: Optional[pulumi.Input[Sequence[pulumi.Input['ExpressRouteCircuitConnectionArgs']]]] = None,
gateway_manager_etag: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
ipv6_peering_config: Optional[pulumi.Input['Ipv6ExpressRouteCircuitPeeringConfigArgs']] = None,
microsoft_peering_config: Optional[pulumi.Input['ExpressRouteCircuitPeeringConfigArgs']] = None,
name: Optional[pulumi.Input[str]] = None,
peer_asn: Optional[pulumi.Input[float]] = None,
peering_name: Optional[pulumi.Input[str]] = None,
peering_type: Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringType']]] = None,
primary_azure_port: Optional[pulumi.Input[str]] = None,
primary_peer_address_prefix: Optional[pulumi.Input[str]] = None,
route_filter: Optional[pulumi.Input['SubResourceArgs']] = None,
secondary_azure_port: Optional[pulumi.Input[str]] = None,
secondary_peer_address_prefix: Optional[pulumi.Input[str]] = None,
shared_key: Optional[pulumi.Input[str]] = None,
state: Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringState']]] = None,
stats: Optional[pulumi.Input['ExpressRouteCircuitStatsArgs']] = None,
vlan_id: Optional[pulumi.Input[int]] = None):
"""
The set of arguments for constructing a ExpressRouteCircuitPeering resource.
:param pulumi.Input[str] circuit_name: The name of the express route circuit.
:param pulumi.Input[str] resource_group_name: The name of the resource group.
:param pulumi.Input[int] azure_asn: The Azure ASN.
:param pulumi.Input[Sequence[pulumi.Input['ExpressRouteCircuitConnectionArgs']]] connections: The list of circuit connections associated with Azure Private Peering for this circuit.
:param pulumi.Input[str] gateway_manager_etag: The GatewayManager Etag.
:param pulumi.Input[str] id: Resource ID.
:param pulumi.Input['Ipv6ExpressRouteCircuitPeeringConfigArgs'] ipv6_peering_config: The IPv6 peering configuration.
:param pulumi.Input['ExpressRouteCircuitPeeringConfigArgs'] microsoft_peering_config: The Microsoft peering configuration.
:param pulumi.Input[str] name: The name of the resource that is unique within a resource group. This name can be used to access the resource.
:param pulumi.Input[float] peer_asn: The peer ASN.
:param pulumi.Input[str] peering_name: The name of the peering.
:param pulumi.Input[Union[str, 'ExpressRoutePeeringType']] peering_type: The peering type.
:param pulumi.Input[str] primary_azure_port: The primary port.
:param pulumi.Input[str] primary_peer_address_prefix: The primary address prefix.
:param pulumi.Input['SubResourceArgs'] route_filter: The reference to the RouteFilter resource.
:param pulumi.Input[str] secondary_azure_port: The secondary port.
:param pulumi.Input[str] secondary_peer_address_prefix: The secondary address prefix.
:param pulumi.Input[str] shared_key: The shared key.
:param pulumi.Input[Union[str, 'ExpressRoutePeeringState']] state: The peering state.
:param pulumi.Input['ExpressRouteCircuitStatsArgs'] stats: The peering stats of express route circuit.
:param pulumi.Input[int] vlan_id: The VLAN ID.
"""
pulumi.set(__self__, "circuit_name", circuit_name)
pulumi.set(__self__, "resource_group_name", resource_group_name)
if azure_asn is not None:
pulumi.set(__self__, "azure_asn", azure_asn)
if connections is not None:
pulumi.set(__self__, "connections", connections)
if gateway_manager_etag is not None:
pulumi.set(__self__, "gateway_manager_etag", gateway_manager_etag)
if id is not None:
pulumi.set(__self__, "id", id)
if ipv6_peering_config is not None:
pulumi.set(__self__, "ipv6_peering_config", ipv6_peering_config)
if microsoft_peering_config is not None:
pulumi.set(__self__, "microsoft_peering_config", microsoft_peering_config)
if name is not None:
pulumi.set(__self__, "name", name)
if peer_asn is not None:
pulumi.set(__self__, "peer_asn", peer_asn)
if peering_name is not None:
pulumi.set(__self__, "peering_name", peering_name)
if peering_type is not None:
pulumi.set(__self__, "peering_type", peering_type)
if primary_azure_port is not None:
pulumi.set(__self__, "primary_azure_port", primary_azure_port)
if primary_peer_address_prefix is not None:
pulumi.set(__self__, "primary_peer_address_prefix", primary_peer_address_prefix)
if route_filter is not None:
pulumi.set(__self__, "route_filter", route_filter)
if secondary_azure_port is not None:
pulumi.set(__self__, "secondary_azure_port", secondary_azure_port)
if secondary_peer_address_prefix is not None:
pulumi.set(__self__, "secondary_peer_address_prefix", secondary_peer_address_prefix)
if shared_key is not None:
pulumi.set(__self__, "shared_key", shared_key)
if state is not None:
pulumi.set(__self__, "state", state)
if stats is not None:
pulumi.set(__self__, "stats", stats)
if vlan_id is not None:
pulumi.set(__self__, "vlan_id", vlan_id)
@property
@pulumi.getter(name="circuitName")
def circuit_name(self) -> pulumi.Input[str]:
"""
The name of the express route circuit.
"""
return pulumi.get(self, "circuit_name")
@circuit_name.setter
def circuit_name(self, value: pulumi.Input[str]):
pulumi.set(self, "circuit_name", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
The name of the resource group.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="azureASN")
def azure_asn(self) -> Optional[pulumi.Input[int]]:
"""
The Azure ASN.
"""
return pulumi.get(self, "azure_asn")
@azure_asn.setter
def azure_asn(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "azure_asn", value)
@property
@pulumi.getter
def connections(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ExpressRouteCircuitConnectionArgs']]]]:
"""
The list of circuit connections associated with Azure Private Peering for this circuit.
"""
return pulumi.get(self, "connections")
@connections.setter
def connections(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ExpressRouteCircuitConnectionArgs']]]]):
pulumi.set(self, "connections", value)
@property
@pulumi.getter(name="gatewayManagerEtag")
def gateway_manager_etag(self) -> Optional[pulumi.Input[str]]:
"""
The GatewayManager Etag.
"""
return pulumi.get(self, "gateway_manager_etag")
@gateway_manager_etag.setter
def gateway_manager_etag(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "gateway_manager_etag", value)
@property
@pulumi.getter
def id(self) -> Optional[pulumi.Input[str]]:
"""
Resource ID.
"""
return pulumi.get(self, "id")
@id.setter
def id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "id", value)
@property
@pulumi.getter(name="ipv6PeeringConfig")
def ipv6_peering_config(self) -> Optional[pulumi.Input['Ipv6ExpressRouteCircuitPeeringConfigArgs']]:
"""
The IPv6 peering configuration.
"""
return pulumi.get(self, "ipv6_peering_config")
@ipv6_peering_config.setter
def ipv6_peering_config(self, value: Optional[pulumi.Input['Ipv6ExpressRouteCircuitPeeringConfigArgs']]):
pulumi.set(self, "ipv6_peering_config", value)
@property
@pulumi.getter(name="microsoftPeeringConfig")
def microsoft_peering_config(self) -> Optional[pulumi.Input['ExpressRouteCircuitPeeringConfigArgs']]:
"""
The Microsoft peering configuration.
"""
return pulumi.get(self, "microsoft_peering_config")
@microsoft_peering_config.setter
def microsoft_peering_config(self, value: Optional[pulumi.Input['ExpressRouteCircuitPeeringConfigArgs']]):
pulumi.set(self, "microsoft_peering_config", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the resource that is unique within a resource group. This name can be used to access the resource.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="peerASN")
def peer_asn(self) -> Optional[pulumi.Input[float]]:
"""
The peer ASN.
"""
return pulumi.get(self, "peer_asn")
@peer_asn.setter
def peer_asn(self, value: Optional[pulumi.Input[float]]):
pulumi.set(self, "peer_asn", value)
@property
@pulumi.getter(name="peeringName")
def peering_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the peering.
"""
return pulumi.get(self, "peering_name")
@peering_name.setter
def peering_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "peering_name", value)
@property
@pulumi.getter(name="peeringType")
def peering_type(self) -> Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringType']]]:
"""
The peering type.
"""
return pulumi.get(self, "peering_type")
@peering_type.setter
def peering_type(self, value: Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringType']]]):
pulumi.set(self, "peering_type", value)
@property
@pulumi.getter(name="primaryAzurePort")
def primary_azure_port(self) -> Optional[pulumi.Input[str]]:
"""
The primary port.
"""
return pulumi.get(self, "primary_azure_port")
@primary_azure_port.setter
def primary_azure_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "primary_azure_port", value)
@property
@pulumi.getter(name="primaryPeerAddressPrefix")
def primary_peer_address_prefix(self) -> Optional[pulumi.Input[str]]:
"""
The primary address prefix.
"""
return pulumi.get(self, "primary_peer_address_prefix")
@primary_peer_address_prefix.setter
def primary_peer_address_prefix(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "primary_peer_address_prefix", value)
@property
@pulumi.getter(name="routeFilter")
def route_filter(self) -> Optional[pulumi.Input['SubResourceArgs']]:
"""
The reference to the RouteFilter resource.
"""
return pulumi.get(self, "route_filter")
@route_filter.setter
def route_filter(self, value: Optional[pulumi.Input['SubResourceArgs']]):
pulumi.set(self, "route_filter", value)
@property
@pulumi.getter(name="secondaryAzurePort")
def secondary_azure_port(self) -> Optional[pulumi.Input[str]]:
"""
The secondary port.
"""
return pulumi.get(self, "secondary_azure_port")
@secondary_azure_port.setter
def secondary_azure_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secondary_azure_port", value)
@property
@pulumi.getter(name="secondaryPeerAddressPrefix")
def secondary_peer_address_prefix(self) -> Optional[pulumi.Input[str]]:
"""
The secondary address prefix.
"""
return pulumi.get(self, "secondary_peer_address_prefix")
@secondary_peer_address_prefix.setter
def secondary_peer_address_prefix(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secondary_peer_address_prefix", value)
@property
@pulumi.getter(name="sharedKey")
def shared_key(self) -> Optional[pulumi.Input[str]]:
"""
The shared key.
"""
return pulumi.get(self, "shared_key")
@shared_key.setter
def shared_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "shared_key", value)
@property
@pulumi.getter
def state(self) -> Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringState']]]:
"""
The peering state.
"""
return pulumi.get(self, "state")
@state.setter
def state(self, value: Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringState']]]):
pulumi.set(self, "state", value)
@property
@pulumi.getter
def stats(self) -> Optional[pulumi.Input['ExpressRouteCircuitStatsArgs']]:
"""
The peering stats of express route circuit.
"""
return pulumi.get(self, "stats")
@stats.setter
def stats(self, value: Optional[pulumi.Input['ExpressRouteCircuitStatsArgs']]):
pulumi.set(self, "stats", value)
@property
@pulumi.getter(name="vlanId")
def vlan_id(self) -> Optional[pulumi.Input[int]]:
"""
The VLAN ID.
"""
return pulumi.get(self, "vlan_id")
@vlan_id.setter
def vlan_id(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "vlan_id", value)
class ExpressRouteCircuitPeering(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
azure_asn: Optional[pulumi.Input[int]] = None,
circuit_name: Optional[pulumi.Input[str]] = None,
connections: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ExpressRouteCircuitConnectionArgs']]]]] = None,
gateway_manager_etag: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
ipv6_peering_config: Optional[pulumi.Input[pulumi.InputType['Ipv6ExpressRouteCircuitPeeringConfigArgs']]] = None,
microsoft_peering_config: Optional[pulumi.Input[pulumi.InputType['ExpressRouteCircuitPeeringConfigArgs']]] = None,
name: Optional[pulumi.Input[str]] = None,
peer_asn: Optional[pulumi.Input[float]] = None,
peering_name: Optional[pulumi.Input[str]] = None,
peering_type: Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringType']]] = None,
primary_azure_port: Optional[pulumi.Input[str]] = None,
primary_peer_address_prefix: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
route_filter: Optional[pulumi.Input[pulumi.InputType['SubResourceArgs']]] = None,
secondary_azure_port: Optional[pulumi.Input[str]] = None,
secondary_peer_address_prefix: Optional[pulumi.Input[str]] = None,
shared_key: Optional[pulumi.Input[str]] = None,
state: Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringState']]] = None,
stats: Optional[pulumi.Input[pulumi.InputType['ExpressRouteCircuitStatsArgs']]] = None,
vlan_id: Optional[pulumi.Input[int]] = None,
__props__=None):
"""
Peering in an ExpressRouteCircuit resource.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[int] azure_asn: The Azure ASN.
:param pulumi.Input[str] circuit_name: The name of the express route circuit.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ExpressRouteCircuitConnectionArgs']]]] connections: The list of circuit connections associated with Azure Private Peering for this circuit.
:param pulumi.Input[str] gateway_manager_etag: The GatewayManager Etag.
:param pulumi.Input[str] id: Resource ID.
:param pulumi.Input[pulumi.InputType['Ipv6ExpressRouteCircuitPeeringConfigArgs']] ipv6_peering_config: The IPv6 peering configuration.
:param pulumi.Input[pulumi.InputType['ExpressRouteCircuitPeeringConfigArgs']] microsoft_peering_config: The Microsoft peering configuration.
:param pulumi.Input[str] name: The name of the resource that is unique within a resource group. This name can be used to access the resource.
:param pulumi.Input[float] peer_asn: The peer ASN.
:param pulumi.Input[str] peering_name: The name of the peering.
:param pulumi.Input[Union[str, 'ExpressRoutePeeringType']] peering_type: The peering type.
:param pulumi.Input[str] primary_azure_port: The primary port.
:param pulumi.Input[str] primary_peer_address_prefix: The primary address prefix.
:param pulumi.Input[str] resource_group_name: The name of the resource group.
:param pulumi.Input[pulumi.InputType['SubResourceArgs']] route_filter: The reference to the RouteFilter resource.
:param pulumi.Input[str] secondary_azure_port: The secondary port.
:param pulumi.Input[str] secondary_peer_address_prefix: The secondary address prefix.
:param pulumi.Input[str] shared_key: The shared key.
:param pulumi.Input[Union[str, 'ExpressRoutePeeringState']] state: The peering state.
:param pulumi.Input[pulumi.InputType['ExpressRouteCircuitStatsArgs']] stats: The peering stats of express route circuit.
:param pulumi.Input[int] vlan_id: The VLAN ID.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ExpressRouteCircuitPeeringArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Peering in an ExpressRouteCircuit resource.
:param str resource_name: The name of the resource.
:param ExpressRouteCircuitPeeringArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ExpressRouteCircuitPeeringArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
azure_asn: Optional[pulumi.Input[int]] = None,
circuit_name: Optional[pulumi.Input[str]] = None,
connections: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ExpressRouteCircuitConnectionArgs']]]]] = None,
gateway_manager_etag: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
ipv6_peering_config: Optional[pulumi.Input[pulumi.InputType['Ipv6ExpressRouteCircuitPeeringConfigArgs']]] = None,
microsoft_peering_config: Optional[pulumi.Input[pulumi.InputType['ExpressRouteCircuitPeeringConfigArgs']]] = None,
name: Optional[pulumi.Input[str]] = None,
peer_asn: Optional[pulumi.Input[float]] = None,
peering_name: Optional[pulumi.Input[str]] = None,
peering_type: Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringType']]] = None,
primary_azure_port: Optional[pulumi.Input[str]] = None,
primary_peer_address_prefix: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
route_filter: Optional[pulumi.Input[pulumi.InputType['SubResourceArgs']]] = None,
secondary_azure_port: Optional[pulumi.Input[str]] = None,
secondary_peer_address_prefix: Optional[pulumi.Input[str]] = None,
shared_key: Optional[pulumi.Input[str]] = None,
state: Optional[pulumi.Input[Union[str, 'ExpressRoutePeeringState']]] = None,
stats: Optional[pulumi.Input[pulumi.InputType['ExpressRouteCircuitStatsArgs']]] = None,
vlan_id: Optional[pulumi.Input[int]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ExpressRouteCircuitPeeringArgs.__new__(ExpressRouteCircuitPeeringArgs)
__props__.__dict__["azure_asn"] = azure_asn
if circuit_name is None and not opts.urn:
raise TypeError("Missing required property 'circuit_name'")
__props__.__dict__["circuit_name"] = circuit_name
__props__.__dict__["connections"] = connections
__props__.__dict__["gateway_manager_etag"] = gateway_manager_etag
__props__.__dict__["id"] = id
__props__.__dict__["ipv6_peering_config"] = ipv6_peering_config
__props__.__dict__["microsoft_peering_config"] = microsoft_peering_config
__props__.__dict__["name"] = name
__props__.__dict__["peer_asn"] = peer_asn
__props__.__dict__["peering_name"] = peering_name
__props__.__dict__["peering_type"] = peering_type
__props__.__dict__["primary_azure_port"] = primary_azure_port
__props__.__dict__["primary_peer_address_prefix"] = primary_peer_address_prefix
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["route_filter"] = route_filter
__props__.__dict__["secondary_azure_port"] = secondary_azure_port
__props__.__dict__["secondary_peer_address_prefix"] = secondary_peer_address_prefix
__props__.__dict__["shared_key"] = shared_key
__props__.__dict__["state"] = state
__props__.__dict__["stats"] = stats
__props__.__dict__["vlan_id"] = vlan_id
__props__.__dict__["etag"] = None
__props__.__dict__["express_route_connection"] = None
__props__.__dict__["last_modified_by"] = None
__props__.__dict__["peered_connections"] = None
__props__.__dict__["provisioning_state"] = None
__props__.__dict__["type"] = None
alias_opts = pulumi.ResourceOptions(aliases=[pulumi.Alias(type_="azure-nextgen:network/v20201101:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20150501preview:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20150501preview:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20150615:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20150615:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20160330:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20160330:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20160601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20160601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20160901:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20160901:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20161201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20161201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20170301:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20170301:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20170601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20170601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20170801:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20170801:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20170901:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20170901:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20171001:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20171001:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20171101:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20171101:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20180101:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20180101:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20180201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20180201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20180401:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20180401:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20180601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20180601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20180701:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20180701:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20180801:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20180801:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20181001:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20181001:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20181101:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20181101:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20181201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20181201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20190201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20190201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20190401:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20190401:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20190601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20190601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20190701:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20190701:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20190801:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20190801:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20190901:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20190901:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20191101:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20191101:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20191201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20191201:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20200301:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20200301:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20200401:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20200401:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20200501:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20200501:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20200601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20200601:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20200701:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20200701:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-native:network/v20200801:ExpressRouteCircuitPeering"), pulumi.Alias(type_="azure-nextgen:network/v20200801:ExpressRouteCircuitPeering")])
opts = pulumi.ResourceOptions.merge(opts, alias_opts)
super(ExpressRouteCircuitPeering, __self__).__init__(
'azure-native:network/v20201101:ExpressRouteCircuitPeering',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None) -> 'ExpressRouteCircuitPeering':
"""
Get an existing ExpressRouteCircuitPeering resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = ExpressRouteCircuitPeeringArgs.__new__(ExpressRouteCircuitPeeringArgs)
__props__.__dict__["azure_asn"] = None
__props__.__dict__["connections"] = None
__props__.__dict__["etag"] = None
__props__.__dict__["express_route_connection"] = None
__props__.__dict__["gateway_manager_etag"] = None
__props__.__dict__["ipv6_peering_config"] = None
__props__.__dict__["last_modified_by"] = None
__props__.__dict__["microsoft_peering_config"] = None
__props__.__dict__["name"] = None
__props__.__dict__["peer_asn"] = None
__props__.__dict__["peered_connections"] = None
__props__.__dict__["peering_type"] = None
__props__.__dict__["primary_azure_port"] = None
__props__.__dict__["primary_peer_address_prefix"] = None
__props__.__dict__["provisioning_state"] = None
__props__.__dict__["route_filter"] = None
__props__.__dict__["secondary_azure_port"] = None
__props__.__dict__["secondary_peer_address_prefix"] = None
__props__.__dict__["shared_key"] = None
__props__.__dict__["state"] = None
__props__.__dict__["stats"] = None
__props__.__dict__["type"] = None
__props__.__dict__["vlan_id"] = None
return ExpressRouteCircuitPeering(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="azureASN")
def azure_asn(self) -> pulumi.Output[Optional[int]]:
"""
The Azure ASN.
"""
return pulumi.get(self, "azure_asn")
@property
@pulumi.getter
def connections(self) -> pulumi.Output[Optional[Sequence['outputs.ExpressRouteCircuitConnectionResponse']]]:
"""
The list of circuit connections associated with Azure Private Peering for this circuit.
"""
return pulumi.get(self, "connections")
@property
@pulumi.getter
def etag(self) -> pulumi.Output[str]:
"""
A unique read-only string that changes whenever the resource is updated.
"""
return pulumi.get(self, "etag")
@property
@pulumi.getter(name="expressRouteConnection")
def express_route_connection(self) -> pulumi.Output[Optional['outputs.ExpressRouteConnectionIdResponse']]:
"""
The ExpressRoute connection.
"""
return pulumi.get(self, "express_route_connection")
@property
@pulumi.getter(name="gatewayManagerEtag")
def gateway_manager_etag(self) -> pulumi.Output[Optional[str]]:
"""
The GatewayManager Etag.
"""
return pulumi.get(self, "gateway_manager_etag")
@property
@pulumi.getter(name="ipv6PeeringConfig")
def ipv6_peering_config(self) -> pulumi.Output[Optional['outputs.Ipv6ExpressRouteCircuitPeeringConfigResponse']]:
"""
The IPv6 peering configuration.
"""
return pulumi.get(self, "ipv6_peering_config")
@property
@pulumi.getter(name="lastModifiedBy")
def last_modified_by(self) -> pulumi.Output[str]:
"""
Who was the last to modify the peering.
"""
return pulumi.get(self, "last_modified_by")
@property
@pulumi.getter(name="microsoftPeeringConfig")
def microsoft_peering_config(self) -> pulumi.Output[Optional['outputs.ExpressRouteCircuitPeeringConfigResponse']]:
"""
The Microsoft peering configuration.
"""
return pulumi.get(self, "microsoft_peering_config")
@property
@pulumi.getter
def name(self) -> pulumi.Output[Optional[str]]:
"""
The name of the resource that is unique within a resource group. This name can be used to access the resource.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="peerASN")
def peer_asn(self) -> pulumi.Output[Optional[float]]:
"""
The peer ASN.
"""
return pulumi.get(self, "peer_asn")
@property
@pulumi.getter(name="peeredConnections")
def peered_connections(self) -> pulumi.Output[Sequence['outputs.PeerExpressRouteCircuitConnectionResponse']]:
"""
The list of peered circuit connections associated with Azure Private Peering for this circuit.
"""
return pulumi.get(self, "peered_connections")
@property
@pulumi.getter(name="peeringType")
def peering_type(self) -> pulumi.Output[Optional[str]]:
"""
The peering type.
"""
return pulumi.get(self, "peering_type")
@property
@pulumi.getter(name="primaryAzurePort")
def primary_azure_port(self) -> pulumi.Output[Optional[str]]:
"""
The primary port.
"""
return pulumi.get(self, "primary_azure_port")
@property
@pulumi.getter(name="primaryPeerAddressPrefix")
def primary_peer_address_prefix(self) -> pulumi.Output[Optional[str]]:
"""
The primary address prefix.
"""
return pulumi.get(self, "primary_peer_address_prefix")
@property
@pulumi.getter(name="provisioningState")
def provisioning_state(self) -> pulumi.Output[str]:
"""
The provisioning state of the express route circuit peering resource.
"""
return pulumi.get(self, "provisioning_state")
@property
@pulumi.getter(name="routeFilter")
def route_filter(self) -> pulumi.Output[Optional['outputs.SubResourceResponse']]:
"""
The reference to the RouteFilter resource.
"""
return pulumi.get(self, "route_filter")
@property
@pulumi.getter(name="secondaryAzurePort")
def secondary_azure_port(self) -> pulumi.Output[Optional[str]]:
"""
The secondary port.
"""
return pulumi.get(self, "secondary_azure_port")
@property
@pulumi.getter(name="secondaryPeerAddressPrefix")
def secondary_peer_address_prefix(self) -> pulumi.Output[Optional[str]]:
"""
The secondary address prefix.
"""
return pulumi.get(self, "secondary_peer_address_prefix")
@property
@pulumi.getter(name="sharedKey")
def shared_key(self) -> pulumi.Output[Optional[str]]:
"""
The shared key.
"""
return pulumi.get(self, "shared_key")
@property
@pulumi.getter
def state(self) -> pulumi.Output[Optional[str]]:
"""
The peering state.
"""
return pulumi.get(self, "state")
@property
@pulumi.getter
def stats(self) -> pulumi.Output[Optional['outputs.ExpressRouteCircuitStatsResponse']]:
"""
The peering stats of express route circuit.
"""
return pulumi.get(self, "stats")
@property
@pulumi.getter
def type(self) -> pulumi.Output[str]:
"""
Type of the resource.
"""
return pulumi.get(self, "type")
@property
@pulumi.getter(name="vlanId")
def vlan_id(self) -> pulumi.Output[Optional[int]]:
"""
The VLAN ID.
"""
return pulumi.get(self, "vlan_id")
| 52.203274 | 5,997 | 0.688985 | 4,074 | 38,265 | 6.182376 | 0.060137 | 0.068567 | 0.074681 | 0.057966 | 0.841823 | 0.785088 | 0.725215 | 0.485092 | 0.444277 | 0.38373 | 0 | 0.019875 | 0.199242 | 38,265 | 732 | 5,998 | 52.27459 | 0.802128 | 0.162917 | 0 | 0.432967 | 1 | 0 | 0.269966 | 0.206161 | 0 | 0 | 0 | 0 | 0 | 1 | 0.156044 | false | 0.002198 | 0.017582 | 0 | 0.276923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
b95970eb5136c6a12d2f4c9c24fcaa4771f8ddf5 | 45 | py | Python | branch.py | hcd2ha/cs3240-labdemo | 5d500d0adf6930997e1044de60d10c01d6308b65 | [
"MIT"
] | null | null | null | branch.py | hcd2ha/cs3240-labdemo | 5d500d0adf6930997e1044de60d10c01d6308b65 | [
"MIT"
] | null | null | null | branch.py | hcd2ha/cs3240-labdemo | 5d500d0adf6930997e1044de60d10c01d6308b65 | [
"MIT"
] | null | null | null | def addHello(msg):
print("Hello " + msg)
| 15 | 25 | 0.6 | 6 | 45 | 4.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 45 | 2 | 26 | 22.5 | 0.771429 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
b98c189c8ffab5c468ba458d06d2fd50335643a1 | 107 | py | Python | server/driver/__init__.py | githubalvin/pandorabox | 4bca09ea6df9c91fd2344a4346d037f5a1be643b | [
"MIT"
] | null | null | null | server/driver/__init__.py | githubalvin/pandorabox | 4bca09ea6df9c91fd2344a4346d037f5a1be643b | [
"MIT"
] | null | null | null | server/driver/__init__.py | githubalvin/pandorabox | 4bca09ea6df9c91fd2344a4346d037f5a1be643b | [
"MIT"
] | null | null | null | from .exchange import ExchangeAbstract, SubscribeHandle
__all__ = ["ExchangeAbstract", "SubscribeHandle"]
| 26.75 | 55 | 0.813084 | 8 | 107 | 10.375 | 0.75 | 0.746988 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093458 | 107 | 3 | 56 | 35.666667 | 0.85567 | 0 | 0 | 0 | 0 | 0 | 0.28972 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
b99c662a732357d20bc49e61f9a8da01917637ba | 64 | py | Python | torchpruner/init/init.py | Ocean-627/torch-model-compression | e317c10a92503f2793d0c4c06bb91ddeb512f4ac | [
"MIT"
] | 86 | 2021-06-21T11:09:49.000Z | 2022-03-21T09:09:26.000Z | torchpruner/init/init.py | Ocean-627/torch-model-compression | e317c10a92503f2793d0c4c06bb91ddeb512f4ac | [
"MIT"
] | 7 | 2021-06-26T09:37:37.000Z | 2022-03-09T03:49:11.000Z | torchpruner/init/init.py | Ocean-627/torch-model-compression | e317c10a92503f2793d0c4c06bb91ddeb512f4ac | [
"MIT"
] | 17 | 2021-08-18T17:06:44.000Z | 2022-02-28T09:14:38.000Z | from . import onnx_op_regist
from . import module_pruner_regist
| 21.333333 | 34 | 0.84375 | 10 | 64 | 5 | 0.7 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 64 | 2 | 35 | 32 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
b9a6f06892ac2b3da1474fd26ab79b0ba835b737 | 3,118 | py | Python | src/pythonFEA/structure/node.py | honzatomek/pythonFEA | c851c20800a06cc2084ef53dfd2ab67e7dfbc3b7 | [
"MIT"
] | null | null | null | src/pythonFEA/structure/node.py | honzatomek/pythonFEA | c851c20800a06cc2084ef53dfd2ab67e7dfbc3b7 | [
"MIT"
] | null | null | null | src/pythonFEA/structure/node.py | honzatomek/pythonFEA | c851c20800a06cc2084ef53dfd2ab67e7dfbc3b7 | [
"MIT"
] | null | null | null | from templates.errors import *
from templates.basic import Basic
import defaults
import numpy as np
class Node2D(Basic):
command = 'COOR2D'
type = 'Node2D'
def __init__(self, id, coor: list, label = None):
super().__init__(id=id, label=label)
self.x = coor[0]
self.y = coor[1]
def __str__(self):
if self.label is not None:
return f'{self.id:8n} : {"".join([" {0:15.4f}".format(x) for x in self.coors])} : \'{self.label:s}\''
else:
return f'{self.id:8n} : {"".join([" {0:15.4f}".format(x) for x in self.coors])} '
def __getitem__(self, id):
return [self.x, self.y][id]
def __iter__(self):
return iter([self.x, self.y])
@property
def coors(self):
return np.asarray([self.x, self.y], dtype=defaults.DEFAULT_FLOAT)
@coors.setter
def coors(self, coors):
if type(coors) in (list, tuple, np.ndarray):
if type(coors) is np.ndarray:
coors = coors.flatten()
coors = coors.astype(defaults.DEFAULT_FLOAT)
else:
coors = np.asarray(coors, dtype=defaults.DEFAULT_FLOAT)
if len(coors) != 2:
raise MissingCoordinate(f'repr(self): Coordinate is missing, len(coors) = {len(coors):n} ({str(coors)}).')
else:
self.x = coors[0]
self.y = coors[1]
else:
raise WrongType(f'repr(self): Coordinates are not iterable ({type(coors).__name__:s} != list, tuple, np.ndarray).')
@property
def x(self):
return self.__x
@x.setter
def x(self, x):
self.__x = defaults.DEFAULT_FLOAT(x)
@property
def y(self):
return self.__y
@y.setter
def y(self, y):
self.__y = defaults.DEFAULT_FLOAT(y)
class Node(Node2D):
command = 'COOR'
type = 'Node'
def __init__(self, id, coor: list, label = None):
super().__init__(id=id, coor=[coor[0], coor[1]], label=label)
self.z = coor[2]
def __str__(self):
if self.label is not None:
return f'{self.id:8n} : {"".join([" {0:15.4f}".format(x) for x in self.coors])} : \'{self.label:s}\''
else:
return f'{self.id:8n} : {"".join([" {0:15.4f}".format(x) for x in self.coors])} '
def __getitem__(self, id):
return [self.x, self.y, self.z][id]
def __iter__(self):
return iter([self.x, self.y, self.z])
@property
def coors(self):
return np.asarray([self.x, self.y, self.z], dtype=defaults.DEFAULT_FLOAT)
@coors.setter
def coors(self, coors):
if type(coors) in (list, tuple, np.ndarray):
if type(coors) is np.ndarray:
coors = coors.flatten()
coors = coors.astype(defaults.DEFAULT_FLOAT)
else:
coors = np.asarray(coors, dtype=defaults.DEFAULT_FLOAT)
if len(coors) != 3:
raise MissingCoordinate(f'repr(self): Coordinate is missing, len(coors) = {len(coors):n} ({str(coors)}).')
else:
self.x = coors[0]
self.y = coors[1]
self.z = coors[2]
else:
raise WrongType(f'repr(self): Coordinates are not iterable ({type(coors).__name__:s} != list, tuple, np.ndarray).')
@property
def z(self):
return self.__z
@z.setter
def z(self, z):
self.__z = defaults.DEFAULT_FLOAT(z)
| 27.350877 | 121 | 0.611289 | 467 | 3,118 | 3.933619 | 0.152034 | 0.032662 | 0.097986 | 0.032662 | 0.772455 | 0.772455 | 0.76429 | 0.76429 | 0.76429 | 0.76429 | 0 | 0.014803 | 0.220013 | 3,118 | 113 | 122 | 27.59292 | 0.740543 | 0 | 0 | 0.566667 | 0 | 0.088889 | 0.210459 | 0.016041 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.044444 | 0.1 | 0.455556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
b9bafc5e7db4aa17b212fa52c62fdde6a76adf77 | 101 | py | Python | organisation/admin.py | hiyqapp/hiYq | 9947c05718f59c6eab94e3f441c3f3227b758248 | [
"BSD-3-Clause"
] | 1 | 2015-03-26T15:59:27.000Z | 2015-03-26T15:59:27.000Z | q_and_a/apps/organisations/admin.py | DemocracyClub/candidate_questions | 95954de7cb818f113691445b71b4668177a7f5b9 | [
"BSD-3-Clause"
] | 16 | 2015-03-01T21:20:21.000Z | 2015-04-21T23:12:14.000Z | organisation/admin.py | hiyqapp/hiYq | 9947c05718f59c6eab94e3f441c3f3227b758248 | [
"BSD-3-Clause"
] | 3 | 2015-03-23T08:57:03.000Z | 2015-04-17T09:10:08.000Z | from django.contrib import admin
from .models import Organisation
admin.site.register(Organisation)
| 20.2 | 33 | 0.841584 | 13 | 101 | 6.538462 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09901 | 101 | 4 | 34 | 25.25 | 0.934066 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
b9d5fde4d6530775f61e0fc7def591931f2e4a30 | 13,797 | py | Python | aries_cloudagent/askar/didcomm/tests/test_v2.py | kuraakhilesh8230/aries-cloudagent-python | ee384d1330f6a50ff45a507392ce54f92900f23a | [
"Apache-2.0"
] | 247 | 2019-07-02T21:10:21.000Z | 2022-03-30T13:55:33.000Z | aries_cloudagent/askar/didcomm/tests/test_v2.py | kuraakhilesh8230/aries-cloudagent-python | ee384d1330f6a50ff45a507392ce54f92900f23a | [
"Apache-2.0"
] | 1,462 | 2019-07-02T20:57:30.000Z | 2022-03-31T23:13:35.000Z | aries_cloudagent/askar/didcomm/tests/test_v2.py | kuraakhilesh8230/aries-cloudagent-python | ee384d1330f6a50ff45a507392ce54f92900f23a | [
"Apache-2.0"
] | 377 | 2019-06-20T21:01:31.000Z | 2022-03-30T08:27:53.000Z | import json
from asynctest import mock as async_mock
import pytest
from aries_askar import AskarError, Key, KeyAlg, Session
from ....config.injection_context import InjectionContext
from ....utils.jwe import JweRecipient, b64url, JweEnvelope
from ...profile import AskarProfileManager
from .. import v2 as test_module
ALICE_KID = "did:example:alice#key-1"
BOB_KID = "did:example:bob#key-1"
CAROL_KID = "did:example:carol#key-2"
MESSAGE = b"Expecto patronum"
@pytest.fixture()
async def session():
context = InjectionContext()
profile = await AskarProfileManager().provision(
context,
{
"name": ":memory:",
"key": await AskarProfileManager.generate_store_key(),
"key_derivation_method": "RAW", # much faster than using argon-hashed keys
},
)
async with profile.session() as session:
yield session.handle
del session
await profile.close()
@pytest.mark.askar
class TestAskarDidCommV2:
@pytest.mark.asyncio
async def test_es_round_trip(self, session: Session):
alg = KeyAlg.X25519
bob_sk = Key.generate(alg)
bob_pk = Key.from_jwk(bob_sk.get_jwk_public())
carol_sk = Key.generate(KeyAlg.P256) # testing mixed recipient key types
carol_pk = Key.from_jwk(carol_sk.get_jwk_public())
enc_message = test_module.ecdh_es_encrypt(
{BOB_KID: bob_pk, CAROL_KID: carol_pk}, MESSAGE
)
# receiver must have the private keypair accessible
await session.insert_key("my_sk", bob_sk, tags={"kid": BOB_KID})
plaintext, recip_kid, sender_kid = await test_module.unpack_message(
session, enc_message
)
assert recip_kid == BOB_KID
assert sender_kid is None
assert plaintext == MESSAGE
@pytest.mark.asyncio
async def test_es_encrypt_x(self, session: Session):
alg = KeyAlg.X25519
bob_sk = Key.generate(alg)
bob_pk = Key.from_jwk(bob_sk.get_jwk_public())
with pytest.raises(
test_module.DidcommEnvelopeError, match="No message recipients"
):
_ = test_module.ecdh_es_encrypt({}, MESSAGE)
with async_mock.patch(
"aries_askar.Key.generate",
async_mock.MagicMock(side_effect=AskarError(99, "")),
):
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Error creating content encryption key",
):
_ = test_module.ecdh_es_encrypt({BOB_KID: bob_pk}, MESSAGE)
with async_mock.patch(
"aries_askar.Key.aead_encrypt",
async_mock.MagicMock(side_effect=AskarError(99, "")),
):
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Error encrypting",
):
_ = test_module.ecdh_es_encrypt({BOB_KID: bob_pk}, MESSAGE)
@pytest.mark.asyncio
async def test_es_decrypt_x(self):
alg = KeyAlg.X25519
bob_sk = Key.generate(alg)
message_unknown_alg = JweEnvelope(
protected={"alg": "NOT-SUPPORTED"},
)
message_unknown_alg.add_recipient(
JweRecipient(encrypted_key=b"0000", header={"kid": BOB_KID})
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Unsupported ECDH-ES algorithm",
):
_ = test_module.ecdh_es_decrypt(
message_unknown_alg,
BOB_KID,
bob_sk,
)
message_unknown_enc = JweEnvelope(
protected={"alg": "ECDH-ES+A128KW", "enc": "UNKNOWN"},
)
message_unknown_enc.add_recipient(
JweRecipient(encrypted_key=b"0000", header={"kid": BOB_KID})
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Unsupported ECDH-ES content encryption",
):
_ = test_module.ecdh_es_decrypt(
message_unknown_enc,
BOB_KID,
bob_sk,
)
message_invalid_epk = JweEnvelope(
protected={"alg": "ECDH-ES+A128KW", "enc": "A256GCM", "epk": {}},
)
message_invalid_epk.add_recipient(
JweRecipient(encrypted_key=b"0000", header={"kid": BOB_KID})
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Error loading ephemeral key",
):
_ = test_module.ecdh_es_decrypt(
message_invalid_epk,
BOB_KID,
bob_sk,
)
@pytest.mark.asyncio
async def test_1pu_round_trip(self, session: Session):
alg = KeyAlg.X25519
alice_sk = Key.generate(alg)
alice_pk = Key.from_jwk(alice_sk.get_jwk_public())
bob_sk = Key.generate(alg)
bob_pk = Key.from_jwk(bob_sk.get_jwk_public())
enc_message = test_module.ecdh_1pu_encrypt(
{BOB_KID: bob_pk}, ALICE_KID, alice_sk, MESSAGE
)
# receiver must have the private keypair accessible
await session.insert_key("my_sk", bob_sk, tags={"kid": BOB_KID})
# for now at least, insert the sender public key so it can be resolved
await session.insert_key("alice_pk", alice_pk, tags={"kid": ALICE_KID})
plaintext, recip_kid, sender_kid = await test_module.unpack_message(
session, enc_message
)
assert recip_kid == BOB_KID
assert sender_kid == ALICE_KID
assert plaintext == MESSAGE
@pytest.mark.asyncio
async def test_1pu_encrypt_x(self, session: Session):
alg = KeyAlg.X25519
alice_sk = Key.generate(alg)
bob_sk = Key.generate(alg)
bob_pk = Key.from_jwk(bob_sk.get_jwk_public())
with pytest.raises(
test_module.DidcommEnvelopeError, match="No message recipients"
):
_ = test_module.ecdh_1pu_encrypt({}, ALICE_KID, alice_sk, MESSAGE)
alt_sk = Key.generate(KeyAlg.P256)
alt_pk = Key.from_jwk(alt_sk.get_jwk_public())
with pytest.raises(
test_module.DidcommEnvelopeError, match="key types must be consistent"
):
_ = test_module.ecdh_1pu_encrypt(
{BOB_KID: bob_pk, "alt": alt_pk}, ALICE_KID, alice_sk, MESSAGE
)
with async_mock.patch(
"aries_askar.Key.generate",
async_mock.MagicMock(side_effect=AskarError(99, "")),
):
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Error creating content encryption key",
):
_ = test_module.ecdh_1pu_encrypt(
{BOB_KID: bob_pk}, ALICE_KID, alice_sk, MESSAGE
)
with async_mock.patch(
"aries_askar.Key.aead_encrypt",
async_mock.MagicMock(side_effect=AskarError(99, "")),
):
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Error encrypting",
):
_ = test_module.ecdh_1pu_encrypt(
{BOB_KID: bob_pk}, ALICE_KID, alice_sk, MESSAGE
)
@pytest.mark.asyncio
async def test_1pu_decrypt_x(self):
alg = KeyAlg.X25519
alice_sk = Key.generate(alg)
alice_pk = Key.from_jwk(alice_sk.get_jwk_public())
bob_sk = Key.generate(alg)
message_unknown_alg = JweEnvelope(
protected={"alg": "NOT-SUPPORTED"},
)
message_unknown_alg.add_recipient(
JweRecipient(encrypted_key=b"0000", header={"kid": BOB_KID})
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Unsupported ECDH-1PU algorithm",
):
_ = test_module.ecdh_1pu_decrypt(
message_unknown_alg,
BOB_KID,
bob_sk,
alice_pk,
)
message_unknown_enc = JweEnvelope(
protected={"alg": "ECDH-1PU+A128KW", "enc": "UNKNOWN"},
)
message_unknown_enc.add_recipient(
JweRecipient(encrypted_key=b"0000", header={"kid": BOB_KID})
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Unsupported ECDH-1PU content encryption",
):
_ = test_module.ecdh_1pu_decrypt(
message_unknown_enc, BOB_KID, bob_sk, alice_pk
)
message_invalid_epk = JweEnvelope(
protected={"alg": "ECDH-1PU+A128KW", "enc": "A256CBC-HS512", "epk": {}},
)
message_invalid_epk.add_recipient(
JweRecipient(encrypted_key=b"0000", header={"kid": BOB_KID})
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Error loading ephemeral key",
):
_ = test_module.ecdh_1pu_decrypt(
message_invalid_epk,
BOB_KID,
bob_sk,
alice_pk,
)
@pytest.mark.asyncio
async def test_unpack_message_any_x(self, session: Session):
message_invalid = "{}"
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Invalid",
):
_ = await test_module.unpack_message(session, message_invalid)
message_unknown_alg = json.dumps(
{
"protected": b64url(json.dumps({"alg": "NOT-SUPPORTED"})),
"recipients": [{"header": {"kid": "bob"}, "encrypted_key": "MTIzNA"}],
"iv": "MTIzNA",
"ciphertext": "MTIzNA",
"tag": "MTIzNA",
}
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Unsupported DIDComm encryption",
):
_ = await test_module.unpack_message(session, message_unknown_alg)
message_unknown_recip = json.dumps(
{
"protected": b64url(json.dumps({"alg": "ECDH-ES+A128KW"})),
"recipients": [{"header": {"kid": "bob"}, "encrypted_key": "MTIzNA"}],
"iv": "MTIzNA",
"ciphertext": "MTIzNA",
"tag": "MTIzNA",
}
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="No recognized recipient key",
):
_ = await test_module.unpack_message(session, message_unknown_recip)
@pytest.mark.asyncio
async def test_unpack_message_1pu_x(self, session: Session):
alg = KeyAlg.X25519
alice_sk = Key.generate(alg)
alice_pk = Key.from_jwk(alice_sk.get_jwk_public())
bob_sk = Key.generate(alg)
bob_pk = Key.from_jwk(bob_sk.get_jwk_public())
# receiver must have the private keypair accessible
await session.insert_key("my_sk", bob_sk, tags={"kid": BOB_KID})
# for now at least, insert the sender public key so it can be resolved
await session.insert_key("alice_pk", alice_pk, tags={"kid": ALICE_KID})
message_1pu_no_skid = json.dumps(
{
"protected": b64url(json.dumps({"alg": "ECDH-1PU+A128KW"})),
"recipients": [{"header": {"kid": BOB_KID}, "encrypted_key": "MTIzNA"}],
"iv": "MTIzNA",
"ciphertext": "MTIzNA",
"tag": "MTIzNA",
}
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Sender key ID not provided",
):
_ = await test_module.unpack_message(session, message_1pu_no_skid)
message_1pu_unknown_skid = json.dumps(
{
"protected": b64url(
json.dumps({"alg": "ECDH-1PU+A128KW", "skid": "UNKNOWN"})
),
"recipients": [{"header": {"kid": BOB_KID}, "encrypted_key": "MTIzNA"}],
"iv": "MTIzNA",
"ciphertext": "MTIzNA",
"tag": "MTIzNA",
}
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Sender public key not found",
):
_ = await test_module.unpack_message(session, message_1pu_unknown_skid)
message_1pu_apu_invalid = json.dumps(
{
"protected": b64url(
json.dumps({"alg": "ECDH-1PU+A128KW", "skid": "A", "apu": "A"})
),
"recipients": [{"header": {"kid": BOB_KID}, "encrypted_key": "MTIzNA"}],
"iv": "MTIzNA",
"ciphertext": "MTIzNA",
"tag": "MTIzNA",
}
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Invalid apu value",
):
_ = await test_module.unpack_message(session, message_1pu_apu_invalid)
message_1pu_apu_mismatch = json.dumps(
{
"protected": b64url(
json.dumps(
{
"alg": "ECDH-1PU+A128KW",
"skid": ALICE_KID,
"apu": b64url("UNKNOWN"),
}
)
),
"recipients": [{"header": {"kid": BOB_KID}, "encrypted_key": "MTIzNA"}],
"iv": "MTIzNA",
"ciphertext": "MTIzNA",
"tag": "MTIzNA",
}
)
with pytest.raises(
test_module.DidcommEnvelopeError,
match="Mismatch between skid and apu",
):
_ = await test_module.unpack_message(session, message_1pu_apu_mismatch)
| 34.4925 | 88 | 0.558672 | 1,425 | 13,797 | 5.130526 | 0.118596 | 0.061551 | 0.04377 | 0.054712 | 0.837915 | 0.820544 | 0.81562 | 0.757489 | 0.659554 | 0.618383 | 0 | 0.017219 | 0.334928 | 13,797 | 399 | 89 | 34.578947 | 0.779534 | 0.026238 | 0 | 0.604651 | 0 | 0 | 0.121174 | 0.0143 | 0 | 0 | 0 | 0 | 0.017442 | 1 | 0 | false | 0 | 0.023256 | 0 | 0.026163 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
6a3297ad6d33c158c4b8b39b0ba5df707d61a6e9 | 90 | py | Python | main/admin.py | Lord-sarcastic/inventory | 58450faae30b83e4fc0e8bdec1496b2ecc98030e | [
"MIT"
] | null | null | null | main/admin.py | Lord-sarcastic/inventory | 58450faae30b83e4fc0e8bdec1496b2ecc98030e | [
"MIT"
] | null | null | null | main/admin.py | Lord-sarcastic/inventory | 58450faae30b83e4fc0e8bdec1496b2ecc98030e | [
"MIT"
] | null | null | null | from django.contrib import admin
from main.models import Item
admin.site.register(Item)
| 15 | 32 | 0.811111 | 14 | 90 | 5.214286 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122222 | 90 | 5 | 33 | 18 | 0.924051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
dbe3cd90d588e31b5f9373d8ef7de34c2348fc4a | 287 | py | Python | exercise3.py | LukasHoste/Algorithms_Vives_exercises_test | 9bc17450352ed1adf57833f739ba9e58bd2deb80 | [
"Apache-2.0"
] | null | null | null | exercise3.py | LukasHoste/Algorithms_Vives_exercises_test | 9bc17450352ed1adf57833f739ba9e58bd2deb80 | [
"Apache-2.0"
] | null | null | null | exercise3.py | LukasHoste/Algorithms_Vives_exercises_test | 9bc17450352ed1adf57833f739ba9e58bd2deb80 | [
"Apache-2.0"
] | 1 | 2021-10-14T07:30:44.000Z | 2021-10-14T07:30:44.000Z | class BinaryCounter:
def __init__(self):
self.__binaryled = '1000'
def decimal(self):
return int(self,2)
def hex(self):
return hex(self.decimal())
def increment(self):
decimal = self.decimal() +1
def decrement(self):
decimal = self.decimal() -1 | 17.9375 | 31 | 0.627178 | 36 | 287 | 4.833333 | 0.416667 | 0.316092 | 0.172414 | 0.252874 | 0.264368 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03211 | 0.240418 | 287 | 16 | 32 | 17.9375 | 0.766055 | 0 | 0 | 0 | 0 | 0 | 0.013889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.454545 | false | 0 | 0 | 0.181818 | 0.727273 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
e000166503ea598e803b9a0aae7660f17e621f22 | 128 | py | Python | anthill/people/admin.py | nsmgr8/anthill | 25f556bd34efbb5fd0ac36d3904d43b893ac9172 | [
"BSD-3-Clause"
] | 1 | 2015-11-06T01:09:29.000Z | 2015-11-06T01:09:29.000Z | anthill/people/admin.py | nsmgr8/anthill | 25f556bd34efbb5fd0ac36d3904d43b893ac9172 | [
"BSD-3-Clause"
] | null | null | null | anthill/people/admin.py | nsmgr8/anthill | 25f556bd34efbb5fd0ac36d3904d43b893ac9172 | [
"BSD-3-Clause"
] | null | null | null | from django.contrib.gis import admin
from anthill.people.models import Profile
admin.site.register(Profile, admin.OSMGeoAdmin)
| 25.6 | 47 | 0.835938 | 18 | 128 | 5.944444 | 0.722222 | 0.224299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085938 | 128 | 4 | 48 | 32 | 0.91453 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
e0320a394c007ef914bda38f47fec27991f52997 | 46 | py | Python | lets_do_dns/do_domain/__init__.py | Jitsusama/lets-do-dns | faff4bf45e9a4be438e15afbe5caa249fe1e5210 | [
"Apache-2.0"
] | 8 | 2017-05-07T12:17:16.000Z | 2018-07-02T23:35:16.000Z | lets_do_dns/do_domain/__init__.py | Jitsusama/lets-do-dns | faff4bf45e9a4be438e15afbe5caa249fe1e5210 | [
"Apache-2.0"
] | null | null | null | lets_do_dns/do_domain/__init__.py | Jitsusama/lets-do-dns | faff4bf45e9a4be438e15afbe5caa249fe1e5210 | [
"Apache-2.0"
] | null | null | null | """Handles DigitalOcean Domain Record CRD."""
| 23 | 45 | 0.73913 | 5 | 46 | 6.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108696 | 46 | 1 | 46 | 46 | 0.829268 | 0.847826 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
e04a56d9657c04f8474bf454825b20af5f9ed43f | 59 | py | Python | LibFace/__init__.py | iyah4888/DlibFaceDetectorDemo | 160b1df3b502f5ff83d08c2ab676c3cfb6d8b528 | [
"MIT"
] | 2 | 2018-08-24T08:05:00.000Z | 2018-10-04T21:47:56.000Z | LibFace/__init__.py | iyah4888/DlibFaceDetectorDemo | 160b1df3b502f5ff83d08c2ab676c3cfb6d8b528 | [
"MIT"
] | null | null | null | LibFace/__init__.py | iyah4888/DlibFaceDetectorDemo | 160b1df3b502f5ff83d08c2ab676c3cfb6d8b528 | [
"MIT"
] | 1 | 2018-09-14T17:14:45.000Z | 2018-09-14T17:14:45.000Z | from .lib_facedet import FaceDetector, FaceLandmarkDetector | 59 | 59 | 0.898305 | 6 | 59 | 8.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067797 | 59 | 1 | 59 | 59 | 0.945455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
1615d6f5287616d0e6d79a3ee1d3a7efa7431a65 | 184 | py | Python | third_party/fabric/basics/examples/fabfile_2.py | jeremyosborne/examples-python | 5900b3a4f47d59de0a32d3257a8b90a44e80fdcd | [
"MIT"
] | null | null | null | third_party/fabric/basics/examples/fabfile_2.py | jeremyosborne/examples-python | 5900b3a4f47d59de0a32d3257a8b90a44e80fdcd | [
"MIT"
] | null | null | null | third_party/fabric/basics/examples/fabfile_2.py | jeremyosborne/examples-python | 5900b3a4f47d59de0a32d3257a8b90a44e80fdcd | [
"MIT"
] | null | null | null | # Import our environment. The act of importing will set everything.
import fabenv
from fabric.api import run
def host_type():
run('uname -s')
def diskspace():
run('df')
| 16.727273 | 67 | 0.684783 | 27 | 184 | 4.62963 | 0.814815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.211957 | 184 | 10 | 68 | 18.4 | 0.862069 | 0.353261 | 0 | 0 | 0 | 0 | 0.08547 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
161a95326d33facdb9ea3c3a988fd9337d90fade | 55 | py | Python | models/__init__.py | flowlight0/talkingdata-adtracking-fraud-detection | 18a7d19a260a545a2fd13c4f02afcd27245afd8b | [
"MIT"
] | 214 | 2018-05-16T04:15:57.000Z | 2022-02-19T15:10:50.000Z | models/__init__.py | dayeren/talkingdata-adtracking-fraud-detection | 18a7d19a260a545a2fd13c4f02afcd27245afd8b | [
"MIT"
] | 4 | 2018-06-10T15:10:45.000Z | 2019-07-16T05:50:38.000Z | models/__init__.py | dayeren/talkingdata-adtracking-fraud-detection | 18a7d19a260a545a2fd13c4f02afcd27245afd8b | [
"MIT"
] | 61 | 2018-05-29T09:53:49.000Z | 2021-10-16T22:24:02.000Z | from .lightgbm import LightGBM
from .base import Model
| 18.333333 | 30 | 0.818182 | 8 | 55 | 5.625 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145455 | 55 | 2 | 31 | 27.5 | 0.957447 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
161da5cd93ac13e73f208cda658ece9c4bbf2ed4 | 25,044 | py | Python | tests/test_database.py | mitchellkrogza/PyFunceble | b2b0d3fc18097cd630eed849ee99d961263d607b | [
"MIT"
] | null | null | null | tests/test_database.py | mitchellkrogza/PyFunceble | b2b0d3fc18097cd630eed849ee99d961263d607b | [
"MIT"
] | null | null | null | tests/test_database.py | mitchellkrogza/PyFunceble | b2b0d3fc18097cd630eed849ee99d961263d607b | [
"MIT"
] | null | null | null | # pylint:disable=line-too-long
"""
The tool to check the availability or syntax of domains, IPv4 or URL.
::
██████╗ ██╗ ██╗███████╗██╗ ██╗███╗ ██╗ ██████╗███████╗██████╗ ██╗ ███████╗
██╔══██╗╚██╗ ██╔╝██╔════╝██║ ██║████╗ ██║██╔════╝██╔════╝██╔══██╗██║ ██╔════╝
██████╔╝ ╚████╔╝ █████╗ ██║ ██║██╔██╗ ██║██║ █████╗ ██████╔╝██║ █████╗
██╔═══╝ ╚██╔╝ ██╔══╝ ██║ ██║██║╚██╗██║██║ ██╔══╝ ██╔══██╗██║ ██╔══╝
██║ ██║ ██║ ╚██████╔╝██║ ╚████║╚██████╗███████╗██████╔╝███████╗███████╗
╚═╝ ╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚═══╝ ╚═════╝╚══════╝╚═════╝ ╚══════╝╚══════╝
This submodule will test PyFunceble.database.
Author:
Nissar Chababy, @funilrys, contactTATAfunilrysTODTODcom
Special thanks:
https://pyfunceble.readthedocs.io/en/master/special-thanks.html
Contributors:
http://pyfunceble.readthedocs.io/en/master/special-thanks.html
Project link:
https://github.com/funilrys/PyFunceble
Project documentation:
https://pyfunceble.readthedocs.io/en/master/
Project homepage:
https://funilrys.github.io/PyFunceble/
License:
::
MIT License
Copyright (c) 2017-2019 Nissar Chababy
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""
# pylint: enable=line-too-long
# pylint: disable=protected-access, import-error
from unittest import TestCase
from unittest import main as launch_tests
import PyFunceble
from PyFunceble.database import Inactive, Whois
from PyFunceble.helpers import Dict, File
class TestDatabaseInactive(TestCase):
"""
Test PyFunceble.database.Inactive
"""
def setUp(self):
"""
Setup everything needed for the test
"""
PyFunceble.load_config(True)
PyFunceble.INTERN["file_to_test"] = "this_file_is_a_ghost"
self.file = (
PyFunceble.CURRENT_DIRECTORY
+ PyFunceble.OUTPUTS["default_files"]["inactive_db"]
)
self.expected_content = {
PyFunceble.INTERN["file_to_test"]: {
"1523447416": ["mÿethèrwallét.com", "||google.com^"],
"to_test": ["myètherwället.com"],
}
}
self.time_past = str(int(PyFunceble.time()) - (365 * 24 * 3600))
self.time_future = str(int(PyFunceble.time()) + (365 * 24 * 3600))
def test_file_not_exist(self):
"""
Test if everything is right with the generated
file.
"""
File(self.file).delete()
expected = False
actual = PyFunceble.path.isfile(self.file)
self.assertEqual(expected, actual)
def test_retrieve_file_not_exist(self):
"""
Test the case that we want to retrieve a file that does not exist.
"""
self.test_file_not_exist()
Inactive()._retrieve()
expected = {}
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_retrieve_file_exist(self):
"""
Test the case that we want to retrieve a file that exist.
"""
self.test_file_not_exist()
Dict(self.expected_content).to_json(self.file)
Inactive()._retrieve()
self.assertEqual(self.expected_content, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_backup(self):
"""
Test the backup of the Inactive.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = self.expected_content
Inactive()._backup()
self.assertEqual(
self.expected_content, Dict().from_json(File(self.file).read())
)
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_add_to_test__path_not_exist(self): # pylint: disable=invalid-name
"""
Test Inactive._add_to_test() for the case that the currently tested
path is not present into the Inactive.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {}
Inactive()._add_to_test("hello.world")
expected = {PyFunceble.INTERN["file_to_test"]: {"to_test": ["hello.world"]}}
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_add_to_test__path_exist(self): # pylint: disable=invalid-name
"""
Test Inactive._add_to_test() for the case that the path exist
in the Inactive.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {"to_test": ["hello.world"]}
}
expected = {
PyFunceble.INTERN["file_to_test"]: {
"to_test": ["hello.world", "world.hello"]
}
}
Inactive()._add_to_test("world.hello")
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_add_to_test__path_exist_not_test(self): # pylint: disable=invalid-name
"""
Test Inactive._add_to_test() for the case that the path exist
in the database but the not `to_test` index.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {PyFunceble.INTERN["file_to_test"]: {}}
expected = {PyFunceble.INTERN["file_to_test"]: {"to_test": ["hello.world"]}}
Inactive()._add_to_test("hello.world")
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_to_test__path_not_exist(self): # pylint: disable=invalid-name
"""
Test Inactive.to_test() for the case that the path does not exist.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {}
expected = {PyFunceble.INTERN["file_to_test"]: {}}
Inactive().to_test()
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_to_test__path_exist_time_past(self): # pylint: disable=invalid-name
"""
Test Inactive.to_test() for the case that the path exist but
the timestamp is in the past.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
self.time_past: ["hello.world", "world.hello"],
"to_test": ["github.com"],
}
}
expected = {
PyFunceble.INTERN["file_to_test"]: {
"to_test": ["github.com", "hello.world", "world.hello"]
}
}
Dict(PyFunceble.INTERN["inactive_db"]).to_json(self.file)
Inactive().to_test()
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_to_test__path_exist_time_future(self): # pylint: disable=invalid-name
"""
Test Inactive.to_test() for the case that the path exist but
the timestamp is in the future.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
self.time_future: ["hello.world", "world.hello"]
}
}
expected = {
PyFunceble.INTERN["file_to_test"]: {
self.time_future: ["hello.world", "world.hello"],
"to_test": [],
}
}
Dict(PyFunceble.INTERN["inactive_db"]).to_json(self.file)
Inactive().to_test()
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_timestamp_path_does_not_exit(self): # pylint: disable=invalid-name
"""
Test Inactive.timestamp() for the case that the path does
not exist but the time is in the past.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {}
expected = int(PyFunceble.time())
actual = Inactive()._timestamp()
self.assertGreaterEqual(expected, actual)
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_timestamp_path_exist_time_past(self): # pylint: disable=invalid-name
"""
Test Inactive.timestamp() for the case that the path exist but
the time is in the past.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
self.time_past: ["hello.world", "world.hello"]
}
}
expected = int(PyFunceble.time())
actual = Inactive()._timestamp()
self.assertGreaterEqual(expected, actual)
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_timestamp_path_exist_time_future(self): # pylint: disable=invalid-name
"""
Test Inactive.timestamp() for the case that the path exist but
the time is in the future.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
self.time_future: ["hello.world", "world.hello"]
}
}
expected = int(self.time_future)
actual = Inactive()._timestamp()
self.assertEqual(expected, actual)
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_add_path_does_not_exist(self): # pylint: disable=invalid-name
"""
Test Inactive.add() for the case that the path does not exist.
"""
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {}
PyFunceble.INTERN["to_test"] = "hello.world"
expected = {
PyFunceble.INTERN["file_to_test"]: {
str(Inactive()._timestamp()): ["hello.world"]
}
}
Inactive().add()
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
PyFunceble.INTERN["inactive_db"] = {}
PyFunceble.INTERN["to_test"] = "http://hello.world"
expected = {
PyFunceble.INTERN["file_to_test"]: {
str(Inactive()._timestamp()): ["http://hello.world"]
}
}
Inactive().add()
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
PyFunceble.INTERN["to_test"] = ""
self.test_file_not_exist()
def test_add_file_path_not_present(self): # pylint: disable=invalid-name
"""
Test Inactive.add() for the case that the path is not
present into the Inactive.
"""
self.test_file_not_exist()
timestamp = str(Inactive()._timestamp())
PyFunceble.INTERN["to_test"] = "hello.world"
expected = {PyFunceble.INTERN["file_to_test"]: {timestamp: ["hello.world"]}}
Inactive().add()
actual = Dict().from_json(File(self.file).read())
self.assertEqual(expected, actual)
del PyFunceble.INTERN["to_test"]
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_add_file_path_present(self): # pylint: disable=invalid-name
"""
Test Inactive.add() for the case that the path is present
into the Inactive.
"""
self.test_file_not_exist()
timestamp = str(Inactive()._timestamp())
PyFunceble.INTERN["to_test"] = "hello.world"
expected = {
PyFunceble.INTERN["file_to_test"]: {
timestamp: ["world.hello", "hello.world"]
}
}
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {timestamp: ["world.hello"]}
}
Inactive().add()
actual = Dict().from_json(File(self.file).read())
self.assertEqual(expected, actual)
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
str(int(timestamp) - (5 * 24 * 3600)): ["world.hello"]
}
}
expected = {
PyFunceble.INTERN["file_to_test"]: {
str(int(timestamp) - (5 * 24 * 3600)): ["world.hello"],
timestamp: ["hello.world"],
}
}
Inactive().add()
actual = Dict().from_json(File(self.file).read())
self.assertEqual(expected, actual)
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
str(int(timestamp) - (5 * 24 * 3600)): ["world.hello"],
"to_test": [PyFunceble.INTERN["to_test"]],
}
}
expected = {
PyFunceble.INTERN["file_to_test"]: {
str(int(timestamp) - (5 * 24 * 3600)): ["world.hello"],
timestamp: ["hello.world"],
"to_test": [],
}
}
Inactive().add()
actual = Dict().from_json(File(self.file).read())
self.assertEqual(expected, actual)
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_remove(self):
"""
Test Inactive.remove().
"""
timestamp = str(Inactive()._timestamp())
self.test_file_not_exist()
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
timestamp: ["hello.world"],
"to_test": ["hello.world", "world.hello"],
}
}
PyFunceble.INTERN["to_test"] = "hello.world"
expected = {
PyFunceble.INTERN["file_to_test"]: {
timestamp: [],
"to_test": ["world.hello"],
}
}
Inactive().remove()
self.assertEqual(expected, PyFunceble.INTERN["inactive_db"])
del PyFunceble.INTERN["inactive_db"]
self.test_file_not_exist()
def test_content(self):
"""
Test Inactive.content().
"""
self.test_file_not_exist()
# Test of the case that everything goes right !
timestamp = str(Inactive()._timestamp())
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
timestamp: ["hello.world", "world.hello", "hello-world.com"],
"to_test": ["hello.world", "world.hello"],
}
}
PyFunceble.INTERN["to_test"] = "hello.world"
expected = ["hello.world", "world.hello", "hello-world.com"]
actual = Inactive().content()
self.assertEqual(expected, actual)
# Test of the case that the database is not activated
PyFunceble.CONFIGURATION["inactive_database"] = False
expected = []
actual = Inactive().content()
self.assertEqual(expected, actual)
# Test of the case that there is nothing in the Inactive.
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
"to_test": ["hello.world", "world.hello"]
}
}
actual = Inactive().content()
self.assertEqual(expected, actual)
del PyFunceble.INTERN["inactive_db"]
del PyFunceble.INTERN["to_test"]
self.test_file_not_exist()
def test_is_present(self):
"""
Test Inactive.is_present().
"""
PyFunceble.CONFIGURATION["inactive_database"] = True
self.test_file_not_exist()
# Test of the case that everything goes right !
timestamp = str(Inactive()._timestamp())
PyFunceble.INTERN["inactive_db"] = {
PyFunceble.INTERN["file_to_test"]: {
timestamp: ["hello.world", "world.hello", "hello-world.com"],
"to_test": ["hello.world", "world.hello"],
}
}
PyFunceble.INTERN["to_test"] = "hello.world"
expected = True
actual = Inactive().is_present()
self.assertEqual(expected, actual)
PyFunceble.INTERN["to_test"] = "github.com"
expected = False
actual = Inactive().is_present()
self.assertEqual(expected, actual)
del PyFunceble.INTERN["inactive_db"]
del PyFunceble.INTERN["to_test"]
self.test_file_not_exist()
class TestDatabaseWhois(TestCase):
"""
Test PyFunceble.database.Whois
"""
def setUp(self):
"""
Setup everything needed for the test
"""
PyFunceble.INTERN["file_to_test"] = "this_file_is_a_ghost"
self.file = (
PyFunceble.CURRENT_DIRECTORY
+ PyFunceble.OUTPUTS["default_files"]["whois_db"]
)
self.expected_content = {
PyFunceble.INTERN["file_to_test"]: {
"google.com": {
"epoch": "1600034400",
"expiration_date": "14-sep-2020",
"state": "future",
},
"github.com": {
"epoch": "1602194400",
"expiration_date": "09-oct-2020",
"state": "future",
},
}
}
def test_file_not_exist(self):
"""
Test if everything is right with the generated
file.
"""
File(self.file).delete()
expected = False
actual = PyFunceble.path.isfile(self.file)
self.assertEqual(expected, actual)
def test_authorization(self):
"""
Test the authorization method.
"""
PyFunceble.CONFIGURATION["no_whois"] = True
PyFunceble.CONFIGURATION["whois_database"] = False
expected = False
self.assertEqual(expected, Whois()._authorization())
PyFunceble.CONFIGURATION["no_whois"] = False
PyFunceble.CONFIGURATION["whois_database"] = False
self.assertEqual(expected, Whois()._authorization())
PyFunceble.CONFIGURATION["no_whois"] = True
PyFunceble.CONFIGURATION["whois_database"] = True
self.assertEqual(expected, Whois()._authorization())
PyFunceble.CONFIGURATION["no_whois"] = False
PyFunceble.CONFIGURATION["whois_database"] = True
expected = True
self.assertEqual(expected, Whois()._authorization())
def test_retrieve_file_not_exist(self):
"""
Test the case that we want to retrieve a file that does not exist.
"""
self.test_file_not_exist()
Whois()._retrieve()
expected = {}
self.assertEqual(expected, PyFunceble.INTERN["whois_db"])
del PyFunceble.INTERN["whois_db"]
self.test_file_not_exist()
def test_retrieve_file_exist(self):
"""
Test the case that we want to retrieve a file that exist.
"""
self.test_file_not_exist()
Dict(self.expected_content).to_json(self.file)
Whois()._retrieve()
self.assertEqual(self.expected_content, PyFunceble.INTERN["whois_db"])
del PyFunceble.INTERN["whois_db"]
self.test_file_not_exist()
def test_backup(self):
"""
Test the backup of the database.
"""
self.test_file_not_exist()
PyFunceble.INTERN["whois_db"] = self.expected_content
Whois()._backup()
expected = True
actual = PyFunceble.path.isfile(self.file)
self.assertEqual(expected, actual)
del PyFunceble.INTERN["whois_db"]
Whois()._retrieve()
self.assertEqual(self.expected_content, PyFunceble.INTERN["whois_db"])
del PyFunceble.INTERN["whois_db"]
self.test_file_not_exist()
def test_is_in_database(self):
"""
Test the check.
"""
self.test_file_not_exist()
PyFunceble.INTERN["whois_db"] = self.expected_content
PyFunceble.INTERN["to_test"] = "google.com"
expected = True
actual = Whois().is_in_database()
self.assertEqual(expected, actual)
PyFunceble.INTERN["to_test"] = "hello.google.com"
expected = False
actual = Whois().is_in_database()
self.assertEqual(expected, actual)
del PyFunceble.INTERN["to_test"]
del PyFunceble.INTERN["whois_db"]
self.test_file_not_exist()
def test_is_time_older(self):
"""
Test if a time is older or not than the current date.
"""
self.test_file_not_exist()
PyFunceble.INTERN["whois_db"] = self.expected_content
PyFunceble.INTERN["to_test"] = "google.com"
PyFunceble.INTERN["whois_db"][PyFunceble.INTERN["file_to_test"]]["google.com"][
"epoch"
] = PyFunceble.time() - (15 * (60 * 60 * 24))
expected = True
actual = Whois().is_time_older()
self.assertEqual(expected, actual)
PyFunceble.INTERN["whois_db"][PyFunceble.INTERN["file_to_test"]]["google.com"][
"epoch"
] = PyFunceble.time() + (15 * (60 * 60 * 24))
expected = False
actual = Whois().is_time_older()
self.assertEqual(expected, actual)
del PyFunceble.INTERN["to_test"]
del PyFunceble.INTERN["whois_db"]
self.test_file_not_exist()
def test_get_expiration_date(self):
"""
Test the way we get the expiration date from the database.
"""
self.test_file_not_exist()
PyFunceble.INTERN["whois_db"] = self.expected_content
PyFunceble.INTERN["to_test"] = "google.com"
expected = "14-sep-2020"
actual = Whois().get_expiration_date()
self.assertEqual(expected, actual)
PyFunceble.INTERN["to_test"] = "hello.google.com"
expected = None
actual = Whois().get_expiration_date()
self.assertEqual(expected, actual)
del PyFunceble.INTERN["to_test"]
del PyFunceble.INTERN["whois_db"]
self.test_file_not_exist()
def test_add(self):
"""
Test the addition for the case that the element is not into
the database.
"""
self.test_file_not_exist()
del PyFunceble.INTERN["file_to_test"]
PyFunceble.INTERN["whois_db"] = {}
PyFunceble.INTERN["to_test"] = "microsoft.google.com"
epoch = str(
int(PyFunceble.mktime(PyFunceble.strptime("25-dec-2022", "%d-%b-%Y")))
)
expected = {
"single_testing": {
"microsoft.google.com": {
"epoch": epoch,
"expiration_date": "25-dec-2022",
"state": "future",
}
}
}
Whois("25-dec-2022").add()
self.assertEqual(expected, PyFunceble.INTERN["whois_db"])
PyFunceble.INTERN["whois_db"]["single_testing"]["microsoft.google.com"][
"state"
] = "hello"
Whois("25-dec-2022").add()
self.assertEqual(expected, PyFunceble.INTERN["whois_db"])
epoch = str(
int(PyFunceble.mktime(PyFunceble.strptime("25-dec-2007", "%d-%b-%Y")))
)
expected = {
"single_testing": {
"microsoft.google.com": {
"epoch": epoch,
"expiration_date": "25-dec-2007",
"state": "past",
}
}
}
Whois("25-dec-2007").add()
self.assertEqual(expected, PyFunceble.INTERN["whois_db"])
del PyFunceble.INTERN["to_test"]
del PyFunceble.INTERN["whois_db"]
self.test_file_not_exist()
if __name__ == "__main__":
launch_tests()
| 28.819333 | 88 | 0.576306 | 2,711 | 25,044 | 5.259314 | 0.104758 | 0.143639 | 0.045448 | 0.058353 | 0.78854 | 0.772899 | 0.758311 | 0.74239 | 0.71658 | 0.682284 | 0 | 0.009043 | 0.289131 | 25,044 | 868 | 89 | 28.852535 | 0.76987 | 0.191902 | 0 | 0.678959 | 0 | 0 | 0.147823 | 0 | 0 | 0 | 0 | 0 | 0.093275 | 1 | 0.065076 | false | 0 | 0.010846 | 0 | 0.08026 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
1620d30f3c01960e3562df1891c1f2b6bddfd613 | 42 | py | Python | python/testData/breadcrumbs/functionBodyCaret.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/breadcrumbs/functionBodyCaret.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/breadcrumbs/functionBodyCaret.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | class A:
def foo(self):
pass<caret>
| 10.5 | 16 | 0.595238 | 7 | 42 | 3.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.261905 | 42 | 3 | 17 | 14 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.333333 | 0 | null | null | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
1621a7449f45c65c56fe6bc96c2cebc0c60903d3 | 4,540 | py | Python | pyVmomi/_typeinfo_core.py | xweichu/pyvmomi | 77aedef02974a63517a079c482e49fd9890c09a4 | [
"Apache-2.0"
] | null | null | null | pyVmomi/_typeinfo_core.py | xweichu/pyvmomi | 77aedef02974a63517a079c482e49fd9890c09a4 | [
"Apache-2.0"
] | null | null | null | pyVmomi/_typeinfo_core.py | xweichu/pyvmomi | 77aedef02974a63517a079c482e49fd9890c09a4 | [
"Apache-2.0"
] | null | null | null | # ******* WARNING - AUTO GENERATED CODE - DO NOT EDIT *******
from .VmomiSupport import CreateDataType, CreateManagedType
from .VmomiSupport import CreateEnumType
from .VmomiSupport import AddVersion, AddVersionParent
from .VmomiSupport import AddBreakingChangesInfo
from .VmomiSupport import F_LINK, F_LINKABLE
from .VmomiSupport import F_OPTIONAL, F_SECRET
from .VmomiSupport import newestVersions, ltsVersions
from .VmomiSupport import dottedVersions, oldestVersions
AddVersion("vmodl.version.version0", "", "", 0, "vim25")
AddVersion("vmodl.version.version1", "", "", 0, "vim25")
AddVersion("vmodl.version.version2", "", "", 0, "vim25")
AddVersionParent("vmodl.version.version0", "vmodl.version.version0")
AddVersionParent("vmodl.version.version1", "vmodl.version.version0")
AddVersionParent("vmodl.version.version1", "vmodl.version.version1")
AddVersionParent("vmodl.version.version2", "vmodl.version.version0")
AddVersionParent("vmodl.version.version2", "vmodl.version.version1")
AddVersionParent("vmodl.version.version2", "vmodl.version.version2")
CreateDataType("vmodl.DynamicArray", "DynamicArray", "vmodl.DataObject", "vmodl.version.version0", [("dynamicType", "string", "vmodl.version.version0", F_OPTIONAL), ("val", "anyType[]", "vmodl.version.version0", 0)])
CreateDataType("vmodl.DynamicData", "DynamicData", "vmodl.DataObject", "vmodl.version.version0", [("dynamicType", "string", "vmodl.version.version0", F_OPTIONAL), ("dynamicProperty", "vmodl.DynamicProperty[]", "vmodl.version.version0", F_OPTIONAL)])
CreateDataType("vmodl.DynamicProperty", "DynamicProperty", "vmodl.DataObject", "vmodl.version.version0", [("name", "vmodl.PropertyPath", "vmodl.version.version0", 0), ("val", "anyType", "vmodl.version.version0", 0)])
CreateDataType("vmodl.KeyAnyValue", "KeyAnyValue", "vmodl.DynamicData", "vmodl.version.version1", [("key", "string", "vmodl.version.version1", 0), ("value", "anyType", "vmodl.version.version1", 0)])
CreateDataType("vmodl.LocalizableMessage", "LocalizableMessage", "vmodl.DynamicData", "vmodl.version.version1", [("key", "string", "vmodl.version.version1", 0), ("arg", "vmodl.KeyAnyValue[]", "vmodl.version.version1", F_OPTIONAL), ("message", "string", "vmodl.version.version1", F_OPTIONAL)])
CreateDataType("vmodl.fault.HostCommunication", "HostCommunication", "vmodl.RuntimeFault", "vmodl.version.version0", None)
CreateDataType("vmodl.fault.HostNotConnected", "HostNotConnected", "vmodl.fault.HostCommunication", "vmodl.version.version0", None)
CreateDataType("vmodl.fault.HostNotReachable", "HostNotReachable", "vmodl.fault.HostCommunication", "vmodl.version.version0", None)
CreateDataType("vmodl.fault.InvalidArgument", "InvalidArgument", "vmodl.RuntimeFault", "vmodl.version.version0", [("invalidProperty", "vmodl.PropertyPath", "vmodl.version.version0", F_OPTIONAL)])
CreateDataType("vmodl.fault.InvalidRequest", "InvalidRequest", "vmodl.RuntimeFault", "vmodl.version.version0", None)
CreateDataType("vmodl.fault.InvalidType", "InvalidType", "vmodl.fault.InvalidRequest", "vmodl.version.version0", [("argument", "vmodl.PropertyPath", "vmodl.version.version0", F_OPTIONAL)])
CreateDataType("vmodl.fault.ManagedObjectNotFound", "ManagedObjectNotFound", "vmodl.RuntimeFault", "vmodl.version.version0", [("obj", "vmodl.ManagedObject", "vmodl.version.version0", 0)])
CreateDataType("vmodl.fault.MethodNotFound", "MethodNotFound", "vmodl.fault.InvalidRequest", "vmodl.version.version0", [("receiver", "vmodl.ManagedObject", "vmodl.version.version0", 0), ("method", "string", "vmodl.version.version0", 0)])
CreateDataType("vmodl.fault.NotEnoughLicenses", "NotEnoughLicenses", "vmodl.RuntimeFault", "vmodl.version.version0", None)
CreateDataType("vmodl.fault.NotImplemented", "NotImplemented", "vmodl.RuntimeFault", "vmodl.version.version0", None)
CreateDataType("vmodl.fault.NotSupported", "NotSupported", "vmodl.RuntimeFault", "vmodl.version.version0", None)
CreateDataType("vmodl.fault.RequestCanceled", "RequestCanceled", "vmodl.RuntimeFault", "vmodl.version.version0", None)
CreateDataType("vmodl.fault.SecurityError", "SecurityError", "vmodl.RuntimeFault", "vmodl.version.version0", None)
CreateDataType("vmodl.fault.SystemError", "SystemError", "vmodl.RuntimeFault", "vmodl.version.version0", [("reason", "string", "vmodl.version.version0", 0)])
CreateDataType("vmodl.fault.UnexpectedFault", "UnexpectedFault", "vmodl.RuntimeFault", "vmodl.version.version0", [("faultName", "vmodl.TypeName", "vmodl.version.version0", 0), ("fault", "vmodl.MethodFault", "vmodl.version.version0", F_OPTIONAL)])
| 110.731707 | 292 | 0.76652 | 448 | 4,540 | 7.741071 | 0.169643 | 0.186851 | 0.213379 | 0.091984 | 0.622261 | 0.510669 | 0.456171 | 0.429354 | 0.371107 | 0.168397 | 0 | 0.017225 | 0.053744 | 4,540 | 40 | 293 | 113.5 | 0.790037 | 0.012996 | 0 | 0 | 1 | 0 | 0.611074 | 0.399196 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.216216 | 0 | 0.216216 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
1698dbe35ace674de4ad3b5eb3dae98e1839002c | 2,811 | py | Python | win32_event_log/datadog_checks/win32_event_log/config_models/defaults.py | gaffneyd4/integrations-core | 4c7725c9f1be4985381aad9740e7186f16a87976 | [
"BSD-3-Clause"
] | null | null | null | win32_event_log/datadog_checks/win32_event_log/config_models/defaults.py | gaffneyd4/integrations-core | 4c7725c9f1be4985381aad9740e7186f16a87976 | [
"BSD-3-Clause"
] | null | null | null | win32_event_log/datadog_checks/win32_event_log/config_models/defaults.py | gaffneyd4/integrations-core | 4c7725c9f1be4985381aad9740e7186f16a87976 | [
"BSD-3-Clause"
] | null | null | null | # (C) Datadog, Inc. 2021-present
# All rights reserved
# Licensed under a 3-clause BSD style license (see LICENSE)
from datadog_checks.base.utils.models.fields import get_default_field_value
def shared_default_event_priority(field, value):
return 'normal'
def shared_event_priority(field, value):
return 'normal'
def shared_interpret_messages(field, value):
return True
def shared_service(field, value):
return get_default_field_value(field, value)
def shared_tag_event_id(field, value):
return False
def shared_tag_sid(field, value):
return False
def instance_auth_type(field, value):
return 'default'
def instance_bookmark_frequency(field, value):
return '<PAYLOAD_SIZE>'
def instance_domain(field, value):
return get_default_field_value(field, value)
def instance_empty_default_hostname(field, value):
return False
def instance_event_format(field, value):
return ['Message']
def instance_event_id(field, value):
return get_default_field_value(field, value)
def instance_event_priority(field, value):
return 'normal'
def instance_excluded_messages(field, value):
return get_default_field_value(field, value)
def instance_filters(field, value):
return get_default_field_value(field, value)
def instance_host(field, value):
return 'localhost'
def instance_included_messages(field, value):
return get_default_field_value(field, value)
def instance_interpret_messages(field, value):
return True
def instance_legacy_mode(field, value):
return True
def instance_log_file(field, value):
return get_default_field_value(field, value)
def instance_message_filters(field, value):
return get_default_field_value(field, value)
def instance_min_collection_interval(field, value):
return 15
def instance_password(field, value):
return get_default_field_value(field, value)
def instance_path(field, value):
return get_default_field_value(field, value)
def instance_payload_size(field, value):
return 10
def instance_query(field, value):
return get_default_field_value(field, value)
def instance_server(field, value):
return 'localhost'
def instance_service(field, value):
return get_default_field_value(field, value)
def instance_source_name(field, value):
return get_default_field_value(field, value)
def instance_start(field, value):
return 'now'
def instance_tag_event_id(field, value):
return False
def instance_tag_sid(field, value):
return False
def instance_tags(field, value):
return get_default_field_value(field, value)
def instance_timeout(field, value):
return 5
def instance_type(field, value):
return ['information']
def instance_user(field, value):
return get_default_field_value(field, value)
| 18.865772 | 75 | 0.763785 | 385 | 2,811 | 5.283117 | 0.21039 | 0.3294 | 0.283186 | 0.157325 | 0.682399 | 0.678958 | 0.60472 | 0.546706 | 0.432645 | 0.41003 | 0 | 0.004207 | 0.154393 | 2,811 | 148 | 76 | 18.993243 | 0.851493 | 0.03842 | 0 | 0.383562 | 0 | 0 | 0.0289 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.493151 | false | 0.013699 | 0.013699 | 0.493151 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
1699c8e78fb5ee135d87e7d41add2ada6c45cc35 | 124,762 | py | Python | Rules.py | draguscloud/MultiWorld-Utilities | 0157f348cd46fce1bc6ff58a7eaec9f0a482f4be | [
"MIT"
] | null | null | null | Rules.py | draguscloud/MultiWorld-Utilities | 0157f348cd46fce1bc6ff58a7eaec9f0a482f4be | [
"MIT"
] | null | null | null | Rules.py | draguscloud/MultiWorld-Utilities | 0157f348cd46fce1bc6ff58a7eaec9f0a482f4be | [
"MIT"
] | null | null | null | import collections
import logging
import OverworldGlitchRules
from BaseClasses import RegionType, World, Entrance
from Items import ItemFactory, progression_items, item_name_groups
from OverworldGlitchRules import overworld_glitches_rules, no_logic_rules
def set_rules(world, player):
locality_rules(world, player)
if world.logic[player] == 'nologic':
logging.getLogger('').info(
'WARNING! Seeds generated under this logic often require major glitches and may be impossible!')
world.get_region('Menu', player).can_reach_private = lambda state: True
no_logic_rules(world, player)
for exit in world.get_region('Menu', player).exits:
exit.hide_path = True
return
crossover_logic(world, player)
global_rules(world, player)
if world.mode[player] != 'inverted':
default_rules(world, player)
if world.mode[player] == 'open':
open_rules(world, player)
elif world.mode[player] == 'standard':
standard_rules(world, player)
elif world.mode[player] == 'inverted':
open_rules(world, player)
inverted_rules(world, player)
else:
raise NotImplementedError('Not implemented yet')
if world.logic[player] == 'noglitches':
no_glitches_rules(world, player)
elif world.logic[player] == 'owglitches':
# Initially setting no_glitches_rules to set the baseline rules for some
# entrances. The overworld_glitches_rules set is primarily additive.
no_glitches_rules(world, player)
fake_flipper_rules(world, player)
overworld_glitches_rules(world, player)
elif world.logic[player] == 'minorglitches':
no_glitches_rules(world, player)
fake_flipper_rules(world, player)
else:
raise NotImplementedError('Not implemented yet')
if world.goal[player] == 'dungeons':
# require all dungeons to beat ganon
add_rule(world.get_location('Ganon', player), lambda state: state.can_reach('Master Sword Pedestal', 'Location', player) and state.has('Beat Agahnim 1', player) and state.has('Beat Agahnim 2', player) and state.has_crystals(7, player))
elif world.goal[player] == 'ganon':
# require aga2 to beat ganon
add_rule(world.get_location('Ganon', player), lambda state: state.has('Beat Agahnim 2', player))
if world.mode[player] != 'inverted':
set_big_bomb_rules(world, player)
if world.logic[player] == 'owglitches' and world.shuffle not in ('insanity', 'insanity_legacy'):
path_to_courtyard = mirrorless_path_to_castle_courtyard(world, player)
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.world.get_entrance('Dark Death Mountain Offset Mirror', player).can_reach(state) and all(rule(state) for rule in path_to_courtyard), 'or')
else:
set_inverted_big_bomb_rules(world, player)
# if swamp and dam have not been moved we require mirror for swamp palace
if not world.swamp_patch_required[player]:
add_rule(world.get_entrance('Swamp Palace Moat', player), lambda state: state.has_Mirror(player))
# GT Entrance may be required for Turtle Rock for OWG and < 7 required
ganons_tower = world.get_entrance('Inverted Ganons Tower' if world.mode[player] == 'inverted' else 'Ganons Tower', player)
if world.crystals_needed_for_gt[player] == 7 and not (world.logic[player] == 'owglitches' and world.mode[player] != 'inverted'):
set_rule(ganons_tower, lambda state: False)
set_trock_key_rules(world, player)
set_rule(ganons_tower, lambda state: state.has_crystals(world.crystals_needed_for_gt[player], player))
if world.mode[player] != 'inverted' and world.logic[player] == 'owglitches':
add_rule(world.get_entrance('Ganons Tower', player), lambda state: state.world.get_entrance('Ganons Tower Ascent', player).can_reach(state), 'or')
set_bunny_rules(world, player, world.mode[player] == 'inverted')
def mirrorless_path_to_castle_courtyard(world, player):
# If Agahnim is defeated then the courtyard needs to be accessible without using the mirror for the mirror offset glitch.
# Only considering the secret passage for now (in non-insanity shuffle). Basically, if it's Ganon you need the master sword.
start = world.get_entrance('Hyrule Castle Secret Entrance Drop', player)
target = world.get_region('Hyrule Castle Courtyard', player)
seen = {start.parent_region, start.connected_region}
queue = collections.deque([(start.connected_region, [])])
while queue:
(current, path) = queue.popleft()
for entrance in current.exits:
if entrance.connected_region not in seen:
new_path = path + [entrance.access_rule]
if entrance.connected_region == target:
return new_path
else:
queue.append((entrance.connected_region, new_path))
def set_rule(spot, rule):
spot.access_rule = rule
def set_defeat_dungeon_boss_rule(location):
# Lambda required to defer evaluation of dungeon.boss since it will change later if boos shuffle is used
set_rule(location, lambda state: location.parent_region.dungeon.boss.can_defeat(state))
def set_always_allow(spot, rule):
spot.always_allow = rule
def add_rule(spot, rule, combine='and'):
old_rule = spot.access_rule
if combine == 'or':
spot.access_rule = lambda state: rule(state) or old_rule(state)
else:
spot.access_rule = lambda state: rule(state) and old_rule(state)
def add_lamp_requirement(spot, player):
add_rule(spot, lambda state: state.has('Lamp', player, state.world.lamps_needed_for_dark_rooms))
def forbid_item(location, item, player: int):
old_rule = location.item_rule
location.item_rule = lambda i: (i.name != item or i.player != player) and old_rule(i)
def forbid_items(location, items: set, player: int):
old_rule = location.item_rule
location.item_rule = lambda i: (i.player != player or i.name not in items) and old_rule(i)
def add_item_rule(location, rule):
old_rule = location.item_rule
location.item_rule = lambda item: rule(item) and old_rule(item)
def item_in_locations(state, item, player, locations):
for location in locations:
if item_name(state, location[0], location[1]) == (item, player):
return True
return False
def item_name(state, location, player):
location = state.world.get_location(location, player)
if location.item is None:
return None
return (location.item.name, location.item.player)
def locality_rules(world, player):
if world.goal[player] in ["localtriforcehunt", "localganontriforcehunt"]:
world.local_items[player].add('Triforce Piece')
if world.local_items[player]:
for location in world.get_locations():
if location.player != player:
forbid_items(location, world.local_items[player], player)
non_crossover_items = (item_name_groups["Small Keys"] | item_name_groups["Big Keys"] | progression_items) - {
"Small Key (Universal)"}
def crossover_logic(world, player):
""" Simple and not graceful solution to logic loops if you mix no logic and logic.
Making it so that logical progression cannot be placed in no logic worlds."""
no_logic_players = set()
for other_player in world.player_ids:
if world.logic[other_player] == 'nologic':
no_logic_players.add(other_player)
if no_logic_players:
for location in world.get_locations():
if location.player in no_logic_players:
forbid_items(location, non_crossover_items, player)
def global_rules(world, player):
# ganon can only carry triforce
add_item_rule(world.get_location('Ganon', player), lambda item: item.name == 'Triforce' and item.player == player)
# determines which S&Q locations are available - hide from paths since it isn't an in-game location
world.get_region('Menu', player).can_reach_private = lambda state: True
for exit in world.get_region('Menu', player).exits:
exit.hide_path = True
set_rule(world.get_entrance('Old Man S&Q', player), lambda state: state.can_reach('Old Man', 'Location', player))
set_rule(world.get_location('Sunken Treasure', player), lambda state: state.has('Open Floodgate', player))
set_rule(world.get_location('Dark Blacksmith Ruins', player), lambda state: state.has('Return Smith', player))
set_rule(world.get_location('Purple Chest', player),
lambda state: state.has('Pick Up Purple Chest', player)) # Can S&Q with chest
set_rule(world.get_location('Ether Tablet', player), lambda state: state.has('Book of Mudora', player) and state.has_beam_sword(player))
set_rule(world.get_location('Master Sword Pedestal', player), lambda state: state.has('Red Pendant', player) and state.has('Blue Pendant', player) and state.has('Green Pendant', player))
set_rule(world.get_location('Missing Smith', player), lambda state: state.has('Get Frog', player) and state.can_reach('Blacksmiths Hut', 'Region', player)) # Can't S&Q with smith
set_rule(world.get_location('Blacksmith', player), lambda state: state.has('Return Smith', player))
set_rule(world.get_location('Magic Bat', player), lambda state: state.has('Magic Powder', player))
set_rule(world.get_location('Sick Kid', player), lambda state: state.has_bottle(player))
set_rule(world.get_location('Library', player), lambda state: state.has_Boots(player))
set_rule(world.get_location('Mimic Cave', player), lambda state: state.has('Hammer', player))
set_rule(world.get_location('Sahasrahla', player), lambda state: state.has('Green Pendant', player))
set_rule(world.get_location('Spike Cave', player), lambda state:
state.has('Hammer', player) and state.can_lift_rocks(player) and
((state.has('Cape', player) and state.can_extend_magic(player, 16, True)) or
(state.has('Cane of Byrna', player) and
(state.can_extend_magic(player, 12, True) or
(state.world.can_take_damage[player] and (state.has_Boots(player) or state.has_hearts(player, 4))))))
)
set_rule(world.get_location('Hookshot Cave - Top Right', player), lambda state: state.has('Hookshot', player))
set_rule(world.get_location('Hookshot Cave - Top Left', player), lambda state: state.has('Hookshot', player))
set_rule(world.get_location('Hookshot Cave - Bottom Right', player),
lambda state: state.has('Hookshot', player) or state.has('Pegasus Boots', player))
set_rule(world.get_location('Hookshot Cave - Bottom Left', player), lambda state: state.has('Hookshot', player))
set_rule(world.get_entrance('Sewers Door', player),
lambda state: state.has_key('Small Key (Hyrule Castle)', player) or (world.retro[player] and world.mode[
player] == 'standard')) # standard retro cannot access the shop
set_rule(world.get_entrance('Sewers Back Door', player),
lambda state: state.has_key('Small Key (Hyrule Castle)', player))
set_rule(world.get_entrance('Agahnim 1', player),
lambda state: state.has_sword(player) and state.has_key('Small Key (Agahnims Tower)', player, 2))
set_defeat_dungeon_boss_rule(world.get_location('Agahnim 1', player))
set_rule(world.get_location('Castle Tower - Room 03', player), lambda state: state.can_kill_most_things(player, 8))
set_rule(world.get_location('Castle Tower - Dark Maze', player),
lambda state: state.can_kill_most_things(player, 8) and state.has_key('Small Key (Agahnims Tower)',
player))
set_rule(world.get_location('Eastern Palace - Big Chest', player),
lambda state: state.has('Big Key (Eastern Palace)', player))
set_rule(world.get_location('Eastern Palace - Boss', player),
lambda state: state.can_shoot_arrows(player) and state.has('Big Key (Eastern Palace)',
player) and state.world.get_location(
'Eastern Palace - Boss', player).parent_region.dungeon.boss.can_defeat(state))
set_rule(world.get_location('Eastern Palace - Prize', player),
lambda state: state.can_shoot_arrows(player) and state.has('Big Key (Eastern Palace)',
player) and state.world.get_location(
'Eastern Palace - Prize', player).parent_region.dungeon.boss.can_defeat(state))
for location in ['Eastern Palace - Boss', 'Eastern Palace - Big Chest']:
forbid_item(world.get_location(location, player), 'Big Key (Eastern Palace)', player)
set_rule(world.get_location('Desert Palace - Big Chest', player), lambda state: state.has('Big Key (Desert Palace)', player))
set_rule(world.get_location('Desert Palace - Torch', player), lambda state: state.has_Boots(player))
set_rule(world.get_entrance('Desert Palace East Wing', player), lambda state: state.has_key('Small Key (Desert Palace)', player))
set_rule(world.get_location('Desert Palace - Prize', player), lambda state: state.has_key('Small Key (Desert Palace)', player) and state.has('Big Key (Desert Palace)', player) and state.has_fire_source(player) and state.world.get_location('Desert Palace - Prize', player).parent_region.dungeon.boss.can_defeat(state))
set_rule(world.get_location('Desert Palace - Boss', player), lambda state: state.has_key('Small Key (Desert Palace)', player) and state.has('Big Key (Desert Palace)', player) and state.has_fire_source(player) and state.world.get_location('Desert Palace - Boss', player).parent_region.dungeon.boss.can_defeat(state))
for location in ['Desert Palace - Boss', 'Desert Palace - Big Chest']:
forbid_item(world.get_location(location, player), 'Big Key (Desert Palace)', player)
for location in ['Desert Palace - Boss', 'Desert Palace - Big Key Chest', 'Desert Palace - Compass Chest']:
forbid_item(world.get_location(location, player), 'Small Key (Desert Palace)', player)
# logic patch to prevent placing a crystal in Desert that's required to reach the required keys
if not (world.keyshuffle[player] and world.bigkeyshuffle[player]):
add_rule(world.get_location('Desert Palace - Prize', player), lambda state: state.world.get_region('Desert Palace Main (Outer)', player).can_reach(state))
set_rule(world.get_entrance('Tower of Hera Small Key Door', player), lambda state: state.has_key('Small Key (Tower of Hera)', player) or item_name(state, 'Tower of Hera - Big Key Chest', player) == ('Small Key (Tower of Hera)', player))
set_rule(world.get_entrance('Tower of Hera Big Key Door', player), lambda state: state.has('Big Key (Tower of Hera)', player))
set_rule(world.get_location('Tower of Hera - Big Chest', player), lambda state: state.has('Big Key (Tower of Hera)', player))
set_rule(world.get_location('Tower of Hera - Big Key Chest', player), lambda state: state.has_fire_source(player))
if world.accessibility[player] != 'locations':
set_always_allow(world.get_location('Tower of Hera - Big Key Chest', player), lambda state, item: item.name == 'Small Key (Tower of Hera)' and item.player == player)
set_defeat_dungeon_boss_rule(world.get_location('Tower of Hera - Boss', player))
set_defeat_dungeon_boss_rule(world.get_location('Tower of Hera - Prize', player))
for location in ['Tower of Hera - Boss', 'Tower of Hera - Big Chest', 'Tower of Hera - Compass Chest']:
forbid_item(world.get_location(location, player), 'Big Key (Tower of Hera)', player)
# for location in ['Tower of Hera - Big Key Chest']:
# forbid_item(world.get_location(location, player), 'Small Key (Tower of Hera)', player)
set_rule(world.get_entrance('Swamp Palace Moat', player), lambda state: state.has('Flippers', player) and state.has('Open Floodgate', player))
set_rule(world.get_entrance('Swamp Palace Small Key Door', player), lambda state: state.has_key('Small Key (Swamp Palace)', player))
set_rule(world.get_entrance('Swamp Palace (Center)', player), lambda state: state.has('Hammer', player))
set_rule(world.get_location('Swamp Palace - Big Chest', player), lambda state: state.has('Big Key (Swamp Palace)', player) or item_name(state, 'Swamp Palace - Big Chest', player) == ('Big Key (Swamp Palace)', player))
if world.accessibility[player] != 'locations':
set_always_allow(world.get_location('Swamp Palace - Big Chest', player), lambda state, item: item.name == 'Big Key (Swamp Palace)' and item.player == player)
set_rule(world.get_entrance('Swamp Palace (North)', player), lambda state: state.has('Hookshot', player))
set_defeat_dungeon_boss_rule(world.get_location('Swamp Palace - Boss', player))
set_defeat_dungeon_boss_rule(world.get_location('Swamp Palace - Prize', player))
for location in ['Swamp Palace - Entrance']:
forbid_item(world.get_location(location, player), 'Big Key (Swamp Palace)', player)
set_rule(world.get_entrance('Thieves Town Big Key Door', player), lambda state: state.has('Big Key (Thieves Town)', player))
set_rule(world.get_entrance('Blind Fight', player), lambda state: state.has_key('Small Key (Thieves Town)', player))
set_defeat_dungeon_boss_rule(world.get_location('Thieves\' Town - Boss', player))
set_defeat_dungeon_boss_rule(world.get_location('Thieves\' Town - Prize', player))
set_rule(world.get_location('Thieves\' Town - Big Chest', player), lambda state: (state.has_key('Small Key (Thieves Town)', player) or item_name(state, 'Thieves\' Town - Big Chest', player) == ('Small Key (Thieves Town)', player)) and state.has('Hammer', player))
if world.accessibility[player] != 'locations':
set_always_allow(world.get_location('Thieves\' Town - Big Chest', player), lambda state, item: item.name == 'Small Key (Thieves Town)' and item.player == player and state.has('Hammer', player))
set_rule(world.get_location('Thieves\' Town - Attic', player), lambda state: state.has_key('Small Key (Thieves Town)', player))
for location in ['Thieves\' Town - Attic', 'Thieves\' Town - Big Chest', 'Thieves\' Town - Blind\'s Cell', 'Thieves\' Town - Boss']:
forbid_item(world.get_location(location, player), 'Big Key (Thieves Town)', player)
for location in ['Thieves\' Town - Attic', 'Thieves\' Town - Boss']:
forbid_item(world.get_location(location, player), 'Small Key (Thieves Town)', player)
set_rule(world.get_entrance('Skull Woods First Section South Door', player), lambda state: state.has_key('Small Key (Skull Woods)', player))
set_rule(world.get_entrance('Skull Woods First Section (Right) North Door', player), lambda state: state.has_key('Small Key (Skull Woods)', player))
set_rule(world.get_entrance('Skull Woods First Section West Door', player), lambda state: state.has_key('Small Key (Skull Woods)', player, 2)) # ideally would only be one key, but we may have spent thst key already on escaping the right section
set_rule(world.get_entrance('Skull Woods First Section (Left) Door to Exit', player), lambda state: state.has_key('Small Key (Skull Woods)', player, 2))
set_rule(world.get_location('Skull Woods - Big Chest', player), lambda state: state.has('Big Key (Skull Woods)', player) or item_name(state, 'Skull Woods - Big Chest', player) == ('Big Key (Skull Woods)', player))
if world.accessibility[player] != 'locations':
set_always_allow(world.get_location('Skull Woods - Big Chest', player), lambda state, item: item.name == 'Big Key (Skull Woods)' and item.player == player)
set_rule(world.get_entrance('Skull Woods Torch Room', player), lambda state: state.has_key('Small Key (Skull Woods)', player, 3) and state.has('Fire Rod', player) and state.has_sword(player)) # sword required for curtain
set_defeat_dungeon_boss_rule(world.get_location('Skull Woods - Boss', player))
set_defeat_dungeon_boss_rule(world.get_location('Skull Woods - Prize', player))
for location in ['Skull Woods - Boss']:
forbid_item(world.get_location(location, player), 'Small Key (Skull Woods)', player)
set_rule(world.get_entrance('Ice Palace Entrance Room', player), lambda state: state.can_melt_things(player))
set_rule(world.get_location('Ice Palace - Big Chest', player), lambda state: state.has('Big Key (Ice Palace)', player))
set_rule(world.get_entrance('Ice Palace (Kholdstare)', player), lambda state: state.can_lift_rocks(player) and state.has('Hammer', player) and state.has('Big Key (Ice Palace)', player) and (state.has_key('Small Key (Ice Palace)', player, 2) or (state.has('Cane of Somaria', player) and state.has_key('Small Key (Ice Palace)', player, 1))))
# TODO: investigate change from VT. Changed to hookshot or 2 keys (no checking for big key in specific chests)
set_rule(world.get_entrance('Ice Palace (East)', player), lambda state: (state.has('Hookshot', player) or (item_in_locations(state, 'Big Key (Ice Palace)', player, [('Ice Palace - Spike Room', player), ('Ice Palace - Big Key Chest', player), ('Ice Palace - Map Chest', player)]) and state.has_key('Small Key (Ice Palace)', player))) and (state.world.can_take_damage[player] or state.has('Hookshot', player) or state.has('Cape', player) or state.has('Cane of Byrna', player)))
set_rule(world.get_entrance('Ice Palace (East Top)', player), lambda state: state.can_lift_rocks(player) and state.has('Hammer', player))
set_defeat_dungeon_boss_rule(world.get_location('Ice Palace - Boss', player))
set_defeat_dungeon_boss_rule(world.get_location('Ice Palace - Prize', player))
for location in ['Ice Palace - Big Chest', 'Ice Palace - Boss']:
forbid_item(world.get_location(location, player), 'Big Key (Ice Palace)', player)
set_rule(world.get_entrance('Misery Mire Entrance Gap', player), lambda state: (state.has_Boots(player) or state.has('Hookshot', player)) and (state.has_sword(player) or state.has('Fire Rod', player) or state.has('Ice Rod', player) or state.has('Hammer', player) or state.has('Cane of Somaria', player) or state.can_shoot_arrows(player))) # need to defeat wizzrobes, bombs don't work ...
set_rule(world.get_location('Misery Mire - Big Chest', player), lambda state: state.has('Big Key (Misery Mire)', player))
set_rule(world.get_location('Misery Mire - Spike Chest', player), lambda state: (state.world.can_take_damage[player] and state.has_hearts(player, 4)) or state.has('Cane of Byrna', player) or state.has('Cape', player))
set_rule(world.get_entrance('Misery Mire Big Key Door', player), lambda state: state.has('Big Key (Misery Mire)', player))
# you can squander the free small key from the pot by opening the south door to the north west switch room, locking you out of accessing a color switch ...
# big key gives backdoor access to that from the teleporter in the north west
set_rule(world.get_location('Misery Mire - Map Chest', player), lambda state: state.has_key('Small Key (Misery Mire)', player, 1) or state.has('Big Key (Misery Mire)', player))
set_rule(world.get_location('Misery Mire - Main Lobby', player), lambda state: state.has_key('Small Key (Misery Mire)', player, 1) or state.has_key('Big Key (Misery Mire)', player))
# we can place a small key in the West wing iff it also contains/blocks the Big Key, as we cannot reach and softlock with the basement key door yet
set_rule(world.get_entrance('Misery Mire (West)', player), lambda state: state.has_key('Small Key (Misery Mire)', player, 2) if ((item_name(state, 'Misery Mire - Compass Chest', player) in [('Big Key (Misery Mire)', player)]) or
(item_name(state, 'Misery Mire - Big Key Chest', player) in [('Big Key (Misery Mire)', player)])) else state.has_key('Small Key (Misery Mire)', player, 3))
set_rule(world.get_location('Misery Mire - Compass Chest', player), lambda state: state.has_fire_source(player))
set_rule(world.get_location('Misery Mire - Big Key Chest', player), lambda state: state.has_fire_source(player))
set_rule(world.get_entrance('Misery Mire (Vitreous)', player), lambda state: state.has('Cane of Somaria', player))
set_defeat_dungeon_boss_rule(world.get_location('Misery Mire - Boss', player))
set_defeat_dungeon_boss_rule(world.get_location('Misery Mire - Prize', player))
for location in ['Misery Mire - Big Chest', 'Misery Mire - Boss']:
forbid_item(world.get_location(location, player), 'Big Key (Misery Mire)', player)
set_rule(world.get_entrance('Turtle Rock Entrance Gap', player), lambda state: state.has('Cane of Somaria', player))
set_rule(world.get_entrance('Turtle Rock Entrance Gap Reverse', player), lambda state: state.has('Cane of Somaria', player))
set_rule(world.get_location('Turtle Rock - Compass Chest', player), lambda state: state.has('Cane of Somaria', player)) # We could get here from the middle section without Cane as we don't cross the entrance gap!
set_rule(world.get_location('Turtle Rock - Roller Room - Left', player), lambda state: state.has('Cane of Somaria', player) and state.has('Fire Rod', player))
set_rule(world.get_location('Turtle Rock - Roller Room - Right', player), lambda state: state.has('Cane of Somaria', player) and state.has('Fire Rod', player))
set_rule(world.get_location('Turtle Rock - Big Chest', player), lambda state: state.has('Big Key (Turtle Rock)', player) and (state.has('Cane of Somaria', player) or state.has('Hookshot', player)))
set_rule(world.get_entrance('Turtle Rock (Big Chest) (North)', player), lambda state: state.has('Cane of Somaria', player) or state.has('Hookshot', player))
set_rule(world.get_entrance('Turtle Rock Big Key Door', player), lambda state: state.has('Big Key (Turtle Rock)', player))
set_rule(world.get_entrance('Turtle Rock (Dark Room) (North)', player), lambda state: state.has('Cane of Somaria', player))
set_rule(world.get_entrance('Turtle Rock (Dark Room) (South)', player), lambda state: state.has('Cane of Somaria', player))
set_rule(world.get_location('Turtle Rock - Eye Bridge - Bottom Left', player), lambda state: state.has('Cane of Byrna', player) or state.has('Cape', player) or state.has('Mirror Shield', player))
set_rule(world.get_location('Turtle Rock - Eye Bridge - Bottom Right', player), lambda state: state.has('Cane of Byrna', player) or state.has('Cape', player) or state.has('Mirror Shield', player))
set_rule(world.get_location('Turtle Rock - Eye Bridge - Top Left', player), lambda state: state.has('Cane of Byrna', player) or state.has('Cape', player) or state.has('Mirror Shield', player))
set_rule(world.get_location('Turtle Rock - Eye Bridge - Top Right', player), lambda state: state.has('Cane of Byrna', player) or state.has('Cape', player) or state.has('Mirror Shield', player))
set_rule(world.get_entrance('Turtle Rock (Trinexx)', player), lambda state: state.has_key('Small Key (Turtle Rock)', player, 4) and state.has('Big Key (Turtle Rock)', player) and state.has('Cane of Somaria', player))
set_defeat_dungeon_boss_rule(world.get_location('Turtle Rock - Boss', player))
set_defeat_dungeon_boss_rule(world.get_location('Turtle Rock - Prize', player))
set_rule(world.get_entrance('Palace of Darkness Bonk Wall', player), lambda state: state.can_shoot_arrows(player))
set_rule(world.get_entrance('Palace of Darkness Hammer Peg Drop', player), lambda state: state.has('Hammer', player))
set_rule(world.get_entrance('Palace of Darkness Bridge Room', player), lambda state: state.has_key('Small Key (Palace of Darkness)', player, 1)) # If we can reach any other small key door, we already have back door access to this area
set_rule(world.get_entrance('Palace of Darkness Big Key Door', player), lambda state: state.has_key('Small Key (Palace of Darkness)', player, 6) and state.has('Big Key (Palace of Darkness)', player) and state.can_shoot_arrows(player) and state.has('Hammer', player))
set_rule(world.get_entrance('Palace of Darkness (North)', player), lambda state: state.has_key('Small Key (Palace of Darkness)', player, 4))
set_rule(world.get_location('Palace of Darkness - Big Chest', player), lambda state: state.has('Big Key (Palace of Darkness)', player))
set_rule(world.get_entrance('Palace of Darkness Big Key Chest Staircase', player), lambda state: state.has_key('Small Key (Palace of Darkness)', player, 6) or (item_name(state, 'Palace of Darkness - Big Key Chest', player) in [('Small Key (Palace of Darkness)', player)] and state.has_key('Small Key (Palace of Darkness)', player, 3)))
if world.accessibility[player] != 'locations':
set_always_allow(world.get_location('Palace of Darkness - Big Key Chest', player), lambda state, item: item.name == 'Small Key (Palace of Darkness)' and item.player == player and state.has_key('Small Key (Palace of Darkness)', player, 5))
else:
forbid_item(world.get_location('Palace of Darkness - Big Key Chest', player), 'Small Key (Palace of Darkness)', player)
set_rule(world.get_entrance('Palace of Darkness Spike Statue Room Door', player), lambda state: state.has_key('Small Key (Palace of Darkness)', player, 6) or (item_name(state, 'Palace of Darkness - Harmless Hellway', player) in [('Small Key (Palace of Darkness)', player)] and state.has_key('Small Key (Palace of Darkness)', player, 4)))
if world.accessibility[player] != 'locations':
set_always_allow(world.get_location('Palace of Darkness - Harmless Hellway', player), lambda state, item: item.name == 'Small Key (Palace of Darkness)' and item.player == player and state.has_key('Small Key (Palace of Darkness)', player, 5))
else:
forbid_item(world.get_location('Palace of Darkness - Harmless Hellway', player), 'Small Key (Palace of Darkness)', player)
set_rule(world.get_entrance('Palace of Darkness Maze Door', player), lambda state: state.has_key('Small Key (Palace of Darkness)', player, 6))
set_defeat_dungeon_boss_rule(world.get_location('Palace of Darkness - Boss', player))
set_defeat_dungeon_boss_rule(world.get_location('Palace of Darkness - Prize', player))
# these key rules are conservative, you might be able to get away with more lenient rules
randomizer_room_chests = ['Ganons Tower - Randomizer Room - Top Left', 'Ganons Tower - Randomizer Room - Top Right', 'Ganons Tower - Randomizer Room - Bottom Left', 'Ganons Tower - Randomizer Room - Bottom Right']
compass_room_chests = ['Ganons Tower - Compass Room - Top Left', 'Ganons Tower - Compass Room - Top Right', 'Ganons Tower - Compass Room - Bottom Left', 'Ganons Tower - Compass Room - Bottom Right']
set_rule(world.get_location('Ganons Tower - Bob\'s Torch', player), lambda state: state.has_Boots(player))
set_rule(world.get_entrance('Ganons Tower (Tile Room)', player), lambda state: state.has('Cane of Somaria', player))
set_rule(world.get_entrance('Ganons Tower (Hookshot Room)', player), lambda state: state.has('Hammer', player) and (state.has('Hookshot', player) or state.has_Boots(player)))
set_rule(world.get_entrance('Ganons Tower (Map Room)', player), lambda state: state.has_key('Small Key (Ganons Tower)', player, 4) or (item_name(state, 'Ganons Tower - Map Chest', player) in [('Big Key (Ganons Tower)', player), ('Small Key (Ganons Tower)', player)] and state.has_key('Small Key (Ganons Tower)', player, 3)))
if world.accessibility[player] != 'locations':
set_always_allow(world.get_location('Ganons Tower - Map Chest', player), lambda state, item: item.name == 'Small Key (Ganons Tower)' and item.player == player and state.has_key('Small Key (Ganons Tower)', player, 3))
else:
forbid_item(world.get_location('Ganons Tower - Map Chest', player), 'Small Key (Ganons Tower)', player)
# It is possible to need more than 2 keys to get through this entrance if you spend keys elsewhere. We reflect this in the chest requirements.
# However we need to leave these at the lower values to derive that with 3 keys it is always possible to reach Bob and Ice Armos.
set_rule(world.get_entrance('Ganons Tower (Double Switch Room)', player), lambda state: state.has_key('Small Key (Ganons Tower)', player, 2))
# It is possible to need more than 3 keys ....
set_rule(world.get_entrance('Ganons Tower (Firesnake Room)', player), lambda state: state.has_key('Small Key (Ganons Tower)', player, 3))
#The actual requirements for these rooms to avoid key-lock
set_rule(world.get_location('Ganons Tower - Firesnake Room', player), lambda state: state.has_key('Small Key (Ganons Tower)', player, 3) or ((item_in_locations(state, 'Big Key (Ganons Tower)', player, zip(randomizer_room_chests, [player] * len(randomizer_room_chests))) or item_in_locations(state, 'Small Key (Ganons Tower)', player, [('Ganons Tower - Firesnake Room', player)])) and state.has_key('Small Key (Ganons Tower)', player, 2)))
for location in randomizer_room_chests:
set_rule(world.get_location(location, player), lambda state: state.has_key('Small Key (Ganons Tower)', player, 4) or (item_in_locations(state, 'Big Key (Ganons Tower)', player, zip(randomizer_room_chests, [player] * len(randomizer_room_chests))) and state.has_key('Small Key (Ganons Tower)', player, 3)))
# Once again it is possible to need more than 3 keys...
set_rule(world.get_entrance('Ganons Tower (Tile Room) Key Door', player), lambda state: state.has_key('Small Key (Ganons Tower)', player, 3) and state.has('Fire Rod', player))
# Actual requirements
for location in compass_room_chests:
set_rule(world.get_location(location, player), lambda state: state.has('Fire Rod', player) and (state.has_key('Small Key (Ganons Tower)', player, 4) or (item_in_locations(state, 'Big Key (Ganons Tower)', player, zip(compass_room_chests, [player] * len(compass_room_chests))) and state.has_key('Small Key (Ganons Tower)', player, 3))))
set_rule(world.get_location('Ganons Tower - Big Chest', player), lambda state: state.has('Big Key (Ganons Tower)', player))
set_rule(world.get_location('Ganons Tower - Big Key Room - Left', player), lambda state: state.world.get_location('Ganons Tower - Big Key Room - Left', player).parent_region.dungeon.bosses['bottom'].can_defeat(state))
set_rule(world.get_location('Ganons Tower - Big Key Chest', player), lambda state: state.world.get_location('Ganons Tower - Big Key Chest', player).parent_region.dungeon.bosses['bottom'].can_defeat(state))
set_rule(world.get_location('Ganons Tower - Big Key Room - Right', player), lambda state: state.world.get_location('Ganons Tower - Big Key Room - Right', player).parent_region.dungeon.bosses['bottom'].can_defeat(state))
set_rule(world.get_entrance('Ganons Tower Big Key Door', player), lambda state: state.has('Big Key (Ganons Tower)', player) and state.can_shoot_arrows(player))
set_rule(world.get_entrance('Ganons Tower Torch Rooms', player), lambda state: state.has_fire_source(player) and state.world.get_entrance('Ganons Tower Torch Rooms', player).parent_region.dungeon.bosses['middle'].can_defeat(state))
set_rule(world.get_location('Ganons Tower - Pre-Moldorm Chest', player), lambda state: state.has_key('Small Key (Ganons Tower)', player, 3))
set_rule(world.get_entrance('Ganons Tower Moldorm Door', player), lambda state: state.has_key('Small Key (Ganons Tower)', player, 4))
set_rule(world.get_entrance('Ganons Tower Moldorm Gap', player), lambda state: state.has('Hookshot', player) and state.world.get_entrance('Ganons Tower Moldorm Gap', player).parent_region.dungeon.bosses['top'].can_defeat(state))
set_defeat_dungeon_boss_rule(world.get_location('Agahnim 2', player))
for location in ['Ganons Tower - Big Chest', 'Ganons Tower - Mini Helmasaur Room - Left', 'Ganons Tower - Mini Helmasaur Room - Right',
'Ganons Tower - Pre-Moldorm Chest', 'Ganons Tower - Validation Chest']:
forbid_item(world.get_location(location, player), 'Big Key (Ganons Tower)', player)
if world.goal[player] in ['ganontriforcehunt', 'localganontriforcehunt']:
set_rule(world.get_location('Ganon', player), lambda state: state.has_beam_sword(player) and state.has_fire_source(player) and state.has_triforce_pieces(world.treasure_hunt_count[player], player)
and (state.has('Tempered Sword', player) or state.has('Golden Sword', player) or (state.has('Silver Bow', player) and state.can_shoot_arrows(player)) or state.has('Lamp', player) or state.can_extend_magic(player, 12))) # need to light torch a sufficient amount of times
else:
set_rule(world.get_location('Ganon', player), lambda state: state.has_beam_sword(player) and state.has_fire_source(player) and state.has_crystals(world.crystals_needed_for_ganon[player], player)
and (state.has('Tempered Sword', player) or state.has('Golden Sword', player) or (state.has('Silver Bow', player) and state.can_shoot_arrows(player)) or state.has('Lamp', player) or state.can_extend_magic(player, 12))) # need to light torch a sufficient amount of times
set_rule(world.get_entrance('Ganon Drop', player), lambda state: state.has_beam_sword(player)) # need to damage ganon to get tiles to drop
def default_rules(world, player):
# overworld requirements
set_rule(world.get_entrance('Kings Grave', player), lambda state: state.has_Boots(player))
set_rule(world.get_entrance('Kings Grave Outer Rocks', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Kings Grave Inner Rocks', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Kings Grave Mirror Spot', player), lambda state: state.has_Pearl(player) and state.has_Mirror(player))
# Caution: If king's grave is releaxed at all to account for reaching it via a two way cave's exit in insanity mode, then the bomb shop logic will need to be updated (that would involve create a small ledge-like Region for it)
set_rule(world.get_entrance('Bonk Fairy (Light)', player), lambda state: state.has_Boots(player))
set_rule(world.get_entrance('Lumberjack Tree Tree', player), lambda state: state.has_Boots(player) and state.has('Beat Agahnim 1', player))
set_rule(world.get_entrance('Bonk Rock Cave', player), lambda state: state.has_Boots(player))
set_rule(world.get_entrance('Desert Palace Stairs', player), lambda state: state.has('Book of Mudora', player))
set_rule(world.get_entrance('Sanctuary Grave', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('20 Rupee Cave', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('50 Rupee Cave', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('Death Mountain Entrance Rock', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('Bumper Cave Entrance Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Flute Spot 1', player), lambda state: state.has('Flute', player))
set_rule(world.get_entrance('Lake Hylia Central Island Teleporter', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Dark Desert Teleporter', player), lambda state: state.has('Flute', player) and state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('East Hyrule Teleporter', player), lambda state: state.has('Hammer', player) and state.can_lift_rocks(player) and state.has_Pearl(player)) # bunny cannot use hammer
set_rule(world.get_entrance('South Hyrule Teleporter', player), lambda state: state.has('Hammer', player) and state.can_lift_rocks(player) and state.has_Pearl(player)) # bunny cannot use hammer
set_rule(world.get_entrance('Kakariko Teleporter', player), lambda state: ((state.has('Hammer', player) and state.can_lift_rocks(player)) or state.can_lift_heavy_rocks(player)) and state.has_Pearl(player)) # bunny cannot lift bushes
set_rule(world.get_location('Flute Spot', player), lambda state: state.has('Shovel', player))
set_rule(world.get_entrance('Bat Cave Drop Ledge', player), lambda state: state.has('Hammer', player))
set_rule(world.get_location('Zora\'s Ledge', player), lambda state: state.has('Flippers', player))
set_rule(world.get_entrance('Waterfall of Wishing', player), lambda state: state.has('Flippers', player))
set_rule(world.get_location('Frog', player), lambda state: state.can_lift_heavy_rocks(player)) # will get automatic moon pearl requirement
set_rule(world.get_location('Potion Shop', player), lambda state: state.has('Mushroom', player))
set_rule(world.get_entrance('Desert Palace Entrance (North) Rocks', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('Desert Ledge Return Rocks', player), lambda state: state.can_lift_rocks(player)) # should we decide to place something that is not a dungeon end up there at some point
set_rule(world.get_entrance('Checkerboard Cave', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('Agahnims Tower', player), lambda state: state.has('Cape', player) or state.has_beam_sword(player) or state.has('Beat Agahnim 1', player)) # barrier gets removed after killing agahnim, relevant for entrance shuffle
set_rule(world.get_entrance('Top of Pyramid', player), lambda state: state.has('Beat Agahnim 1', player))
set_rule(world.get_entrance('Old Man Cave Exit (West)', player), lambda state: False) # drop cannot be climbed up
set_rule(world.get_entrance('Broken Bridge (West)', player), lambda state: state.has('Hookshot', player))
set_rule(world.get_entrance('Broken Bridge (East)', player), lambda state: state.has('Hookshot', player))
set_rule(world.get_entrance('East Death Mountain Teleporter', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Fairy Ascension Rocks', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Paradox Cave Push Block Reverse', player), lambda state: state.has('Mirror', player)) # can erase block
set_rule(world.get_entrance('Death Mountain (Top)', player), lambda state: state.has('Hammer', player))
set_rule(world.get_entrance('Turtle Rock Teleporter', player), lambda state: state.can_lift_heavy_rocks(player) and state.has('Hammer', player))
set_rule(world.get_entrance('East Death Mountain (Top)', player), lambda state: state.has('Hammer', player))
set_rule(world.get_entrance('Catfish Exit Rock', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('Catfish Entrance Rock', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('Northeast Dark World Broken Bridge Pass', player), lambda state: state.has_Pearl(player) and (state.can_lift_rocks(player) or state.has('Hammer', player) or state.has('Flippers', player)))
set_rule(world.get_entrance('East Dark World Broken Bridge Pass', player), lambda state: state.has_Pearl(player) and (state.can_lift_rocks(player) or state.has('Hammer', player)))
set_rule(world.get_entrance('South Dark World Bridge', player), lambda state: state.has('Hammer', player) and state.has_Pearl(player))
set_rule(world.get_entrance('Bonk Fairy (Dark)', player), lambda state: state.has_Pearl(player) and state.has_Boots(player))
set_rule(world.get_entrance('West Dark World Gap', player), lambda state: state.has_Pearl(player) and state.has('Hookshot', player))
set_rule(world.get_entrance('Palace of Darkness', player), lambda state: state.has_Pearl(player)) # kiki needs pearl
set_rule(world.get_entrance('Hyrule Castle Ledge Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Hyrule Castle Main Gate', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Dark Lake Hylia Drop (East)', player), lambda state: (state.has_Pearl(player) and state.has('Flippers', player) or state.has_Mirror(player))) # Overworld Bunny Revival
set_rule(world.get_location('Bombos Tablet', player), lambda state: state.has('Book of Mudora', player) and state.has_beam_sword(player))
set_rule(world.get_entrance('Dark Lake Hylia Drop (South)', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player)) # ToDo any fake flipper set up?
set_rule(world.get_entrance('Dark Lake Hylia Ledge Fairy', player), lambda state: state.has_Pearl(player)) # bomb required
set_rule(world.get_entrance('Dark Lake Hylia Ledge Spike Cave', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Dark Lake Hylia Teleporter', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Village of Outcasts Heavy Rock', player), lambda state: state.has_Pearl(player) and state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Hype Cave', player), lambda state: state.has_Pearl(player)) # bomb required
set_rule(world.get_entrance('Brewery', player), lambda state: state.has_Pearl(player)) # bomb required
set_rule(world.get_entrance('Thieves Town', player), lambda state: state.has_Pearl(player)) # bunny cannot pull
set_rule(world.get_entrance('Skull Woods First Section Hole (North)', player), lambda state: state.has_Pearl(player)) # bunny cannot lift bush
set_rule(world.get_entrance('Skull Woods Second Section Hole', player), lambda state: state.has_Pearl(player)) # bunny cannot lift bush
set_rule(world.get_entrance('Maze Race Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Cave 45 Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Bombos Tablet Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('East Dark World Bridge', player), lambda state: state.has_Pearl(player) and state.has('Hammer', player))
set_rule(world.get_entrance('Lake Hylia Island Mirror Spot', player), lambda state: state.has_Pearl(player) and state.has_Mirror(player) and state.has('Flippers', player))
set_rule(world.get_entrance('Lake Hylia Central Island Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('East Dark World River Pier', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player)) # ToDo any fake flipper set up?
set_rule(world.get_entrance('Graveyard Ledge Mirror Spot', player), lambda state: state.has_Pearl(player) and state.has_Mirror(player))
set_rule(world.get_entrance('Bumper Cave Entrance Rock', player), lambda state: state.has_Pearl(player) and state.can_lift_rocks(player))
set_rule(world.get_entrance('Bumper Cave Ledge Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Bat Cave Drop Ledge Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Dark World Hammer Peg Cave', player), lambda state: state.has_Pearl(player) and state.has('Hammer', player))
set_rule(world.get_entrance('Village of Outcasts Eastern Rocks', player), lambda state: state.has_Pearl(player) and state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Peg Area Rocks', player), lambda state: state.has_Pearl(player) and state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Village of Outcasts Pegs', player), lambda state: state.has_Pearl(player) and state.has('Hammer', player))
set_rule(world.get_entrance('Grassy Lawn Pegs', player), lambda state: state.has_Pearl(player) and state.has('Hammer', player))
set_rule(world.get_entrance('Bumper Cave Exit (Top)', player), lambda state: state.has('Cape', player))
set_rule(world.get_entrance('Bumper Cave Exit (Bottom)', player), lambda state: state.has('Cape', player) or state.has('Hookshot', player))
set_rule(world.get_entrance('Skull Woods Final Section', player), lambda state: state.has('Fire Rod', player) and state.has_Pearl(player)) # bunny cannot use fire rod
set_rule(world.get_entrance('Misery Mire', player), lambda state: state.has_Pearl(player) and state.has_sword(player) and state.has_misery_mire_medallion(player)) # sword required to cast magic (!)
set_rule(world.get_entrance('Desert Ledge (Northeast) Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Desert Ledge Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Desert Palace Stairs Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Desert Palace Entrance (North) Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Spectacle Rock Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Hookshot Cave', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('East Death Mountain (Top) Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Mimic Cave Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Spiral Cave Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Fairy Ascension Mirror Spot', player), lambda state: state.has_Mirror(player) and state.has_Pearl(player)) # need to lift flowers
set_rule(world.get_entrance('Isolated Ledge Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Superbunny Cave Exit (Bottom)', player), lambda state: False) # Cannot get to bottom exit from top. Just exists for shuffling
set_rule(world.get_entrance('Floating Island Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Turtle Rock', player), lambda state: state.has_Pearl(player) and state.has_sword(player) and state.has_turtle_rock_medallion(player) and state.can_reach('Turtle Rock (Top)', 'Region', player)) # sword required to cast magic (!)
set_rule(world.get_entrance('Pyramid Hole', player), lambda state: state.has('Beat Agahnim 2', player) or world.open_pyramid[player])
if world.swords[player] == 'swordless':
swordless_rules(world, player)
def inverted_rules(world, player):
# s&q regions.
set_rule(world.get_entrance('Castle Ledge S&Q', player), lambda state: state.has_Mirror(player) and state.has('Beat Agahnim 1', player))
# overworld requirements
set_rule(world.get_location('Maze Race', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Mini Moldorm Cave', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Ice Rod Cave', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Light Hype Fairy', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Potion Shop Pier', player), lambda state: state.has('Flippers', player) and state.has_Pearl(player))
set_rule(world.get_entrance('Light World Pier', player), lambda state: state.has('Flippers', player) and state.has_Pearl(player))
set_rule(world.get_entrance('Kings Grave', player), lambda state: state.has_Boots(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Kings Grave Outer Rocks', player), lambda state: state.can_lift_heavy_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Kings Grave Inner Rocks', player), lambda state: state.can_lift_heavy_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Potion Shop Inner Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Potion Shop Outer Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Potion Shop Outer Rock', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Potion Shop Inner Rock', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Graveyard Cave Inner Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Graveyard Cave Outer Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Secret Passage Inner Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Secret Passage Outer Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Bonk Fairy (Light)', player), lambda state: state.has_Boots(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Bat Cave Drop Ledge', player), lambda state: state.has('Hammer', player) and state.has_Pearl(player))
set_rule(world.get_entrance('Lumberjack Tree Tree', player), lambda state: state.has_Boots(player) and state.has_Pearl(player) and state.has('Beat Agahnim 1', player))
set_rule(world.get_entrance('Bonk Rock Cave', player), lambda state: state.has_Boots(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Desert Palace Stairs', player), lambda state: state.has('Book of Mudora', player)) # bunny can use book
set_rule(world.get_entrance('Sanctuary Grave', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('20 Rupee Cave', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('50 Rupee Cave', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Death Mountain Entrance Rock', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Bumper Cave Entrance Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Lake Hylia Central Island Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Dark Lake Hylia Central Island Teleporter', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Dark Desert Teleporter', player), lambda state: state.can_flute(player) and state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('East Dark World Teleporter', player), lambda state: state.has('Hammer', player) and state.can_lift_rocks(player) and state.has_Pearl(player)) # bunny cannot use hammer
set_rule(world.get_entrance('South Dark World Teleporter', player), lambda state: state.has('Hammer', player) and state.can_lift_rocks(player) and state.has_Pearl(player)) # bunny cannot use hammer
set_rule(world.get_entrance('West Dark World Teleporter', player), lambda state: ((state.has('Hammer', player) and state.can_lift_rocks(player)) or state.can_lift_heavy_rocks(player)) and state.has_Pearl(player))
set_rule(world.get_location('Flute Spot', player), lambda state: state.has('Shovel', player) and state.has_Pearl(player))
set_rule(world.get_location('Zora\'s Ledge', player), lambda state: state.has('Flippers', player) and state.has_Pearl(player))
set_rule(world.get_entrance('Waterfall of Wishing Cave', player), lambda state: state.has('Flippers', player) and state.has_Pearl(player))
set_rule(world.get_entrance('Northeast Light World Return', player), lambda state: state.has('Flippers', player) and state.has_Pearl(player))
set_rule(world.get_location('Frog', player), lambda state: state.can_lift_heavy_rocks(player) and (state.has_Pearl(player) or state.has('Beat Agahnim 1', player)) or (state.can_reach('Light World', 'Region', player) and state.has_Mirror(player))) # Need LW access using Mirror or Portal
set_rule(world.get_location('Missing Smith', player), lambda state: state.has('Get Frog', player) and state.can_reach('Blacksmiths Hut', 'Region', player)) # Can't S&Q with smith
set_rule(world.get_location('Blacksmith', player), lambda state: state.has('Return Smith', player))
set_rule(world.get_location('Magic Bat', player), lambda state: state.has('Magic Powder', player) and state.has_Pearl(player))
set_rule(world.get_location('Sick Kid', player), lambda state: state.has_bottle(player))
set_rule(world.get_location('Mushroom', player), lambda state: state.has_Pearl(player)) # need pearl to pick up bushes
set_rule(world.get_entrance('Bush Covered Lawn Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Bush Covered Lawn Inner Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Bush Covered Lawn Outer Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Bomb Hut Inner Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Bomb Hut Outer Bushes', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Light World Bomb Hut', player), lambda state: state.has_Pearl(player)) # need bomb
set_rule(world.get_entrance('North Fairy Cave Drop', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Lost Woods Hideout Drop', player), lambda state: state.has_Pearl(player))
set_rule(world.get_location('Potion Shop', player), lambda state: state.has('Mushroom', player) and (state.can_reach('Potion Shop Area', 'Region', player))) # new inverted region, need pearl for bushes or access to potion shop door/waterfall fairy
set_rule(world.get_entrance('Desert Palace Entrance (North) Rocks', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Desert Ledge Return Rocks', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player)) # should we decide to place something that is not a dungeon end up there at some point
set_rule(world.get_entrance('Checkerboard Cave', player), lambda state: state.can_lift_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Hyrule Castle Secret Entrance Drop', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Old Man Cave Exit (West)', player), lambda state: False) # drop cannot be climbed up
set_rule(world.get_entrance('Broken Bridge (West)', player), lambda state: state.has('Hookshot', player) and state.has_Pearl(player))
set_rule(world.get_entrance('Broken Bridge (East)', player), lambda state: state.has('Hookshot', player) and state.has_Pearl(player))
set_rule(world.get_entrance('Dark Death Mountain Teleporter (East Bottom)', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Fairy Ascension Rocks', player), lambda state: state.can_lift_heavy_rocks(player) and state.has_Pearl(player))
set_rule(world.get_entrance('Paradox Cave Push Block Reverse', player), lambda state: state.has('Mirror', player)) # can erase block
set_rule(world.get_entrance('Death Mountain (Top)', player), lambda state: state.has('Hammer', player) and state.has_Pearl(player))
set_rule(world.get_entrance('Dark Death Mountain Teleporter (East)', player), lambda state: state.can_lift_heavy_rocks(player) and state.has('Hammer', player) and state.has_Pearl(player)) # bunny cannot use hammer
set_rule(world.get_entrance('East Death Mountain (Top)', player), lambda state: state.has('Hammer', player) and state.has_Pearl(player)) # bunny can not use hammer
set_rule(world.get_entrance('Catfish Entrance Rock', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('Northeast Dark World Broken Bridge Pass', player), lambda state: ((state.can_lift_rocks(player) or state.has('Hammer', player)) or state.has('Flippers', player)))
set_rule(world.get_entrance('East Dark World Broken Bridge Pass', player), lambda state: (state.can_lift_rocks(player) or state.has('Hammer', player)))
set_rule(world.get_entrance('South Dark World Bridge', player), lambda state: state.has('Hammer', player))
set_rule(world.get_entrance('Bonk Fairy (Dark)', player), lambda state: state.has_Boots(player))
set_rule(world.get_entrance('West Dark World Gap', player), lambda state: state.has('Hookshot', player))
set_rule(world.get_entrance('Dark Lake Hylia Drop (East)', player), lambda state: state.has('Flippers', player))
set_rule(world.get_location('Bombos Tablet', player), lambda state: state.has('Book of Mudora', player) and state.has_beam_sword(player))
set_rule(world.get_entrance('Dark Lake Hylia Drop (South)', player), lambda state: state.has('Flippers', player)) # ToDo any fake flipper set up?
set_rule(world.get_entrance('Dark Lake Hylia Ledge Pier', player), lambda state: state.has('Flippers', player))
set_rule(world.get_entrance('Dark Lake Hylia Ledge Spike Cave', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('Dark Lake Hylia Teleporter', player), lambda state: state.has('Flippers', player)) # Fake Flippers
set_rule(world.get_entrance('Dark Lake Hylia Shallows', player), lambda state: state.has('Flippers', player))
set_rule(world.get_entrance('Village of Outcasts Heavy Rock', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('East Dark World Bridge', player), lambda state: state.has('Hammer', player))
set_rule(world.get_entrance('Lake Hylia Central Island Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('East Dark World River Pier', player), lambda state: state.has('Flippers', player)) # ToDo any fake flipper set up? (Qirn Jump)
set_rule(world.get_entrance('Bumper Cave Entrance Rock', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('Bumper Cave Ledge Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Hammer Peg Area Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Dark World Hammer Peg Cave', player), lambda state: state.has('Hammer', player))
set_rule(world.get_entrance('Village of Outcasts Eastern Rocks', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Peg Area Rocks', player), lambda state: state.can_lift_heavy_rocks(player))
set_rule(world.get_entrance('Village of Outcasts Pegs', player), lambda state: state.has('Hammer', player))
set_rule(world.get_entrance('Grassy Lawn Pegs', player), lambda state: state.has('Hammer', player))
set_rule(world.get_entrance('Bumper Cave Exit (Top)', player), lambda state: state.has('Cape', player))
set_rule(world.get_entrance('Bumper Cave Exit (Bottom)', player), lambda state: state.has('Cape', player) or state.has('Hookshot', player))
set_rule(world.get_entrance('Skull Woods Final Section', player), lambda state: state.has('Fire Rod', player))
set_rule(world.get_entrance('Misery Mire', player), lambda state: state.has_sword(player) and state.has_misery_mire_medallion(player)) # sword required to cast magic (!)
set_rule(world.get_entrance('Hookshot Cave', player), lambda state: state.can_lift_rocks(player))
set_rule(world.get_entrance('East Death Mountain Mirror Spot (Top)', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Death Mountain (Top) Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('East Death Mountain Mirror Spot (Bottom)', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Dark Death Mountain Ledge Mirror Spot (East)', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Dark Death Mountain Ledge Mirror Spot (West)', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Laser Bridge Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Floating Island Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Turtle Rock', player), lambda state: state.has_sword(player) and state.has_turtle_rock_medallion(player) and state.can_reach('Turtle Rock (Top)', 'Region', player)) # sword required to cast magic (!)
# new inverted spots
set_rule(world.get_entrance('Post Aga Teleporter', player), lambda state: state.has('Beat Agahnim 1', player))
set_rule(world.get_entrance('Mire Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Desert Palace Stairs Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Death Mountain Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('East Dark World Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('West Dark World Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('South Dark World Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Catfish Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Potion Shop Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Shopping Mall Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Maze Race Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Desert Palace North Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Death Mountain (Top) Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Graveyard Cave Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Bomb Hut Mirror Spot', player), lambda state: state.has_Mirror(player))
set_rule(world.get_entrance('Skull Woods Mirror Spot', player), lambda state: state.has_Mirror(player))
# inverted flute spots
set_rule(world.get_entrance('DDM Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('NEDW Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('WDW Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('SDW Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('EDW Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('DLHL Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('DD Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('EDDM Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('Dark Grassy Lawn Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('Hammer Peg Area Flute', player), lambda state: state.can_flute(player))
set_rule(world.get_entrance('Inverted Pyramid Hole', player), lambda state: state.has('Beat Agahnim 2', player) or world.open_pyramid[player])
if world.swords[player] == 'swordless':
swordless_rules(world, player)
def no_glitches_rules(world, player):
if world.mode[player] != 'inverted':
set_rule(world.get_entrance('Zoras River', player), lambda state: state.has('Flippers', player) or state.can_lift_rocks(player))
set_rule(world.get_entrance('Lake Hylia Central Island Pier', player), lambda state: state.has('Flippers', player)) # can be fake flippered to
set_rule(world.get_entrance('Hobo Bridge', player), lambda state: state.has('Flippers', player))
set_rule(world.get_entrance('Dark Lake Hylia Drop (East)', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player))
set_rule(world.get_entrance('Dark Lake Hylia Teleporter', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player))
set_rule(world.get_entrance('Dark Lake Hylia Ledge Drop', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player))
else:
set_rule(world.get_entrance('Zoras River', player), lambda state: state.has_Pearl(player) and (state.has('Flippers', player) or state.can_lift_rocks(player)))
set_rule(world.get_entrance('Lake Hylia Central Island Pier', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player)) # can be fake flippered to
set_rule(world.get_entrance('Lake Hylia Island Pier', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player)) # can be fake flippered to
set_rule(world.get_entrance('Lake Hylia Warp', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player)) # can be fake flippered to
set_rule(world.get_entrance('Northeast Light World Warp', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player)) # can be fake flippered to
set_rule(world.get_entrance('Hobo Bridge', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player))
set_rule(world.get_entrance('Dark Lake Hylia Drop (East)', player), lambda state: state.has('Flippers', player))
set_rule(world.get_entrance('Dark Lake Hylia Teleporter', player), lambda state: state.has('Flippers', player))
set_rule(world.get_entrance('Dark Lake Hylia Ledge Drop', player), lambda state: state.has('Flippers', player))
set_rule(world.get_entrance('East Dark World Pier', player), lambda state: state.has('Flippers', player))
add_rule(world.get_entrance('Ganons Tower (Double Switch Room)', player), lambda state: state.has('Hookshot', player))
set_rule(world.get_entrance('Paradox Cave Push Block Reverse', player), lambda state: False) # no glitches does not require block override
forbid_bomb_jump_requirements(world, player)
add_conditional_lamps(world, player)
def fake_flipper_rules(world, player):
if world.mode[player] != 'inverted':
set_rule(world.get_entrance('Zoras River', player), lambda state: True)
set_rule(world.get_entrance('Lake Hylia Central Island Pier', player), lambda state: True)
set_rule(world.get_entrance('Hobo Bridge', player), lambda state: True)
set_rule(world.get_entrance('Dark Lake Hylia Drop (East)', player), lambda state: state.has_Pearl(player) and state.has('Flippers', player))
set_rule(world.get_entrance('Dark Lake Hylia Teleporter', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Dark Lake Hylia Ledge Drop', player), lambda state: state.has_Pearl(player))
else:
set_rule(world.get_entrance('Zoras River', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Lake Hylia Central Island Pier', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Lake Hylia Island Pier', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Lake Hylia Warp', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Northeast Light World Warp', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Hobo Bridge', player), lambda state: state.has_Pearl(player))
set_rule(world.get_entrance('Dark Lake Hylia Drop (East)', player), lambda state: state.has('Flippers', player))
set_rule(world.get_entrance('Dark Lake Hylia Teleporter', player), lambda state: True)
set_rule(world.get_entrance('Dark Lake Hylia Ledge Drop', player), lambda state: True)
set_rule(world.get_entrance('East Dark World Pier', player), lambda state: True)
def forbid_bomb_jump_requirements(world, player):
DMs_room_chests = ['Ganons Tower - DMs Room - Top Left', 'Ganons Tower - DMs Room - Top Right', 'Ganons Tower - DMs Room - Bottom Left', 'Ganons Tower - DMs Room - Bottom Right']
for location in DMs_room_chests:
add_rule(world.get_location(location, player), lambda state: state.has('Hookshot', player))
set_rule(world.get_entrance('Paradox Cave Bomb Jump', player), lambda state: False)
set_rule(world.get_entrance('Skull Woods First Section Bomb Jump', player), lambda state: False)
DW_Entrances = ['Bumper Cave (Bottom)',
'Superbunny Cave (Top)',
'Superbunny Cave (Bottom)',
'Hookshot Cave',
'Bumper Cave (Top)',
'Hookshot Cave Back Entrance',
'Dark Death Mountain Ledge (East)',
'Turtle Rock Isolated Ledge Entrance',
'Thieves Town',
'Skull Woods Final Section',
'Ice Palace',
'Misery Mire',
'Palace of Darkness',
'Swamp Palace',
'Turtle Rock',
'Dark Death Mountain Ledge (West)']
def check_is_dark_world(region):
for entrance in region.entrances:
if entrance.name in DW_Entrances:
return True
return False
def add_conditional_lamps(world, player):
# Light cones in standard depend on which world we actually are in, not which one the location would normally be
# We add Lamp requirements only to those locations which lie in the dark world (or everything if open
def add_conditional_lamp(spot, region, spottype='Location'):
if spottype == 'Location':
spot = world.get_location(spot, player)
else:
spot = world.get_entrance(spot, player)
if (not world.dark_world_light_cone and check_is_dark_world(world.get_region(region, player))) or (not world.light_world_light_cone and not check_is_dark_world(world.get_region(region, player))):
add_lamp_requirement(spot, player)
add_conditional_lamp('Misery Mire (Vitreous)', 'Misery Mire (Entrance)', 'Entrance')
add_conditional_lamp('Turtle Rock (Dark Room) (North)', 'Turtle Rock (Entrance)', 'Entrance')
add_conditional_lamp('Turtle Rock (Dark Room) (South)', 'Turtle Rock (Entrance)', 'Entrance')
add_conditional_lamp('Palace of Darkness Big Key Door', 'Palace of Darkness (Entrance)', 'Entrance')
add_conditional_lamp('Palace of Darkness Maze Door', 'Palace of Darkness (Entrance)', 'Entrance')
add_conditional_lamp('Palace of Darkness - Dark Basement - Left', 'Palace of Darkness (Entrance)', 'Location')
add_conditional_lamp('Palace of Darkness - Dark Basement - Right', 'Palace of Darkness (Entrance)', 'Location')
if world.mode[player] != 'inverted':
add_conditional_lamp('Agahnim 1', 'Agahnims Tower', 'Entrance')
add_conditional_lamp('Castle Tower - Dark Maze', 'Agahnims Tower', 'Location')
else:
add_conditional_lamp('Agahnim 1', 'Inverted Agahnims Tower', 'Entrance')
add_conditional_lamp('Castle Tower - Dark Maze', 'Inverted Agahnims Tower', 'Location')
add_conditional_lamp('Old Man', 'Old Man Cave', 'Location')
add_conditional_lamp('Old Man Cave Exit (East)', 'Old Man Cave', 'Entrance')
add_conditional_lamp('Death Mountain Return Cave Exit (East)', 'Death Mountain Return Cave', 'Entrance')
add_conditional_lamp('Death Mountain Return Cave Exit (West)', 'Death Mountain Return Cave', 'Entrance')
add_conditional_lamp('Old Man House Front to Back', 'Old Man House', 'Entrance')
add_conditional_lamp('Old Man House Back to Front', 'Old Man House', 'Entrance')
add_conditional_lamp('Eastern Palace - Big Key Chest', 'Eastern Palace', 'Location')
add_conditional_lamp('Eastern Palace - Boss', 'Eastern Palace', 'Location')
add_conditional_lamp('Eastern Palace - Prize', 'Eastern Palace', 'Location')
if not world.sewer_light_cone[player]:
add_lamp_requirement(world.get_location('Sewers - Dark Cross', player), player)
add_lamp_requirement(world.get_entrance('Sewers Back Door', player), player)
add_lamp_requirement(world.get_entrance('Throne Room', player), player)
def open_rules(world, player):
# softlock protection as you can reach the sewers small key door with a guard drop key
set_rule(world.get_location('Hyrule Castle - Boomerang Chest', player),
lambda state: state.has_key('Small Key (Hyrule Castle)', player))
set_rule(world.get_location('Hyrule Castle - Zelda\'s Chest', player),
lambda state: state.has_key('Small Key (Hyrule Castle)', player))
def swordless_rules(world, player):
set_rule(world.get_entrance('Agahnim 1', player), lambda state: (state.has('Hammer', player) or state.has('Fire Rod', player) or state.can_shoot_arrows(player) or state.has('Cane of Somaria', player)) and state.has_key('Small Key (Agahnims Tower)', player, 2))
set_rule(world.get_location('Ether Tablet', player), lambda state: state.has('Book of Mudora', player) and state.has('Hammer', player))
set_rule(world.get_entrance('Skull Woods Torch Room', player), lambda state: state.has_key('Small Key (Skull Woods)', player, 3) and state.has('Fire Rod', player)) # no curtain
set_rule(world.get_entrance('Ice Palace Entrance Room', player), lambda state: state.has('Fire Rod', player) or state.has('Bombos', player)) #in swordless mode bombos pads are present in the relevant parts of ice palace
if world.goal[player] in ['ganontriforcehunt', 'localganontriforcehunt']:
set_rule(world.get_location('Ganon', player), lambda state: state.has('Hammer', player) and state.has_fire_source(player) and state.has('Silver Bow', player) and state.can_shoot_arrows(player) and state.has_triforce_pieces(world.treasure_hunt_count[player], player))
else:
set_rule(world.get_location('Ganon', player), lambda state: state.has('Hammer', player) and state.has_fire_source(player) and state.has('Silver Bow', player) and state.can_shoot_arrows(player) and state.has_crystals(world.crystals_needed_for_ganon[player], player))
set_rule(world.get_entrance('Ganon Drop', player), lambda state: state.has('Hammer', player)) # need to damage ganon to get tiles to drop
if world.mode[player] != 'inverted':
set_rule(world.get_entrance('Agahnims Tower', player), lambda state: state.has('Cape', player) or state.has('Hammer', player) or state.has('Beat Agahnim 1', player)) # barrier gets removed after killing agahnim, relevant for entrance shuffle
set_rule(world.get_entrance('Turtle Rock', player), lambda state: state.has_Pearl(player) and state.has_turtle_rock_medallion(player) and state.can_reach('Turtle Rock (Top)', 'Region', player)) # sword not required to use medallion for opening in swordless (!)
set_rule(world.get_entrance('Misery Mire', player), lambda state: state.has_Pearl(player) and state.has_misery_mire_medallion(player)) # sword not required to use medallion for opening in swordless (!)
set_rule(world.get_location('Bombos Tablet', player), lambda state: state.has('Book of Mudora', player) and state.has('Hammer', player) and state.has_Mirror(player))
else:
# only need ddm access for aga tower in inverted
set_rule(world.get_entrance('Turtle Rock', player), lambda state: state.has_turtle_rock_medallion(player) and state.can_reach('Turtle Rock (Top)', 'Region', player)) # sword not required to use medallion for opening in swordless (!)
set_rule(world.get_entrance('Misery Mire', player), lambda state: state.has_misery_mire_medallion(player)) # sword not required to use medallion for opening in swordless (!)
set_rule(world.get_location('Bombos Tablet', player), lambda state: state.has('Book of Mudora', player) and state.has('Hammer', player))
def add_connection(parent_name, target_name, entrance_name, world, player):
parent = world.get_region(parent_name, player)
target = world.get_region(target_name, player)
connection = Entrance(player, entrance_name, parent)
parent.exits.append(connection)
connection.connect(target)
def standard_rules(world, player):
add_connection('Menu', 'Hyrule Castle Secret Entrance', 'Uncle S&Q', world, player)
world.get_entrance('Uncle S&Q', player).hide_path = True
set_rule(world.get_entrance('Hyrule Castle Exit (East)', player), lambda state: state.can_reach('Sanctuary', 'Region', player))
set_rule(world.get_entrance('Hyrule Castle Exit (West)', player), lambda state: state.can_reach('Sanctuary', 'Region', player))
set_rule(world.get_entrance('Links House S&Q', player), lambda state: state.can_reach('Sanctuary', 'Region', player))
set_rule(world.get_entrance('Sanctuary S&Q', player), lambda state: state.can_reach('Sanctuary', 'Region', player))
def set_trock_key_rules(world, player):
# First set all relevant locked doors to impassible.
for entrance in ['Turtle Rock Dark Room Staircase', 'Turtle Rock (Chain Chomp Room) (North)', 'Turtle Rock (Chain Chomp Room) (South)', 'Turtle Rock Pokey Room']:
set_rule(world.get_entrance(entrance, player), lambda state: False)
all_state = world.get_all_state(True)
# Check if each of the four main regions of the dungoen can be reached. The previous code section prevents key-costing moves within the dungeon.
can_reach_back = all_state.can_reach(world.get_region('Turtle Rock (Eye Bridge)', player)) if world.can_access_trock_eyebridge[player] is None else world.can_access_trock_eyebridge[player]
world.can_access_trock_eyebridge[player] = can_reach_back
can_reach_front = all_state.can_reach(world.get_region('Turtle Rock (Entrance)', player)) if world.can_access_trock_front[player] is None else world.can_access_trock_front[player]
world.can_access_trock_front[player] = can_reach_front
can_reach_big_chest = all_state.can_reach(world.get_region('Turtle Rock (Big Chest)', player)) if world.can_access_trock_big_chest[player] is None else world.can_access_trock_big_chest[player]
world.can_access_trock_big_chest[player] = can_reach_big_chest
can_reach_middle = all_state.can_reach(world.get_region('Turtle Rock (Second Section)', player)) if world.can_access_trock_middle[player] is None else world.can_access_trock_middle[player]
world.can_access_trock_middle[player] = can_reach_middle
# The following represent the common key rules.
# No matter what, the key requirement for going from the middle to the bottom should be three keys.
set_rule(world.get_entrance('Turtle Rock Dark Room Staircase', player), lambda state: state.has_key('Small Key (Turtle Rock)', player, 3))
# No matter what, the Big Key cannot be in the Big Chest or held by Trinexx.
non_big_key_locations = ['Turtle Rock - Big Chest', 'Turtle Rock - Boss']
# Now we need to set rules based on which entrances we have access to. The most important point is whether we have back access. If we have back access, we
# might open all the locked doors in any order so we need maximally restrictive rules.
if can_reach_back:
set_rule(world.get_location('Turtle Rock - Big Key Chest', player), lambda state: (state.has_key('Small Key (Turtle Rock)', player, 4) or item_name(state, 'Turtle Rock - Big Key Chest', player) == ('Small Key (Turtle Rock)', player)))
set_rule(world.get_entrance('Turtle Rock (Chain Chomp Room) (South)', player), lambda state: state.has_key('Small Key (Turtle Rock)', player, 4))
# Only consider wasting the key on the Trinexx door for going from the front entrance to middle section. If other key doors are accessible, then these doors can be avoided
set_rule(world.get_entrance('Turtle Rock (Chain Chomp Room) (North)', player), lambda state: state.has_key('Small Key (Turtle Rock)', player, 3))
set_rule(world.get_entrance('Turtle Rock Pokey Room', player), lambda state: state.has_key('Small Key (Turtle Rock)', player, 2))
else:
# Middle to front requires 2 keys if the back is locked, otherwise 4
set_rule(world.get_entrance('Turtle Rock (Chain Chomp Room) (South)', player), lambda state: state.has_key('Small Key (Turtle Rock)', player, 2)
if item_in_locations(state, 'Big Key (Turtle Rock)', player, [('Turtle Rock - Compass Chest', player), ('Turtle Rock - Roller Room - Left', player), ('Turtle Rock - Roller Room - Right', player)])
else state.has_key('Small Key (Turtle Rock)', player, 4))
# Front to middle requires 2 keys (if the middle is accessible then these doors can be avoided, otherwise no keys can be wasted)
set_rule(world.get_entrance('Turtle Rock (Chain Chomp Room) (North)', player), lambda state: state.has_key('Small Key (Turtle Rock)', player, 2))
set_rule(world.get_entrance('Turtle Rock Pokey Room', player), lambda state: state.has_key('Small Key (Turtle Rock)', player, 1))
set_rule(world.get_location('Turtle Rock - Big Key Chest', player), lambda state: state.has_key('Small Key (Turtle Rock)', player, tr_big_key_chest_keys_needed(state)))
def tr_big_key_chest_keys_needed(state):
# This function handles the key requirements for the TR Big Chest in the situations it having the Big Key should logically require 2 keys, small key
# should logically require no keys, and anything else should logically require 4 keys.
item = item_name(state, 'Turtle Rock - Big Key Chest', player)
if item in [('Small Key (Turtle Rock)', player)]:
return 0
if item in [('Big Key (Turtle Rock)', player)]:
return 2
return 4
non_big_key_locations += ['Turtle Rock - Crystaroller Room', 'Turtle Rock - Eye Bridge - Bottom Left',
'Turtle Rock - Eye Bridge - Bottom Right', 'Turtle Rock - Eye Bridge - Top Left',
'Turtle Rock - Eye Bridge - Top Right']
# If TR is only accessible from the middle, the big key must be further restricted to prevent softlock potential
if not can_reach_front and not world.keyshuffle[player] and not world.retro[player]:
# Must not go in the Big Key Chest - only 1 other chest available and 2+ keys required for all other chests
non_big_key_locations += ['Turtle Rock - Big Key Chest']
if not can_reach_big_chest:
# Must not go in the Chain Chomps chest - only 2 other chests available and 3+ keys required for all other chests
non_big_key_locations += ['Turtle Rock - Chain Chomps']
if world.accessibility[player] == 'locations':
if world.bigkeyshuffle[player] and can_reach_big_chest:
# Must not go in the dungeon - all 3 available chests (Chomps, Big Chest, Crystaroller) must be keys to access laser bridge, and the big key is required first
non_big_key_locations += ['Turtle Rock - Chain Chomps', 'Turtle Rock - Compass Chest', 'Turtle Rock - Roller Room - Left', 'Turtle Rock - Roller Room - Right']
else:
# A key is required in the Big Key Chest to prevent a possible softlock. Place an extra key to ensure 100% locations still works
world.push_item(world.get_location('Turtle Rock - Big Key Chest', player), ItemFactory('Small Key (Turtle Rock)', player), False)
world.get_location('Turtle Rock - Big Key Chest', player).event = True
big20 = next(i for i in world.itempool if i.name == "Rupees (20)" and i.player == player)
world.itempool.remove(big20)
if world.accessibility[player] != 'locations':
set_always_allow(world.get_location('Turtle Rock - Big Key Chest', player), lambda state, item: item.name == 'Small Key (Turtle Rock)' and item.player == player
and state.can_reach(state.world.get_region('Turtle Rock (Second Section)', player)))
else:
forbid_item(world.get_location('Turtle Rock - Big Key Chest', player), 'Small Key (Turtle Rock)', player)
# set big key restrictions
for location in non_big_key_locations:
forbid_item(world.get_location(location, player), 'Big Key (Turtle Rock)', player)
# small key restriction
for location in ['Turtle Rock - Boss']:
forbid_item(world.get_location(location, player), 'Small Key (Turtle Rock)', player)
def set_big_bomb_rules(world, player):
# this is a mess
bombshop_entrance = world.get_region('Big Bomb Shop', player).entrances[0]
Normal_LW_entrances = ['Blinds Hideout',
'Bonk Fairy (Light)',
'Lake Hylia Fairy',
'Light Hype Fairy',
'Desert Fairy',
'Chicken House',
'Aginahs Cave',
'Sahasrahlas Hut',
'Cave Shop (Lake Hylia)',
'Blacksmiths Hut',
'Sick Kids House',
'Lost Woods Gamble',
'Fortune Teller (Light)',
'Snitch Lady (East)',
'Snitch Lady (West)',
'Bush Covered House',
'Tavern (Front)',
'Light World Bomb Hut',
'Kakariko Shop',
'Mini Moldorm Cave',
'Long Fairy Cave',
'Good Bee Cave',
'20 Rupee Cave',
'50 Rupee Cave',
'Ice Rod Cave',
'Bonk Rock Cave',
'Library',
'Potion Shop',
'Dam',
'Lumberjack House',
'Lake Hylia Fortune Teller',
'Eastern Palace',
'Kakariko Gamble Game',
'Kakariko Well Cave',
'Bat Cave Cave',
'Elder House (East)',
'Elder House (West)',
'North Fairy Cave',
'Lost Woods Hideout Stump',
'Lumberjack Tree Cave',
'Two Brothers House (East)',
'Sanctuary',
'Hyrule Castle Entrance (South)',
'Hyrule Castle Secret Entrance Stairs']
LW_walkable_entrances = ['Dark Lake Hylia Ledge Fairy',
'Dark Lake Hylia Ledge Spike Cave',
'Dark Lake Hylia Ledge Hint',
'Mire Shed',
'Dark Desert Hint',
'Dark Desert Fairy',
'Misery Mire']
Northern_DW_entrances = ['Brewery',
'C-Shaped House',
'Chest Game',
'Dark World Hammer Peg Cave',
'Red Shield Shop',
'Dark Sanctuary Hint',
'Fortune Teller (Dark)',
'Dark World Shop',
'Dark World Lumberjack Shop',
'Thieves Town',
'Skull Woods First Section Door',
'Skull Woods Second Section Door (East)']
Southern_DW_entrances = ['Hype Cave',
'Bonk Fairy (Dark)',
'Archery Game',
'Big Bomb Shop',
'Dark Lake Hylia Shop',
'Swamp Palace']
Isolated_DW_entrances = ['Spike Cave',
'Cave Shop (Dark Death Mountain)',
'Dark Death Mountain Fairy',
'Mimic Cave',
'Skull Woods Second Section Door (West)',
'Skull Woods Final Section',
'Ice Palace',
'Turtle Rock',
'Dark Death Mountain Ledge (West)',
'Dark Death Mountain Ledge (East)',
'Bumper Cave (Top)',
'Superbunny Cave (Top)',
'Superbunny Cave (Bottom)',
'Hookshot Cave',
'Ganons Tower',
'Turtle Rock Isolated Ledge Entrance',
'Hookshot Cave Back Entrance']
Isolated_LW_entrances = ['Capacity Upgrade',
'Tower of Hera',
'Death Mountain Return Cave (West)',
'Paradox Cave (Top)',
'Fairy Ascension Cave (Top)',
'Spiral Cave',
'Desert Palace Entrance (East)']
West_LW_DM_entrances = ['Old Man Cave (East)',
'Old Man House (Bottom)',
'Old Man House (Top)',
'Death Mountain Return Cave (East)',
'Spectacle Rock Cave Peak',
'Spectacle Rock Cave',
'Spectacle Rock Cave (Bottom)']
East_LW_DM_entrances = ['Paradox Cave (Bottom)',
'Paradox Cave (Middle)',
'Hookshot Fairy',
'Spiral Cave (Bottom)']
Mirror_from_SDW_entrances = ['Two Brothers House (West)',
'Cave 45']
Castle_ledge_entrances = ['Hyrule Castle Entrance (West)',
'Hyrule Castle Entrance (East)',
'Agahnims Tower']
Desert_mirrorable_ledge_entrances = ['Desert Palace Entrance (West)',
'Desert Palace Entrance (North)',
'Desert Palace Entrance (South)',
'Checkerboard Cave']
set_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.can_reach('East Dark World', 'Region', player) and state.can_reach('Big Bomb Shop', 'Region', player) and state.has('Crystal 5', player) and state.has('Crystal 6', player))
#crossing peg bridge starting from the southern dark world
def cross_peg_bridge(state):
return state.has('Hammer', player) and state.has_Pearl(player)
# returning via the eastern and southern teleporters needs the same items, so we use the southern teleporter for out routing.
# crossing preg bridge already requires hammer so we just add the gloves to the requirement
def southern_teleporter(state):
return state.can_lift_rocks(player) and cross_peg_bridge(state)
# the basic routes assume you can reach eastern light world with the bomb.
# you can then use the southern teleporter, or (if you have beaten Aga1) the hyrule castle gate warp
def basic_routes(state):
return southern_teleporter(state) or state.has('Beat Agahnim 1', player)
# Key for below abbreviations:
# P = pearl
# A = Aga1
# H = hammer
# M = Mirror
# G = Glove
if bombshop_entrance.name in Normal_LW_entrances:
#1. basic routes
#2. Can reach Eastern dark world some other way, mirror, get bomb, return to mirror spot, walk to pyramid: Needs mirror
# -> M or BR
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: basic_routes(state) or state.has_Mirror(player))
elif bombshop_entrance.name in LW_walkable_entrances:
#1. Mirror then basic routes
# -> M and BR
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has_Mirror(player) and basic_routes(state))
elif bombshop_entrance.name in Northern_DW_entrances:
#1. Mirror and basic routes
#2. Go to south DW and then cross peg bridge: Need Mitts and hammer and moon pearl
# -> (Mitts and CPB) or (M and BR)
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.can_lift_heavy_rocks(player) and cross_peg_bridge(state)) or (state.has_Mirror(player) and basic_routes(state)))
elif bombshop_entrance.name == 'Bumper Cave (Bottom)':
#1. Mirror and Lift rock and basic_routes
#2. Mirror and Flute and basic routes (can make difference if accessed via insanity or w/ mirror from connector, and then via hyrule castle gate, because no gloves are needed in that case)
#3. Go to south DW and then cross peg bridge: Need Mitts and hammer and moon pearl
# -> (Mitts and CPB) or (((G or Flute) and M) and BR))
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.can_lift_heavy_rocks(player) and cross_peg_bridge(state)) or (((state.can_lift_rocks(player) or state.has('Flute', player)) and state.has_Mirror(player)) and basic_routes(state)))
elif bombshop_entrance.name in Southern_DW_entrances:
#1. Mirror and enter via gate: Need mirror and Aga1
#2. cross peg bridge: Need hammer and moon pearl
# -> CPB or (M and A)
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: cross_peg_bridge(state) or (state.has_Mirror(player) and state.has('Beat Agahnim 1', player)))
elif bombshop_entrance.name in Isolated_DW_entrances:
# 1. mirror then flute then basic routes
# -> M and Flute and BR
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has_Mirror(player) and state.has('Flute', player) and basic_routes(state))
elif bombshop_entrance.name in Isolated_LW_entrances:
# 1. flute then basic routes
# Prexisting mirror spot is not permitted, because mirror might have been needed to reach these isolated locations.
# -> Flute and BR
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has('Flute', player) and basic_routes(state))
elif bombshop_entrance.name in West_LW_DM_entrances:
# 1. flute then basic routes or mirror
# Prexisting mirror spot is permitted, because flute can be used to reach west DM directly.
# -> Flute and (M or BR)
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has('Flute', player) and (state.has_Mirror(player) or basic_routes(state)))
elif bombshop_entrance.name in East_LW_DM_entrances:
# 1. flute then basic routes or mirror and hookshot
# Prexisting mirror spot is permitted, because flute can be used to reach west DM directly and then east DM via Hookshot
# -> Flute and ((M and Hookshot) or BR)
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has('Flute', player) and ((state.has_Mirror(player) and state.has('Hookshot', player)) or basic_routes(state)))
elif bombshop_entrance.name == 'Fairy Ascension Cave (Bottom)':
# Same as East_LW_DM_entrances except navigation without BR requires Mitts
# -> Flute and ((M and Hookshot and Mitts) or BR)
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has('Flute', player) and ((state.has_Mirror(player) and state.has('Hookshot', player) and state.can_lift_heavy_rocks(player)) or basic_routes(state)))
elif bombshop_entrance.name in Castle_ledge_entrances:
# 1. mirror on pyramid to castle ledge, grab bomb, return through mirror spot: Needs mirror
# 2. flute then basic routes
# -> M or (Flute and BR)
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has_Mirror(player) or (state.has('Flute', player) and basic_routes(state)))
elif bombshop_entrance.name in Desert_mirrorable_ledge_entrances:
# Cases when you have mire access: Mirror to reach locations, return via mirror spot, move to center of desert, mirror anagin and:
# 1. Have mire access, Mirror to reach locations, return via mirror spot, move to center of desert, mirror again and then basic routes
# 2. flute then basic routes
# -> (Mire access and M) or Flute) and BR
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: ((state.can_reach('Dark Desert', 'Region', player) and state.has_Mirror(player)) or state.has('Flute', player)) and basic_routes(state))
elif bombshop_entrance.name == 'Old Man Cave (West)':
# 1. Lift rock then basic_routes
# 2. flute then basic_routes
# -> (Flute or G) and BR
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.has('Flute', player) or state.can_lift_rocks(player)) and basic_routes(state))
elif bombshop_entrance.name == 'Graveyard Cave':
# 1. flute then basic routes
# 2. (has west dark world access) use existing mirror spot (required Pearl), mirror again off ledge
# -> (Flute or (M and P and West Dark World access) and BR
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.has('Flute', player) or (state.can_reach('West Dark World', 'Region', player) and state.has_Pearl(player) and state.has_Mirror(player))) and basic_routes(state))
elif bombshop_entrance.name in Mirror_from_SDW_entrances:
# 1. flute then basic routes
# 2. (has South dark world access) use existing mirror spot, mirror again off ledge
# -> (Flute or (M and South Dark World access) and BR
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.has('Flute', player) or (state.can_reach('South Dark World', 'Region', player) and state.has_Mirror(player))) and basic_routes(state))
elif bombshop_entrance.name == 'Dark World Potion Shop':
# 1. walk down by lifting rock: needs gloves and pearl`
# 2. walk down by hammering peg: needs hammer and pearl
# 3. mirror and basic routes
# -> (P and (H or Gloves)) or (M and BR)
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.has_Pearl(player) and (state.has('Hammer', player) or state.can_lift_rocks(player))) or (state.has_Mirror(player) and basic_routes(state)))
elif bombshop_entrance.name == 'Kings Grave':
# same as the Normal_LW_entrances case except that the pre-existing mirror is only possible if you have mitts
# (because otherwise mirror was used to reach the grave, so would cancel a pre-existing mirror spot)
# to account for insanity, must consider a way to escape without a cave for basic_routes
# -> (M and Mitts) or ((Mitts or Flute or (M and P and West Dark World access)) and BR)
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.can_lift_heavy_rocks(player) and state.has_Mirror(player)) or ((state.can_lift_heavy_rocks(player) or state.has('Flute', player) or (state.can_reach('West Dark World', 'Region', player) and state.has_Pearl(player) and state.has_Mirror(player))) and basic_routes(state)))
elif bombshop_entrance.name == 'Waterfall of Wishing':
# same as the Normal_LW_entrances case except in insanity it's possible you could be here without Flippers which
# means you need an escape route of either Flippers or Flute
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.has('Flippers', player) or state.has('Flute', player)) and (basic_routes(state) or state.has_Mirror(player)))
def set_inverted_big_bomb_rules(world, player):
bombshop_entrance = world.get_region('Inverted Big Bomb Shop', player).entrances[0]
Normal_LW_entrances = ['Blinds Hideout',
'Bonk Fairy (Light)',
'Lake Hylia Fairy',
'Light Hype Fairy',
'Desert Fairy',
'Chicken House',
'Aginahs Cave',
'Sahasrahlas Hut',
'Cave Shop (Lake Hylia)',
'Blacksmiths Hut',
'Sick Kids House',
'Lost Woods Gamble',
'Fortune Teller (Light)',
'Snitch Lady (East)',
'Snitch Lady (West)',
'Tavern (Front)',
'Kakariko Shop',
'Mini Moldorm Cave',
'Long Fairy Cave',
'Good Bee Cave',
'20 Rupee Cave',
'50 Rupee Cave',
'Ice Rod Cave',
'Bonk Rock Cave',
'Library',
'Potion Shop',
'Dam',
'Lumberjack House',
'Lake Hylia Fortune Teller',
'Eastern Palace',
'Kakariko Gamble Game',
'Kakariko Well Cave',
'Bat Cave Cave',
'Elder House (East)',
'Elder House (West)',
'North Fairy Cave',
'Lost Woods Hideout Stump',
'Lumberjack Tree Cave',
'Two Brothers House (East)',
'Sanctuary',
'Hyrule Castle Entrance (South)',
'Hyrule Castle Secret Entrance Stairs',
'Hyrule Castle Entrance (West)',
'Hyrule Castle Entrance (East)',
'Inverted Ganons Tower',
'Cave 45',
'Checkerboard Cave',
'Inverted Big Bomb Shop']
Isolated_LW_entrances = ['Old Man Cave (East)',
'Old Man House (Bottom)',
'Old Man House (Top)',
'Death Mountain Return Cave (East)',
'Spectacle Rock Cave Peak',
'Tower of Hera',
'Death Mountain Return Cave (West)',
'Paradox Cave (Top)',
'Fairy Ascension Cave (Top)',
'Spiral Cave',
'Paradox Cave (Bottom)',
'Paradox Cave (Middle)',
'Hookshot Fairy',
'Spiral Cave (Bottom)',
'Mimic Cave',
'Fairy Ascension Cave (Bottom)',
'Desert Palace Entrance (West)',
'Desert Palace Entrance (North)',
'Desert Palace Entrance (South)']
Eastern_DW_entrances = ['Palace of Darkness',
'Palace of Darkness Hint',
'Dark Lake Hylia Fairy',
'East Dark World Hint']
Northern_DW_entrances = ['Brewery',
'C-Shaped House',
'Chest Game',
'Dark World Hammer Peg Cave',
'Inverted Dark Sanctuary',
'Fortune Teller (Dark)',
'Dark World Lumberjack Shop',
'Thieves Town',
'Skull Woods First Section Door',
'Skull Woods Second Section Door (East)']
Southern_DW_entrances = ['Hype Cave',
'Bonk Fairy (Dark)',
'Archery Game',
'Inverted Links House',
'Dark Lake Hylia Shop',
'Swamp Palace']
Isolated_DW_entrances = ['Spike Cave',
'Cave Shop (Dark Death Mountain)',
'Dark Death Mountain Fairy',
'Skull Woods Second Section Door (West)',
'Skull Woods Final Section',
'Turtle Rock',
'Dark Death Mountain Ledge (West)',
'Dark Death Mountain Ledge (East)',
'Bumper Cave (Top)',
'Superbunny Cave (Top)',
'Superbunny Cave (Bottom)',
'Hookshot Cave',
'Turtle Rock Isolated Ledge Entrance',
'Hookshot Cave Back Entrance',
'Inverted Agahnims Tower']
LW_walkable_entrances = ['Dark Lake Hylia Ledge Fairy',
'Dark Lake Hylia Ledge Spike Cave',
'Dark Lake Hylia Ledge Hint',
'Mire Shed',
'Dark Desert Hint',
'Dark Desert Fairy',
'Misery Mire',
'Red Shield Shop']
LW_bush_entrances = ['Bush Covered House',
'Light World Bomb Hut',
'Graveyard Cave']
LW_inaccessible_entrances = ['Desert Palace Entrance (East)',
'Spectacle Rock Cave',
'Spectacle Rock Cave (Bottom)']
set_rule(world.get_entrance('Pyramid Fairy', player),
lambda state: state.can_reach('East Dark World', 'Region', player) and state.can_reach('Inverted Big Bomb Shop', 'Region', player) and state.has('Crystal 5', player) and state.has('Crystal 6', player))
# Key for below abbreviations:
# P = pearl
# A = Aga1
# H = hammer
# M = Mirror
# G = Glove
if bombshop_entrance.name in Eastern_DW_entrances:
# Just walk to the pyramid
pass
elif bombshop_entrance.name in Normal_LW_entrances:
# Just walk to the castle and mirror.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has_Mirror(player))
elif bombshop_entrance.name in Isolated_LW_entrances:
# For these entrances, you cannot walk to the castle/pyramid and thus must use Mirror and then Flute.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.can_flute(player) and state.has_Mirror(player))
elif bombshop_entrance.name in Northern_DW_entrances:
# You can just fly with the Flute, you can take a long walk with Mitts and Hammer,
# or you can leave a Mirror portal nearby and then walk to the castle to Mirror again.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.can_flute or (state.can_lift_heavy_rocks(player) and state.has('Hammer', player)) or (state.has_Mirror(player) and state.can_reach('Light World', 'Region', player)))
elif bombshop_entrance.name in Southern_DW_entrances:
# This is the same as north DW without the Mitts rock present.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has('Hammer', player) or state.can_flute(player) or (state.has_Mirror(player) and state.can_reach('Light World', 'Region', player)))
elif bombshop_entrance.name in Isolated_DW_entrances:
# There's just no way to escape these places with the bomb and no Flute.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.can_flute(player))
elif bombshop_entrance.name in LW_walkable_entrances:
# You can fly with the flute, or leave a mirror portal and walk through the light world
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.can_flute(player) or (state.has_Mirror(player) and state.can_reach('Light World', 'Region', player)))
elif bombshop_entrance.name in LW_bush_entrances:
# These entrances are behind bushes in LW so you need either Pearl or the tools to solve NDW bomb shop locations.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has_Mirror(player) and (state.can_flute(player) or state.has_Pearl(player) or (state.can_lift_heavy_rocks(player) and state.has('Hammer', player))))
elif bombshop_entrance.name == 'Dark World Shop':
# This is mostly the same as NDW but the Mirror path requires the Pearl, or using the Hammer
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.can_flute or (state.can_lift_heavy_rocks(player) and state.has('Hammer', player)) or (state.has_Mirror(player) and state.can_reach('Light World', 'Region', player) and (state.has_Pearl(player) or state.has('Hammer', player))))
elif bombshop_entrance.name == 'Bumper Cave (Bottom)':
# This is mostly the same as NDW but the Mirror path requires being able to lift a rock.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.can_flute or (state.can_lift_heavy_rocks(player) and state.has('Hammer', player)) or (state.has_Mirror(player) and state.can_lift_rocks(player) and state.can_reach('Light World', 'Region', player)))
elif bombshop_entrance.name == 'Old Man Cave (West)':
# The three paths back are Mirror and DW walk, Mirror and Flute, or LW walk and then Mirror.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has_Mirror(player) and ((state.can_lift_heavy_rocks(player) and state.has('Hammer', player)) or (state.can_lift_rocks(player) and state.has_Pearl(player)) or state.can_flute(player)))
elif bombshop_entrance.name == 'Dark World Potion Shop':
# You either need to Flute to 5 or cross the rock/hammer choice pass to the south.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.can_flute(player) or state.has('Hammer', player) or state.can_lift_rocks(player))
elif bombshop_entrance.name == 'Kings Grave':
# Either lift the rock and walk to the castle to Mirror or Mirror immediately and Flute.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.can_flute(player) or state.can_lift_heavy_rocks(player)) and state.has_Mirror(player))
elif bombshop_entrance.name == 'Waterfall of Wishing':
# You absolutely must be able to swim to return it from here.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has('Flippers', player) and state.has_Pearl(player) and state.has_Mirror(player))
elif bombshop_entrance.name == 'Ice Palace':
# You can swim to the dock or use the Flute to get off the island.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: state.has('Flippers', player) or state.can_flute(player))
elif bombshop_entrance.name == 'Capacity Upgrade':
# You must Mirror but then can use either Ice Palace return path.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.has('Flippers', player) or state.can_flute(player)) and state.has_Mirror(player))
elif bombshop_entrance.name == 'Two Brothers House (West)':
# First you must Mirror. Then you can either Flute, cross the peg bridge, or use the Agah 1 portal to Mirror again.
add_rule(world.get_entrance('Pyramid Fairy', player), lambda state: (state.can_flute(player) or state.has('Hammer', player) or state.has('Beat Agahnim 1', player)) and state.has_Mirror(player))
elif bombshop_entrance.name in LW_inaccessible_entrances:
# You can't get to the pyramid from these entrances without bomb duping.
raise Exception('No valid path to open Pyramid Fairy. (Could not route from %s)' % bombshop_entrance.name)
elif bombshop_entrance.name == 'Pyramid Fairy':
# Self locking. The shuffles don't put the bomb shop here, but doesn't lock anything important.
set_rule(world.get_entrance('Pyramid Fairy', player), lambda state: False)
else:
raise Exception('No logic found for routing from %s to the pyramid.' % bombshop_entrance.name)
def set_bunny_rules(world: World, player: int, inverted: bool):
# regions for the exits of multi-entrance caves/drops that bunny cannot pass
# Note spiral cave and two brothers house are passable in superbunny state for glitch logic with extra requirements.
bunny_impassable_caves = ['Bumper Cave', 'Two Brothers House', 'Hookshot Cave', 'Skull Woods First Section (Right)', 'Skull Woods First Section (Left)', 'Skull Woods First Section (Top)', 'Turtle Rock (Entrance)', 'Turtle Rock (Second Section)', 'Turtle Rock (Big Chest)', 'Skull Woods Second Section (Drop)',
'Turtle Rock (Eye Bridge)', 'Sewers', 'Pyramid', 'Spiral Cave (Top)', 'Desert Palace Main (Inner)', 'Fairy Ascension Cave (Drop)']
bunny_accessible_locations = ['Link\'s Uncle', 'Sahasrahla', 'Sick Kid', 'Lost Woods Hideout', 'Lumberjack Tree', 'Checkerboard Cave', 'Potion Shop', 'Spectacle Rock Cave', 'Pyramid', 'Hype Cave - Generous Guy', 'Peg Cave', 'Bumper Cave Ledge', 'Dark Blacksmith Ruins', 'Spectacle Rock', 'Bombos Tablet', 'Ether Tablet', 'Purple Chest', 'Blacksmith', 'Missing Smith', 'Master Sword Pedestal', 'Bottle Merchant', 'Sunken Treasure', 'Desert Ledge']
def path_to_access_rule(path, entrance):
return lambda state: state.can_reach(entrance.name, 'Entrance', entrance.player) and all(rule(state) for rule in path)
def options_to_access_rule(options):
return lambda state: any(rule(state) for rule in options)
# Helper functions to determine if the moon pearl is required
def is_bunny(region):
if inverted:
return region.is_light_world
else:
return region.is_dark_world
def is_link(region):
if inverted:
return region.is_dark_world
else:
return region.is_light_world
def get_rule_to_add(region, location = None, connecting_entrance = None):
# In OWG, a location can potentially be superbunny-mirror accessible or
# bunny revival accessible.
if world.logic[player] == 'owglitches':
if region.name == 'Swamp Palace (Entrance)':
return lambda state: state.has_Pearl(player)
if region.name in OverworldGlitchRules.get_invalid_bunny_revival_dungeons():
return lambda state: state.has_Mirror(player) or state.has_Pearl(player)
if region.type == RegionType.Dungeon:
return lambda state: True
if (((location is None or location.name not in OverworldGlitchRules.get_superbunny_accessible_locations())
or (connecting_entrance is not None and connecting_entrance.name in OverworldGlitchRules.get_invalid_bunny_revival_dungeons()))
and not is_link(region)):
return lambda state: state.has_Pearl(player)
else:
if not is_link(region):
return lambda state: state.has_Pearl(player)
# in this case we are mixed region.
# we collect possible options.
# The base option is having the moon pearl
possible_options = [lambda state: state.has_Pearl(player)]
# We will search entrances recursively until we find
# one that leads to an exclusively link state region
# for each such entrance a new option is added that consist of:
# a) being able to reach it, and
# b) being able to access all entrances from there to `region`
seen = set([region])
queue = collections.deque([(region, [])])
while queue:
(current, path) = queue.popleft()
for entrance in current.entrances:
new_region = entrance.parent_region
if new_region in seen:
continue
new_path = path + [entrance.access_rule]
seen.add(new_region)
if not is_link(new_region):
# For OWG, establish superbunny and revival rules.
if world.logic[player] == 'owglitches' and entrance.name not in OverworldGlitchRules.get_invalid_bunny_revival_dungeons():
if region.name in OverworldGlitchRules.get_sword_required_superbunny_mirror_regions():
possible_options.append(lambda state: path_to_access_rule(new_path, entrance) and state.has_Mirror(player) and state.has_sword(player))
elif (region.name in OverworldGlitchRules.get_boots_required_superbunny_mirror_regions()
or location is not None and location.name in OverworldGlitchRules.get_boots_required_superbunny_mirror_locations()):
possible_options.append(lambda state: path_to_access_rule(new_path, entrance) and state.has_Mirror(player) and state.has_Boots(player))
elif location is not None and location.name in OverworldGlitchRules.get_superbunny_accessible_locations():
if new_region.name == 'Superbunny Cave (Bottom)' or region.name == 'Kakariko Well (top)':
possible_options.append(lambda state: path_to_access_rule(new_path, entrance))
else:
possible_options.append(lambda state: path_to_access_rule(new_path, entrance) and state.has_Mirror(player))
if new_region.type != RegionType.Cave:
continue
else:
continue
if is_bunny(new_region):
queue.append((new_region, new_path))
else:
# we have reached pure link state, so we have a new possible option
possible_options.append(path_to_access_rule(new_path, entrance))
return options_to_access_rule(possible_options)
# Add requirements for bunny-impassible caves if link is a bunny in them
for region in [world.get_region(name, player) for name in bunny_impassable_caves]:
if not is_bunny(region):
continue
rule = get_rule_to_add(region)
for exit in region.exits:
add_rule(exit, rule)
paradox_shop = world.get_region('Light World Death Mountain Shop', player)
if is_bunny(paradox_shop):
add_rule(paradox_shop.entrances[0], get_rule_to_add(paradox_shop))
# Add requirements for all locations that are actually in the dark world, except those available to the bunny, including dungeon revival
for entrance in world.get_entrances():
if entrance.player == player and is_bunny(entrance.connected_region):
if world.logic[player] == 'owglitches':
if entrance.connected_region.type == RegionType.Dungeon:
if entrance.parent_region.type != RegionType.Dungeon and entrance.connected_region.name in OverworldGlitchRules.get_invalid_bunny_revival_dungeons():
add_rule(entrance, get_rule_to_add(entrance.connected_region, None, entrance))
continue
if entrance.connected_region.name == 'Turtle Rock (Entrance)':
add_rule(world.get_entrance('Turtle Rock Entrance Gap', player), get_rule_to_add(entrance.connected_region, None, entrance))
for location in entrance.connected_region.locations:
if world.logic[player] == 'owglitches' and entrance.name in OverworldGlitchRules.get_invalid_mirror_bunny_entrances():
continue
if location.name in bunny_accessible_locations:
continue
add_rule(location, get_rule_to_add(entrance.connected_region, location)) | 83.285714 | 479 | 0.686724 | 17,142 | 124,762 | 4.849609 | 0.048886 | 0.058606 | 0.067555 | 0.114324 | 0.807318 | 0.77253 | 0.742001 | 0.70874 | 0.669285 | 0.627087 | 0 | 0.00179 | 0.20289 | 124,762 | 1,498 | 480 | 83.285714 | 0.834136 | 0.10916 | 0 | 0.309727 | 0 | 0.024744 | 0.240494 | 0.000595 | 0 | 0 | 0 | 0.000668 | 0 | 1 | 0.03413 | false | 0.016212 | 0.005119 | 0.004266 | 0.061433 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
16d544f75f2378538f4097e46314a174064c5ab7 | 156 | py | Python | location/app/points.py | madpin/renthub | 6ea0f1eefe1fcdb13a9b11d00a3c8f1a1f8ad439 | [
"MIT"
] | null | null | null | location/app/points.py | madpin/renthub | 6ea0f1eefe1fcdb13a9b11d00a3c8f1a1f8ad439 | [
"MIT"
] | null | null | null | location/app/points.py | madpin/renthub | 6ea0f1eefe1fcdb13a9b11d00a3c8f1a1f8ad439 | [
"MIT"
] | null | null | null | from schemas import Point
indeed = Point(lat=53.34545621516955, long=-6.231801040391591)
bank_house = Point(lat=53.34347027177946, long=-6.276045630904159) | 39 | 66 | 0.807692 | 21 | 156 | 5.952381 | 0.714286 | 0.128 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 0.076923 | 156 | 4 | 66 | 39 | 0.423611 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
16dab816d06d29d61e0b4904fe815b5f4cdb4dad | 114 | py | Python | prep_directory.py | fraser-lab/plumed_em_md | d561793f9a2444085a97249a3e1e97a05336395e | [
"MIT"
] | 1 | 2022-01-09T20:32:57.000Z | 2022-01-09T20:32:57.000Z | prep_directory.py | fraser-lab/plumed_em_md | d561793f9a2444085a97249a3e1e97a05336395e | [
"MIT"
] | null | null | null | prep_directory.py | fraser-lab/plumed_em_md | d561793f9a2444085a97249a3e1e97a05336395e | [
"MIT"
] | 2 | 2020-12-01T08:13:11.000Z | 2021-06-02T20:43:48.000Z | import os
plumed_working_dir = "~/plumed_em_md/MDP"
os.system("cp {dir}/*.mdp .".format(dir=plumed_working_dir))
| 22.8 | 60 | 0.736842 | 19 | 114 | 4.105263 | 0.578947 | 0.333333 | 0.410256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 114 | 4 | 61 | 28.5 | 0.742857 | 0 | 0 | 0 | 0 | 0 | 0.298246 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
16eabd024fda6238f29a7697cb894ae08d8c1d06 | 4,706 | py | Python | tests/artifacts/test_dask_to_dataset.py | Hedingber/mlrun | e2269718fcc7caa7e1aa379ac28495830b45f9da | [
"Apache-2.0"
] | 1 | 2021-02-17T08:12:33.000Z | 2021-02-17T08:12:33.000Z | tests/artifacts/test_dask_to_dataset.py | Hedingber/mlrun | e2269718fcc7caa7e1aa379ac28495830b45f9da | [
"Apache-2.0"
] | null | null | null | tests/artifacts/test_dask_to_dataset.py | Hedingber/mlrun | e2269718fcc7caa7e1aa379ac28495830b45f9da | [
"Apache-2.0"
] | 1 | 2021-08-30T21:43:38.000Z | 2021-08-30T21:43:38.000Z | import dask.dataframe as dd
import numpy
import pandas
import mlrun.artifacts.dataset
def test_dataset_preview_size_limit_from_large_dask_dataframe(monkeypatch):
"""
To simplify testing the behavior of a large Dask DataFrame as a mlrun
Dataset, we set the default max_ddf_size parameter at 300MB,
and test with dataframes of 430MB in size. Default behavior is to
convert any Dask DataFrames of size <1GB to Pandas, else use Dask to create the artifact.
"""
# Set a MAX_DDF_SIZE param to simplify testing
monkeypatch.setattr(mlrun.artifacts.dataset, "max_ddf_size", 0.001)
print("Creating dataframe and setting memory limit")
A = numpy.random.random_sample(size=(50000, 6))
df = pandas.DataFrame(data=A, columns=list("ABCDEF"))
print("Verify the memory size of the dataframe is >400MB")
assert (df.memory_usage().sum() // 1e3) > 200
ddf = dd.from_pandas(df, npartitions=4)
artifact = mlrun.artifacts.dataset.DatasetArtifact(df=ddf)
assert len(artifact.preview) == mlrun.artifacts.dataset.default_preview_rows_length
# override limit
limit = 25
artifact = mlrun.artifacts.dataset.DatasetArtifact(df=ddf, preview=limit)
assert len(artifact.preview) == limit
# ignore limits
artifact = mlrun.artifacts.dataset.DatasetArtifact(
df=ddf, ignore_preview_limits=True
)
# For a large DDF, sample 20% of the rows for
# creating the preview
assert len(artifact.preview) == len(ddf.sample(frac=0.2).compute().values.tolist())
print("Passed assertion on preview")
# more than allowed columns
number_of_columns = mlrun.artifacts.dataset.max_preview_columns * 3
data_frame = pandas.DataFrame(
numpy.random.randint(0, 10, size=(2000, number_of_columns)),
columns=list(range(number_of_columns)),
)
ddf = dd.from_pandas(data_frame, npartitions=4)
ddf = ddf.repartition(partition_size="1MB")
artifact = mlrun.artifacts.dataset.DatasetArtifact(df=ddf)
assert len(artifact.preview[0]) == mlrun.artifacts.dataset.max_preview_columns
assert artifact.stats is None
# ignore limits
artifact = mlrun.artifacts.dataset.DatasetArtifact(
df=ddf, ignore_preview_limits=True
)
assert len(artifact.preview[0]) == number_of_columns + 1
# too many rows for stats computation
data_frame = pandas.DataFrame(
numpy.random.randint(0, 10, size=(mlrun.artifacts.dataset.max_csv * 3, 1)),
columns=["A"],
)
ddf = dd.from_pandas(data_frame, npartitions=2)
ddf = ddf.repartition(partition_size="100MB")
artifact = mlrun.artifacts.dataset.DatasetArtifact(df=data_frame)
assert artifact.stats is None
def test_dataset_preview_size_limit_from_small_dask_dataframe():
print("Starting preview for small dask dataframe")
A = numpy.random.random_sample(size=(100, 6))
df = pandas.DataFrame(data=A, columns=list("ABCDEF"))
ddf = dd.from_pandas(df, npartitions=4).persist()
artifact = mlrun.artifacts.dataset.DatasetArtifact(df=ddf)
assert len(artifact.preview) == mlrun.artifacts.dataset.default_preview_rows_length
# override limit
limit = 25
artifact = mlrun.artifacts.dataset.DatasetArtifact(df=ddf, preview=limit)
assert len(artifact.preview) == limit
# ignore limits
artifact = mlrun.artifacts.dataset.DatasetArtifact(
df=ddf, ignore_preview_limits=True
)
# For a small DDF (<1GB), convert to Pandas
assert len(artifact.preview) == len(ddf)
print("passed length assertion on preview")
# more than allowed columns
number_of_columns = mlrun.artifacts.dataset.max_preview_columns * 3
data_frame = pandas.DataFrame(
numpy.random.randint(0, 10, size=(10, number_of_columns)),
columns=list(range(number_of_columns)),
)
ddf = dd.from_pandas(data_frame, npartitions=4)
ddf = ddf.repartition(partition_size="100MB").persist()
artifact = mlrun.artifacts.dataset.DatasetArtifact(df=ddf)
assert len(artifact.preview[0]) == mlrun.artifacts.dataset.max_preview_columns
assert artifact.stats is None
# ignore limits
artifact = mlrun.artifacts.dataset.DatasetArtifact(
df=ddf, ignore_preview_limits=True
)
assert len(artifact.preview[0]) == number_of_columns + 1
# too many rows for stats computation
data_frame = pandas.DataFrame(
numpy.random.randint(0, 10, size=(mlrun.artifacts.dataset.max_csv * 3, 1)),
columns=["A"],
)
ddf = dd.from_pandas(data_frame, npartitions=2)
ddf = ddf.repartition(partition_size="100MB").persist()
artifact = mlrun.artifacts.dataset.DatasetArtifact(df=data_frame)
assert artifact.stats is None
| 39.881356 | 93 | 0.717382 | 624 | 4,706 | 5.274038 | 0.185897 | 0.093589 | 0.140383 | 0.105743 | 0.774233 | 0.774233 | 0.738985 | 0.700699 | 0.700699 | 0.67639 | 0 | 0.021739 | 0.178921 | 4,706 | 117 | 94 | 40.222222 | 0.829969 | 0.138122 | 0 | 0.617284 | 0 | 0 | 0.059278 | 0 | 0 | 0 | 0 | 0 | 0.209877 | 1 | 0.024691 | false | 0.024691 | 0.049383 | 0 | 0.074074 | 0.061728 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
bc6febf8d65b5f67533f8263e05d269a2ab87b06 | 87 | py | Python | allopy/optimize/portfolio/__init__.py | wangcj05/allopy | 0d97127e5132df1449283198143994b45fb11214 | [
"MIT"
] | 1 | 2021-04-06T04:33:03.000Z | 2021-04-06T04:33:03.000Z | allopy/optimize/portfolio/__init__.py | wangcj05/allopy | 0d97127e5132df1449283198143994b45fb11214 | [
"MIT"
] | null | null | null | allopy/optimize/portfolio/__init__.py | wangcj05/allopy | 0d97127e5132df1449283198143994b45fb11214 | [
"MIT"
] | null | null | null | from .active import ActivePortfolioOptimizer
from .portfolio import PortfolioOptimizer
| 29 | 44 | 0.885057 | 8 | 87 | 9.625 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091954 | 87 | 2 | 45 | 43.5 | 0.974684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
bc71f7fe895a2fc2fc666b63dd1a70b3fa4256f7 | 75 | py | Python | awspider/resources2/__init__.py | wehriam/awspider | 3d3dc40208fb334a6b6cdaae92f5d5ea07295616 | [
"MIT"
] | 2 | 2016-05-09T14:59:51.000Z | 2021-11-22T02:35:39.000Z | awspider/resources2/__init__.py | wehriam/awspider | 3d3dc40208fb334a6b6cdaae92f5d5ea07295616 | [
"MIT"
] | null | null | null | awspider/resources2/__init__.py | wehriam/awspider | 3d3dc40208fb334a6b6cdaae92f5d5ea07295616 | [
"MIT"
] | 2 | 2022-02-27T19:55:42.000Z | 2022-03-08T07:20:53.000Z | from .scheduler import SchedulerResource
from .worker import WorkerResource | 37.5 | 40 | 0.88 | 8 | 75 | 8.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093333 | 75 | 2 | 41 | 37.5 | 0.970588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
bc890b250b2f2388203397132c09ab38ee3a6f8d | 20 | py | Python | python/testData/psi/FStringBackslashAfterExpression.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/psi/FStringBackslashAfterExpression.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/psi/FStringBackslashAfterExpression.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | s = f'foo{42 \ }bar' | 20 | 20 | 0.5 | 5 | 20 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0.2 | 20 | 1 | 20 | 20 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0.619048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
bcd5b85bbafedcb05d43ccd74f0d05062ba58611 | 215 | py | Python | lightex/mulogger/__init__.py | ofnote/lightex | 86aa1306356d20b714f1970fddc981f668ca06e5 | [
"Apache-2.0"
] | 12 | 2019-10-14T22:08:16.000Z | 2022-01-03T04:53:39.000Z | lightex/mulogger/__init__.py | ofnote/lightex | 86aa1306356d20b714f1970fddc981f668ca06e5 | [
"Apache-2.0"
] | 11 | 2019-07-20T03:45:07.000Z | 2020-02-04T18:24:03.000Z | lightex/mulogger/__init__.py | ofnote/lightex | 86aa1306356d20b714f1970fddc981f668ca06e5 | [
"Apache-2.0"
] | 5 | 2019-07-25T11:35:14.000Z | 2021-01-26T04:49:51.000Z | from .config import MLFlowConfig, PytorchTBConfig, LoggerConfig
from .multi_logger import MLFlowLogger, PytorchTBLogger, MultiLogger
from .abstract_logger import AbstractLogger, get_project_name, get_experiment_name | 71.666667 | 82 | 0.87907 | 24 | 215 | 7.625 | 0.708333 | 0.131148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07907 | 215 | 3 | 82 | 71.666667 | 0.924242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
bcfa8607964b06a308094bd655322c9699edbd36 | 162 | py | Python | flask_oauthlib/contrib/__init__.py | mdxs/flask-oauthlib | 21f5c3e73a023d399890d91d84f69c4ac0ea284b | [
"BSD-3-Clause"
] | null | null | null | flask_oauthlib/contrib/__init__.py | mdxs/flask-oauthlib | 21f5c3e73a023d399890d91d84f69c4ac0ea284b | [
"BSD-3-Clause"
] | null | null | null | flask_oauthlib/contrib/__init__.py | mdxs/flask-oauthlib | 21f5c3e73a023d399890d91d84f69c4ac0ea284b | [
"BSD-3-Clause"
] | null | null | null | # coding: utf-8
"""
flask_oauthlib.contrib
~~~~~~~~~~~~~~~~~~~~~~
Contributions for Flask OAuthlib.
:copyright: (c) 2013 by Hsiaoming Yang.
"""
| 16.2 | 43 | 0.549383 | 16 | 162 | 5.5 | 0.875 | 0.295455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03937 | 0.216049 | 162 | 9 | 44 | 18 | 0.653543 | 0.839506 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
bcfe7ae4c84c817e3f5769bdb9a53c388f4c4883 | 271 | py | Python | prettyqt/multimediawidgets/videowidgetcontrol.py | phil65/PrettyQt | 26327670c46caa039c9bd15cb17a35ef5ad72e6c | [
"MIT"
] | 7 | 2019-05-01T01:34:36.000Z | 2022-03-08T02:24:14.000Z | prettyqt/multimediawidgets/videowidgetcontrol.py | phil65/PrettyQt | 26327670c46caa039c9bd15cb17a35ef5ad72e6c | [
"MIT"
] | 141 | 2019-04-16T11:22:01.000Z | 2021-04-14T15:12:36.000Z | prettyqt/multimediawidgets/videowidgetcontrol.py | phil65/PrettyQt | 26327670c46caa039c9bd15cb17a35ef5ad72e6c | [
"MIT"
] | 5 | 2019-04-17T11:48:19.000Z | 2021-11-21T10:30:19.000Z | from __future__ import annotations
from prettyqt import multimedia
from prettyqt.qt import QtMultimediaWidgets
QtMultimediaWidgets.QVideoWidgetControl.__bases__ = (multimedia.MediaControl,)
class VideoWidgetControl(QtMultimediaWidgets.QVideoWidgetControl):
pass
| 22.583333 | 78 | 0.856089 | 23 | 271 | 9.73913 | 0.608696 | 0.107143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099631 | 271 | 11 | 79 | 24.636364 | 0.918033 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.166667 | 0.5 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 5 |
4c0d844c57527ea5d788dddf598a87ccb59359f5 | 252 | py | Python | gym-traffic/gym_traffic/__init__.py | xiaoyum2/RL-Traffic-Dynamics | f9710a71b613f88d47f60566d3b82b5a251a4e8d | [
"MIT"
] | null | null | null | gym-traffic/gym_traffic/__init__.py | xiaoyum2/RL-Traffic-Dynamics | f9710a71b613f88d47f60566d3b82b5a251a4e8d | [
"MIT"
] | null | null | null | gym-traffic/gym_traffic/__init__.py | xiaoyum2/RL-Traffic-Dynamics | f9710a71b613f88d47f60566d3b82b5a251a4e8d | [
"MIT"
] | null | null | null | from gym.envs.registration import register
register(id='traffic-v0',entry_point='gym_traffic.envs:TrafficEnv',)
register(id='traffic-v1', entry_point='gym_traffic.envs:TrafficMidEnv',)
# register(id='basic-v2',entry_point='gym_basic.envs:BasicEnv2',) | 63 | 72 | 0.793651 | 36 | 252 | 5.388889 | 0.472222 | 0.154639 | 0.201031 | 0.206186 | 0.247423 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016529 | 0.039683 | 252 | 4 | 73 | 63 | 0.785124 | 0.25 | 0 | 0 | 0 | 0 | 0.409574 | 0.303191 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
4c4c0f4cb39db83bc6991145c641173076bc596f | 98 | py | Python | DQM/SiStripCommon/python/TkHistoMap_cff.py | nistefan/cmssw | ea13af97f7f2117a4f590a5e654e06ecd9825a5b | [
"Apache-2.0"
] | 3 | 2018-08-24T19:10:26.000Z | 2019-02-19T11:45:32.000Z | DQM/SiStripCommon/python/TkHistoMap_cff.py | nistefan/cmssw | ea13af97f7f2117a4f590a5e654e06ecd9825a5b | [
"Apache-2.0"
] | 3 | 2018-08-23T13:40:24.000Z | 2019-12-05T21:16:03.000Z | DQM/SiStripCommon/python/TkHistoMap_cff.py | nistefan/cmssw | ea13af97f7f2117a4f590a5e654e06ecd9825a5b | [
"Apache-2.0"
] | 5 | 2018-08-21T16:37:52.000Z | 2020-01-09T13:33:17.000Z | from CalibTracker.SiStripCommon.TkDetMap_cff import *
from DQMServices.Core.DQMStore_cfg import *
| 32.666667 | 53 | 0.857143 | 12 | 98 | 6.833333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 98 | 2 | 54 | 49 | 0.911111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
4c52026c85f4157a1d3621a26a14a2af94eb5d2c | 47,550 | py | Python | tests/test_compiler.py | dmort27/morphotactics | 23fee6eb28a59858fdd68ca5a4aa7c4605ea1c8c | [
"MIT"
] | 2 | 2021-09-02T18:32:29.000Z | 2021-12-03T07:41:19.000Z | tests/test_compiler.py | dmort27/morphotactics | 23fee6eb28a59858fdd68ca5a4aa7c4605ea1c8c | [
"MIT"
] | null | null | null | tests/test_compiler.py | dmort27/morphotactics | 23fee6eb28a59858fdd68ca5a4aa7c4605ea1c8c | [
"MIT"
] | 1 | 2021-05-19T00:01:00.000Z | 2021-05-19T00:01:00.000Z | from morphotactics.morphotactics import compile
from morphotactics.slot import Slot
from morphotactics.stem_guesser import StemGuesser
import pytest
import pynini
import pywrapfst
import math
import random
from typing import List, Tuple
# helpers
def accepts(fsa: pynini.Fst, input_str: str) -> bool:
"""
Check if input_str is in the language of the FSA fsa
Pynini converts input_str into a linear chain automaton and composes it
with the FSA. input_str is accepted if the composition has more than 1 state
Args:
fsa (Fst): a finite-state acceptor
input_str (string): the string in question
Returns:
(bool): True if input_str is in fsa's language
"""
return pynini.compose(input_str, fsa).num_states() != 0
def analyze(fst: pynini.Fst, input_str: str) -> str:
"""
Transduces input_str belonging to lower alphabet to string in upper alphabet
Pynini converts input_str into a linear chain automaton and composes it
with the FST. Calls string() to convert the composed FST into a string.
string() only works for deterministic FSTs (i.e. only 1 path exists for input_str)
Args:
fst (Fst): an FST
input_str (string): the string in question, made of symbols from the FST's input alphabet
Returns:
(string): the transduced output string
"""
return pynini.compose(input_str, fst).string()
def all_strings_from_chain(automaton: pynini.Fst) -> List[str]:
"""
Return all strings implied by a non-cyclic automaton.
Adapted from fststr library by David Mortensen
Source: https://github.com/dmort27/fststr/blob/master/fststr/fststr.py
Args:
chain (Fst): a non-cyclic finite state automaton
Returns:
(list): a list of (transduced strings, weight) tuples
"""
def dfs(graph: pynini.Fst, path: List[Tuple[int, str, float]], paths: List[List[Tuple[int, str, float]]] = []):
target, label, weight = path[-1]
if graph.num_arcs(target):
for arc in graph.arcs(target):
new_target = arc.nextstate
new_label = arc.olabel
new_weight = arc.weight
new_path = path + [(new_target, new_label, float(new_weight))]
paths = dfs(graph, new_path, paths)
else:
path = path[:-1]
path += [(target, label, weight + float(graph.final(target)))]
paths += [path]
return paths
if automaton.properties(pywrapfst.CYCLIC, True) == pywrapfst.CYCLIC:
raise Exception('FST is cyclic.')
start = automaton.start()
paths = dfs(automaton, [(start, 0, 0.0)])
strings = []
for path in paths:
chars = []
weight = 0.0
for (_, k, w) in path:
if k:
chars.append(chr(k))
weight += w # semiring product in the tropical semiring is addition
strings.append((''.join(chars), weight))
return strings
def correct_transduction_and_weights(fst: pynini.Fst, input_str: str, expected_paths: List[Tuple[str, float]]) -> bool:
"""Calculate all possible output paths of fst applied to input_str
and see if they match in both symbol and weights with expected_paths
Args:
fst (Fst): the FST
input_str: the string to be transduced
expected_paths (list): a list of (string, weight) tuples
Returns:
(boolean): True if output paths matched expected_paths, False otherwise
"""
output_paths = all_strings_from_chain(pynini.compose(input_str, fst))
if len(output_paths) != len(expected_paths):
return False
output_paths = sorted(output_paths, key=lambda x: (x[1], x[0]))
expected_paths = sorted(expected_paths, key=lambda x: (x[1], x[0]))
for ((str1, weight1), (str2, weight2)) in zip(output_paths, expected_paths):
if str1 != str2:
print(str1 + ' does not match ' + str2)
return False
if not math.isclose(weight1, weight2, abs_tol=1e-5):
print('path ' + str(str1) + ': ' + str(weight1) + ' does not match ' + str(weight2))
return False
return True
def test_no_starting_slot_raises_exception():
with pytest.raises(Exception) as excinfo:
compile({ Slot('name', []) }) # start=False by default
assert 'need at least 1 slot to be a starting slot' in str(excinfo.value)
def test_single_starting_class_no_continuation():
fst = compile({ Slot('name', [('a', 'b', [(None, 0.0)], 0.0)], start=True) })
assert analyze(fst, 'b') == 'a' # direction of morphological analysis
# FST does not do morphological generation (FST rejects upper alphabet symbols)
with pytest.raises(Exception):
analyze(fst, 'a')
def test_single_starting_class_single_continuation():
fst = compile({
Slot('class1', [('a', 'b', [('class2', 0.0)], 0.0)], start=True),
Slot('class2', [('c', 'd', [(None, 0.0)], 0.0)]),
})
assert analyze(fst, 'bd') == 'ac'
def test_single_starting_class_multiple_continuations():
fst = compile({
Slot('class1', [('a', 'b', [('class2', 0.0), ('class3', 0.0)], 0.0)], start=True),
Slot('class2', [('c', 'd', [(None, 0.0)], 0.0)]),
Slot('class3', [('e', 'f', [(None, 0.0)], 0.0)]),
})
assert analyze(fst, 'bd') == 'ac'
assert analyze(fst, 'bf') == 'ae'
# must start with the starting class
with pytest.raises(Exception):
analyze(fst, 'd')
with pytest.raises(Exception):
analyze(fst, 'f')
def test_single_starting_class_multiple_classes():
fst = compile({
Slot('class1', [('a', 'b', [('class2', 0.0)], 0.0)], start=True),
Slot('class2', [('c', 'd', [('class3', 0.0)], 0.0)]),
Slot('class3', [('e', 'f', [('class4', 0.0)], 0.0)]),
Slot('class4', [('g', 'h', [(None, 0.0)], 0.0)])
})
assert analyze(fst, 'bdfh') == 'aceg'
# must start with the starting class
with pytest.raises(Exception):
analyze(fst, 'd')
with pytest.raises(Exception):
analyze(fst, 'f')
with pytest.raises(Exception):
analyze(fst, 'h')
def test_multiple_starting_classes_no_continuation():
fst = compile({
Slot('class1', [('a', 'b', [(None, 0.0)], 0.0)], start=True),
Slot('class2', [('c', 'd', [(None, 0.0)], 0.0)], start=True)
})
assert analyze(fst, 'b') == 'a'
assert analyze(fst, 'd') == 'c'
# starting classes do not connect
with pytest.raises(Exception):
analyze(fst, 'bd')
with pytest.raises(Exception):
analyze(fst, 'db')
def test_multiple_starting_classes_same_continuation():
fst = compile({
Slot('class1', [('a', 'b', [('class3', 0.0)], 0.0)], start=True),
Slot('class2', [('c', 'd', [('class3', 0.0)], 0.0)], start=True),
Slot('class3', [('e', 'f', [(None, 0.0)], 0.0)])
})
assert analyze(fst, 'bf') == 'ae'
assert analyze(fst, 'df') == 'ce'
# not a starting class
with pytest.raises(Exception):
analyze(fst, 'f')
# starting classes do not connect
with pytest.raises(Exception):
analyze(fst, 'bd')
with pytest.raises(Exception):
analyze(fst, 'db')
def test_multiple_starting_classes_some_have_continuation_others_do_not():
fst = compile({
Slot('class1', [('a', 'b', [('class3', 0.0)], 0.0)], start=True),
Slot('class2', [('c', 'd', [(None, 0.0)], 0.0)], start=True),
Slot('class3', [('e', 'f', [(None, 0.0)], 0.0)])
})
assert analyze(fst, 'bf') == 'ae'
assert analyze(fst, 'd') == 'c'
# class2 has no transitions
with pytest.raises(Exception):
analyze(fst, 'df')
# not a starting class
with pytest.raises(Exception):
analyze(fst, 'f')
def test_multiple_starting_classes_different_continuation():
fst = compile({
Slot('class1', [('a', 'b', [('class3', 0.0)], 0.0)], start=True),
Slot('class2', [('c', 'd', [('class4', 0.0)], 0.0)], start=True),
Slot('class3', [('e', 'f', [(None, 0.0)], 0.0)]),
Slot('class4', [('g', 'h', [(None, 0.0)], 0.0)])
})
assert analyze(fst, 'bf') == 'ae'
assert analyze(fst, 'dh') == 'cg'
# class1 should not transition to class4
with pytest.raises(Exception):
analyze(fst, 'bh')
# class2 should not transition to class3
with pytest.raises(Exception):
analyze(fst, 'df')
# must start with a starting class
with pytest.raises(Exception):
analyze(fst, 'f')
with pytest.raises(Exception):
analyze(fst, 'h')
def test_multiple_starting_classes_single_rule_per_class_multiple_continuations():
fst = compile({
Slot('class1', [('a', 'b', [('class2', 0.0), ('class3', 0.0), ('class4', 0.0)], 0.0)], start=True),
Slot('class2', [('c', 'd', [(None, 0.0)], 0.0)]),
Slot('class3', [('e', 'f', [(None, 0.0)], 0.0)]),
Slot('class4', [('g', 'h', [(None, 0.0)], 0.0)]),
Slot('class5', [('i', 'j', [(None, 0.0)], 0.0)], start=True)
})
assert analyze(fst, 'bd') == 'ac'
assert analyze(fst, 'bf') == 'ae'
assert analyze(fst, 'bh') == 'ag'
assert analyze(fst, 'j') == 'i'
# multiple continuation classes do not interfere with each other
with pytest.raises(Exception):
analyze(fst, 'bfh') # class3 not joined with class4
with pytest.raises(Exception):
analyze(fst, 'bdf') # class2 not joined with class3
with pytest.raises(Exception):
analyze(fst, 'bdh') # class2 not joined with class4
# must start with a starting class
for non_starting_class_symbol in ['b', 'd', 'f']:
with pytest.raises(Exception):
analyze(fst, non_starting_class_symbol)
def test_multiple_rules_single_class_no_continuations():
fst = compile({
Slot('class1',
[
('a', 'b', [(None, 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [(None, 0.0)], 0.0),
('g', 'h', [(None, 0.0)], 0.0),
],
start=True),
})
assert analyze(fst, 'b') == 'a'
assert analyze(fst, 'd') == 'c'
assert analyze(fst, 'f') == 'e'
assert analyze(fst, 'h') == 'g'
# FST does not accept upper alphabet symbols
for input_symbol in ['a', 'c', 'e', 'g']:
with pytest.raises(Exception):
analyze(fst, input_symbol)
# a slot is a union of rules, not a concatenation
for not_in_lang in ['bd', 'df', 'fh', 'bh', 'dh', 'bf']:
with pytest.raises(Exception):
analyze(fst, not_in_lang)
def test_multiple_rules_single_starting_class_with_multiple_continuations():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0), ('class3', 0.0)], 0.0),
('c', 'd', [('class4', 0.0)], 0.0),
('e', 'f', [(None, 0.0)], 0.0),
('g', 'h', [(None, 0.0)], 0.0)
],
start=True),
Slot('class2', [('i', 'j', [(None, 0.0)], 0.0)]),
Slot('class3', [('k', 'l', [(None, 0.0)], 0.0)]),
Slot('class4', [('m', 'n', [(None, 0.0)], 0.0)])
})
assert analyze(fst, 'bj') == 'ai'
assert analyze(fst, 'bl') == 'ak'
assert analyze(fst, 'dn') == 'cm'
assert analyze(fst, 'h') == 'g'
# rules within a slot should not be concatenated with wrong continuation class
for not_in_lang in ['bf', 'bh', 'bd', 'bn', 'df', 'dh', 'db', 'dj', 'dl']:
with pytest.raises(Exception):
analyze(fst, not_in_lang)
def test_multiple_rules_multiple_classes_multiple_continuations():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('class2', 0.0), ('class3', 0.0)], 0.0)
],
start=True),
Slot('class2',
[
('g', 'h', [(None, 0.0)], 0.0),
('i', 'j', [(None, 0.0)], 0.0),
('k', 'l', [('class3', 0.0)], 0.0),
]
),
Slot('class3',
[
('m', 'n', [(None, 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [(None, 0.0)], 0.0),
], start=True)
})
# class1 alone
assert analyze(fst, 'd') == 'c'
# class1 to class2
assert analyze(fst, 'bh') == 'ag'
assert analyze(fst, 'bj') == 'ai'
assert analyze(fst, 'fh') == 'eg'
assert analyze(fst, 'fj') == 'ei'
# class1 to class2 to class3
assert analyze(fst, 'bln') == 'akm'
assert analyze(fst, 'blp') == 'ako'
assert analyze(fst, 'fln') == 'ekm'
assert analyze(fst, 'flp') == 'eko'
# class1 to class3
assert analyze(fst, 'fn') == 'em'
assert analyze(fst, 'fp') == 'eo'
# class4
assert analyze(fst, 'r') == 'q'
assert analyze(fst, 't') == 's'
def test_multiple_rules_multiple_classes_multiple_continuations_with_stem_guesser_starting():
nahuatl_alphabet = {
'C': ['m', 'n', 'p', 't', 'k', 'kw', 'h', 'ts', 'tl', 'ch', 's', 'l', 'x', 'j', 'w'],
'V': ['a', 'e', 'i', 'o']
}
bimoraic_fsa = StemGuesser('.*V.*V', 'VerbStem', [('class2', 0.0), ('class3', 0.0)],
alphabet=nahuatl_alphabet, start=True)
fst = compile({
bimoraic_fsa,
Slot('class2',
[
('g', 'h', [(None, 0.0)], 0.0),
('i', 'j', [(None, 0.0)], 0.0),
('k', 'l', [('class3', 0.0)], 0.0),
]
),
Slot('class3',
[
('m', 'n', [(None, 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [(None, 0.0)], 0.0),
], start=True)
})
# non-bimoraic stem rejected
with pytest.raises(Exception):
analyze(fst, 'pak' + 'h')
# paki = fictitious verb stem
# valid verb stem by itself not accepted
with pytest.raises(Exception):
analyze(fst, 'paaki')
# class2 and class3
for upper, lower in [('g', 'h'), ('i', 'j'), ('m', 'n'), ('o', 'p')]:
assert analyze(fst, 'paaki' + lower) == 'paaki' + upper
# class2 then class3
for upper, lower in [('m', 'n'), ('o', 'p')]:
assert analyze(fst, 'paakil' + lower) == 'paakik' + upper
# the other starting class (class4) accepted
assert analyze(fst, 'r') == 'q'
assert analyze(fst, 't') == 's'
def test_multiple_rules_multiple_classes_multiple_continuations_with_stem_guesser_in_middle():
nahuatl_alphabet = {
'C': ['m', 'n', 'p', 't', 'k', 'kw', 'h', 'ts', 'tl', 'ch', 's', 'l', 'x', 'j', 'w'],
'V': ['a', 'e', 'i', 'o']
}
bimoraic_fsa = StemGuesser('.*V.*V', 'VerbStem', [('class3', 0.0)],
alphabet=nahuatl_alphabet)
fst = compile({
Slot('class1',
[
('a', 'b', [('VerbStem', 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('VerbStem', 0.0), ('class3', 0.0)], 0.0)
],
start=True),
bimoraic_fsa,
Slot('class3',
[
('m', 'n', [(None, 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [(None, 0.0)], 0.0),
], start=True)
})
# non-bimoraic stem (with valid prefix) rejected
with pytest.raises(Exception):
analyze(fst, 'b' + 'pak')
# paki = fictitious verb stem
# valid verb stem by itself not accepted
with pytest.raises(Exception):
analyze(fst, 'paaki')
# class1 alone
assert analyze(fst, 'd') == 'c'
# class1 then VerbStem then class3
assert analyze(fst, 'b' + 'paaki' + 'n') == 'a' + 'paaki' + 'm'
assert analyze(fst, 'b' + 'paaki' + 'p') == 'a' + 'paaki' + 'o'
assert analyze(fst, 'f' + 'paaki' + 'n') == 'e' + 'paaki' + 'm'
assert analyze(fst, 'f' + 'paaki' + 'p') == 'e' + 'paaki' + 'o'
# class1 then class3
assert analyze(fst, 'fn') == 'em'
assert analyze(fst, 'fp') == 'eo'
# the other starting class (class4) accepted
assert analyze(fst, 'r') == 'q'
assert analyze(fst, 't') == 's'
def test_multiple_rules_multiple_classes_multiple_continuations_with_stem_guesser_ending():
nahuatl_alphabet = {
'C': ['m', 'n', 'p', 't', 'k', 'kw', 'h', 'ts', 'tl', 'ch', 's', 'l', 'x', 'j', 'w'],
'V': ['a', 'e', 'i', 'o']
}
bimoraic_fsa = StemGuesser('.*V.*V', 'VerbStem', [(None, 0.0)],
alphabet=nahuatl_alphabet)
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], 0.0),
('c', 'd', [('VerbStem', 0.0)], 0.0),
('e', 'f', [('class2', 0.0), ('class3', 0.0)], 0.0)
],
start=True),
Slot('class2',
[
('m', 'n', [('VerbStem', 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class3',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [('VerbStem', 0.0)], 0.0),
]),
bimoraic_fsa
})
# non-bimoraic stem (with valid prefix) rejected
with pytest.raises(Exception):
analyze(fst, 'd' + 'pak')
# class1 to VerbStem
assert analyze(fst, 'dpaki') == 'cpaki'
# class1 to class2
assert analyze(fst, 'bp') == 'ao'
assert analyze(fst, 'fp') == 'eo'
# class1 to class3
assert analyze(fst, 'fr') == 'eq'
# class1 to class2 to VerbStem
assert analyze(fst, 'bn' + 'paki') == 'am' + 'paki'
assert analyze(fst, 'fn' + 'paki') == 'em' + 'paki'
# class1 to class3 to VerbStem
assert analyze(fst, 'ft' + 'paki') == 'es' + 'paki'
def test_single_cyclic_class():
# starting class connects to itself
fst = compile({
Slot('class1',
[
('a', 'b', [('class1', 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [(None, 0.0)], 0.0)
],
start=True),
})
# need another transition to reach accepting state
with pytest.raises(Exception):
assert analyze(fst, 'b')
# repeat transitions
for i in range(1, 5):
assert analyze(fst, ('b' * i) + 'd') == ('a' * i) + 'c'
assert analyze(fst, ('b' * i) + 'f') == ('a' * i) + 'e'
# not all transitions repeat
assert analyze(fst, 'd') == 'c'
assert analyze(fst, 'f') == 'e'
for repeat in (['d' * i for i in range(2, 6)] + ['f' * i for i in range(2, 6)]):
with pytest.raises(Exception):
assert analyze(fst, repeat)
with pytest.raises(Exception):
assert analyze(fst, 'b' + repeat)
with pytest.raises(Exception):
assert analyze(fst, 'bb' + repeat)
with pytest.raises(Exception):
assert analyze(fst, 'bbb' + repeat)
def test_cyclic_class_starting():
fst = compile({
Slot('class1',
[
('a', 'b', [('class1', 0.0)], 0.0), # the cyclic rule
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('class2', 0.0), ('class3', 0.0)], 0.0)
],
start=True),
Slot('class2',
[
('g', 'h', [(None, 0.0)], 0.0),
('i', 'j', [(None, 0.0)], 0.0),
('k', 'l', [('class3', 0.0)], 0.0),
]
),
Slot('class3',
[
('m', 'n', [(None, 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [(None, 0.0)], 0.0),
], start=True)
})
# cyclic class' non-cyclic (and terminal) rule
assert analyze(fst, 'd') == 'c'
# need another transition to reach accepting state
with pytest.raises(Exception):
assert analyze(fst, 'b')
# repeat applications of the cyclic rule
for i in range(1, 5):
assert analyze(fst, ('b' * i) + 'd') == ('a' * i) + 'c'
for i in range(0, 5): # i = 0 means no b's prepended
prepend_input = 'b' * i
prepend_output = 'a' * i
# class1 to class2
assert analyze(fst, prepend_input + 'fh') == prepend_output + 'eg'
assert analyze(fst, prepend_input + 'fj') == prepend_output + 'ei'
# class1 to class2 to class3
assert analyze(fst, prepend_input + 'fln') == prepend_output + 'ekm'
assert analyze(fst, prepend_input + 'flp') == prepend_output + 'eko'
# class1 to class3
assert analyze(fst, prepend_input + 'fn') == prepend_output + 'em'
assert analyze(fst, prepend_input + 'fp') == prepend_output + 'eo'
# cannot get to class4 from class1
if i > 0:
with pytest.raises(Exception):
assert analyze(fst, prepend_input + 'r') == prepend_output + 'q'
with pytest.raises(Exception):
assert analyze(fst, prepend_input + 't') == prepend_output + 's'
# class4
assert analyze(fst, 'r') == 'q'
assert analyze(fst, 't') == 's'
def test_cyclic_class_in_middle():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('class2', 0.0), ('class3', 0.0)], 0.0)
],
start=True),
Slot('class2',
[
('g', 'h', [('class2', 0.0)], 0.0), # cyclic rule
('G', 'H', [('class2', 0.0)], 0.0), # cyclic rule
('i', 'j', [(None, 0.0)], 0.0),
('k', 'l', [('class3', 0.0)], 0.0),
]
),
Slot('class3',
[
('m', 'n', [(None, 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [(None, 0.0)], 0.0),
], start=True)
})
# class1 alone
assert analyze(fst, 'd') == 'c'
# class1 to class2, non-cyclic and terminal
assert analyze(fst, 'bj') == 'ai'
assert analyze(fst, 'fj') == 'ei'
# class1 to class2, cyclic
# need another transition to reach accepting state
with pytest.raises(Exception):
assert analyze(fst, 'bh')
with pytest.raises(Exception):
assert analyze(fst, 'bH')
with pytest.raises(Exception):
assert analyze(fst, 'fh')
with pytest.raises(Exception):
assert analyze(fst, 'fH')
for i in range(1, 5):
assert analyze(fst, 'b' + ('h' * i) + 'j') == 'a' + ('g' * i) + 'i'
assert analyze(fst, 'b' + ('H' * i) + 'j') == 'a' + ('G' * i) + 'i'
assert analyze(fst, 'f' + ('h' * i) + 'j') == 'e' + ('g' * i) + 'i'
assert analyze(fst, 'f' + ('H' * i) + 'j') == 'e' + ('G' * i) + 'i'
# class1 to class2 (non-cyclic) to class3
assert analyze(fst, 'bln') == 'akm'
assert analyze(fst, 'blp') == 'ako'
assert analyze(fst, 'fln') == 'ekm'
assert analyze(fst, 'flp') == 'eko'
# class1 to class2 (cyclic) to class3
for i in range(1, 5):
assert analyze(fst, 'b' + ('h' * i) + 'ln') == 'a' + ('g' * i) + 'km'
assert analyze(fst, 'b' + ('h' * i) + 'lp') == 'a' + ('g' * i) + 'ko'
assert analyze(fst, 'b' + ('H' * i) + 'ln') == 'a' + ('G' * i) + 'km'
assert analyze(fst, 'b' + ('H' * i) + 'lp') == 'a' + ('G' * i) + 'ko'
assert analyze(fst, 'f' + ('h' * i) + 'ln') == 'e' + ('g' * i) + 'km'
assert analyze(fst, 'f' + ('h' * i) + 'lp') == 'e' + ('g' * i) + 'ko'
assert analyze(fst, 'f' + ('H' * i) + 'ln') == 'e' + ('G' * i) + 'km'
assert analyze(fst, 'f' + ('H' * i) + 'lp') == 'e' + ('G' * i) + 'ko'
# class1 to class3
assert analyze(fst, 'fn') == 'em'
assert analyze(fst, 'fp') == 'eo'
# class4
assert analyze(fst, 'r') == 'q'
assert analyze(fst, 't') == 's'
def test_cyclic_class_ending():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('class2', 0.0), ('class3', 0.0)], 0.0)
],
start=True),
Slot('class2',
[
('g', 'h', [(None, 0.0)], 0.0),
('i', 'j', [(None, 0.0)], 0.0),
('k', 'l', [('class3', 0.0)], 0.0),
]
),
Slot('class3',
[
('m', 'n', [('class3', 0.0)], 0.0), # cyclic rule
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [(None, 0.0)], 0.0),
], start=True)
})
# class1 alone
assert analyze(fst, 'd') == 'c'
# class1 to class2
assert analyze(fst, 'bh') == 'ag'
assert analyze(fst, 'bj') == 'ai'
assert analyze(fst, 'fh') == 'eg'
assert analyze(fst, 'fj') == 'ei'
# class1 to class2 to class3 (non-cyclic and terminal)
assert analyze(fst, 'blp') == 'ako'
assert analyze(fst, 'flp') == 'eko'
# class1 to class2 to class3 (cyclic)
for i in range(1, 5):
assert analyze(fst, 'bl' + ('n' * i) + 'p') == 'ak' + ('m' * i) + 'o'
assert analyze(fst, 'fl' + ('n' * i) + 'p') == 'ek' + ('m' * i) + 'o'
# class1 to class3 (non-cyclic and terminal)
assert analyze(fst, 'fp') == 'eo'
# class1 to class3 (cyclic)
for i in range(1, 5):
assert analyze(fst, 'f' + ('n' * i) + 'p') == 'e' + ('m' * i) + 'o'
# class4
assert analyze(fst, 'r') == 'q'
assert analyze(fst, 't') == 's'
# class1 -> class2 -> class3 -> class1
def test_cycle_period_at_least_two_cycle_includes_starting_class():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('class2', 0.0), ('class3', 0.0)], 0.0)
],
start=True),
Slot('class2',
[
('g', 'h', [(None, 0.0)], 0.0),
('i', 'j', [(None, 0.0)], 0.0),
('k', 'l', [('class3', 0.0)], 0.0),
]
),
Slot('class3',
[
('m', 'n', [('class1', 0.0)], 0.0), # cycle
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [(None, 0.0)], 0.0),
], start=True)
})
# the cycle is from class1 to class2 to class3
# i = 0 means no cycle
for i in range(5):
# class1 to class2 to class3 (cyclic) or class1 to class3 (cyclic)
for cyclic_lower, cyclic_upper in [('bln', 'akm'), ('fln', 'ekm')] + [('fn', 'em')]:
# class1 alone
assert analyze(fst, (cyclic_lower * i) + 'd') == (cyclic_upper * i) + 'c'
# class1 to class2
assert analyze(fst, (cyclic_lower * i) + 'bh') == (cyclic_upper * i) + 'ag'
assert analyze(fst, (cyclic_lower * i) + 'bj') == (cyclic_upper * i) + 'ai'
assert analyze(fst, (cyclic_lower * i) + 'fh') == (cyclic_upper * i) + 'eg'
assert analyze(fst, (cyclic_lower * i) + 'fj') == (cyclic_upper * i) + 'ei'
# class1 to class2 to class3 (non-cyclic and terminal)
assert analyze(fst, (cyclic_lower * i) + 'blp') == (cyclic_upper * i) + 'ako'
assert analyze(fst, (cyclic_lower * i) + 'flp') == (cyclic_upper * i) + 'eko'
# class1 to class3 (non-cyclic and terminal)
assert analyze(fst, (cyclic_lower * i) + 'fp') == (cyclic_upper * i) + 'eo'
# class4
assert analyze(fst, 'r') == 'q'
assert analyze(fst, 't') == 's'
# class1 -> class2 -> class3 -> class4 -> class2
def test_cycle_period_at_least_two_cycle_excludes_starting_class():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('class2', 0.0), ('class3', 0.0)], 0.0)
],
start=True),
Slot('class2',
[
('g', 'h', [(None, 0.0)], 0.0),
('i', 'j', [(None, 0.0)], 0.0),
('k', 'l', [('class3', 0.0)], 0.0),
]
),
Slot('class3',
[
('m', 'n', [('class4', 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [('class2', 0.0)], 0.0), # cycle
])
})
# class1 alone
assert analyze(fst, 'd') == 'c'
# class1 to class3 (terminal) - impossible for cycle to go back to class1
assert analyze(fst, 'fp') == 'eo'
# class1 to class3 (non-terminal) to class4 (terminal) - impossible for cycle to go back to class1
assert analyze(fst, 'fnr') == 'emq'
# the cycle is from class2 to class3 to class4
# i = 0 means no cycle
for i in range(5):
# class2 to class3 to class4 (cyclic), class3 to class4 (cyclic)
cyclic_lower, cyclic_upper = ('lnt', 'kms')
# class1 to class2 (terminal)
assert analyze(fst, 'b' + (cyclic_lower * i) + 'h') == 'a' + (cyclic_upper * i) + 'g'
assert analyze(fst, 'b' + (cyclic_lower * i) + 'j') == 'a' + (cyclic_upper * i) + 'i'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'h') == 'e' + (cyclic_upper * i) + 'g'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'j') == 'e' + (cyclic_upper * i) + 'i'
# class1 to class2 to class3 (terminal)
assert analyze(fst, 'b' + (cyclic_lower * i) + 'lp') == 'a' + (cyclic_upper * i) + 'ko'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'lp') == 'e' + (cyclic_upper * i) + 'ko'
# class1 to class2 to class3 (non-terminal)
# class3 to class4 (terminal)
assert analyze(fst, 'b' + (cyclic_lower * i) + 'ln' + 'r') == 'a' + (cyclic_upper * i) + 'km' + 'q'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'ln' + 'r') == 'e' + (cyclic_upper * i) + 'km' + 'q'
def test_single_weighted_class():
fst = compile({
Slot('class1',
[
('a', 'b', [(None, 0.0)], 0.5),
('c', 'd', [(None, 0.0)], 0.25),
('e', 'f', [(None, 0.0)], 0.75),
('g', 'h', [(None, 0.0)], 0.1)
],
start=True)
})
# shortest distance from the start to final state is 0.1
assert math.isclose(float(pywrapfst.shortestdistance(fst)[1]), 0.1, abs_tol=1e-5)
# correct transduction and correct weight
assert correct_transduction_and_weights(fst, 'b', [('a', 0.5)])
assert correct_transduction_and_weights(fst, 'd', [('c', 0.25)])
assert correct_transduction_and_weights(fst, 'f', [('e', 0.75)])
assert correct_transduction_and_weights(fst, 'h', [('g', 0.1)])
def test_multiple_weighted_classes():
weights = {}
for transition in ['ba', 'dc', 'fe', 'hg', 'ji', 'lk', 'nm', 'po', 'rq', 'ts']:
weights[transition] = random.random()
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], weights['ba']),
('c', 'd', [(None, 0.0)], weights['dc']),
('e', 'f', [('class2', 0.0), ('class3', 0.0)], weights['fe'])
],
start=True),
Slot('class2',
[
('g', 'h', [(None, 0.0)], weights['hg']),
('i', 'j', [(None, 0.0)], weights['ji']),
('k', 'l', [('class3', 0.0)], weights['lk']),
]
),
Slot('class3',
[
('m', 'n', [(None, 0.0)], weights['nm']),
('o', 'p', [(None, 0.0)], weights['po']),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], weights['rq']),
('s', 't', [(None, 0.0)], weights['ts']),
], start=True)
})
# class1 alone
assert correct_transduction_and_weights(fst, 'd', [('c', weights['dc'])])
# class1 to class2
assert correct_transduction_and_weights(fst, 'bh', [('ag', weights['ba'] + weights['hg'])])
assert correct_transduction_and_weights(fst, 'bj', [('ai', weights['ba'] + weights['ji'])])
assert correct_transduction_and_weights(fst, 'fh', [('eg', weights['fe'] + weights['hg'])])
assert correct_transduction_and_weights(fst, 'fj', [('ei', weights['fe'] + weights['ji'])])
# class1 to class2 to class3
assert correct_transduction_and_weights(fst, 'bln', [('akm', weights['ba'] + weights['lk'] + weights['nm'])])
assert correct_transduction_and_weights(fst, 'blp', [('ako', weights['ba'] + weights['lk'] + weights['po'])])
assert correct_transduction_and_weights(fst, 'fln', [('ekm', weights['fe'] + weights['lk'] + weights['nm'])])
assert correct_transduction_and_weights(fst, 'flp', [('eko', weights['fe'] + weights['lk'] + weights['po'])])
# class1 to class3
assert correct_transduction_and_weights(fst, 'fn', [('em', weights['fe'] + weights['nm'])])
assert correct_transduction_and_weights(fst, 'fp', [('eo', weights['fe'] + weights['po'])])
# class4
assert correct_transduction_and_weights(fst, 'r', [('q', weights['rq'])])
assert correct_transduction_and_weights(fst, 't', [('s', weights['ts'])])
def test_three_non_deterministic_classes():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], 1.0),
('a', 'b', [('class3', 0.0)], 2.0)
],
start=True),
Slot('class2',
[
('c', 'd', [(None, 0.0)], 3.0)
]),
Slot('class3',
[
('c', 'd', [(None, 0.0)], 4.0)
]),
})
assert correct_transduction_and_weights(fst, 'bd', [('ac', 1.0 + 3.0), ('ac', 2.0 + 4.0)])
def test_three_non_deterministic_classes_equal_weights():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], 1.0),
('a', 'b', [('class3', 0.0)], 1.0)
],
start=True),
Slot('class2',
[
('c', 'd', [(None, 0.0)], 2.0)
]),
Slot('class3',
[
('c', 'd', [(None, 0.0)], 2.0)
]),
})
assert correct_transduction_and_weights(fst, 'bd', [('ac', 1.0 + 2.0), ('ac', 1.0 + 2.0)])
def test_three_non_deterministic_classes_different_outputs():
fst = compile({
Slot('class1',
[
('c', 'b', [('class2', 0.0)], 1.0),
('a', 'b', [('class3', 0.0)], 2.0)
],
start=True),
Slot('class2',
[
('d', 'd', [(None, 0.0)], 3.0)
]),
Slot('class3',
[
('f', 'f', [(None, 0.0)], 4.0)
]),
})
assert correct_transduction_and_weights(fst, 'bd', [('cd', 1.0 + 3.0)])
assert correct_transduction_and_weights(fst, 'bf', [('af', 2.0 + 4.0)])
def test_three_non_deterministic_classes_different_inputs():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0)], 1.0),
('a', 'd', [('class3', 0.0)], 2.0)
],
start=True),
Slot('class2',
[
('f', 'f', [(None, 0.0)], 3.0)
]),
Slot('class3',
[
('h', 'h', [(None, 0.0)], 4.0)
]),
})
assert correct_transduction_and_weights(fst, 'bf', [('af', 1.0 + 3.0)])
assert correct_transduction_and_weights(fst, 'dh', [('ah', 2.0 + 4.0)])
def test_both_terminal_and_non_terminal_rule():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0), (None, 0.0)], 1.0),
],
start=True),
Slot('class2',
[
('c', 'd', [(None, 0.0)], 2.0)
]),
})
assert correct_transduction_and_weights(fst, 'b', [('a', 1.0)])
assert correct_transduction_and_weights(fst, 'bd', [('ac', 1.0 + 2.0)])
def test_non_deterministic_both_terminal_non_terminal_rule():
fst = compile({
Slot('class1',
[
('c', 'b', [('class2', 0.0), (None, 0.0)], 1.0),
('a', 'b', [('class3', 0.0), (None, 0.0)], 2.0)
],
start=True),
Slot('class2',
[
('d', 'd', [(None, 0.0)], 3.0)
]),
Slot('class3',
[
('f', 'f', [(None, 0.0)], 4.0)
]),
})
# non-terminal rules
assert correct_transduction_and_weights(fst, 'bd', [('cd', 1.0 + 3.0)])
assert correct_transduction_and_weights(fst, 'bf', [('af', 2.0 + 4.0)])
# terminal rules
assert correct_transduction_and_weights(fst, 'b', [('c', 1.0), ('a', 2.0)])
def test_multiple_weighted_classes_both_terminal_non_terminal_rules():
weights = {}
for transition in ['ba', 'dc', 'fe', 'hg', 'ji', 'lk', 'nm', 'po', 'rq', 'ts']:
weights[transition] = random.random()
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0), (None, 0.0)], weights['ba']),
('c', 'd', [(None, 0.0)], weights['dc']),
('e', 'f', [('class2', 0.0), ('class3', 0.0), (None, 0.0)], weights['fe'])
],
start=True),
Slot('class2',
[
('g', 'h', [(None, 0.0)], weights['hg']),
('i', 'j', [(None, 0.0)], weights['ji']),
('k', 'l', [('class3', 0.0), (None, 0.0)], weights['lk']),
]
),
Slot('class3',
[
('m', 'n', [(None, 0.0)], weights['nm']),
('o', 'p', [(None, 0.0)], weights['po']),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], weights['rq']),
('s', 't', [(None, 0.0)], weights['ts']),
], start=True)
})
# class1 alone
assert correct_transduction_and_weights(fst, 'd', [('c', weights['dc'])])
assert correct_transduction_and_weights(fst, 'b', [('a', weights['ba'])])
assert correct_transduction_and_weights(fst, 'f', [('e', weights['fe'])])
# class1 to class2
assert correct_transduction_and_weights(fst, 'bh', [('ag', weights['ba'] + weights['hg'])])
assert correct_transduction_and_weights(fst, 'bj', [('ai', weights['ba'] + weights['ji'])])
assert correct_transduction_and_weights(fst, 'bl', [('ak', weights['ba'] + weights['lk'])])
assert correct_transduction_and_weights(fst, 'fh', [('eg', weights['fe'] + weights['hg'])])
assert correct_transduction_and_weights(fst, 'fj', [('ei', weights['fe'] + weights['ji'])])
assert correct_transduction_and_weights(fst, 'fl', [('ek', weights['fe'] + weights['lk'])])
# class1 to class2 to class3
assert correct_transduction_and_weights(fst, 'bln', [('akm', weights['ba'] + weights['lk'] + weights['nm'])])
assert correct_transduction_and_weights(fst, 'blp', [('ako', weights['ba'] + weights['lk'] + weights['po'])])
assert correct_transduction_and_weights(fst, 'fln', [('ekm', weights['fe'] + weights['lk'] + weights['nm'])])
assert correct_transduction_and_weights(fst, 'flp', [('eko', weights['fe'] + weights['lk'] + weights['po'])])
# class1 to class3
assert correct_transduction_and_weights(fst, 'fn', [('em', weights['fe'] + weights['nm'])])
assert correct_transduction_and_weights(fst, 'fp', [('eo', weights['fe'] + weights['po'])])
# class4
assert correct_transduction_and_weights(fst, 'r', [('q', weights['rq'])])
assert correct_transduction_and_weights(fst, 't', [('s', weights['ts'])])
def test_stem_guesser_both_terminal_non_terminal():
nahuatl_alphabet = {
'C': ['m', 'n', 'p', 't', 'k', 'kw', 'h', 'ts', 'tl', 'ch', 's', 'l', 'x', 'j', 'w'],
'V': ['a', 'e', 'i', 'o']
}
bimoraic_fsa = StemGuesser('.*V.*V', 'VerbStem', [('class3', 0.0), (None, 0.0)],
alphabet=nahuatl_alphabet)
fst = compile({
Slot('class1',
[
('a', 'b', [('VerbStem', 0.0), (None, 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('VerbStem', 0.0), ('class3', 0.0), (None, 0.0)], 0.0)
],
start=True),
bimoraic_fsa,
Slot('class3',
[
('m', 'n', [(None, 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [(None, 0.0)], 0.0),
], start=True)
})
# non-bimoraic stem (with valid prefix) rejected
with pytest.raises(Exception):
analyze(fst, 'b' + 'pak')
# paki = fictitious verb stem
# valid verb stem by itself not accepted (need a prefix in this case)
with pytest.raises(Exception):
analyze(fst, 'paaki')
# class1 alone (terminal)
assert analyze(fst, 'd') == 'c'
assert analyze(fst, 'f') == 'e'
# class1 then VerbStem (non-terminal) then class3
assert analyze(fst, 'b' + 'paaki' + 'n') == 'a' + 'paaki' + 'm'
assert analyze(fst, 'b' + 'paaki' + 'p') == 'a' + 'paaki' + 'o'
assert analyze(fst, 'f' + 'paaki' + 'n') == 'e' + 'paaki' + 'm'
assert analyze(fst, 'f' + 'paaki' + 'p') == 'e' + 'paaki' + 'o'
# class1 then VerbStem (terminal)
assert analyze(fst, 'b' + 'paaki') == 'a' + 'paaki'
assert analyze(fst, 'b' + 'paaki') == 'a' + 'paaki'
assert analyze(fst, 'f' + 'paaki') == 'e' + 'paaki'
assert analyze(fst, 'f' + 'paaki') == 'e' + 'paaki'
# class1 then class3
assert analyze(fst, 'fn') == 'em'
assert analyze(fst, 'fp') == 'eo'
# the other starting class (class4) accepted
assert analyze(fst, 'r') == 'q'
assert analyze(fst, 't') == 's'
def test_cycle_period_one_both_terminal_non_terminal_rules():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0), (None, 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('class2', 0.0), ('class3', 0.0), (None, 0.0)], 0.0)
],
start=True),
Slot('class2',
[
('g', 'h', [('class2', 0.0), (None, 0.0)], 0.0), # cyclic rule
('G', 'H', [('class2', 0.0), (None, 0.0)], 0.0), # cyclic rule
('i', 'j', [(None, 0.0)], 0.0),
('k', 'l', [('class3', 0.0), (None, 0.0)], 0.0),
]
),
Slot('class3',
[
('m', 'n', [(None, 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [(None, 0.0)], 0.0),
], start=True)
})
# class1 alone
assert analyze(fst, 'b') == 'a'
assert analyze(fst, 'd') == 'c'
assert analyze(fst, 'f') == 'e'
# class1 to class2, non-cyclic and terminal
assert analyze(fst, 'bj') == 'ai'
assert analyze(fst, 'fj') == 'ei'
for i in range(5):
# class1 to class2, cyclic to class2 (terminal)
# i = 0 means no cycle - class1 to class2 (terminal)
assert analyze(fst, 'b' + ('h' * i)) == 'a' + ('g' * i)
assert analyze(fst, 'b' + ('H' * i)) == 'a' + ('G' * i)
assert analyze(fst, 'f' + ('h' * i)) == 'e' + ('g' * i)
assert analyze(fst, 'f' + ('H' * i)) == 'e' + ('G' * i)
assert analyze(fst, 'b' + ('h' * i) + 'j') == 'a' + ('g' * i) + 'i'
assert analyze(fst, 'b' + ('H' * i) + 'j') == 'a' + ('G' * i) + 'i'
assert analyze(fst, 'f' + ('h' * i) + 'j') == 'e' + ('g' * i) + 'i'
assert analyze(fst, 'f' + ('H' * i) + 'j') == 'e' + ('G' * i) + 'i'
assert analyze(fst, 'b' + ('h' * i) + 'l') == 'a' + ('g' * i) + 'k'
assert analyze(fst, 'b' + ('H' * i) + 'l') == 'a' + ('G' * i) + 'k'
assert analyze(fst, 'f' + ('h' * i) + 'l') == 'e' + ('g' * i) + 'k'
assert analyze(fst, 'f' + ('H' * i) + 'l') == 'e' + ('G' * i) + 'k'
# class1 to class2 (cyclic) to class3
# i = 0 means no cycle = class1 to class2 (non-cyclic) to class3
assert analyze(fst, 'b' + ('h' * i) + 'ln') == 'a' + ('g' * i) + 'km'
assert analyze(fst, 'b' + ('h' * i) + 'lp') == 'a' + ('g' * i) + 'ko'
assert analyze(fst, 'b' + ('H' * i) + 'ln') == 'a' + ('G' * i) + 'km'
assert analyze(fst, 'b' + ('H' * i) + 'lp') == 'a' + ('G' * i) + 'ko'
assert analyze(fst, 'f' + ('h' * i) + 'ln') == 'e' + ('g' * i) + 'km'
assert analyze(fst, 'f' + ('h' * i) + 'lp') == 'e' + ('g' * i) + 'ko'
assert analyze(fst, 'f' + ('H' * i) + 'ln') == 'e' + ('G' * i) + 'km'
assert analyze(fst, 'f' + ('H' * i) + 'lp') == 'e' + ('G' * i) + 'ko'
# class1 to class3
assert analyze(fst, 'fn') == 'em'
assert analyze(fst, 'fp') == 'eo'
# class4
assert analyze(fst, 'r') == 'q'
assert analyze(fst, 't') == 's'
# class1 -> class2 -> class3 -> class4 -> class2
def test_cycle_period_two_both_terminal_non_terminal_rules():
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 0.0), (None, 0.0)], 0.0),
('c', 'd', [(None, 0.0)], 0.0),
('e', 'f', [('class2', 0.0), ('class3', 0.0), (None, 0.0)], 0.0)
],
start=True),
Slot('class2',
[
('g', 'h', [(None, 0.0)], 0.0),
('i', 'j', [(None, 0.0)], 0.0),
('k', 'l', [('class3', 0.0), (None, 0.0)], 0.0),
]
),
Slot('class3',
[
('m', 'n', [('class4', 0.0), (None, 0.0)], 0.0),
('o', 'p', [(None, 0.0)], 0.0),
]
),
Slot('class4',
[
('q', 'r', [(None, 0.0)], 0.0),
('s', 't', [('class2', 0.0), (None, 0.0)], 0.0), # cycle
])
})
# class1 alone
assert analyze(fst, 'b') == 'a'
assert analyze(fst, 'd') == 'c'
assert analyze(fst, 'f') == 'e'
# class1 to class3 (terminal) - impossible for cycle to go back to class1
assert analyze(fst, 'fp') == 'eo'
# class1 to class3 (non-terminal) to class4 (terminal) - impossible for cycle to go back to class1
assert analyze(fst, 'fnr') == 'emq'
assert analyze(fst, 'fnt') == 'ems'
# the cycle is from class2 to class3 to class4
# i = 0 means no cycle
for i in range(5):
# class2 to class3 to class4 (cyclic), class3 to class4 (cyclic)
cyclic_lower, cyclic_upper = ('lnt', 'kms')
# class1 to class2 to class3 to class4 (terminal)
assert analyze(fst, 'b' + (cyclic_lower * i)) == 'a' + (cyclic_upper * i)
assert analyze(fst, 'f' + (cyclic_lower * i)) == 'e' + (cyclic_upper * i)
# class1 to class2 (terminal)
assert analyze(fst, 'b' + (cyclic_lower * i) + 'h') == 'a' + (cyclic_upper * i) + 'g'
assert analyze(fst, 'b' + (cyclic_lower * i) + 'j') == 'a' + (cyclic_upper * i) + 'i'
assert analyze(fst, 'b' + (cyclic_lower * i) + 'l') == 'a' + (cyclic_upper * i) + 'k'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'h') == 'e' + (cyclic_upper * i) + 'g'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'j') == 'e' + (cyclic_upper * i) + 'i'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'l') == 'e' + (cyclic_upper * i) + 'k'
# class1 to class2 to class3 (terminal)
assert analyze(fst, 'b' + (cyclic_lower * i) + 'lp') == 'a' + (cyclic_upper * i) + 'ko'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'lp') == 'e' + (cyclic_upper * i) + 'ko'
assert analyze(fst, 'b' + (cyclic_lower * i) + 'ln') == 'a' + (cyclic_upper * i) + 'km'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'ln') == 'e' + (cyclic_upper * i) + 'km'
# class1 to class2 to class3 (non-terminal)
# class3 to class4 (terminal)
assert analyze(fst, 'b' + (cyclic_lower * i) + 'ln' + 'r') == 'a' + (cyclic_upper * i) + 'km' + 'q'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'ln' + 'r') == 'e' + (cyclic_upper * i) + 'km' + 'q'
assert analyze(fst, 'b' + (cyclic_lower * i) + 'ln' + 't') == 'a' + (cyclic_upper * i) + 'km' + 's'
assert analyze(fst, 'f' + (cyclic_lower * i) + 'ln' + 't') == 'e' + (cyclic_upper * i) + 'km' + 's'
def test_weight_continuation_classes():
weights = {}
for transition in ['ba', 'dc', 'fe', 'hg', 'ji', 'lk', 'nm', 'po', 'rq', 'ts']:
weights[transition] = random.random()
fst = compile({
Slot('class1',
[
('a', 'b', [('class2', 1.0), (None, 2.0)], weights['ba']),
('c', 'd', [(None, 3.0)], weights['dc']),
('e', 'f', [('class2', 4.0), ('class3', 5.0), (None, 6.0)], weights['fe'])
],
start=True),
Slot('class2',
[
('g', 'h', [(None, 7.0)], weights['hg']),
('i', 'j', [(None, 8.0)], weights['ji']),
('k', 'l', [('class3', 9.0), (None, 10.0)], weights['lk']),
]
),
Slot('class3',
[
('m', 'n', [(None, 11.0)], weights['nm']),
('o', 'p', [(None, 12.0)], weights['po']),
]
),
Slot('class4',
[
('q', 'r', [(None, 13.0)], weights['rq']),
('s', 't', [(None, 14.0)], weights['ts']),
], start=True)
})
# class1 alone
assert correct_transduction_and_weights(fst, 'd', [('c', weights['dc'] + 3.0)])
assert correct_transduction_and_weights(fst, 'b', [('a', weights['ba'] + 2.0)])
assert correct_transduction_and_weights(fst, 'f', [('e', weights['fe'] + 6.0)])
# class1 to class2
assert correct_transduction_and_weights(fst, 'bh', [('ag', weights['ba'] + 1.0 + weights['hg'] + 7.0)])
assert correct_transduction_and_weights(fst, 'bj', [('ai', weights['ba'] + 1.0 + weights['ji'] + 8.0)])
assert correct_transduction_and_weights(fst, 'bl', [('ak', weights['ba'] + 1.0 + weights['lk'] + 10.0)])
assert correct_transduction_and_weights(fst, 'fh', [('eg', weights['fe'] + 4.0 + weights['hg'] + 7.0)])
assert correct_transduction_and_weights(fst, 'fj', [('ei', weights['fe'] + 4.0 + weights['ji'] + 8.0)])
assert correct_transduction_and_weights(fst, 'fl', [('ek', weights['fe'] + 4.0 + weights['lk'] + 10.0)])
# class1 to class2 to class3
assert correct_transduction_and_weights(fst, 'bln', [('akm', weights['ba'] + 1.0 + weights['lk'] + 9.0 + weights['nm'] + 11.0)])
assert correct_transduction_and_weights(fst, 'blp', [('ako', weights['ba'] + 1.0 + weights['lk'] + 9.0 + weights['po'] + 12.0)])
assert correct_transduction_and_weights(fst, 'fln', [('ekm', weights['fe'] + 4.0 + weights['lk'] + 9.0 + weights['nm'] + 11.0)])
assert correct_transduction_and_weights(fst, 'flp', [('eko', weights['fe'] + 4.0 + weights['lk'] + 9.0 + weights['po'] + 12.0)])
# class1 to class3
assert correct_transduction_and_weights(fst, 'fn', [('em', weights['fe'] + 5.0 + weights['nm'] + 11.0)])
assert correct_transduction_and_weights(fst, 'fp', [('eo', weights['fe'] + 5.0 + weights['po'] + 12.0)])
# class4
assert correct_transduction_and_weights(fst, 'r', [('q', weights['rq'] + 13.0)])
assert correct_transduction_and_weights(fst, 't', [('s', weights['ts'] + 14.0)])
| 33.891661 | 130 | 0.524921 | 6,633 | 47,550 | 3.676617 | 0.058043 | 0.04486 | 0.037766 | 0.024767 | 0.814245 | 0.775536 | 0.742732 | 0.712183 | 0.667774 | 0.649198 | 0 | 0.040864 | 0.236257 | 47,550 | 1,402 | 131 | 33.915835 | 0.630659 | 0.129842 | 0 | 0.633428 | 0 | 0 | 0.088586 | 0 | 0.019943 | 0 | 0 | 0 | 0.25641 | 1 | 0.037987 | false | 0 | 0.008547 | 0 | 0.054131 | 0.001899 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
d5c1d296a3f37ec41d9e92c1595e122d61b039d6 | 181 | py | Python | bruteforce.py | atanasoff-yordan/talon | b44636997c3ee3798ad840725eb034d66b5cf713 | [
"Apache-2.0"
] | null | null | null | bruteforce.py | atanasoff-yordan/talon | b44636997c3ee3798ad840725eb034d66b5cf713 | [
"Apache-2.0"
] | null | null | null | bruteforce.py | atanasoff-yordan/talon | b44636997c3ee3798ad840725eb034d66b5cf713 | [
"Apache-2.0"
] | null | null | null | from talon.bruteforce import extract_signature
if __name__ == '__main__':
print(extract_signature("\nI just placed an order. \nDo i get credits?\nThx \nSent from my iPhone"))
| 36.2 | 105 | 0.751381 | 26 | 181 | 4.846154 | 0.884615 | 0.253968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149171 | 181 | 4 | 106 | 45.25 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.447514 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
d5cd8097e59d6abffb353b1fd92890fc9255fdac | 16,144 | py | Python | mmtbx/hydrogens/tst_riding_fd_5.py | rimmartin/cctbx_project | 644090f9432d9afc22cfb542fc3ab78ca8e15e5d | [
"BSD-3-Clause-LBNL"
] | null | null | null | mmtbx/hydrogens/tst_riding_fd_5.py | rimmartin/cctbx_project | 644090f9432d9afc22cfb542fc3ab78ca8e15e5d | [
"BSD-3-Clause-LBNL"
] | null | null | null | mmtbx/hydrogens/tst_riding_fd_5.py | rimmartin/cctbx_project | 644090f9432d9afc22cfb542fc3ab78ca8e15e5d | [
"BSD-3-Clause-LBNL"
] | null | null | null | from __future__ import division, print_function
import time
import iotbx.pdb
import mmtbx.model
from cctbx.array_family import flex
from libtbx.utils import null_out
from libtbx.test_utils import approx_equal
#-----------------------------------------------------------------------------
# This finite difference test checks transformation of riding H gradients
# for H geometries, H/D exchanged
# C-H exchange might not be sensical, but it's always good to test anyway
#-----------------------------------------------------------------------------
#
def exercise(pdb_str, eps):
pdb_inp = iotbx.pdb.input(lines=pdb_str.split("\n"), source_info=None)
model = mmtbx.model.manager(
model_input = pdb_inp,
build_grm = True,
log = null_out())
geometry_restraints = model.restraints_manager.geometry
xray_structure = model.get_xray_structure()
model.setup_riding_h_manager()
riding_h_manager = model.get_riding_h_manager()
riding_h_manager.idealize(xray_structure=xray_structure)
sites_cart = xray_structure.sites_cart()
g_analytical = geometry_restraints.energies_sites(
sites_cart = sites_cart,
compute_gradients = True).gradients
hd_selection = xray_structure.hd_selection()
g_analytical_reduced = riding_h_manager.gradients_reduced_cpp(
gradients = g_analytical,
sites_cart = sites_cart,
hd_selection = hd_selection)
#
ex = [eps,0,0]
ey = [0,eps,0]
ez = [0,0,eps]
g_fd = flex.vec3_double()
for i_site in xrange(sites_cart.size()):
g_fd_i = []
for e in [ex,ey,ez]:
ts = []
for sign in [-1,1]:
sites_cart_ = sites_cart.deep_copy()
xray_structure_ = xray_structure.deep_copy_scatterers()
sites_cart_[i_site] = [
sites_cart_[i_site][j]+e[j]*sign for j in xrange(3)]
xray_structure_.set_sites_cart(sites_cart_)
# after shift, recalculate H position
riding_h_manager.idealize(
xray_structure=xray_structure_)
sites_cart_ = xray_structure_.sites_cart()
ts.append(geometry_restraints.energies_sites(
sites_cart = sites_cart_,
compute_gradients = False).target)
g_fd_i.append((ts[1]-ts[0])/(2*eps))
g_fd.append(g_fd_i)
g_fd_reduced = g_fd.select(~hd_selection)
for g1, g2 in zip(g_analytical_reduced, g_fd_reduced):
assert approx_equal(g1,g2, 1.e-4)
# flat 2 neighbors
pdb_str_01 = """
CRYST1 15.083 11.251 14.394 90.00 90.00 90.00 P 1
SCALE1 0.066300 0.000000 0.000000 0.00000
SCALE2 0.000000 0.088881 0.000000 0.00000
SCALE3 0.000000 0.000000 0.069473 0.00000
ATOM 1 C ARG A 2 5.253 20.304 25.155 1.00 0.00 C
ATOM 2 N HIS A 3 6.067 20.098 24.123 1.00 0.00 N
ATOM 3 CA HIS A 3 7.516 20.107 24.260 1.00 0.00 C
ATOM 4 H AHIS A 3 5.747 19.921 23.171 0.50 0.00 H
ATOM 5 D BHIS A 3 5.747 19.921 23.171 0.50 0.00 D
"""
# flat 2 neighbors shaked
pdb_str_02 = """
CRYST1 15.083 11.251 14.394 90.00 90.00 90.00 P 1
SCALE1 0.066300 0.000000 0.000000 0.00000
SCALE2 0.000000 0.088881 0.000000 0.00000
SCALE3 0.000000 0.000000 0.069473 0.00000
ATOM 1 C ARG A 2 5.099 20.402 25.169 1.00 0.00 C
ATOM 2 N HIS A 3 6.078 20.274 24.222 1.00 0.00 N
ATOM 3 CA HIS A 3 7.421 19.969 24.222 1.00 0.00 C
ATOM 4 H AHIS A 3 5.582 19.974 23.131 0.50 0.00 H
ATOM 5 D BHIS A 3 5.926 20.100 23.155 0.50 0.00 D
"""
# alg1a minimized
pdb_str_03 = """
CRYST1 14.446 16.451 11.913 90.00 90.00 90.00 P 1
SCALE1 0.069223 0.000000 0.000000 0.00000
SCALE2 0.000000 0.060787 0.000000 0.00000
SCALE3 0.000000 0.000000 0.083942 0.00000
ATOM 1 CD ARG A 50 4.896 7.973 6.011 1.00 10.00 C
ATOM 2 NE ARG A 50 5.406 6.701 5.508 1.00 10.00 N
ATOM 3 CZ ARG A 50 6.515 6.111 5.944 1.00 10.00 C
ATOM 4 NH1 ARG A 50 7.243 6.676 6.898 1.00 10.00 N
ATOM 5 NH2 ARG A 50 6.898 4.953 5.424 1.00 10.00 N
ATOM 6 HE AARG A 50 4.876 6.233 4.773 0.50 10.00 H
ATOM 7 HH11AARG A 50 6.956 7.567 7.304 0.50 10.00 H
ATOM 8 HH12AARG A 50 8.093 6.218 7.227 0.50 10.00 H
ATOM 9 HH21AARG A 50 6.342 4.515 4.690 0.50 10.00 H
ATOM 10 HH22AARG A 50 7.749 4.500 5.758 0.50 10.00 H
ATOM 11 DE BARG A 50 4.876 6.233 4.773 0.50 10.00 D
ATOM 12 DH11BARG A 50 6.956 7.567 7.304 0.50 10.00 D
ATOM 13 DH12BARG A 50 8.093 6.218 7.227 0.50 10.00 D
ATOM 14 DH21BARG A 50 6.342 4.515 4.690 0.50 10.00 D
ATOM 15 DH22BARG A 50 7.749 4.500 5.758 0.50 10.00 D
TER
"""
# alg1a shaked
pdb_str_04 = """
CRYST1 14.446 16.451 11.913 90.00 90.00 90.00 P 1
SCALE1 0.069223 0.000000 0.000000 0.00000
SCALE2 0.000000 0.060787 0.000000 0.00000
SCALE3 0.000000 0.000000 0.083942 0.00000
ATOM 1 CD ARG A 50 4.830 7.928 6.084 1.00 10.00 C
ATOM 2 NE ARG A 50 5.434 6.841 5.374 1.00 10.00 N
ATOM 3 CZ ARG A 50 6.450 6.116 6.035 1.00 10.00 C
ATOM 4 NH1 ARG A 50 7.392 6.694 6.803 1.00 10.00 N
ATOM 5 NH2 ARG A 50 6.898 5.026 5.305 1.00 10.00 N
ATOM 6 HE AARG A 50 4.867 6.348 4.684 0.50 10.00 H
ATOM 7 HH11AARG A 50 7.140 7.516 7.352 0.50 10.00 H
ATOM 8 HH12AARG A 50 8.199 6.141 7.091 0.50 10.00 H
ATOM 9 HH21AARG A 50 6.308 4.628 4.574 0.50 10.00 H
ATOM 10 HH22AARG A 50 7.759 4.551 5.577 0.50 10.00 H
ATOM 11 DE BARG A 50 4.867 6.348 4.684 0.50 10.00 D
ATOM 12 DH11BARG A 50 7.140 7.516 7.352 0.50 10.00 D
ATOM 13 DH12BARG A 50 8.199 6.141 7.091 0.50 10.00 D
ATOM 14 DH21BARG A 50 6.308 4.628 4.574 0.50 10.00 D
ATOM 15 DH22BARG A 50 7.759 4.551 5.577 0.50 10.00 D
"""
# 2tetra
pdb_str_05 = """
CRYST1 27.805 30.931 25.453 90.00 90.00 90.00 P 1
SCALE1 0.035965 0.000000 0.000000 0.00000
SCALE2 0.000000 0.032330 0.000000 0.00000
SCALE3 0.000000 0.000000 0.039288 0.00000
ATOM 1 CA SER A 3 11.057 14.432 13.617 1.00 5.95 C
ANISOU 1 CA SER A 3 582 671 1007 -168 135 158 C
ATOM 2 CB SER A 3 10.277 13.648 12.559 1.00 6.41 C
ANISOU 2 CB SER A 3 638 591 1207 -162 148 106 C
ATOM 3 OG SER A 3 9.060 14.298 12.238 1.00 7.60 O
ANISOU 3 OG SER A 3 680 955 1251 -244 -164 339 O
ATOM 4 HB2ASER A 3 10.818 13.578 11.757 0.40 6.41 H
ATOM 5 HB3ASER A 3 10.081 12.763 12.904 0.40 6.41 H
ATOM 6 DB2BSER A 3 10.818 13.578 11.757 0.60 6.41 D
ATOM 7 DB3BSER A 3 10.081 12.763 12.904 0.60 6.41 D
"""
# 2tetra distorted
pdb_str_06 = """
CRYST1 27.805 30.931 25.453 90.00 90.00 90.00 P 1
SCALE1 0.035965 0.000000 0.000000 0.00000
SCALE2 0.000000 0.032330 0.000000 0.00000
SCALE3 0.000000 0.000000 0.039288 0.00000
ATOM 1 CA SER A 3 10.856 14.444 13.455 1.00 5.95 C
ANISOU 1 CA SER A 3 582 671 1007 -168 135 158 C
ATOM 2 CB SER A 3 10.225 13.565 12.597 1.00 6.41 C
ANISOU 2 CB SER A 3 638 591 1207 -162 148 106 C
ATOM 3 OG SER A 3 9.123 14.383 12.193 1.00 7.60 O
ANISOU 3 OG SER A 3 680 955 1251 -244 -164 339 O
ATOM 4 HB2ASER A 3 10.969 13.436 11.739 0.40 6.41 H
ATOM 5 HB3ASER A 3 10.028 12.954 13.077 0.40 6.41 H
ATOM 6 DB2BSER A 3 10.925 13.557 11.772 0.60 6.41 D
ATOM 7 DB3BSER A 3 10.287 12.654 12.803 0.60 6.41 D
"""
# 3tetra minimized
pdb_str_07 = """
CRYST1 27.805 30.931 25.453 90.00 90.00 90.00 P 1
SCALE1 0.035965 0.000000 0.000000 0.00000
SCALE2 0.000000 0.032330 0.000000 0.00000
SCALE3 0.000000 0.000000 0.039288 0.00000
ATOM 1 N SER A 4 7.601 20.006 12.837 1.00 5.10 N
ANISOU 1 N SER A 4 500 632 808 -107 58 104 N
ATOM 2 CA SER A 4 7.145 18.738 13.391 1.00 5.95 C
ANISOU 2 CA SER A 4 582 671 1007 -168 135 158 C
ATOM 3 C SER A 4 8.313 17.949 13.971 1.00 5.79 C
ANISOU 3 C SER A 4 661 588 952 -116 206 135 C
ATOM 4 CB SER A 4 6.433 17.910 12.319 1.00 6.41 C
ANISOU 4 CB SER A 4 638 591 1207 -162 148 106 C
ATOM 5 HA ASER A 4 6.513 18.912 14.106 0.55 5.95 H
ATOM 5 DA BSER A 4 6.513 18.912 14.106 0.45 5.95 D
"""
# 3tetra distorted
pdb_str_08 = """
CRYST1 27.805 30.931 25.453 90.00 90.00 90.00 P 1
SCALE1 0.035965 0.000000 0.000000 0.00000
SCALE2 0.000000 0.032330 0.000000 0.00000
SCALE3 0.000000 0.000000 0.039288 0.00000
ATOM 1 N SER A 4 7.694 20.012 12.917 1.00 5.10 N
ANISOU 1 N SER A 4 500 632 808 -107 58 104 N
ATOM 2 CA SER A 4 7.004 18.841 13.534 1.00 5.95 C
ANISOU 2 CA SER A 4 582 671 1007 -168 135 158 C
ATOM 3 C SER A 4 8.178 17.985 14.068 1.00 5.79 C
ANISOU 3 C SER A 4 661 588 952 -116 206 135 C
ATOM 4 CB SER A 4 6.270 18.108 12.379 1.00 6.41 C
ANISOU 4 CB SER A 4 638 591 1207 -162 148 106 C
ATOM 5 HA ASER A 4 6.524 18.724 14.126 0.55 5.95 H
ATOM 5 DA BSER A 4 6.513 18.912 14.106 0.45 5.95 H
"""
# alg1b minimized
pdb_str_09 = """CRYST1 27.805 30.931 25.453 90.00 90.00 90.00 P 1
SCALE1 0.035965 0.000000 0.000000 0.00000
SCALE2 0.000000 0.032330 0.000000 0.00000
SCALE3 0.000000 0.000000 0.039288 0.00000
ATOM 1 CA SER A 6 10.253 21.454 16.783 1.00 5.95 C
ANISOU 1 CA SER A 6 582 671 1007 -168 135 158 C
ATOM 2 CB SER A 6 9.552 20.703 15.649 1.00 6.41 C
ANISOU 2 CB SER A 6 638 591 1207 -162 148 106 C
ATOM 3 OG SER A 6 8.345 21.347 15.280 1.00 7.60 O
ANISOU 3 OG SER A 6 680 955 1251 -244 -164 339 O
ATOM 4 HB2 SER A 6 10.214 20.670 14.784 0.45 6.41 H
ATOM 5 HB3 SER A 6 9.325 19.690 15.982 0.45 6.41 H
ATOM 6 HG ASER A 6 7.910 20.853 14.554 0.55 7.60 H
ATOM 7 DG BSER A 6 7.910 20.853 14.554 0.55 7.60 D
TER
"""
# alg1b distorted
pdb_str_10 = """
CRYST1 27.805 30.931 25.453 90.00 90.00 90.00 P 1
SCALE1 0.035965 0.000000 0.000000 0.00000
SCALE2 0.000000 0.032330 0.000000 0.00000
SCALE3 0.000000 0.000000 0.039288 0.00000
ATOM 1 CA SER A 6 10.083 21.475 16.848 1.00 5.95 C
ANISOU 1 CA SER A 6 582 671 1007 -168 135 158 C
ATOM 2 CB SER A 6 9.437 20.751 15.546 1.00 6.41 C
ANISOU 2 CB SER A 6 638 591 1207 -162 148 106 C
ATOM 3 OG SER A 6 8.368 21.346 15.146 1.00 7.60 O
ANISOU 3 OG SER A 6 680 955 1251 -244 -164 339 O
ATOM 4 HB2 SER A 6 10.034 20.616 14.977 0.45 6.41 H
ATOM 5 HB3 SER A 6 9.158 19.775 16.137 0.45 6.41 H
ATOM 6 HG ASER A 6 7.964 20.906 14.679 0.55 7.60 H
ATOM 7 DG BSER A 6 7.989 20.672 14.454 0.55 7.60 D
"""
# propeller minimized
pdb_str_11 = """
CRYST1 27.805 30.931 25.453 90.00 90.00 90.00 P 1
SCALE1 0.035965 0.000000 0.000000 0.00000
SCALE2 0.000000 0.032330 0.000000 0.00000
SCALE3 0.000000 0.000000 0.039288 0.00000
ATOM 1 CA VAL A 7 17.924 8.950 14.754 1.00 10.37 C
ANISOU 1 CA VAL A 7 1783 902 1255 67 -148 -14 C
ATOM 2 CB VAL A 7 16.678 8.886 13.852 1.00 11.26 C
ANISOU 2 CB VAL A 7 1571 963 1743 96 -176 62 C
ATOM 3 CG1 VAL A 7 16.660 10.063 12.888 1.00 11.06 C
ANISOU 3 CG1 VAL A 7 1704 1072 1425 125 -105 37 C
ATOM 4 HG11AVAL A 7 16.643 10.888 13.399 0.35 13.27 H
ATOM 5 HG12AVAL A 7 15.868 10.002 12.331 0.35 13.27 H
ATOM 6 HG13AVAL A 7 17.457 10.030 12.336 0.35 13.27 H
ATOM 7 DG11BVAL A 7 16.643 10.888 13.399 0.65 13.27 D
ATOM 8 DG12BVAL A 7 15.868 10.002 12.331 0.65 13.27 D
ATOM 9 DG13BVAL A 7 17.457 10.030 12.336 0.65 13.27 D
"""
# propeller shaked
pdb_str_12 = """
CRYST1 27.805 30.931 25.453 90.00 90.00 90.00 P 1
SCALE1 0.035965 0.000000 0.000000 0.00000
SCALE2 0.000000 0.032330 0.000000 0.00000
SCALE3 0.000000 0.000000 0.039288 0.00000
ATOM 1 CA VAL A 7 17.967 9.002 14.926 1.00 10.37 C
ANISOU 1 CA VAL A 7 1783 902 1255 67 -148 -14 C
ATOM 2 CB VAL A 7 16.698 8.861 13.864 1.00 11.26 C
ANISOU 2 CB VAL A 7 1571 963 1743 96 -176 62 C
ATOM 3 CG1 VAL A 7 16.655 10.154 13.014 1.00 11.06 C
ANISOU 3 CG1 VAL A 7 1704 1072 1425 125 -105 37 C
ATOM 4 HG11AVAL A 7 16.508 10.702 13.486 0.35 13.27 H
ATOM 5 HG12AVAL A 7 15.946 9.926 12.504 0.35 13.27 H
ATOM 6 HG13AVAL A 7 17.430 9.954 12.137 0.35 13.27 H
ATOM 7 DG11BVAL A 7 16.842 11.073 13.530 0.65 13.27 D
ATOM 8 DG12BVAL A 7 15.776 10.049 12.295 0.65 13.27 D
ATOM 9 DG13BVAL A 7 17.656 9.940 12.381 0.65 13.27 D
"""
pdb_list = [pdb_str_01, pdb_str_02, pdb_str_03, pdb_str_04, pdb_str_05,
pdb_str_06, pdb_str_07, pdb_str_08, pdb_str_09, pdb_str_10, pdb_str_11,
pdb_str_12]
#
pdb_list_name = ['pdb_str_01', 'pdb_str_02', 'pdb_str_03', 'pdb_str_04', 'pdb_str_05',
'pdb_str_06', 'pdb_str_07', 'pdb_str_08', 'pdb_str_09', 'pdb_str_10', 'pdb_str_11',
'pdb_str_12']
#pdb_list = [pdb_str_02]
#pdb_list_name = ['pdb_str_02']
def run():
#for idealize in [True, False]:
for pdb_str, str_name in zip(pdb_list,pdb_list_name):
#print 'pdb_string:', str_name#, 'idealize =', idealize
exercise(pdb_str=pdb_str, eps=1.e-4)
if (__name__ == "__main__"):
t0 = time.time()
run()
print("OK. Time:", round(time.time()-t0, 2), "seconds")
| 51.250794 | 86 | 0.512389 | 3,003 | 16,144 | 2.68365 | 0.169497 | 0.062539 | 0.071473 | 0.023824 | 0.705795 | 0.697481 | 0.691029 | 0.691029 | 0.691029 | 0.673036 | 0 | 0.447421 | 0.405662 | 16,144 | 314 | 87 | 51.414013 | 0.392496 | 0.043298 | 0 | 0.339695 | 0 | 0.419847 | 0.822686 | 0 | 0 | 0 | 0 | 0 | 0.003817 | 1 | 0.007634 | false | 0 | 0.026718 | 0 | 0.034351 | 0.007634 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
d5dc1f9b22ffd21895bfb1510f75e28c11f092eb | 20,046 | py | Python | custos-client-sdks/custos-python-sdk/build/lib/custos/server/integration/SharingManagementService_pb2.py | apache/airavata-custos | 075dd26c364b5b5abe8a4f2b226b2de30474f8e4 | [
"Apache-2.0"
] | 10 | 2019-05-21T22:42:35.000Z | 2022-03-25T15:58:09.000Z | custos-client-sdks/custos-python-sdk/build/lib/custos/server/integration/SharingManagementService_pb2.py | apache/airavata-custos | 075dd26c364b5b5abe8a4f2b226b2de30474f8e4 | [
"Apache-2.0"
] | 83 | 2019-02-22T12:22:14.000Z | 2022-03-30T13:42:47.000Z | custos-client-sdks/custos-python-sdk/build/lib/custos/server/integration/SharingManagementService_pb2.py | apache/airavata-custos | 075dd26c364b5b5abe8a4f2b226b2de30474f8e4 | [
"Apache-2.0"
] | 20 | 2019-02-22T08:10:05.000Z | 2021-11-07T19:37:04.000Z | # -*- coding: utf-8 -*-
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: SharingManagementService.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
import custos.server.core.SharingService_pb2 as SharingService__pb2
from google.api import annotations_pb2 as google_dot_api_dot_annotations__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='SharingManagementService.proto',
package='org.apache.custos.sharing.management.service',
syntax='proto3',
serialized_options=b'P\001Z\004./pb',
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x1eSharingManagementService.proto\x12,org.apache.custos.sharing.management.service\x1a\x14SharingService.proto\x1a\x1cgoogle/api/annotations.proto2\xe8\"\n\x18SharingManagementService\x12\xa3\x01\n\x10\x63reateEntityType\x12\x34.org.apache.custos.sharing.service.EntityTypeRequest\x1a).org.apache.custos.sharing.service.Status\".\x82\xd3\xe4\x93\x02(\"&/sharing-management/v1.0.0/entity/type\x12\xa3\x01\n\x10updateEntityType\x12\x34.org.apache.custos.sharing.service.EntityTypeRequest\x1a).org.apache.custos.sharing.service.Status\".\x82\xd3\xe4\x93\x02(\x1a&/sharing-management/v1.0.0/entity/type\x12\xa3\x01\n\x10\x64\x65leteEntityType\x12\x34.org.apache.custos.sharing.service.EntityTypeRequest\x1a).org.apache.custos.sharing.service.Status\".\x82\xd3\xe4\x93\x02(*&/sharing-management/v1.0.0/entity/type\x12\xa4\x01\n\rgetEntityType\x12\x34.org.apache.custos.sharing.service.EntityTypeRequest\x1a-.org.apache.custos.sharing.service.EntityType\".\x82\xd3\xe4\x93\x02(\x12&/sharing-management/v1.0.0/entity/type\x12\xa3\x01\n\x0egetEntityTypes\x12\x30.org.apache.custos.sharing.service.SearchRequest\x1a..org.apache.custos.sharing.service.EntityTypes\"/\x82\xd3\xe4\x93\x02)\x12\'/sharing-management/v1.0.0/entity/types\x12\xaf\x01\n\x14\x63reatePermissionType\x12\x38.org.apache.custos.sharing.service.PermissionTypeRequest\x1a).org.apache.custos.sharing.service.Status\"2\x82\xd3\xe4\x93\x02,\"*/sharing-management/v1.0.0/permission/type\x12\xaf\x01\n\x14updatePermissionType\x12\x38.org.apache.custos.sharing.service.PermissionTypeRequest\x1a).org.apache.custos.sharing.service.Status\"2\x82\xd3\xe4\x93\x02,\x1a*/sharing-management/v1.0.0/permission/type\x12\xaf\x01\n\x14\x64\x65letePermissionType\x12\x38.org.apache.custos.sharing.service.PermissionTypeRequest\x1a).org.apache.custos.sharing.service.Status\"2\x82\xd3\xe4\x93\x02,**/sharing-management/v1.0.0/permission/type\x12\xb4\x01\n\x11getPermissionType\x12\x38.org.apache.custos.sharing.service.PermissionTypeRequest\x1a\x31.org.apache.custos.sharing.service.PermissionType\"2\x82\xd3\xe4\x93\x02,\x12*/sharing-management/v1.0.0/permission/type\x12\xaf\x01\n\x12getPermissionTypes\x12\x30.org.apache.custos.sharing.service.SearchRequest\x1a\x32.org.apache.custos.sharing.service.PermissionTypes\"3\x82\xd3\xe4\x93\x02-\x12+/sharing-management/v1.0.0/permission/types\x12\x96\x01\n\x0c\x63reateEntity\x12\x30.org.apache.custos.sharing.service.EntityRequest\x1a).org.apache.custos.sharing.service.Status\")\x82\xd3\xe4\x93\x02#\"!/sharing-management/v1.0.0/entity\x12\x96\x01\n\x0cupdateEntity\x12\x30.org.apache.custos.sharing.service.EntityRequest\x1a).org.apache.custos.sharing.service.Status\")\x82\xd3\xe4\x93\x02#\x1a!/sharing-management/v1.0.0/entity\x12\xa2\x01\n\x0eisEntityExists\x12\x30.org.apache.custos.sharing.service.EntityRequest\x1a).org.apache.custos.sharing.service.Status\"3\x82\xd3\xe4\x93\x02-\x12+/sharing-management/v1.0.0/entity/existence\x12\x93\x01\n\tgetEntity\x12\x30.org.apache.custos.sharing.service.EntityRequest\x1a).org.apache.custos.sharing.service.Entity\")\x82\xd3\xe4\x93\x02#\x12!/sharing-management/v1.0.0/entity\x12\x96\x01\n\x0c\x64\x65leteEntity\x12\x30.org.apache.custos.sharing.service.EntityRequest\x1a).org.apache.custos.sharing.service.Status\")\x82\xd3\xe4\x93\x02#*!/sharing-management/v1.0.0/entity\x12\x9c\x01\n\x0esearchEntities\x12\x30.org.apache.custos.sharing.service.SearchRequest\x1a+.org.apache.custos.sharing.service.Entities\"+\x82\xd3\xe4\x93\x02%\"#/sharing-management/v1.0.0/entities\x12\xaa\x01\n\x14getListOfSharedUsers\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a/.org.apache.custos.sharing.service.SharedOwners\".\x82\xd3\xe4\x93\x02(\x12&/sharing-management/v1.0.0/users/share\x12\xb9\x01\n\x1cgetListOfDirectlySharedUsers\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a/.org.apache.custos.sharing.service.SharedOwners\"5\x82\xd3\xe4\x93\x02/\x12-/sharing-management/v1.0.0/users/share/direct\x12\xac\x01\n\x15getListOfSharedGroups\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a/.org.apache.custos.sharing.service.SharedOwners\"/\x82\xd3\xe4\x93\x02)\x12\'/sharing-management/v1.0.0/groups/share\x12\xbb\x01\n\x1dgetListOfDirectlySharedGroups\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a/.org.apache.custos.sharing.service.SharedOwners\"6\x82\xd3\xe4\x93\x02\x30\x12./sharing-management/v1.0.0/groups/share/direct\x12\xbb\x01\n\x14getAllDirectSharings\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a?.org.apache.custos.sharing.service.GetAllDirectSharingsResponse\"/\x82\xd3\xe4\x93\x02)\x12\'/sharing-management/v1.0.0/share/direct\x12\xa4\x01\n\x14shareEntityWithUsers\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a).org.apache.custos.sharing.service.Status\".\x82\xd3\xe4\x93\x02(\"&/sharing-management/v1.0.0/users/share\x12\xa6\x01\n\x15shareEntityWithGroups\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a).org.apache.custos.sharing.service.Status\"/\x82\xd3\xe4\x93\x02)\"\'/sharing-management/v1.0.0/groups/share\x12\xac\x01\n\x1crevokeEntitySharingFromUsers\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a).org.apache.custos.sharing.service.Status\".\x82\xd3\xe4\x93\x02(*&/sharing-management/v1.0.0/users/share\x12\xae\x01\n\x1drevokeEntitySharingFromGroups\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a).org.apache.custos.sharing.service.Status\"/\x82\xd3\xe4\x93\x02)*\'/sharing-management/v1.0.0/groups/share\x12\xa4\x01\n\ruserHasAccess\x12\x31.org.apache.custos.sharing.service.SharingRequest\x1a).org.apache.custos.sharing.service.Status\"5\x82\xd3\xe4\x93\x02/\x12-/sharing-management/v1.0.0/entity/user/accessB\x08P\x01Z\x04./pbb\x06proto3'
,
dependencies=[SharingService__pb2.DESCRIPTOR,google_dot_api_dot_annotations__pb2.DESCRIPTOR,])
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
DESCRIPTOR._options = None
_SHARINGMANAGEMENTSERVICE = _descriptor.ServiceDescriptor(
name='SharingManagementService',
full_name='org.apache.custos.sharing.management.service.SharingManagementService',
file=DESCRIPTOR,
index=0,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=133,
serialized_end=4589,
methods=[
_descriptor.MethodDescriptor(
name='createEntityType',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.createEntityType',
index=0,
containing_service=None,
input_type=SharingService__pb2._ENTITYTYPEREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002(\"&/sharing-management/v1.0.0/entity/type',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='updateEntityType',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.updateEntityType',
index=1,
containing_service=None,
input_type=SharingService__pb2._ENTITYTYPEREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002(\032&/sharing-management/v1.0.0/entity/type',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='deleteEntityType',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.deleteEntityType',
index=2,
containing_service=None,
input_type=SharingService__pb2._ENTITYTYPEREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002(*&/sharing-management/v1.0.0/entity/type',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getEntityType',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getEntityType',
index=3,
containing_service=None,
input_type=SharingService__pb2._ENTITYTYPEREQUEST,
output_type=SharingService__pb2._ENTITYTYPE,
serialized_options=b'\202\323\344\223\002(\022&/sharing-management/v1.0.0/entity/type',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getEntityTypes',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getEntityTypes',
index=4,
containing_service=None,
input_type=SharingService__pb2._SEARCHREQUEST,
output_type=SharingService__pb2._ENTITYTYPES,
serialized_options=b'\202\323\344\223\002)\022\'/sharing-management/v1.0.0/entity/types',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='createPermissionType',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.createPermissionType',
index=5,
containing_service=None,
input_type=SharingService__pb2._PERMISSIONTYPEREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002,\"*/sharing-management/v1.0.0/permission/type',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='updatePermissionType',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.updatePermissionType',
index=6,
containing_service=None,
input_type=SharingService__pb2._PERMISSIONTYPEREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002,\032*/sharing-management/v1.0.0/permission/type',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='deletePermissionType',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.deletePermissionType',
index=7,
containing_service=None,
input_type=SharingService__pb2._PERMISSIONTYPEREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002,**/sharing-management/v1.0.0/permission/type',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getPermissionType',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getPermissionType',
index=8,
containing_service=None,
input_type=SharingService__pb2._PERMISSIONTYPEREQUEST,
output_type=SharingService__pb2._PERMISSIONTYPE,
serialized_options=b'\202\323\344\223\002,\022*/sharing-management/v1.0.0/permission/type',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getPermissionTypes',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getPermissionTypes',
index=9,
containing_service=None,
input_type=SharingService__pb2._SEARCHREQUEST,
output_type=SharingService__pb2._PERMISSIONTYPES,
serialized_options=b'\202\323\344\223\002-\022+/sharing-management/v1.0.0/permission/types',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='createEntity',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.createEntity',
index=10,
containing_service=None,
input_type=SharingService__pb2._ENTITYREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002#\"!/sharing-management/v1.0.0/entity',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='updateEntity',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.updateEntity',
index=11,
containing_service=None,
input_type=SharingService__pb2._ENTITYREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002#\032!/sharing-management/v1.0.0/entity',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='isEntityExists',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.isEntityExists',
index=12,
containing_service=None,
input_type=SharingService__pb2._ENTITYREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002-\022+/sharing-management/v1.0.0/entity/existence',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getEntity',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getEntity',
index=13,
containing_service=None,
input_type=SharingService__pb2._ENTITYREQUEST,
output_type=SharingService__pb2._ENTITY,
serialized_options=b'\202\323\344\223\002#\022!/sharing-management/v1.0.0/entity',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='deleteEntity',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.deleteEntity',
index=14,
containing_service=None,
input_type=SharingService__pb2._ENTITYREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002#*!/sharing-management/v1.0.0/entity',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='searchEntities',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.searchEntities',
index=15,
containing_service=None,
input_type=SharingService__pb2._SEARCHREQUEST,
output_type=SharingService__pb2._ENTITIES,
serialized_options=b'\202\323\344\223\002%\"#/sharing-management/v1.0.0/entities',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getListOfSharedUsers',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getListOfSharedUsers',
index=16,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._SHAREDOWNERS,
serialized_options=b'\202\323\344\223\002(\022&/sharing-management/v1.0.0/users/share',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getListOfDirectlySharedUsers',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getListOfDirectlySharedUsers',
index=17,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._SHAREDOWNERS,
serialized_options=b'\202\323\344\223\002/\022-/sharing-management/v1.0.0/users/share/direct',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getListOfSharedGroups',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getListOfSharedGroups',
index=18,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._SHAREDOWNERS,
serialized_options=b'\202\323\344\223\002)\022\'/sharing-management/v1.0.0/groups/share',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getListOfDirectlySharedGroups',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getListOfDirectlySharedGroups',
index=19,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._SHAREDOWNERS,
serialized_options=b'\202\323\344\223\0020\022./sharing-management/v1.0.0/groups/share/direct',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='getAllDirectSharings',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.getAllDirectSharings',
index=20,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._GETALLDIRECTSHARINGSRESPONSE,
serialized_options=b'\202\323\344\223\002)\022\'/sharing-management/v1.0.0/share/direct',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='shareEntityWithUsers',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.shareEntityWithUsers',
index=21,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002(\"&/sharing-management/v1.0.0/users/share',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='shareEntityWithGroups',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.shareEntityWithGroups',
index=22,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002)\"\'/sharing-management/v1.0.0/groups/share',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='revokeEntitySharingFromUsers',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.revokeEntitySharingFromUsers',
index=23,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002(*&/sharing-management/v1.0.0/users/share',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='revokeEntitySharingFromGroups',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.revokeEntitySharingFromGroups',
index=24,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002)*\'/sharing-management/v1.0.0/groups/share',
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='userHasAccess',
full_name='org.apache.custos.sharing.management.service.SharingManagementService.userHasAccess',
index=25,
containing_service=None,
input_type=SharingService__pb2._SHARINGREQUEST,
output_type=SharingService__pb2._STATUS,
serialized_options=b'\202\323\344\223\002/\022-/sharing-management/v1.0.0/entity/user/access',
create_key=_descriptor._internal_create_key,
),
])
_sym_db.RegisterServiceDescriptor(_SHARINGMANAGEMENTSERVICE)
DESCRIPTOR.services_by_name['SharingManagementService'] = _SHARINGMANAGEMENTSERVICE
# @@protoc_insertion_point(module_scope)
| 60.930091 | 5,739 | 0.797665 | 2,503 | 20,046 | 6.187375 | 0.118658 | 0.047072 | 0.078453 | 0.115064 | 0.758895 | 0.752954 | 0.742429 | 0.728224 | 0.722154 | 0.582747 | 0 | 0.069573 | 0.074329 | 20,046 | 328 | 5,740 | 61.115854 | 0.765036 | 0.049436 | 0 | 0.525773 | 0 | 0.072165 | 0.407956 | 0.390299 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.020619 | 0 | 0.020619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
d5fe1e32cec7c507f7dc604cfeaa179ea52dbfaa | 58 | py | Python | python/non_buildable_2/number_returns/tests/__init__.py | nagi49000/tutorial-memory-refs-and-folder-structures | bede74884fc96d89b9cfdd45fba3c69b3f9445c1 | [
"MIT"
] | null | null | null | python/non_buildable_2/number_returns/tests/__init__.py | nagi49000/tutorial-memory-refs-and-folder-structures | bede74884fc96d89b9cfdd45fba3c69b3f9445c1 | [
"MIT"
] | null | null | null | python/non_buildable_2/number_returns/tests/__init__.py | nagi49000/tutorial-memory-refs-and-folder-structures | bede74884fc96d89b9cfdd45fba3c69b3f9445c1 | [
"MIT"
] | null | null | null | import os
import sys
sys.path.append(os.path.join(".."))
| 11.6 | 35 | 0.689655 | 10 | 58 | 4 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 58 | 4 | 36 | 14.5 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
e682e55956574b100d3eeafe40cafc87507dbb94 | 228 | py | Python | altered/py23compat.py | Plexical/altered.states | 40c18306bfa8d0a3f0ccbc942a887969074beff0 | [
"Apache-2.0"
] | 1 | 2021-08-28T11:55:41.000Z | 2021-08-28T11:55:41.000Z | altered/py23compat.py | Plexical/altered.states | 40c18306bfa8d0a3f0ccbc942a887969074beff0 | [
"Apache-2.0"
] | 3 | 2015-02-04T13:09:17.000Z | 2021-09-13T11:34:45.000Z | altered/py23compat.py | Plexical/altered.states | 40c18306bfa8d0a3f0ccbc942a887969074beff0 | [
"Apache-2.0"
] | null | null | null | def strio():
"""
This was difficult to get right in doctests when porting to Python 3.
"""
try:
from StringIO import StringIO
except ImportError:
from io import StringIO
return StringIO()
| 22.8 | 73 | 0.631579 | 28 | 228 | 5.142857 | 0.785714 | 0.194444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006329 | 0.307018 | 228 | 9 | 74 | 25.333333 | 0.905063 | 0.302632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.5 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
e6b27dec1df017de0b766536228accd63d6da1c4 | 193 | py | Python | Machine_Learning_0/ChatBot_0/chatbot_source/chatbot_source.py | CyborgVillager/Ai_Development_0 | 0027bd12060c0988c86fe0b4da5f9cd6f84451bd | [
"MIT"
] | null | null | null | Machine_Learning_0/ChatBot_0/chatbot_source/chatbot_source.py | CyborgVillager/Ai_Development_0 | 0027bd12060c0988c86fe0b4da5f9cd6f84451bd | [
"MIT"
] | null | null | null | Machine_Learning_0/ChatBot_0/chatbot_source/chatbot_source.py | CyborgVillager/Ai_Development_0 | 0027bd12060c0988c86fe0b4da5f9cd6f84451bd | [
"MIT"
] | null | null | null | import nltk
import warnings
from tkinter import *
import time
import tkinter.messagebox
from Ai_Development_Learning.Machine_Learning_0.ChatBot_0.bot import chat
import pyttsx3
import threading | 24.125 | 73 | 0.875648 | 28 | 193 | 5.857143 | 0.607143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017241 | 0.098446 | 193 | 8 | 74 | 24.125 | 0.925287 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
e6bd78667120bb586b60abd59a76f0e35aa6bcd6 | 99 | py | Python | Python_Files/murach/book_apps/ch13/infinite_recursion.py | Interloper2448/BCGPortfolio | c4c160a835c64c8d099d44c0995197f806ccc824 | [
"MIT"
] | null | null | null | Python_Files/murach/book_apps/ch13/infinite_recursion.py | Interloper2448/BCGPortfolio | c4c160a835c64c8d099d44c0995197f806ccc824 | [
"MIT"
] | null | null | null | Python_Files/murach/book_apps/ch13/infinite_recursion.py | Interloper2448/BCGPortfolio | c4c160a835c64c8d099d44c0995197f806ccc824 | [
"MIT"
] | null | null | null | def display_message():
print("Press Ctrl+C to stop!")
display_message()
display_message()
| 16.5 | 34 | 0.69697 | 13 | 99 | 5.076923 | 0.692308 | 0.636364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171717 | 99 | 5 | 35 | 19.8 | 0.804878 | 0 | 0 | 0.5 | 0 | 0 | 0.212121 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.