hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
b07cdff1056abd3ab4c36a97794c67b1bf8f82e7 | 71 | py | Python | tests/__init__.py | Avery246813579/Python-Rainbow-Chain | 97a29dbf3dc54b0e96c9df7b5106f488c627ab48 | [
"MIT"
] | 3 | 2017-12-04T18:17:38.000Z | 2021-01-15T19:21:21.000Z | tests/__init__.py | Avery246813579/Python-Rainbow-Chain | 97a29dbf3dc54b0e96c9df7b5106f488c627ab48 | [
"MIT"
] | null | null | null | tests/__init__.py | Avery246813579/Python-Rainbow-Chain | 97a29dbf3dc54b0e96c9df7b5106f488c627ab48 | [
"MIT"
] | 1 | 2017-11-06T18:43:54.000Z | 2017-11-06T18:43:54.000Z | import Dictogram_Tests
import Histogram_Tests
import MarkovModel_Tests
| 17.75 | 24 | 0.915493 | 9 | 71 | 6.888889 | 0.555556 | 0.354839 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084507 | 71 | 3 | 25 | 23.666667 | 0.953846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b0c416079c0482f5eefbf82925437e9a5561622b | 166 | py | Python | common/tmalibrary/tmalibrary/probes/__init__.py | Jodao/tma-framework | 23132a2ea99d75be4fc56b6074e37ce819c419c0 | [
"Apache-2.0"
] | 1 | 2021-01-13T10:01:07.000Z | 2021-01-13T10:01:07.000Z | common/tmalibrary/tmalibrary/probes/__init__.py | Jodao/tma-framework | 23132a2ea99d75be4fc56b6074e37ce819c419c0 | [
"Apache-2.0"
] | 4 | 2021-03-08T16:00:07.000Z | 2021-03-11T10:38:08.000Z | common/tmalibrary/tmalibrary/probes/__init__.py | Jodao/tma-framework | 23132a2ea99d75be4fc56b6074e37ce819c419c0 | [
"Apache-2.0"
] | 6 | 2018-06-02T17:57:30.000Z | 2021-07-22T16:11:09.000Z | from .communication import Communication
from .data import Data
from .message import Message
from .message import ComplexEncoder
from .observation import Observation
| 27.666667 | 40 | 0.849398 | 20 | 166 | 7.05 | 0.35 | 0.156028 | 0.241135 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120482 | 166 | 5 | 41 | 33.2 | 0.965753 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9ff1713156e875cec44529366d658acb4e2a3652 | 32 | py | Python | src/hepmc/surrogate/__init__.py | mathisgerdes/monte-carlo-integration | 533d13eeb538fec46f8d5ed00e780153b68ba7d9 | [
"MIT"
] | 2 | 2018-11-15T03:01:03.000Z | 2020-02-25T16:54:02.000Z | src/hepmc/surrogate/__init__.py | mathisgerdes/monte-carlo-integration | 533d13eeb538fec46f8d5ed00e780153b68ba7d9 | [
"MIT"
] | null | null | null | src/hepmc/surrogate/__init__.py | mathisgerdes/monte-carlo-integration | 533d13eeb538fec46f8d5ed00e780153b68ba7d9 | [
"MIT"
] | 1 | 2021-04-15T09:02:00.000Z | 2021-04-15T09:02:00.000Z | from .extreme_learning import *
| 16 | 31 | 0.8125 | 4 | 32 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b0396c71e8037368c55335c1f7eaf47fbb3a6e51 | 115 | py | Python | pytrie/trie/__init__.py | irahorecka/barcode-levenshtein-search | ae9a22ffd66e19105d98beea0db091d209d70d2e | [
"MIT"
] | null | null | null | pytrie/trie/__init__.py | irahorecka/barcode-levenshtein-search | ae9a22ffd66e19105d98beea0db091d209d70d2e | [
"MIT"
] | null | null | null | pytrie/trie/__init__.py | irahorecka/barcode-levenshtein-search | ae9a22ffd66e19105d98beea0db091d209d70d2e | [
"MIT"
] | null | null | null | from pytrie.trie.concurrency import map_processes
from pytrie.trie.trie import TrieNode, search, search_concurrent
| 38.333333 | 64 | 0.86087 | 16 | 115 | 6.0625 | 0.625 | 0.206186 | 0.28866 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 115 | 2 | 65 | 57.5 | 0.92381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
b046134aecda13b1700d93cb501433220bc924d3 | 36 | py | Python | thumbulator/bareBench/python/__init__.py | impedimentToProgress/ratchet | 4a0f4994ee7e16819849fc0a1a1dbb04921a25fd | [
"MIT"
] | 4 | 2018-12-31T04:46:13.000Z | 2021-02-04T15:11:03.000Z | thumbulator/bareBench/python/__init__.py | impedimentToProgress/ratchet | 4a0f4994ee7e16819849fc0a1a1dbb04921a25fd | [
"MIT"
] | null | null | null | thumbulator/bareBench/python/__init__.py | impedimentToProgress/ratchet | 4a0f4994ee7e16819849fc0a1a1dbb04921a25fd | [
"MIT"
] | 3 | 2017-10-28T16:15:33.000Z | 2021-11-16T05:11:43.000Z | from commands import *
import tests
| 12 | 22 | 0.805556 | 5 | 36 | 5.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 36 | 2 | 23 | 18 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b049dd4423d4e6830835ef4c97f12631b04c3536 | 109 | py | Python | srllib/qtgui/__init__.py | craneworks/srl-python-lib | 83b24b9dc406f4481d2b5fad814e2eff932cb04f | [
"MIT"
] | null | null | null | srllib/qtgui/__init__.py | craneworks/srl-python-lib | 83b24b9dc406f4481d2b5fad814e2eff932cb04f | [
"MIT"
] | null | null | null | srllib/qtgui/__init__.py | craneworks/srl-python-lib | 83b24b9dc406f4481d2b5fad814e2eff932cb04f | [
"MIT"
] | null | null | null | """ Utilities for use with PyQt4 """
from _application import *
from _common import *
from _signal import *
| 18.166667 | 36 | 0.733945 | 14 | 109 | 5.5 | 0.714286 | 0.25974 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011236 | 0.183486 | 109 | 5 | 37 | 21.8 | 0.853933 | 0.256881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c67de60c5869c70b4294ace23a0e35bb38246d5b | 20,221 | py | Python | gym_extensions/continuous/mujoco/__init__.py | nicofirst1/gym-extensions | 0a5bb74a248484d2a7e5d2cb552e247612650f04 | [
"MIT"
] | null | null | null | gym_extensions/continuous/mujoco/__init__.py | nicofirst1/gym-extensions | 0a5bb74a248484d2a7e5d2cb552e247612650f04 | [
"MIT"
] | null | null | null | gym_extensions/continuous/mujoco/__init__.py | nicofirst1/gym-extensions | 0a5bb74a248484d2a7e5d2cb552e247612650f04 | [
"MIT"
] | 1 | 2021-12-01T09:31:48.000Z | 2021-12-01T09:31:48.000Z | import gym
import os
import gym.envs.mujoco
custom_envs = {
# Pusher modifications
"PusherMovingGoal-v1":
dict(path='gym_extensions.continuous.mujoco.modified_arm:PusherMovingGoalEnv',
max_episode_steps=100,
reward_threshold=0.0,
kwargs=dict()),
# Pusher modifications
"PusherLeftSide-v1":
dict(path='gym_extensions.continuous.mujoco.modified_arm:PusherLeftSide',
max_episode_steps=100,
reward_threshold=0.0,
kwargs=dict()),
"PusherFullRange-v1":
dict(path='gym_extensions.continuous.mujoco.modified_arm:PusherFullRange',
max_episode_steps=100,
reward_threshold=0.0,
kwargs=dict()),
# Striker
"StrikerMovingStart-v1":
dict(path='gym_extensions.continuous.mujoco.modified_arm:StrikerMovingStartStateEnv',
max_episode_steps=100,
reward_threshold=0.0,
kwargs=dict()),
# modified gravity - Hopper
"AntGravityMars-v1":
dict(path='gym_extensions.continuous.mujoco.modified_ant:AntGravityEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(gravity=-3.711)),
"AntGravityHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_ant:AntGravityEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(gravity=-4.905)),
"AntGravityOneAndHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_ant:AntGravityEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(gravity=-14.715)),
"HopperGravityHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperGravityEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(gravity=-4.905)),
"HopperGravityThreeQuarters-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperGravityEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(gravity=-7.3575)),
"HopperGravityOneAndHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperGravityEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(gravity=-14.715)),
"HopperGravityOneAndQuarter-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperGravityEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(gravity=-12.2625)),
"Walker2dGravityHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dGravityEnv',
max_episode_steps=1000,
kwargs=dict(gravity=-4.905)),
"Walker2dGravityThreeQuarters-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dGravityEnv',
max_episode_steps=1000,
kwargs=dict(gravity=-7.3575)),
"Walker2dGravityOneAndHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dGravityEnv',
max_episode_steps=1000,
kwargs=dict(gravity=-14.715)),
"Walker2dGravityOneAndQuarter-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dGravityEnv',
max_episode_steps=1000,
kwargs=dict(gravity=-12.2625)),
"HalfCheetahGravityHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahGravityEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(gravity=-4.905)),
"HalfCheetahGravityThreeQuarters-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahGravityEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(gravity=-7.3575)),
"HalfCheetahGravityOneAndHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahGravityEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(gravity=-14.715)),
"HalfCheetahGravityOneAndQuarter-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahGravityEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(gravity=-12.2625)),
"HumanoidGravityHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidGravityEnv',
max_episode_steps=1000,
kwargs=dict(gravity=-4.905)),
"HumanoidGravityThreeQuarters-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidGravityEnv',
max_episode_steps=1000,
kwargs=dict(gravity=-7.3575)),
"HumanoidGravityOneAndHalf-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidGravityEnv',
max_episode_steps=1000,
kwargs=dict(gravity=-14.715)),
"HumanoidGravityOneAndQuarter-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidGravityEnv',
max_episode_steps=1000,
kwargs=dict(gravity=-12.2625)),
### Environment with walls
"AntMaze-v1":
dict(path='gym_extensions.continuous.mujoco.modified_ant:AntMaze',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict()),
"HopperStairs-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperStairs',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict()),
"HopperSimpleWall-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperSimpleWallEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict()),
"HopperWithSensor-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperWithSensorEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(model_path=os.path.dirname(gym.envs.mujoco.__file__) + "/assets/hopper.xml")),
### Walker
"Walker2dWall-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dWallEnv',
max_episode_steps=1000,
kwargs=dict()),
"Walker2dWithSensor-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dWithSensorEnv',
max_episode_steps=1000,
kwargs=dict(model_path=os.path.dirname(gym.envs.mujoco.__file__) + "/assets/walker2d.xml")),
### HalfCheetah
"HalfCheetahWall-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahWallEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict()),
"HalfCheetahWithSensor-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahWithSensorEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(model_path=os.path.dirname(gym.envs.mujoco.__file__) + "/assets/half_cheetah.xml")),
### Humanoid
"HumanoidWall-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidWallEnv',
max_episode_steps=1000,
kwargs=dict()),
"HumanoidWithSensor-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidWithSensorEnv',
max_episode_steps=1000,
kwargs=dict(model_path=os.path.dirname(gym.envs.mujoco.__file__) + "/assets/humanoid.xml")),
"HumanoidStandupWithSensor-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidStandupWithSensorEnv',
max_episode_steps=1000,
kwargs=dict(model_path=os.path.dirname(gym.envs.mujoco.__file__) + "/assets/humanoidstandup.xml")),
"HumanoidStandupAndRunWall-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidStandupAndRunWallEnv',
max_episode_steps=1000,
kwargs=dict()),
"HumanoidStandupAndRunWithSensor-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidStandupAndRunEnvWithSensor',
max_episode_steps=1000,
kwargs=dict(model_path=os.path.dirname(gym.envs.mujoco.__file__) + "/assets/humanoidstandup.xml")),
"HumanoidStandupAndRun-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidStandupAndRunEnv',
max_episode_steps=1000,
kwargs=dict()),
# Modified body parts - Hopper
"HopperBigTorso-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(body_parts=["torso_geom"], size_scale=1.25)),
"HopperBigThigh-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(body_parts=["thigh_geom"], size_scale=1.25)),
"HopperBigLeg-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(body_parts=["leg_geom"], size_scale=1.25)),
"HopperBigFoot-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(body_parts=["foot_geom"], size_scale=1.25)),
"HopperSmallTorso-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(body_parts=["torso_geom"], size_scale=.75)),
"HopperSmallThigh-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(body_parts=["thigh_geom"], size_scale=.75)),
"HopperSmallLeg-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(body_parts=["leg_geom"], size_scale=.75)),
"HopperSmallFoot-v1":
dict(path='gym_extensions.continuous.mujoco.modified_hopper:HopperModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=3800.0,
kwargs=dict(body_parts=["foot_geom"], size_scale=.75)),
# Modified body parts - Walker
"Walker2dBigTorso-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["torso_geom"], size_scale=1.25)),
"Walker2dBigThigh-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["thigh_geom", "thigh_left_geom"], size_scale=1.25)),
"Walker2dBigLeg-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["leg_geom", "leg_left_geom"], size_scale=1.25)),
"Walker2dBigFoot-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["foot_geom", "foot_left_geom"], size_scale=1.25)),
"Walker2dSmallTorso-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["torso_geom"], size_scale=.75)),
"Walker2dSmallThigh-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["thigh_geom", "thigh_left_geom"], size_scale=.75)),
"Walker2dSmallLeg-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["leg_geom", "leg_left_geom"], size_scale=.75)),
"Walker2dSmallFoot-v1":
dict(path='gym_extensions.continuous.mujoco.modified_walker2d:Walker2dModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["foot_geom", "foot_left_geom"], size_scale=.75)),
# Modified body parts - HalfCheetah
"HalfCheetahBigTorso-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["torso"], size_scale=1.25)),
"HalfCheetahBigThigh-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["fthigh", "bthigh"], size_scale=1.25)),
"HalfCheetahBigLeg-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["fshin", "bshin"], size_scale=1.25)),
"HalfCheetahBigFoot-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["ffoot", "bfoot"], size_scale=1.25)),
"HalfCheetahSmallTorso-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["torso"], size_scale=.75)),
"HalfCheetahSmallThigh-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["fthigh", "bthigh"], size_scale=.75)),
"HalfCheetahSmallLeg-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["fshin", "bshin"], size_scale=.75)),
"HalfCheetahSmallFoot-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["ffoot", "bfoot"], size_scale=.75)),
"HalfCheetahSmallHead-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["head"], size_scale=.75)),
"HalfCheetahBigHead-v1":
dict(path='gym_extensions.continuous.mujoco.modified_half_cheetah:HalfCheetahModifiedBodyPartSizeEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict(body_parts=["head"], size_scale=1.25)),
# Modified body parts - Humanoid
"HumanoidBigTorso-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["torso1", "uwaist", "lwaist"], size_scale=1.25)),
"HumanoidBigThigh-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["right_thigh1", "left_thigh1", "butt"], size_scale=1.25)),
"HumanoidBigLeg-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["right_shin1", "left_shin1"], size_scale=1.25)),
"HumanoidBigFoot-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["left_foot", "right_foot"], size_scale=1.25)),
"HumanoidSmallTorso-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["torso1", "uwaist", "lwaist"], size_scale=.75)),
"HumanoidSmallThigh-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["right_thigh1", "left_thigh1", "butt"], size_scale=.75)),
"HumanoidSmallLeg-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["right_shin1", "left_shin1"], size_scale=.75)),
"HumanoidSmallFoot-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["left_foot", "right_foot"], size_scale=.75)),
"HumanoidSmallHead-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["head"], size_scale=.75)),
"HumanoidBigHead-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["head"], size_scale=1.25)),
"HumanoidSmallArm-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["right_uarm1", "right_larm", "left_uarm1", "left_larm"], size_scale=.75)),
"HumanoidBigArm-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["right_uarm1", "right_larm", "left_uarm1", "left_larm"], size_scale=1.25)),
"HumanoidSmallHand-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["left_hand", "right_hand"], size_scale=.75)),
"HumanoidBigHand-v1":
dict(path='gym_extensions.continuous.mujoco.modified_humanoid:HumanoidModifiedBodyPartSizeEnv',
max_episode_steps=1000,
kwargs=dict(body_parts=["left_hand", "right_hand"], size_scale=1.25)),
### Food Env
"AntSingleFood-v1":
dict(path='gym_extensions.continuous.mujoco.modified_ant:AntSingleFoodEnv',
max_episode_steps=1000,
reward_threshold=4800.0,
kwargs=dict()),
}
def register_custom_envs():
for key, value in custom_envs.items():
arg_dict = dict(id=key,
entry_point=value["path"],
max_episode_steps=value["max_episode_steps"],
kwargs=value["kwargs"])
if "reward_threshold" in value:
arg_dict["reward_threshold"] = value["reward_threshold"]
gym.envs.register(**arg_dict)
register_custom_envs()
| 50.300995 | 112 | 0.678206 | 2,068 | 20,221 | 6.366538 | 0.097195 | 0.060763 | 0.091144 | 0.077017 | 0.818776 | 0.817788 | 0.80594 | 0.803053 | 0.80275 | 0.73067 | 0 | 0.051391 | 0.207062 | 20,221 | 401 | 113 | 50.426434 | 0.769739 | 0.012759 | 0 | 0.594005 | 0 | 0 | 0.422238 | 0.348413 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002725 | false | 0 | 0.008174 | 0 | 0.010899 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c691cf410fb11ef9eeb668e6c013db32d424d306 | 232 | py | Python | src/tokenizer/ErrorToken.py | aboveyou00/cc | 816a2fb8e53723e8f2c72a8e5d9f443057017594 | [
"MIT"
] | null | null | null | src/tokenizer/ErrorToken.py | aboveyou00/cc | 816a2fb8e53723e8f2c72a8e5d9f443057017594 | [
"MIT"
] | null | null | null | src/tokenizer/ErrorToken.py | aboveyou00/cc | 816a2fb8e53723e8f2c72a8e5d9f443057017594 | [
"MIT"
] | null | null | null | from tokenizer.Token import *
class ErrorToken(Token):
def __init__(self, linen, start, orig):
super().__init__(linen, start, orig)
self.err = orig
def __str__(self):
return 'ERROR:' + self.err
| 23.2 | 44 | 0.616379 | 28 | 232 | 4.678571 | 0.607143 | 0.152672 | 0.21374 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.262931 | 232 | 9 | 45 | 25.777778 | 0.766082 | 0 | 0 | 0 | 0 | 0 | 0.025862 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.142857 | 0.714286 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
c6a00dc36f583235c0398f2ffaa65a8aa963d592 | 4,791 | py | Python | beatbrain/cli/convert.py | Emrys-Hong/BeatBrain | 68159d9cc46d85e73afdc5aa5341c45158dc1b28 | [
"MIT"
] | 1 | 2020-03-28T21:18:50.000Z | 2020-03-28T21:18:50.000Z | beatbrain/cli/convert.py | Emrys-Hong/BeatBrain | 68159d9cc46d85e73afdc5aa5341c45158dc1b28 | [
"MIT"
] | null | null | null | beatbrain/cli/convert.py | Emrys-Hong/BeatBrain | 68159d9cc46d85e73afdc5aa5341c45158dc1b28 | [
"MIT"
] | null | null | null | import click
from .. import settings, utils
@click.group(invoke_without_command=True, short_help="Data Conversion Utilities")
@click.pass_context
def convert(ctx):
click.echo(
click.style(
"------------------------\n"
"BeatBrain Data Converter\n"
"------------------------\n",
fg="green",
bold=True,
)
)
if ctx.invoked_subcommand is None:
click.echo(ctx.get_help())
@convert.command(
name="numpy", short_help="Convert audio files or TIFF images to numpy arrays"
)
@click.argument("path")
@click.argument("output")
@click.option(
"--sr",
help="Rate at which to resample audio",
default=settings.SAMPLE_RATE,
show_default=True,
)
@click.option(
"--offset",
help="Audio start timestamp (seconds)",
default=settings.AUDIO_OFFSET,
show_default=True,
)
@click.option(
"--duration",
help="Audio duration (seconds)",
default=settings.AUDIO_DURATION,
type=float,
show_default=True,
)
@click.option(
"--n_fft",
help="Size of FFT window to use",
default=settings.N_FFT,
show_default=True,
)
@click.option(
"--hop_length",
help="Short-time Fourier Transform hop length",
default=settings.HOP_LENGTH,
show_default=True,
)
@click.option(
"--n_mels",
help="Number of frequency bins to use",
default=settings.N_MELS,
show_default=True,
)
@click.option(
"--chunk_size",
help="Number of frames per spectrogram chunk",
default=settings.CHUNK_SIZE,
show_default=True,
)
@click.option(
"--flip",
help="Whether to flip images veritcally",
default=settings.IMAGE_FLIP,
show_default=True,
)
@click.option(
"--truncate/--pad",
help="Whether to truncate or pad the last chunk",
default=True,
show_default=True,
)
@click.option(
"--skip",
help="Number of samples to skip. Useful when restarting a failed job.",
default=0,
show_default=True,
)
def to_numpy(path, output, **kwargs):
return utils.convert_to_numpy(path, output, **kwargs)
@convert.command(name="image", short_help="Convert audio or .npz files to TIFF images")
@click.argument("path")
@click.argument("output")
@click.option(
"--sr",
help="Rate at which to resample audio",
default=settings.SAMPLE_RATE,
show_default=True,
)
@click.option(
"--offset",
help="Audio start timestamp (seconds)",
default=settings.AUDIO_OFFSET,
show_default=True,
)
@click.option(
"--duration",
help="Audio duration (seconds)",
default=settings.AUDIO_DURATION,
type=float,
show_default=True,
)
@click.option(
"--n_fft",
help="Size of FFT window to use",
default=settings.N_FFT,
show_default=True,
)
@click.option(
"--hop_length",
help="Short-time Fourier Transform hop length",
default=settings.HOP_LENGTH,
show_default=True,
)
@click.option(
"--chunk_size",
help="Number of frames per spectrogram chunk",
default=settings.CHUNK_SIZE,
show_default=True,
)
@click.option(
"--truncate/--pad",
help="Whether to truncate or pad the last chunk",
default=True,
show_default=True,
)
@click.option(
"--flip",
help="Whether to flip images veritcally",
default=settings.IMAGE_FLIP,
show_default=True,
)
@click.option(
"--skip",
help="Number of data samples to skip. Useful when restarting a failed job.",
default=0,
show_default=True,
)
def to_image(path, output, **kwargs):
return utils.convert_to_image(path, output, **kwargs)
@convert.command(
name="audio", short_help="Convert .npz files or TIFF images to audio files"
)
@click.argument("path")
@click.argument("output")
@click.option(
"--sr",
help="Rate at which to resample audio",
default=settings.SAMPLE_RATE,
show_default=True,
)
@click.option(
"--n_fft",
help="Size of FFT window to use",
default=settings.N_FFT,
show_default=True,
)
@click.option(
"--hop_length",
help="Short-time Fourier Transform hop length",
default=settings.HOP_LENGTH,
show_default=True,
)
@click.option(
"--offset",
help="Start point (in seconds) of reconstructed audio",
default=settings.AUDIO_OFFSET,
show_default=True,
)
@click.option(
"--duration",
help="Maximum seconds of audio to convert",
default=settings.AUDIO_DURATION,
type=float,
show_default=True,
)
@click.option(
"--flip",
help="Whether to flip images veritcally",
default=settings.IMAGE_FLIP,
show_default=True,
)
@click.option(
"--skip",
help="Number of samples to skip. Useful when restarting a failed job.",
default=0,
show_default=True,
)
def to_audio(path, output, **kwargs):
return utils.convert_to_audio(path, output, **kwargs)
| 23.485294 | 87 | 0.657274 | 609 | 4,791 | 5.049261 | 0.170772 | 0.100163 | 0.126829 | 0.149594 | 0.844553 | 0.810407 | 0.787967 | 0.749594 | 0.749594 | 0.749594 | 0 | 0.000779 | 0.196201 | 4,791 | 203 | 88 | 23.600985 | 0.797715 | 0 | 0 | 0.695876 | 0 | 0 | 0.307869 | 0.010854 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020619 | false | 0.005155 | 0.010309 | 0.015464 | 0.046392 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c6bdffae33e5268398bba0e4d88c0a0a2a9b48d7 | 33 | py | Python | sabueso/cards/protein/__init__.py | dprada/sabueso | 14843cf3522b5b89db5b61c1541a7015f114dd53 | [
"MIT"
] | null | null | null | sabueso/cards/protein/__init__.py | dprada/sabueso | 14843cf3522b5b89db5b61c1541a7015f114dd53 | [
"MIT"
] | 2 | 2022-01-31T21:22:17.000Z | 2022-02-04T20:20:12.000Z | sabueso/cards/protein/__init__.py | dprada/sabueso | 14843cf3522b5b89db5b61c1541a7015f114dd53 | [
"MIT"
] | 1 | 2021-07-20T15:01:14.000Z | 2021-07-20T15:01:14.000Z | from .protein import ProteinCard
| 16.5 | 32 | 0.848485 | 4 | 33 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
05b522c1056b1c4bd4c0289ab245698d754e9b86 | 137 | py | Python | opm/convertDoc/text2html.py | Open-Prose-Metrics/open_prose_metrics_app-core | 9df65edfe9ee9af0a0731c3f2e21ea25bced250c | [
"MIT"
] | null | null | null | opm/convertDoc/text2html.py | Open-Prose-Metrics/open_prose_metrics_app-core | 9df65edfe9ee9af0a0731c3f2e21ea25bced250c | [
"MIT"
] | 4 | 2021-04-30T21:38:10.000Z | 2022-01-13T03:32:33.000Z | opm/convertDoc/text2html.py | Open-Prose-Metrics/open_prose_metrics_app-core | 9df65edfe9ee9af0a0731c3f2e21ea25bced250c | [
"MIT"
] | 1 | 2021-03-21T14:08:28.000Z | 2021-03-21T14:08:28.000Z | # -*- coding: utf-8 -*-
from bs4 import BeautifulSoup
def htmlify(ucodetext):
return ucodetext.encode('ascii', 'xmlcharrefreplace')
| 22.833333 | 57 | 0.715328 | 15 | 137 | 6.533333 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016949 | 0.138686 | 137 | 5 | 58 | 27.4 | 0.813559 | 0.153285 | 0 | 0 | 0 | 0 | 0.192982 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
05cf3a05d9bd9dd9af9d19cf955c59946e1b09d2 | 35 | py | Python | pywren_ibm_cloud/compute/backends/ibm_cf/__init__.py | class-euproject/lithops | acf381817673c29db0e9e143001029357890a39b | [
"Apache-2.0"
] | 1 | 2020-08-04T08:16:31.000Z | 2020-08-04T08:16:31.000Z | pywren_ibm_cloud/compute/backends/ibm_cf/__init__.py | class-euproject/lithops | acf381817673c29db0e9e143001029357890a39b | [
"Apache-2.0"
] | null | null | null | pywren_ibm_cloud/compute/backends/ibm_cf/__init__.py | class-euproject/lithops | acf381817673c29db0e9e143001029357890a39b | [
"Apache-2.0"
] | null | null | null | from .ibm_cf import ComputeBackend
| 17.5 | 34 | 0.857143 | 5 | 35 | 5.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.935484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
af391a659fce2e750f66c7206851d45e5965e4da | 13,394 | py | Python | lindbladmpo/examples/simulation_building/operators_library.py | qiskit-community/lindbladmpo | e5d07fc2ce226b19e831d06334e33afb8d131e33 | [
"Apache-2.0"
] | 4 | 2022-03-14T02:47:16.000Z | 2022-03-24T11:20:04.000Z | lindbladmpo/examples/simulation_building/operators_library.py | qiskit-community/lindbladmpo | e5d07fc2ce226b19e831d06334e33afb8d131e33 | [
"Apache-2.0"
] | 1 | 2022-03-24T11:36:35.000Z | 2022-03-24T11:36:35.000Z | lindbladmpo/examples/simulation_building/operators_library.py | qiskit-community/lindbladmpo | e5d07fc2ce226b19e831d06334e33afb8d131e33 | [
"Apache-2.0"
] | null | null | null | # This code is licensed under the Apache License, Version 2.0. You may
# obtain a copy of this license in the LICENSE.txt file in the root directory
# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
#
# Any modifications or derivative works of this code must retain this
# copyright notice, and modified files need to carry a notice indicating
# that they have been altered from the originals.
from .operators import DynamicalOperator, Id, Zero
from typing import Any
import numpy as np
"""
These are the predefined operators available for creating dynamical system simulations.
Currently implemented are spin operators and truncated harmonic oscillator operators,
in addition to some more general operators, all realizing numpy matrices by default.
The classes ``Id`` and ``Zero`` are implemented separately, in the file operators.py
together with the base class ``DynamicalOperator``.
Currently supported:
Identity, Zero, and a standard basis Projector in any dimension.
Spin-1/2 (qubit) operators: x, y, z, sigma^(+/-) and six projectors on the state with +/- eigenvalues
along the axes.
Oscillator canonical operators, ladder operators, number and number^2 operators, in any dimension.
"""
class Projector(DynamicalOperator):
"""A dynamical operator that builds a numpy projector matrix in the standard basis."""
def __init__(self, system_id="", row=0, col=0):
self._row = row
self._col = col
super().__init__(system_id, "proj" + str(row) + "_" + str(col))
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
result = np.zeros((dim, dim), complex)
row = self._row
col = self._col
if row < dim and col < dim:
result[row, col] = 1
return result
raise Exception(
f"A projector with row = {row} and column = {col } "
f"is incompatible with matrix generation of dimension {dim}."
)
class Sx(DynamicalOperator):
"""A dynamical operator that builds a numpy Pauli x matrix."""
def __init__(self, system_id=""):
super().__init__(system_id, "x")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "x":
return np.asarray([[0, 1], [1, 0]], complex)
super().get_operator_matrix(dim)
class Sy(DynamicalOperator):
"""A dynamical operator that builds a numpy Pauli y matrix."""
def __init__(self, system_id=""):
super().__init__(system_id, "y")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "y":
return np.asarray([[0, -1j], [1j, 0]], complex)
super().get_operator_matrix(dim)
class Sz(DynamicalOperator):
"""A dynamical operator that builds a numpy Pauli z matrix."""
def __init__(self, system_id=""):
super().__init__(system_id, "z")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "z":
return np.asarray([[1, 0], [0, -1]], complex)
super().get_operator_matrix(dim)
class Sp(DynamicalOperator):
"""A dynamical operator that builds a numpy Pauli ladder |1><0| matrix."""
def __init__(self, system_id=""):
super().__init__(system_id, "sp")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "sp":
return np.asarray([[0, 1], [0, 0]], complex)
super().get_operator_matrix(dim)
class Sm(DynamicalOperator):
"""A dynamical operator that builds a numpy Pauli ladder |0><1| matrix."""
def __init__(self, system_id=""):
super().__init__(system_id, "sm")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "sm":
return np.asarray([[0, 0], [1, 0]], complex)
super().get_operator_matrix(dim)
class PlusZ(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for Up (|0><0|)."""
def __init__(self, system_id=""):
super().__init__(system_id, "0")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "0":
return np.asarray([[1, 0], [0, 0]], complex)
super().get_operator_matrix(dim)
class MinusZ(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for Down (|1><1|)."""
def __init__(self, system_id=""):
super().__init__(system_id, "1")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "1":
return np.asarray([[0, 0], [0, 1]], complex)
super().get_operator_matrix(dim)
class PlusX(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for plus state (|+><+|)."""
def __init__(self, system_id=""):
super().__init__(system_id, "+")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "+":
return np.asarray([[0.5, 0.5], [0.5, 0.5]], complex)
super().get_operator_matrix(dim)
class MinusX(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for minus state (|-><-|)."""
def __init__(self, system_id=""):
super().__init__(system_id, "-")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "-":
return np.asarray([[0.5, -0.5], [-0.5, 0.5]], complex)
super().get_operator_matrix(dim)
class PlusY(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for right y (|i><i|)."""
def __init__(self, system_id=""):
super().__init__(system_id, "r")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "r":
return np.asarray([[0.5, -0.5j], [0.5j, 0.5]], complex)
super().get_operator_matrix(dim)
class MinusY(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for left y (|-i><-i|)."""
def __init__(self, system_id=""):
super().__init__(system_id, "l")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if dim == 2 and self.s_type == "l":
return np.asarray([[0.5, 0.5j], [-0.5j, 0.5]], complex)
super().get_operator_matrix(dim)
class On(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for an oscillator number operator."""
def __init__(self, system_id=""):
super().__init__(system_id, "n")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if self.s_type == "n":
return np.diag(np.asarray(range(dim), complex))
super().get_operator_matrix(dim)
class On2(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for an oscillator n^2 operator."""
def __init__(self, system_id=""):
super().__init__(system_id, "n2")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if self.s_type == "n2":
return np.diag(np.asarray(range(dim), complex) ** 2)
super().get_operator_matrix(dim)
class Oa(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for an oscillator number operator."""
def __init__(self, system_id=""):
super().__init__(system_id, "a")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if self.s_type == "a":
return np.diag(np.asarray(range(1, dim), complex) ** 0.5, 1)
super().get_operator_matrix(dim)
class Oa_(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for an oscillator number operator."""
def __init__(self, system_id=""):
super().__init__(system_id, "a")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if self.s_type == "a":
return np.diag(np.asarray(range(1, dim), complex) ** 0.5, -1)
super().get_operator_matrix(dim)
class Oq(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for an oscillator number operator."""
def __init__(self, system_id=""):
super().__init__(system_id, "q")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if self.s_type == "q":
return (0.5**0.5) * (
np.diag(np.asarray(range(1, dim), complex) ** 0.5, 1)
+ np.diag(np.asarray(range(1, dim), complex) ** 0.5, -1)
)
super().get_operator_matrix(dim)
class Op(DynamicalOperator):
"""A dynamical operator that builds a numpy density matrix for an oscillator number operator."""
def __init__(self, system_id=""):
super().__init__(system_id, "p")
def get_operator_matrix(self, dim: int) -> Any:
"""Returns a matrix describing a realization of the operator specified in the parameters.
Args:
dim: The physical dimension of the matrix to generate.
"""
if self.s_type == "p":
return (
-1j
* (0.5**0.5)
* (
np.diag(np.asarray(range(1, dim), complex) ** 0.5, 1)
- np.diag(np.asarray(range(1, dim), complex) ** 0.5, -1)
)
)
super().get_operator_matrix(dim)
def get_operator_from_label(s_op: str, system_id=""):
s_op = s_op.lower()
if s_op == "i":
return Id(system_id)
if s_op == "zero":
return Zero(system_id)
elif s_op == "x":
return Sx(system_id)
elif s_op == "y":
return Sy(system_id)
elif s_op == "z":
return Sz(system_id)
elif s_op == "sp":
return Sp(system_id)
elif s_op == "sm":
return Sm(system_id)
elif s_op == "+z":
return PlusZ(system_id)
elif s_op == "-z":
return MinusZ(system_id)
elif s_op == "+y":
return PlusY(system_id)
elif s_op == "-y":
return MinusY(system_id)
elif s_op == "+x":
return PlusX(system_id)
elif s_op == "-x":
return MinusX(system_id)
else:
raise Exception(f"Unsupported operator label: {s_op}.")
| 35.062827 | 101 | 0.616246 | 1,791 | 13,394 | 4.437186 | 0.106644 | 0.050333 | 0.074871 | 0.079275 | 0.797156 | 0.782559 | 0.778407 | 0.749717 | 0.729458 | 0.679628 | 0 | 0.013344 | 0.26706 | 13,394 | 381 | 102 | 35.154856 | 0.79617 | 0.345603 | 0 | 0.325843 | 0 | 0 | 0.028437 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.207865 | false | 0 | 0.016854 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
af51d65e7c2e66c8dc0df7d576fd5103c57baf80 | 63 | py | Python | models/__init__.py | gloriatao/mcgproject | d085d9bc978bd086eb4dc9d28c7821eed401be70 | [
"MIT"
] | 1 | 2022-01-22T00:59:24.000Z | 2022-01-22T00:59:24.000Z | models/__init__.py | gloriatao/mcgproject | d085d9bc978bd086eb4dc9d28c7821eed401be70 | [
"MIT"
] | null | null | null | models/__init__.py | gloriatao/mcgproject | d085d9bc978bd086eb4dc9d28c7821eed401be70 | [
"MIT"
] | null | null | null | from models.mcg_bert import mcg_bert, Criterion3, Postprocess3
| 31.5 | 62 | 0.857143 | 9 | 63 | 5.777778 | 0.777778 | 0.269231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035088 | 0.095238 | 63 | 1 | 63 | 63 | 0.877193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
af62b3c87f650057abe53cab96a88b0b592493ff | 66 | py | Python | survae/utils/__init__.py | alisiahkoohi/survae_flows | e1747b05524c7ab540a211ed360ab3e67bc3e96d | [
"MIT"
] | 262 | 2020-07-05T20:57:44.000Z | 2022-03-28T02:24:43.000Z | survae/utils/__init__.py | alisiahkoohi/survae_flows | e1747b05524c7ab540a211ed360ab3e67bc3e96d | [
"MIT"
] | 17 | 2020-08-15T05:43:34.000Z | 2022-01-31T12:24:21.000Z | survae/utils/__init__.py | alisiahkoohi/survae_flows | e1747b05524c7ab540a211ed360ab3e67bc3e96d | [
"MIT"
] | 35 | 2020-08-24T06:55:37.000Z | 2022-02-11T05:17:58.000Z | from .tensors import *
from .context import *
from .loss import *
| 16.5 | 22 | 0.727273 | 9 | 66 | 5.333333 | 0.555556 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 66 | 3 | 23 | 22 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
afcbd252cffb07ecc2ff364de090eca321e85c30 | 38,031 | py | Python | instances/passenger_demand/pas-20210421-2109-int18e/24.py | LHcau/scheduling-shared-passenger-and-freight-transport-on-a-fixed-infrastructure | bba1e6af5bc8d9deaa2dc3b83f6fe9ddf15d2a11 | [
"BSD-3-Clause"
] | null | null | null | instances/passenger_demand/pas-20210421-2109-int18e/24.py | LHcau/scheduling-shared-passenger-and-freight-transport-on-a-fixed-infrastructure | bba1e6af5bc8d9deaa2dc3b83f6fe9ddf15d2a11 | [
"BSD-3-Clause"
] | null | null | null | instances/passenger_demand/pas-20210421-2109-int18e/24.py | LHcau/scheduling-shared-passenger-and-freight-transport-on-a-fixed-infrastructure | bba1e6af5bc8d9deaa2dc3b83f6fe9ddf15d2a11 | [
"BSD-3-Clause"
] | null | null | null |
"""
PASSENGERS
"""
numPassengers = 4167
passenger_arriving = (
(5, 13, 4, 5, 2, 0, 8, 8, 4, 4, 4, 0), # 0
(4, 9, 5, 5, 1, 0, 13, 12, 11, 8, 3, 0), # 1
(8, 8, 9, 5, 3, 0, 11, 6, 7, 8, 4, 0), # 2
(5, 10, 7, 3, 3, 0, 11, 11, 8, 3, 4, 0), # 3
(10, 10, 5, 7, 2, 0, 11, 18, 7, 5, 2, 0), # 4
(9, 12, 12, 7, 4, 0, 9, 18, 7, 5, 1, 0), # 5
(4, 9, 8, 8, 3, 0, 10, 17, 6, 4, 1, 0), # 6
(3, 13, 6, 5, 1, 0, 8, 8, 3, 4, 3, 0), # 7
(6, 20, 9, 12, 2, 0, 9, 10, 5, 1, 2, 0), # 8
(5, 9, 11, 10, 1, 0, 11, 8, 3, 7, 2, 0), # 9
(5, 20, 7, 4, 4, 0, 9, 14, 8, 2, 3, 0), # 10
(4, 12, 10, 5, 4, 0, 10, 4, 4, 10, 2, 0), # 11
(5, 16, 7, 6, 3, 0, 7, 13, 4, 5, 2, 0), # 12
(4, 7, 7, 2, 3, 0, 6, 5, 7, 4, 3, 0), # 13
(9, 9, 9, 7, 2, 0, 8, 9, 5, 7, 3, 0), # 14
(8, 9, 16, 2, 3, 0, 3, 10, 8, 4, 3, 0), # 15
(2, 12, 10, 4, 1, 0, 8, 11, 5, 7, 5, 0), # 16
(2, 13, 5, 3, 3, 0, 10, 11, 4, 11, 4, 0), # 17
(3, 9, 10, 3, 1, 0, 11, 9, 5, 4, 2, 0), # 18
(8, 17, 12, 4, 1, 0, 10, 9, 8, 9, 3, 0), # 19
(5, 9, 8, 3, 4, 0, 10, 12, 6, 4, 5, 0), # 20
(7, 10, 6, 3, 2, 0, 13, 11, 5, 5, 5, 0), # 21
(6, 8, 7, 2, 3, 0, 14, 9, 2, 11, 1, 0), # 22
(6, 11, 10, 9, 3, 0, 9, 9, 5, 8, 4, 0), # 23
(8, 14, 12, 5, 6, 0, 9, 9, 7, 4, 1, 0), # 24
(6, 15, 7, 6, 3, 0, 6, 14, 5, 5, 1, 0), # 25
(8, 9, 11, 12, 4, 0, 15, 7, 1, 9, 2, 0), # 26
(8, 8, 11, 6, 2, 0, 5, 15, 10, 14, 2, 0), # 27
(9, 16, 12, 4, 10, 0, 9, 10, 5, 5, 3, 0), # 28
(5, 18, 3, 3, 2, 0, 8, 12, 11, 6, 3, 0), # 29
(6, 10, 7, 5, 2, 0, 10, 9, 10, 6, 4, 0), # 30
(7, 17, 11, 2, 3, 0, 12, 10, 7, 9, 4, 0), # 31
(5, 14, 13, 6, 4, 0, 6, 13, 5, 8, 3, 0), # 32
(6, 15, 12, 3, 2, 0, 7, 5, 4, 7, 2, 0), # 33
(9, 14, 4, 7, 3, 0, 7, 4, 5, 4, 1, 0), # 34
(3, 16, 4, 5, 4, 0, 12, 15, 8, 6, 2, 0), # 35
(5, 9, 9, 2, 5, 0, 5, 8, 9, 13, 6, 0), # 36
(3, 15, 11, 6, 2, 0, 4, 13, 9, 6, 2, 0), # 37
(9, 10, 7, 6, 4, 0, 10, 13, 13, 5, 3, 0), # 38
(3, 14, 11, 7, 6, 0, 11, 9, 8, 5, 0, 0), # 39
(2, 14, 11, 5, 4, 0, 10, 8, 4, 7, 3, 0), # 40
(8, 20, 5, 8, 1, 0, 4, 8, 13, 7, 2, 0), # 41
(9, 10, 12, 8, 5, 0, 9, 13, 12, 6, 1, 0), # 42
(2, 12, 11, 4, 3, 0, 7, 10, 5, 8, 3, 0), # 43
(6, 13, 8, 1, 2, 0, 4, 10, 12, 2, 0, 0), # 44
(4, 10, 8, 7, 4, 0, 4, 20, 6, 9, 1, 0), # 45
(9, 15, 10, 3, 3, 0, 5, 7, 8, 7, 1, 0), # 46
(4, 11, 12, 4, 4, 0, 12, 14, 3, 4, 4, 0), # 47
(7, 10, 8, 7, 6, 0, 8, 13, 3, 9, 2, 0), # 48
(7, 12, 6, 10, 2, 0, 5, 14, 7, 9, 2, 0), # 49
(6, 17, 12, 7, 3, 0, 9, 15, 2, 7, 3, 0), # 50
(5, 14, 10, 9, 3, 0, 9, 8, 5, 5, 3, 0), # 51
(3, 14, 6, 8, 4, 0, 8, 8, 10, 3, 3, 0), # 52
(1, 11, 14, 6, 4, 0, 10, 10, 10, 8, 5, 0), # 53
(5, 18, 9, 4, 4, 0, 7, 9, 4, 3, 1, 0), # 54
(5, 16, 12, 6, 1, 0, 10, 14, 4, 6, 7, 0), # 55
(7, 12, 7, 1, 5, 0, 12, 8, 4, 5, 2, 0), # 56
(5, 18, 12, 8, 0, 0, 6, 12, 8, 4, 1, 0), # 57
(7, 9, 10, 10, 1, 0, 8, 14, 10, 9, 1, 0), # 58
(0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0), # 59
)
station_arriving_intensity = (
(4.769372805092186, 12.233629261363635, 14.389624839331619, 11.405298913043477, 12.857451923076923, 8.562228260869567), # 0
(4.81413961808604, 12.369674877683082, 14.46734796754499, 11.46881589673913, 12.953819711538461, 8.559309850543478), # 1
(4.8583952589991215, 12.503702525252525, 14.54322622107969, 11.530934782608696, 13.048153846153847, 8.556302173913043), # 2
(4.902102161984196, 12.635567578125, 14.617204169344474, 11.591602581521737, 13.14036778846154, 8.553205638586958), # 3
(4.94522276119403, 12.765125410353535, 14.689226381748071, 11.650766304347826, 13.230375, 8.550020652173911), # 4
(4.987719490781387, 12.892231395991162, 14.759237427699228, 11.708372961956522, 13.318088942307691, 8.546747622282608), # 5
(5.029554784899035, 13.01674090909091, 14.827181876606687, 11.764369565217393, 13.403423076923078, 8.54338695652174), # 6
(5.0706910776997365, 13.138509323705808, 14.893004297879177, 11.818703125, 13.486290865384618, 8.5399390625), # 7
(5.1110908033362605, 13.257392013888888, 14.956649260925452, 11.871320652173912, 13.56660576923077, 8.536404347826087), # 8
(5.1507163959613695, 13.373244353693181, 15.018061335154243, 11.922169157608696, 13.644281249999999, 8.532783220108696), # 9
(5.1895302897278315, 13.485921717171717, 15.077185089974291, 11.971195652173915, 13.719230769230771, 8.529076086956522), # 10
(5.227494918788412, 13.595279478377526, 15.133965094794343, 12.018347146739131, 13.791367788461539, 8.525283355978262), # 11
(5.2645727172958745, 13.701173011363636, 15.188345919023137, 12.063570652173912, 13.860605769230768, 8.521405434782608), # 12
(5.3007261194029835, 13.803457690183082, 15.240272132069407, 12.106813179347826, 13.926858173076925, 8.51744273097826), # 13
(5.335917559262511, 13.90198888888889, 15.289688303341899, 12.148021739130433, 13.99003846153846, 8.513395652173912), # 14
(5.370109471027217, 13.996621981534089, 15.336539002249355, 12.187143342391304, 14.050060096153846, 8.509264605978261), # 15
(5.403264288849868, 14.087212342171718, 15.380768798200515, 12.224124999999999, 14.10683653846154, 8.50505), # 16
(5.4353444468832315, 14.173615344854797, 15.422322260604112, 12.258913722826087, 14.16028125, 8.500752241847827), # 17
(5.46631237928007, 14.255686363636363, 15.461143958868895, 12.291456521739132, 14.210307692307696, 8.496371739130435), # 18
(5.496130520193152, 14.333280772569443, 15.4971784624036, 12.321700407608695, 14.256829326923079, 8.491908899456522), # 19
(5.524761303775241, 14.40625394570707, 15.530370340616965, 12.349592391304348, 14.299759615384616, 8.487364130434782), # 20
(5.552167164179106, 14.47446125710227, 15.56066416291774, 12.375079483695652, 14.339012019230768, 8.482737839673913), # 21
(5.578310535557506, 14.537758080808082, 15.588004498714653, 12.398108695652175, 14.374499999999998, 8.47803043478261), # 22
(5.603153852063214, 14.595999790877526, 15.612335917416454, 12.418627038043478, 14.40613701923077, 8.473242323369567), # 23
(5.62665954784899, 14.649041761363636, 15.633602988431875, 12.43658152173913, 14.433836538461538, 8.468373913043479), # 24
(5.648790057067603, 14.696739366319445, 15.651750281169667, 12.451919157608696, 14.457512019230768, 8.463425611413044), # 25
(5.669507813871817, 14.738947979797977, 15.66672236503856, 12.464586956521739, 14.477076923076922, 8.458397826086957), # 26
(5.688775252414398, 14.77552297585227, 15.6784638094473, 12.474531929347828, 14.492444711538463, 8.453290964673915), # 27
(5.7065548068481124, 14.806319728535353, 15.68691918380463, 12.481701086956523, 14.503528846153845, 8.448105434782608), # 28
(5.722808911325724, 14.831193611900254, 15.69203305751928, 12.486041440217392, 14.510242788461538, 8.44284164402174), # 29
(5.7375, 14.85, 15.69375, 12.4875, 14.512500000000001, 8.4375), # 30
(5.751246651214834, 14.865621839488634, 15.692462907608693, 12.487236580882353, 14.511678590425532, 8.430077267616193), # 31
(5.7646965153452685, 14.881037215909092, 15.68863804347826, 12.486451470588234, 14.509231914893617, 8.418644565217393), # 32
(5.777855634590792, 14.896244211647728, 15.682330027173915, 12.485152389705883, 14.50518630319149, 8.403313830584706), # 33
(5.790730051150895, 14.91124090909091, 15.67359347826087, 12.483347058823531, 14.499568085106382, 8.38419700149925), # 34
(5.803325807225064, 14.926025390624996, 15.662483016304348, 12.481043198529411, 14.492403590425532, 8.361406015742128), # 35
(5.815648945012788, 14.940595738636366, 15.649053260869564, 12.478248529411767, 14.48371914893617, 8.335052811094453), # 36
(5.8277055067135555, 14.954950035511365, 15.63335883152174, 12.474970772058823, 14.47354109042553, 8.305249325337332), # 37
(5.839501534526853, 14.969086363636364, 15.615454347826088, 12.471217647058824, 14.461895744680852, 8.272107496251873), # 38
(5.851043070652174, 14.983002805397728, 15.595394429347825, 12.466996875000001, 14.44880944148936, 8.23573926161919), # 39
(5.862336157289003, 14.99669744318182, 15.573233695652176, 12.462316176470589, 14.434308510638296, 8.196256559220389), # 40
(5.873386836636828, 15.010168359374997, 15.549026766304348, 12.457183272058824, 14.418419281914893, 8.153771326836583), # 41
(5.88420115089514, 15.023413636363639, 15.522828260869566, 12.451605882352942, 14.401168085106384, 8.108395502248875), # 42
(5.894785142263428, 15.03643135653409, 15.494692798913043, 12.445591727941178, 14.38258125, 8.060241023238381), # 43
(5.905144852941176, 15.049219602272727, 15.464675, 12.439148529411764, 14.36268510638298, 8.009419827586207), # 44
(5.915286325127877, 15.061776455965909, 15.432829483695656, 12.43228400735294, 14.341505984042554, 7.956043853073464), # 45
(5.925215601023019, 15.074100000000003, 15.39921086956522, 12.425005882352941, 14.319070212765958, 7.90022503748126), # 46
(5.934938722826087, 15.086188316761364, 15.363873777173913, 12.417321874999999, 14.295404122340427, 7.842075318590705), # 47
(5.944461732736574, 15.098039488636365, 15.326872826086957, 12.409239705882353, 14.27053404255319, 7.7817066341829095), # 48
(5.953790672953963, 15.10965159801136, 15.288262635869566, 12.400767095588236, 14.24448630319149, 7.71923092203898), # 49
(5.96293158567775, 15.121022727272724, 15.248097826086958, 12.391911764705883, 14.217287234042553, 7.65476011994003), # 50
(5.971890513107417, 15.132150958806818, 15.206433016304347, 12.38268143382353, 14.188963164893616, 7.588406165667167), # 51
(5.980673497442456, 15.143034375, 15.163322826086954, 12.373083823529411, 14.159540425531915, 7.5202809970015), # 52
(5.989286580882353, 15.153671058238638, 15.118821875, 12.363126654411765, 14.129045345744682, 7.450496551724138), # 53
(5.9977358056266, 15.164059090909088, 15.072984782608694, 12.352817647058824, 14.09750425531915, 7.379164767616192), # 54
(6.00602721387468, 15.174196555397728, 15.02586616847826, 12.342164522058825, 14.064943484042553, 7.306397582458771), # 55
(6.014166847826087, 15.184081534090907, 14.977520652173913, 12.331175, 14.031389361702129, 7.232306934032984), # 56
(6.022160749680308, 15.193712109375003, 14.92800285326087, 12.319856801470587, 13.996868218085105, 7.15700476011994), # 57
(6.030014961636829, 15.203086363636363, 14.877367391304347, 12.308217647058825, 13.961406382978723, 7.0806029985007495), # 58
(0.0, 0.0, 0.0, 0.0, 0.0, 0.0), # 59
)
passenger_arriving_acc = (
(5, 13, 4, 5, 2, 0, 8, 8, 4, 4, 4, 0), # 0
(9, 22, 9, 10, 3, 0, 21, 20, 15, 12, 7, 0), # 1
(17, 30, 18, 15, 6, 0, 32, 26, 22, 20, 11, 0), # 2
(22, 40, 25, 18, 9, 0, 43, 37, 30, 23, 15, 0), # 3
(32, 50, 30, 25, 11, 0, 54, 55, 37, 28, 17, 0), # 4
(41, 62, 42, 32, 15, 0, 63, 73, 44, 33, 18, 0), # 5
(45, 71, 50, 40, 18, 0, 73, 90, 50, 37, 19, 0), # 6
(48, 84, 56, 45, 19, 0, 81, 98, 53, 41, 22, 0), # 7
(54, 104, 65, 57, 21, 0, 90, 108, 58, 42, 24, 0), # 8
(59, 113, 76, 67, 22, 0, 101, 116, 61, 49, 26, 0), # 9
(64, 133, 83, 71, 26, 0, 110, 130, 69, 51, 29, 0), # 10
(68, 145, 93, 76, 30, 0, 120, 134, 73, 61, 31, 0), # 11
(73, 161, 100, 82, 33, 0, 127, 147, 77, 66, 33, 0), # 12
(77, 168, 107, 84, 36, 0, 133, 152, 84, 70, 36, 0), # 13
(86, 177, 116, 91, 38, 0, 141, 161, 89, 77, 39, 0), # 14
(94, 186, 132, 93, 41, 0, 144, 171, 97, 81, 42, 0), # 15
(96, 198, 142, 97, 42, 0, 152, 182, 102, 88, 47, 0), # 16
(98, 211, 147, 100, 45, 0, 162, 193, 106, 99, 51, 0), # 17
(101, 220, 157, 103, 46, 0, 173, 202, 111, 103, 53, 0), # 18
(109, 237, 169, 107, 47, 0, 183, 211, 119, 112, 56, 0), # 19
(114, 246, 177, 110, 51, 0, 193, 223, 125, 116, 61, 0), # 20
(121, 256, 183, 113, 53, 0, 206, 234, 130, 121, 66, 0), # 21
(127, 264, 190, 115, 56, 0, 220, 243, 132, 132, 67, 0), # 22
(133, 275, 200, 124, 59, 0, 229, 252, 137, 140, 71, 0), # 23
(141, 289, 212, 129, 65, 0, 238, 261, 144, 144, 72, 0), # 24
(147, 304, 219, 135, 68, 0, 244, 275, 149, 149, 73, 0), # 25
(155, 313, 230, 147, 72, 0, 259, 282, 150, 158, 75, 0), # 26
(163, 321, 241, 153, 74, 0, 264, 297, 160, 172, 77, 0), # 27
(172, 337, 253, 157, 84, 0, 273, 307, 165, 177, 80, 0), # 28
(177, 355, 256, 160, 86, 0, 281, 319, 176, 183, 83, 0), # 29
(183, 365, 263, 165, 88, 0, 291, 328, 186, 189, 87, 0), # 30
(190, 382, 274, 167, 91, 0, 303, 338, 193, 198, 91, 0), # 31
(195, 396, 287, 173, 95, 0, 309, 351, 198, 206, 94, 0), # 32
(201, 411, 299, 176, 97, 0, 316, 356, 202, 213, 96, 0), # 33
(210, 425, 303, 183, 100, 0, 323, 360, 207, 217, 97, 0), # 34
(213, 441, 307, 188, 104, 0, 335, 375, 215, 223, 99, 0), # 35
(218, 450, 316, 190, 109, 0, 340, 383, 224, 236, 105, 0), # 36
(221, 465, 327, 196, 111, 0, 344, 396, 233, 242, 107, 0), # 37
(230, 475, 334, 202, 115, 0, 354, 409, 246, 247, 110, 0), # 38
(233, 489, 345, 209, 121, 0, 365, 418, 254, 252, 110, 0), # 39
(235, 503, 356, 214, 125, 0, 375, 426, 258, 259, 113, 0), # 40
(243, 523, 361, 222, 126, 0, 379, 434, 271, 266, 115, 0), # 41
(252, 533, 373, 230, 131, 0, 388, 447, 283, 272, 116, 0), # 42
(254, 545, 384, 234, 134, 0, 395, 457, 288, 280, 119, 0), # 43
(260, 558, 392, 235, 136, 0, 399, 467, 300, 282, 119, 0), # 44
(264, 568, 400, 242, 140, 0, 403, 487, 306, 291, 120, 0), # 45
(273, 583, 410, 245, 143, 0, 408, 494, 314, 298, 121, 0), # 46
(277, 594, 422, 249, 147, 0, 420, 508, 317, 302, 125, 0), # 47
(284, 604, 430, 256, 153, 0, 428, 521, 320, 311, 127, 0), # 48
(291, 616, 436, 266, 155, 0, 433, 535, 327, 320, 129, 0), # 49
(297, 633, 448, 273, 158, 0, 442, 550, 329, 327, 132, 0), # 50
(302, 647, 458, 282, 161, 0, 451, 558, 334, 332, 135, 0), # 51
(305, 661, 464, 290, 165, 0, 459, 566, 344, 335, 138, 0), # 52
(306, 672, 478, 296, 169, 0, 469, 576, 354, 343, 143, 0), # 53
(311, 690, 487, 300, 173, 0, 476, 585, 358, 346, 144, 0), # 54
(316, 706, 499, 306, 174, 0, 486, 599, 362, 352, 151, 0), # 55
(323, 718, 506, 307, 179, 0, 498, 607, 366, 357, 153, 0), # 56
(328, 736, 518, 315, 179, 0, 504, 619, 374, 361, 154, 0), # 57
(335, 745, 528, 325, 180, 0, 512, 633, 384, 370, 155, 0), # 58
(335, 745, 528, 325, 180, 0, 512, 633, 384, 370, 155, 0), # 59
)
passenger_arriving_rate = (
(4.769372805092186, 9.786903409090908, 8.63377490359897, 4.56211956521739, 2.5714903846153843, 0.0, 8.562228260869567, 10.285961538461537, 6.843179347826086, 5.755849935732647, 2.446725852272727, 0.0), # 0
(4.81413961808604, 9.895739902146465, 8.680408780526994, 4.587526358695651, 2.5907639423076922, 0.0, 8.559309850543478, 10.363055769230769, 6.881289538043478, 5.786939187017995, 2.4739349755366162, 0.0), # 1
(4.8583952589991215, 10.00296202020202, 8.725935732647814, 4.612373913043478, 2.609630769230769, 0.0, 8.556302173913043, 10.438523076923076, 6.918560869565217, 5.817290488431875, 2.500740505050505, 0.0), # 2
(4.902102161984196, 10.1084540625, 8.770322501606683, 4.636641032608694, 2.628073557692308, 0.0, 8.553205638586958, 10.512294230769232, 6.954961548913042, 5.846881667737789, 2.527113515625, 0.0), # 3
(4.94522276119403, 10.212100328282828, 8.813535829048842, 4.66030652173913, 2.6460749999999997, 0.0, 8.550020652173911, 10.584299999999999, 6.990459782608696, 5.875690552699228, 2.553025082070707, 0.0), # 4
(4.987719490781387, 10.313785116792928, 8.855542456619537, 4.6833491847826085, 2.663617788461538, 0.0, 8.546747622282608, 10.654471153846153, 7.025023777173913, 5.90369497107969, 2.578446279198232, 0.0), # 5
(5.029554784899035, 10.413392727272727, 8.896309125964011, 4.705747826086957, 2.680684615384615, 0.0, 8.54338695652174, 10.72273846153846, 7.058621739130436, 5.930872750642674, 2.603348181818182, 0.0), # 6
(5.0706910776997365, 10.510807458964646, 8.935802578727506, 4.72748125, 2.697258173076923, 0.0, 8.5399390625, 10.789032692307693, 7.0912218750000005, 5.95720171915167, 2.6277018647411614, 0.0), # 7
(5.1110908033362605, 10.60591361111111, 8.97398955655527, 4.7485282608695645, 2.7133211538461537, 0.0, 8.536404347826087, 10.853284615384615, 7.122792391304347, 5.982659704370181, 2.6514784027777774, 0.0), # 8
(5.1507163959613695, 10.698595482954543, 9.010836801092546, 4.768867663043478, 2.7288562499999993, 0.0, 8.532783220108696, 10.915424999999997, 7.153301494565217, 6.007224534061697, 2.6746488707386358, 0.0), # 9
(5.1895302897278315, 10.788737373737373, 9.046311053984574, 4.7884782608695655, 2.743846153846154, 0.0, 8.529076086956522, 10.975384615384616, 7.182717391304348, 6.030874035989716, 2.697184343434343, 0.0), # 10
(5.227494918788412, 10.87622358270202, 9.080379056876605, 4.807338858695652, 2.7582735576923074, 0.0, 8.525283355978262, 11.03309423076923, 7.2110082880434785, 6.053586037917737, 2.719055895675505, 0.0), # 11
(5.2645727172958745, 10.960938409090907, 9.113007551413881, 4.825428260869565, 2.7721211538461534, 0.0, 8.521405434782608, 11.088484615384614, 7.238142391304347, 6.0753383676092545, 2.740234602272727, 0.0), # 12
(5.3007261194029835, 11.042766152146465, 9.144163279241644, 4.8427252717391305, 2.7853716346153847, 0.0, 8.51744273097826, 11.141486538461539, 7.264087907608696, 6.096108852827762, 2.760691538036616, 0.0), # 13
(5.335917559262511, 11.121591111111112, 9.173812982005138, 4.859208695652173, 2.7980076923076918, 0.0, 8.513395652173912, 11.192030769230767, 7.288813043478259, 6.115875321336759, 2.780397777777778, 0.0), # 14
(5.370109471027217, 11.19729758522727, 9.201923401349612, 4.874857336956521, 2.810012019230769, 0.0, 8.509264605978261, 11.240048076923076, 7.312286005434782, 6.134615600899742, 2.7993243963068175, 0.0), # 15
(5.403264288849868, 11.269769873737372, 9.228461278920308, 4.88965, 2.8213673076923076, 0.0, 8.50505, 11.28546923076923, 7.334474999999999, 6.152307519280206, 2.817442468434343, 0.0), # 16
(5.4353444468832315, 11.338892275883836, 9.253393356362468, 4.903565489130434, 2.83205625, 0.0, 8.500752241847827, 11.328225, 7.3553482336956515, 6.168928904241644, 2.834723068970959, 0.0), # 17
(5.46631237928007, 11.40454909090909, 9.276686375321336, 4.916582608695652, 2.842061538461539, 0.0, 8.496371739130435, 11.368246153846156, 7.374873913043479, 6.184457583547558, 2.8511372727272724, 0.0), # 18
(5.496130520193152, 11.466624618055553, 9.298307077442159, 4.928680163043477, 2.8513658653846155, 0.0, 8.491908899456522, 11.405463461538462, 7.393020244565217, 6.198871384961439, 2.866656154513888, 0.0), # 19
(5.524761303775241, 11.525003156565655, 9.318222204370178, 4.939836956521739, 2.859951923076923, 0.0, 8.487364130434782, 11.439807692307692, 7.409755434782609, 6.212148136246785, 2.8812507891414136, 0.0), # 20
(5.552167164179106, 11.579569005681815, 9.336398497750643, 4.95003179347826, 2.8678024038461536, 0.0, 8.482737839673913, 11.471209615384614, 7.425047690217391, 6.224265665167096, 2.894892251420454, 0.0), # 21
(5.578310535557506, 11.630206464646465, 9.352802699228791, 4.95924347826087, 2.8748999999999993, 0.0, 8.47803043478261, 11.499599999999997, 7.438865217391305, 6.235201799485861, 2.907551616161616, 0.0), # 22
(5.603153852063214, 11.67679983270202, 9.367401550449872, 4.967450815217391, 2.8812274038461534, 0.0, 8.473242323369567, 11.524909615384614, 7.451176222826087, 6.244934366966581, 2.919199958175505, 0.0), # 23
(5.62665954784899, 11.719233409090908, 9.380161793059125, 4.974632608695652, 2.8867673076923075, 0.0, 8.468373913043479, 11.54706923076923, 7.461948913043478, 6.25344119537275, 2.929808352272727, 0.0), # 24
(5.648790057067603, 11.757391493055556, 9.391050168701799, 4.980767663043478, 2.8915024038461534, 0.0, 8.463425611413044, 11.566009615384614, 7.471151494565217, 6.260700112467866, 2.939347873263889, 0.0), # 25
(5.669507813871817, 11.79115838383838, 9.400033419023135, 4.985834782608695, 2.8954153846153843, 0.0, 8.458397826086957, 11.581661538461537, 7.478752173913043, 6.266688946015424, 2.947789595959595, 0.0), # 26
(5.688775252414398, 11.820418380681815, 9.40707828566838, 4.989812771739131, 2.8984889423076923, 0.0, 8.453290964673915, 11.593955769230769, 7.484719157608696, 6.271385523778919, 2.9551045951704538, 0.0), # 27
(5.7065548068481124, 11.84505578282828, 9.412151510282778, 4.992680434782609, 2.9007057692307687, 0.0, 8.448105434782608, 11.602823076923075, 7.489020652173913, 6.274767673521851, 2.96126394570707, 0.0), # 28
(5.722808911325724, 11.864954889520202, 9.415219834511568, 4.994416576086956, 2.902048557692307, 0.0, 8.44284164402174, 11.608194230769229, 7.491624864130435, 6.276813223007712, 2.9662387223800506, 0.0), # 29
(5.7375, 11.879999999999999, 9.41625, 4.995, 2.9025, 0.0, 8.4375, 11.61, 7.4925, 6.277499999999999, 2.9699999999999998, 0.0), # 30
(5.751246651214834, 11.892497471590906, 9.415477744565216, 4.994894632352941, 2.9023357180851064, 0.0, 8.430077267616193, 11.609342872340426, 7.492341948529411, 6.276985163043476, 2.9731243678977264, 0.0), # 31
(5.7646965153452685, 11.904829772727274, 9.413182826086956, 4.994580588235293, 2.901846382978723, 0.0, 8.418644565217393, 11.607385531914892, 7.49187088235294, 6.275455217391303, 2.9762074431818184, 0.0), # 32
(5.777855634590792, 11.916995369318181, 9.40939801630435, 4.994060955882353, 2.9010372606382977, 0.0, 8.403313830584706, 11.60414904255319, 7.491091433823529, 6.272932010869566, 2.9792488423295453, 0.0), # 33
(5.790730051150895, 11.928992727272727, 9.40415608695652, 4.993338823529412, 2.899913617021276, 0.0, 8.38419700149925, 11.599654468085104, 7.490008235294118, 6.269437391304347, 2.9822481818181816, 0.0), # 34
(5.803325807225064, 11.940820312499996, 9.39748980978261, 4.9924172794117645, 2.898480718085106, 0.0, 8.361406015742128, 11.593922872340425, 7.488625919117647, 6.264993206521739, 2.985205078124999, 0.0), # 35
(5.815648945012788, 11.952476590909091, 9.389431956521738, 4.9912994117647065, 2.896743829787234, 0.0, 8.335052811094453, 11.586975319148936, 7.486949117647059, 6.259621304347825, 2.988119147727273, 0.0), # 36
(5.8277055067135555, 11.96396002840909, 9.380015298913044, 4.989988308823529, 2.8947082180851056, 0.0, 8.305249325337332, 11.578832872340422, 7.484982463235293, 6.253343532608695, 2.9909900071022726, 0.0), # 37
(5.839501534526853, 11.97526909090909, 9.369272608695653, 4.988487058823529, 2.89237914893617, 0.0, 8.272107496251873, 11.56951659574468, 7.4827305882352935, 6.246181739130434, 2.9938172727272727, 0.0), # 38
(5.851043070652174, 11.986402244318182, 9.357236657608695, 4.98679875, 2.8897618882978717, 0.0, 8.23573926161919, 11.559047553191487, 7.480198125, 6.23815777173913, 2.9966005610795454, 0.0), # 39
(5.862336157289003, 11.997357954545455, 9.343940217391305, 4.984926470588235, 2.886861702127659, 0.0, 8.196256559220389, 11.547446808510635, 7.477389705882353, 6.22929347826087, 2.999339488636364, 0.0), # 40
(5.873386836636828, 12.008134687499997, 9.329416059782607, 4.982873308823529, 2.8836838563829783, 0.0, 8.153771326836583, 11.534735425531913, 7.474309963235294, 6.219610706521738, 3.002033671874999, 0.0), # 41
(5.88420115089514, 12.01873090909091, 9.31369695652174, 4.980642352941176, 2.880233617021277, 0.0, 8.108395502248875, 11.520934468085107, 7.4709635294117644, 6.209131304347826, 3.0046827272727277, 0.0), # 42
(5.894785142263428, 12.02914508522727, 9.296815679347825, 4.978236691176471, 2.8765162499999994, 0.0, 8.060241023238381, 11.506064999999998, 7.467355036764706, 6.1978771195652165, 3.0072862713068176, 0.0), # 43
(5.905144852941176, 12.03937568181818, 9.278805, 4.975659411764705, 2.8725370212765955, 0.0, 8.009419827586207, 11.490148085106382, 7.4634891176470575, 6.1858699999999995, 3.009843920454545, 0.0), # 44
(5.915286325127877, 12.049421164772726, 9.259697690217394, 4.972913602941176, 2.8683011968085106, 0.0, 7.956043853073464, 11.473204787234042, 7.459370404411764, 6.1731317934782615, 3.0123552911931815, 0.0), # 45
(5.925215601023019, 12.059280000000001, 9.239526521739132, 4.970002352941176, 2.8638140425531913, 0.0, 7.90022503748126, 11.455256170212765, 7.455003529411765, 6.159684347826087, 3.0148200000000003, 0.0), # 46
(5.934938722826087, 12.06895065340909, 9.218324266304347, 4.966928749999999, 2.859080824468085, 0.0, 7.842075318590705, 11.43632329787234, 7.450393124999999, 6.145549510869564, 3.0172376633522724, 0.0), # 47
(5.944461732736574, 12.07843159090909, 9.196123695652174, 4.9636958823529405, 2.854106808510638, 0.0, 7.7817066341829095, 11.416427234042551, 7.445543823529412, 6.130749130434782, 3.0196078977272727, 0.0), # 48
(5.953790672953963, 12.087721278409088, 9.17295758152174, 4.960306838235294, 2.8488972606382976, 0.0, 7.71923092203898, 11.39558904255319, 7.4404602573529415, 6.115305054347826, 3.021930319602272, 0.0), # 49
(5.96293158567775, 12.096818181818177, 9.148858695652175, 4.956764705882353, 2.8434574468085105, 0.0, 7.65476011994003, 11.373829787234042, 7.43514705882353, 6.099239130434783, 3.0242045454545443, 0.0), # 50
(5.971890513107417, 12.105720767045453, 9.123859809782608, 4.953072573529411, 2.837792632978723, 0.0, 7.588406165667167, 11.351170531914892, 7.429608860294118, 6.082573206521738, 3.026430191761363, 0.0), # 51
(5.980673497442456, 12.114427499999998, 9.097993695652173, 4.949233529411764, 2.8319080851063827, 0.0, 7.5202809970015, 11.32763234042553, 7.4238502941176465, 6.065329130434781, 3.0286068749999995, 0.0), # 52
(5.989286580882353, 12.122936846590909, 9.071293125, 4.945250661764706, 2.8258090691489364, 0.0, 7.450496551724138, 11.303236276595745, 7.417875992647058, 6.04752875, 3.030734211647727, 0.0), # 53
(5.9977358056266, 12.13124727272727, 9.043790869565216, 4.941127058823529, 2.8195008510638297, 0.0, 7.379164767616192, 11.278003404255319, 7.411690588235294, 6.0291939130434775, 3.0328118181818176, 0.0), # 54
(6.00602721387468, 12.139357244318182, 9.015519701086955, 4.93686580882353, 2.8129886968085103, 0.0, 7.306397582458771, 11.251954787234041, 7.405298713235295, 6.010346467391304, 3.0348393110795455, 0.0), # 55
(6.014166847826087, 12.147265227272724, 8.986512391304348, 4.9324699999999995, 2.8062778723404254, 0.0, 7.232306934032984, 11.225111489361701, 7.398705, 5.991008260869565, 3.036816306818181, 0.0), # 56
(6.022160749680308, 12.154969687500001, 8.95680171195652, 4.927942720588234, 2.7993736436170207, 0.0, 7.15700476011994, 11.197494574468083, 7.391914080882352, 5.9712011413043475, 3.0387424218750003, 0.0), # 57
(6.030014961636829, 12.16246909090909, 8.926420434782608, 4.923287058823529, 2.792281276595744, 0.0, 7.0806029985007495, 11.169125106382976, 7.384930588235295, 5.950946956521738, 3.0406172727272724, 0.0), # 58
(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), # 59
)
passenger_allighting_rate = (
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 0
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 1
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 2
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 3
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 4
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 5
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 6
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 7
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 8
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 9
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 10
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 11
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 12
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 13
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 14
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 15
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 16
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 17
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 18
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 19
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 20
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 21
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 22
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 23
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 24
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 25
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 26
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 27
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 28
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 29
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 30
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 31
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 32
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 33
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 34
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 35
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 36
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 37
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 38
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 39
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 40
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 41
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 42
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 43
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 44
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 45
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 46
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 47
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 48
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 49
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 50
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 51
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 52
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 53
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 54
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 55
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 56
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 57
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 58
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 59
)
"""
parameters for reproducibiliy. More information: https://numpy.org/doc/stable/reference/random/parallel.html
"""
#initial entropy
entropy = 258194110137029475889902652135037600173
#index for seed sequence child
child_seed_index = (
1, # 0
23, # 1
)
| 113.525373 | 213 | 0.730115 | 5,147 | 38,031 | 5.392656 | 0.237808 | 0.311284 | 0.246433 | 0.466926 | 0.326848 | 0.3262 | 0.3262 | 0.3262 | 0.3262 | 0.3262 | 0 | 0.819793 | 0.118693 | 38,031 | 334 | 214 | 113.865269 | 0.008324 | 0.031842 | 0 | 0.202532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.015823 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
afd24fd8b4ee7394673d435f6212f15560258846 | 220 | py | Python | tweets.py | guzey/save_tweets | 3b8c03a60f848fc4219cfd92a50e136dc32f4979 | [
"MIT"
] | 2 | 2017-12-06T13:46:10.000Z | 2020-02-10T00:30:38.000Z | tweets.py | guzey/save_tweets | 3b8c03a60f848fc4219cfd92a50e136dc32f4979 | [
"MIT"
] | null | null | null | tweets.py | guzey/save_tweets | 3b8c03a60f848fc4219cfd92a50e136dc32f4979 | [
"MIT"
] | null | null | null | tweets = ["https://twitter.com/GateOfHeavens/status/246739124193751041",
"https://twitter.com/abigailb/status/360482662491754496",
"https://twitter.com/suleimenov/status/366952592200314880"
] | 55 | 72 | 0.7 | 19 | 220 | 8.105263 | 0.578947 | 0.233766 | 0.292208 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.290323 | 0.154545 | 220 | 4 | 73 | 55 | 0.537634 | 0 | 0 | 0 | 0 | 0 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bb6ee75cd66741b5ce884dd93a8e424bcd69618c | 38 | py | Python | tools/console/plugins/plugin_run/__init__.py | rh101/engine-x | 17ad9829dd410c689857760b6ece89d99e877a95 | [
"MIT"
] | 113 | 2020-02-25T03:19:32.000Z | 2021-05-17T09:15:40.000Z | tools/console/plugins/plugin_run/__init__.py | rh101/engine-x | 17ad9829dd410c689857760b6ece89d99e877a95 | [
"MIT"
] | 172 | 2020-02-21T08:56:42.000Z | 2021-05-12T03:18:40.000Z | tools/console/plugins/plugin_run/__init__.py | rh101/engine-x | 17ad9829dd410c689857760b6ece89d99e877a95 | [
"MIT"
] | 62 | 2020-02-23T14:10:16.000Z | 2021-05-14T13:53:19.000Z | from .project_run import CCPluginRun
| 12.666667 | 36 | 0.842105 | 5 | 38 | 6.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131579 | 38 | 2 | 37 | 19 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bb87102c3547dedc2d87b2626aedaf52bbd9e1e6 | 29 | py | Python | gpsfluxlite/__init__.py | iotfablab/gpsfluxlite | 20b1c1b6b3873caabe556d41ede4c2ef912371c2 | [
"MIT"
] | null | null | null | gpsfluxlite/__init__.py | iotfablab/gpsfluxlite | 20b1c1b6b3873caabe556d41ede4c2ef912371c2 | [
"MIT"
] | 1 | 2020-09-14T15:44:33.000Z | 2020-09-17T12:51:41.000Z | gpsfluxlite/__init__.py | iotfablab/gpsfluxlite | 20b1c1b6b3873caabe556d41ede4c2ef912371c2 | [
"MIT"
] | null | null | null | from .gpsfluxlite import main | 29 | 29 | 0.862069 | 4 | 29 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 29 | 1 | 29 | 29 | 0.961538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bb972d51d85aa77bbf590d3be3a1b7df23b26c03 | 27 | py | Python | bugzoo/cli/__init__.py | pdreiter/BugZoo | e164ee67ff8bd3addfcc87b5e38ff1774992196b | [
"MIT"
] | 53 | 2017-12-02T03:22:06.000Z | 2022-03-10T22:20:52.000Z | bugzoo/cli/__init__.py | pdreiter/BugZoo | e164ee67ff8bd3addfcc87b5e38ff1774992196b | [
"MIT"
] | 145 | 2017-11-29T23:23:06.000Z | 2020-09-17T22:17:44.000Z | bugzoo/cli/__init__.py | pdreiter/BugZoo | e164ee67ff8bd3addfcc87b5e38ff1774992196b | [
"MIT"
] | 8 | 2018-06-26T17:58:49.000Z | 2021-09-07T14:03:41.000Z | from .app import BugZooCLI
| 13.5 | 26 | 0.814815 | 4 | 27 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 27 | 1 | 27 | 27 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bb9b615f3bedfdcc8272600ce8653587b1ded5a7 | 2,366 | py | Python | calculator/domain/reward.py | Remmeauth/block-producer-calculator-back | 4e18437217044434a57c37eb2f90bded9c76a691 | [
"MIT"
] | null | null | null | calculator/domain/reward.py | Remmeauth/block-producer-calculator-back | 4e18437217044434a57c37eb2f90bded9c76a691 | [
"MIT"
] | null | null | null | calculator/domain/reward.py | Remmeauth/block-producer-calculator-back | 4e18437217044434a57c37eb2f90bded9c76a691 | [
"MIT"
] | 1 | 2019-11-08T14:50:39.000Z | 2019-11-08T14:50:39.000Z | """
Provide implementation of the reward.
"""
from calculator.domain.block import (
BlockProducer,
BlockReward,
)
from calculator.domain.economy import Economy
class BlockProducerReward:
"""
Implements block producer reward.
"""
def __init__(self, economy: Economy, block_reward: BlockReward, block_producer: BlockProducer):
"""
Constructor.
"""
self.economy = economy
self.block_reward = block_reward
self.block_producer = block_producer
def get(self):
"""
Get a block producer reward.
"""
sum_of_stakes = self.economy.all_block_producers_stakes + self.block_producer.stake
return (self.block_producer.stake * self.block_reward.get() / sum_of_stakes) * \
self.economy.block_producers_reward_coefficient
def get_from_pool(self):
"""
Get a block producer reward from rewards pool.
"""
sum_of_stakes = self.economy.all_block_producers_stakes + self.block_producer.stake
return (self.block_producer.stake * (self.economy.to_rewards_pool/12/self.economy.blocks_per_month)
/ sum_of_stakes) * \
self.economy.block_producers_reward_coefficient
class ActiveBlockProducerReward:
"""
Implements an active block producer reward.
"""
def __init__(self, economy: Economy, block_reward: BlockReward, block_producer: BlockProducer):
"""
Constructor.
"""
self.economy = economy
self.block_reward = block_reward
self.block_producer = block_producer
def get(self):
"""
Get an active block producer reward.
"""
sum_of_votes = self.economy.active_block_producers_votes + self.block_producer.votes
return (self.block_producer.votes * self.block_reward.get() / sum_of_votes) * \
self.economy.active_block_producers_reward_coefficient
def get_from_pool(self):
"""
Get an active block producer reward from rewards pool.
"""
sum_of_votes = self.economy.active_block_producers_votes + self.block_producer.votes
return (self.block_producer.votes * (self.economy.to_rewards_pool/12/self.economy.blocks_per_month)
/ sum_of_votes) * \
self.economy.active_block_producers_reward_coefficient
| 31.546667 | 107 | 0.668216 | 266 | 2,366 | 5.62782 | 0.154135 | 0.173681 | 0.11356 | 0.04008 | 0.865731 | 0.860387 | 0.827655 | 0.804275 | 0.758851 | 0.712091 | 0 | 0.002242 | 0.245985 | 2,366 | 74 | 108 | 31.972973 | 0.836883 | 0.1306 | 0 | 0.606061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.060606 | 0 | 0.424242 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bbb9c0f528b1ee827a0d8f94596a83de2ff9582c | 1,827 | py | Python | applications/physbam/physbam-lib/Scripts/Archives/pd/common/CONNECT.py | schinmayee/nimbus | 170cd15e24a7a88243a6ea80aabadc0fc0e6e177 | [
"BSD-3-Clause"
] | 20 | 2017-07-03T19:09:09.000Z | 2021-09-10T02:53:56.000Z | applications/physbam/physbam-lib/Scripts/Archives/pd/common/CONNECT.py | schinmayee/nimbus | 170cd15e24a7a88243a6ea80aabadc0fc0e6e177 | [
"BSD-3-Clause"
] | null | null | null | applications/physbam/physbam-lib/Scripts/Archives/pd/common/CONNECT.py | schinmayee/nimbus | 170cd15e24a7a88243a6ea80aabadc0fc0e6e177 | [
"BSD-3-Clause"
] | 9 | 2017-09-17T02:05:06.000Z | 2020-01-31T00:12:01.000Z | from pd.common import SOCKET
from pd.common import CONFIG
def host_client():
print "HOST CLIENT IS OBSOLETE"
sys.exit(1)
return SOCKET.CLIENT(CONFIG.pdhosts_server_host,CONFIG.pdhosts_server_port,
(CONFIG.client_private_key_file,CONFIG.client_certificate_file,CONFIG.ca_certificate_file))
def send_client(timeout=0):
return SOCKET.CLIENT(CONFIG.pdsend_server_host,CONFIG.pdsend_server_port,None,timeout)
#return SOCKET.CLIENT(CONFIG.pdsend_server_host,CONFIG.pdsend_server_port,(CONFIG.client_private_key_file,CONFIG.client_certificate_file,CONFIG.ca_certificate_file),timeout)
# (CONFIG.client_private_key_file,CONFIG.client_certificate_file,CONFIG.ca_certificate_file))
def sim_client():
print "SIM CLIENT IS OBSOLETE"
sys.exit(1)
return SOCKET.CLIENT(CONFIG.pdsim_server_host,CONFIG.pdsim_server_port,
(CONFIG.client_private_key_file,CONFIG.client_certificate_file,CONFIG.ca_certificate_file))
def disk_client(volume):
print "DISK CLIENT IS CURRENTLY OFFLINE"
sys.exit(1)
volume_servers={"vol0":"solverh1","vol1":"solverh1","vol2":"solverh2","vol3":"solverh2"}
return SOCKET.CLIENT(volume_servers[volume],CONFIG.pddisk_server_port,
(CONFIG.client_private_key_file,CONFIG.client_certificate_file,CONFIG.ca_certificate_file))
def mon_client(timeout=0):
return SOCKET.CLIENT(CONFIG.pdmon_server_host,CONFIG.pdmon_server_port,None,timeout)
#return SOCKET.CLIENT(CONFIG.pdmon_server_host,CONFIG.pdmon_server_port,(CONFIG.client_private_key_file,CONFIG.client_certificate_file,CONFIG.ca_certificate_file),timeout)
# return SOCKET.CLIENT(CONFIG.pdmon_server_host,CONFIG.pdmon_server_port,
# (CONFIG.client_private_key_file,CONFIG.client_certificate_file,CONFIG.ca_certificate_file),timeout)
| 49.378378 | 177 | 0.792556 | 250 | 1,827 | 5.448 | 0.18 | 0.123348 | 0.105727 | 0.123348 | 0.767254 | 0.767254 | 0.767254 | 0.746696 | 0.735683 | 0.735683 | 0 | 0.008 | 0.110564 | 1,827 | 36 | 178 | 50.75 | 0.830154 | 0.340449 | 0 | 0.272727 | 0 | 0 | 0.104428 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.090909 | null | null | 0.136364 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bbc19906189c277df43bd9ac12743a0a5b6660ca | 560 | py | Python | sourced/ml/models/__init__.py | vmarkovtsev/ml | 22699b2f44901b84507d15e732003955024e6755 | [
"Apache-2.0"
] | 122 | 2017-11-15T15:19:19.000Z | 2022-03-23T13:36:34.000Z | sourced/ml/models/__init__.py | vmarkovtsev/ml | 22699b2f44901b84507d15e732003955024e6755 | [
"Apache-2.0"
] | 176 | 2017-11-14T18:11:21.000Z | 2019-05-16T04:12:31.000Z | sourced/ml/models/__init__.py | vmarkovtsev/ml | 22699b2f44901b84507d15e732003955024e6755 | [
"Apache-2.0"
] | 58 | 2017-11-14T18:07:08.000Z | 2021-01-28T11:41:21.000Z | # flake8: noqa
from sourced.ml.models.bow import BOW
from sourced.ml.models.coocc import Cooccurrences
from sourced.ml.models.df import DocumentFrequencies
from sourced.ml.models.ordered_df import OrderedDocumentFrequencies
from sourced.ml.models.id2vec import Id2Vec
from sourced.ml.models.tensorflow import TensorFlowModel
from sourced.ml.models.topics import Topics
from sourced.ml.models.quant import QuantizationLevels
from sourced.ml.models.model_converters.merge_df import MergeDocFreq
from sourced.ml.models.model_converters.merge_bow import MergeBOW
| 43.076923 | 68 | 0.860714 | 79 | 560 | 6.037975 | 0.316456 | 0.230608 | 0.272537 | 0.398323 | 0.163522 | 0.163522 | 0.163522 | 0 | 0 | 0 | 0 | 0.005814 | 0.078571 | 560 | 12 | 69 | 46.666667 | 0.918605 | 0.021429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a530360d05ae2e0a216ec0f8e2e1ba1934479b92 | 18,274 | py | Python | twiker/modules/twiker_cli.py | Twiker-Bot/twiker | ae39c4527d886913a137f1eea23cef467071022b | [
"MIT"
] | 2 | 2021-09-23T18:24:12.000Z | 2021-09-23T21:14:21.000Z | twiker/modules/twiker_cli.py | Twiker-Bot/twiker | ae39c4527d886913a137f1eea23cef467071022b | [
"MIT"
] | 1 | 2021-09-26T15:12:52.000Z | 2021-09-27T13:36:00.000Z | twiker/modules/twiker_cli.py | Twiker-Bot/twiker | ae39c4527d886913a137f1eea23cef467071022b | [
"MIT"
] | 1 | 2021-09-27T05:46:19.000Z | 2021-09-27T05:46:19.000Z | # -*- coding: utf-8 -*-
"""
MIT License
Copyright (c) 2021 The Knight All rights reserved.
==========================
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
"""
import argparse
import configparser
from twiker.modules.engine import Engine
class CLI:
"""
@author : Twiker team aka TheKnight
This is the cli client for the Twitter bot you can manually run it to test the bot
This will required the arguments:
-config: the config file to use
-tweet: the tweet to send
-reply: the reply to send
-follow: the user to follow
-unfollow: the user to unfollow
-like: the tweet to like
-unlike: the tweet to unlike
-retweet: the tweet to retweet
You can control the bot with this cli version or you can use it as api to your bot
This is the command line version
"""
def __init__(self, engine):
self.config = configparser.ConfigParser()
engines = Engine("config.ini", verbose=False)
self.bot = engines
# # take arguments from the command line
# """
# Initializes the bot
# Returns the arguments from the command line
# - return None
# """
# self.parser = argparse.ArgumentParser(description="A cli client for managing twitter action and a Twitter Bot")
# self.parser.add_argument("-t",
# help="tweet a msg required -m", action="store_true")
# self.parser.add_argument("-n", "--name",
# help="name of the user", type=str, dest="name")
# self.parser.add_argument("-m", "--message",
# help="message to be used for dm,reply,tweet", type=str,
# dest="message")
# self.parser.add_argument("-i", "--image",
# help="media link or path to be attached", type=str, dest="image")
# self.parser.add_argument("-p", "--profile",
# help="profile to be attached", type=str)
# self.parser.add_argument("-c", "--config",
# help="set config file or new config file", action="store_true")
# self.parser.add_argument("-d", "--dm",
# help="send a dm to a user", action="store_true")
# # show tweet on console latest tweet
# self.parser.add_argument("-s", "--show",
# help="show latest tweet on console", action="store_true")
# # list followers
# self.parser.add_argument("-l", "--list",
# help="list followers,following", type=str, dest="list")
# # user
# self.parser.add_argument("-u", "--user",
# help="user for other action that required user", type=str, dest="user")
# # show tweet on hash tag
# self.parser.add_argument("-hg", "--hashtag",
# help="show latest tweet on hash tag", type=str, dest="hashtag")
# # like or retweet on a hash tag
# self.parser.add_argument("-lk", "--like",
# help="like a tweet", action="store_true")
# self.parser.add_argument("-rt", "--retweet",
# help="retweet a tweet", action="store_true")
# self.parser.add_argument("-f", "--follow",
# help="follow a user", action="store_true")
# self.parser.add_argument("-uf", "--unfollow",
# help="unfollow a user", action="store_true")
# self.parser.add_argument("-r", "--search",
# help="search for a user", type=str, dest="search")
# self.parser.add_argument("-ra", "--retweet_all",
# help="retweet all tweet on a hashtag or of a user required "
# "hashtag or user",
# action="store_true")
# self.parser.add_argument("-la", "--like_all",
# help="like all tweets on a hashtag or of a user required user or hashtag",
# action="store_true")
# args = self.parser.parse_args()
# msg = args.message
# name = args.name
# list_type = args.list
# image = args.image
# profile = args.profile
# hashtag = args.hashtag
# user_search = args.search
# user = args.user
# if args.config:
# """
# if the -c flag is used, the config file is set to the new file
# else the config file will be read from the default location that is config.ini
# """
# # ask for consumer key
# c_key = input("Consumer Key: ")
# # ask for consumer secret
# c_secret = input("Consumer Secret: ")
# # ask for access token
# a_token = input("Access Token: ")
# # ask for access token secret
# a_secret = input("Access Token Secret: ")
# # ask for the name of the user
# name = input("Name of the user: ")
# self.config.add_section('Twitter')
# self.config.set('twitter', 'consumer_key', str(c_key))
# self.config.set('twitter', 'consumer_secret', str(c_secret))
# self.config.set('twitter', 'access_token', str(a_token))
# self.config.set('twitter', 'access_token_secret', str(a_secret))
# with open('config.ini', 'w') as configfile:
# self.config.write(configfile)
# exit()
# elif args.tweet:
# """
# tweet a message
# """
# if not msg:
# msg = input("Message: ")
# try:
# self.bot.tweet(msg, media=image)
# except Exception as e:
# print(e)
# exit()
# elif args.dm:
# """
# send a dm to a user
# """
# if not user and not msg:
# user = input("User: ")
# msg = input("Message: ")
# self.bot.send_dm(user, msg)
# exit()
# elif list_type:
# """
# list followers,following
# """
# if list_type == "followers":
# self.bot.get_followers()
# elif list_type == "following":
# self.bot.get_following()
# else:
# print("Invalid list type")
# exit()
# elif args.show:
# """
# show latest tweet on console
# """
# if hashtag:
# self.bot.get_hashtag_tweets(hashtag)
# self.bot.get_tweets(count=1000)
# exit()
# elif args.like_all:
# """
# like a tweet
# """
# if hashtag:
# self.bot.like_all_tweets_on_hashtag(hashtag)
# elif user:
# self.bot.like_all_tweets(user)
# else:
# print("Invalid user or hashtag")
# self.bot.like_hashtag(hashtag)
# exit()
# elif args.retweet:
# """
# retweet a tweet
# """
# if hashtag:
# self.bot.retweet_all_tweets(hashtag)
# elif user:
# self.bot.retweet_all_tweets(user)
# else:
# print("Invalid user or hashtag")
# exit()
# elif args.follow:
# """
# follow a user
# """
# if hashtag:
# self.bot.follow_all_users(hashtag)
# self.bot.follow(name)
# exit()
# elif args.unfollow:
# """
# unfollow a user
# """
# self.bot.unfollow(name)
# exit()
# elif user_search:
# """
# search for a user
# """
# try:
# try:
# self.bot.get_user__(user_search)
# except Exception as e:
# self.bot.search_user(user_search)
# print("User found")
# except Exception as e:
# print(e)
# exit()
# elif args.retweet_all:
# """
# retweet all tweet on a hashtag or of a user required hashtag or user
# """
# if hashtag:
# self.bot.retweet_on_hashtag(hashtag)
# elif user:
# self.bot.retweet_all(username=user)
# else:
# print("Invalid arguments")
# exit()
# elif args.like_all:
# """
# like all tweets on a hashtag or of a user required user or hashtag
# """
# if hashtag:
# self.bot.like_hashtag(hashtag)
# elif user:
# self.bot.like_all(username=user)
# else:
# print("Invalid arguments")
# exit()
def init__cli(self):
"""
init the client
"""
self.parser = argparse.ArgumentParser(description="A cli client for managing twitter action and a Twitter Bot")
self.parser.add_argument("-t","--tweet",
help="tweet a msg required -m", action="store_true")
self.parser.add_argument("-n", "--name",
help="name of the user", type=str, dest="name")
self.parser.add_argument("-m", "--message",
help="message to be used for dm,reply,tweet", type=str,
dest="message")
self.parser.add_argument("-i", "--image",
help="media link or path to be attached", type=str, dest="image")
self.parser.add_argument("-p", "--profile",
help="profile to be attached", type=str)
self.parser.add_argument("-c", "--config",
help="set config file or new config file", action="store_true")
self.parser.add_argument("-d", "--dm",
help="send a dm to a user", action="store_true")
# show tweet on console latest tweet
self.parser.add_argument("-s", "--show",
help="show latest tweet on console", action="store_true")
# list followers
self.parser.add_argument("-l", "--list",
help="list followers,following", type=str, dest="list")
# user
self.parser.add_argument("-u", "--user",
help="user for other action that required user", type=str, dest="user")
# show tweet on hash tag
self.parser.add_argument("-hg", "--hashtag",
help="show latest tweet on hash tag", type=str, dest="hashtag")
# like or retweet on a hash tag
self.parser.add_argument("-lk", "--like",
help="like a tweet", action="store_true")
self.parser.add_argument("-rt", "--retweet",
help="retweet a tweet", action="store_true")
self.parser.add_argument("-f", "--follow",
help="follow a user", action="store_true")
self.parser.add_argument("-uf", "--unfollow",
help="unfollow a user", action="store_true")
self.parser.add_argument("-r", "--search",
help="search for a user", type=str, dest="search")
self.parser.add_argument("-ra", "--retweet_all",
help="retweet all tweet on a hashtag or of a user required "
"hashtag or user",
action="store_true")
self.parser.add_argument("-la", "--like_all",
help="like all tweets on a hashtag or of a user required user or hashtag",
action="store_true")
args = self.parser.parse_args()
msg = args.message
name = args.name
list_type = args.list
image = args.image
profile = args.profile
hashtag = args.hashtag
user_search = args.search
user = args.user
if args.config:
"""
if the -c flag is used, the config file is set to the new file
else the config file will be read from the default location that is config.ini
"""
# ask for consumer key
c_key = input("Consumer Key: ")
# ask for consumer secret
c_secret = input("Consumer Secret: ")
# ask for access token
a_token = input("Access Token: ")
# ask for access token secret
a_secret = input("Access Token Secret: ")
# ask for the name of the user
name = input("Name of the user: ")
self.config.add_section('Twitter')
self.config.set('twitter', 'consumer_key', str(c_key))
self.config.set('twitter', 'consumer_secret', str(c_secret))
self.config.set('twitter', 'access_token', str(a_token))
self.config.set('twitter', 'access_token_secret', str(a_secret))
with open('config.ini', 'w') as configfile:
self.config.write(configfile)
exit()
elif args.tweet:
"""
tweet a message
"""
if not msg:
msg = input("Message: ")
try:
self.bot.tweet(msg, media=image)
except Exception as e:
print(e)
exit()
elif args.dm:
"""
send a dm to a user
"""
if not user and not msg:
user = input("User: ")
msg = input("Message: ")
self.bot.send_dm(user, msg)
exit()
elif list_type:
"""
list followers,following
"""
if list_type == "followers":
self.bot.get_followers()
elif list_type == "following":
self.bot.get_following()
else:
print("Invalid list type")
exit()
elif args.show:
"""
show latest tweet on console
"""
if hashtag:
self.bot.get_hashtag_tweets(hashtag)
self.bot.get_tweets(count=1000)
exit()
elif args.like_all:
"""
like a tweet
"""
if hashtag:
self.bot.like_all_tweets_on_hashtag(hashtag)
elif user:
self.bot.like_all_tweets(user)
else:
print("Invalid user or hashtag")
self.bot.like_hashtag(hashtag)
exit()
elif args.retweet:
"""
retweet a tweet
"""
if hashtag:
self.bot.retweet_all_tweets(hashtag)
elif user:
self.bot.retweet_all_tweets(user)
else:
print("Invalid user or hashtag")
exit()
elif args.follow:
"""
follow a user
"""
if hashtag:
self.bot.follow_all_users(hashtag)
self.bot.follow(name)
exit()
elif args.unfollow:
"""
unfollow a user
"""
self.bot.unfollow(name)
exit()
elif user_search:
"""
search for a user
"""
try:
try:
self.bot.get_user__(user_search)
except Exception as e:
self.bot.search_user(user_search)
print("User found")
except Exception as e:
print(e)
exit()
elif args.retweet_all:
"""
retweet all tweet on a hashtag or of a user required hashtag or user
"""
if hashtag:
self.bot.retweet_on_hashtag(hashtag)
elif user:
self.bot.retweet_all(username=user)
else:
print("Invalid arguments")
exit()
elif args.like_all:
"""
like all tweets on a hashtag or of a user required user or hashtag
"""
if hashtag:
self.bot.like_hashtag(hashtag)
elif user:
self.bot.like_all(username=user)
else:
print("Invalid arguments")
exit()
# write to the config file
# get_args()
# c = read_config('config.ini',Section='Twitter',Content='__CONSUMER_KEY__')
# print(c)
| 31.891798 | 121 | 0.488563 | 1,968 | 18,274 | 4.444614 | 0.122967 | 0.032811 | 0.053504 | 0.08643 | 0.816966 | 0.810792 | 0.810792 | 0.810792 | 0.810792 | 0.810792 | 0 | 0.001195 | 0.404454 | 18,274 | 572 | 122 | 31.947552 | 0.802536 | 0.454526 | 0 | 0.311258 | 0 | 0 | 0.140407 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013245 | false | 0 | 0.019868 | 0 | 0.039735 | 0.05298 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3c4da2d5f1537dffc6cc06e6bd787f8beab5bec3 | 127 | py | Python | src/wishlist/admin.py | lgandersen/bornhack-website | fbda2b4b53dc2cb266d1d7c13ba0aad59d9079df | [
"BSD-3-Clause"
] | 7 | 2017-04-14T15:28:29.000Z | 2021-09-10T09:45:38.000Z | src/wishlist/admin.py | lgandersen/bornhack-website | fbda2b4b53dc2cb266d1d7c13ba0aad59d9079df | [
"BSD-3-Clause"
] | 799 | 2016-04-28T09:31:50.000Z | 2022-03-29T09:05:02.000Z | src/wishlist/admin.py | lgandersen/bornhack-website | fbda2b4b53dc2cb266d1d7c13ba0aad59d9079df | [
"BSD-3-Clause"
] | 35 | 2016-04-28T09:23:53.000Z | 2021-05-02T12:36:01.000Z | from django.contrib import admin
from .models import Wish
@admin.register(Wish)
class WishAdmin(admin.ModelAdmin):
pass
| 14.111111 | 34 | 0.771654 | 17 | 127 | 5.764706 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149606 | 127 | 8 | 35 | 15.875 | 0.907407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
3c52ae10dd2dc20c4fdb75058864b2b50453cb3e | 2,286 | py | Python | seviceLayer/core/ServiceProvider.py | arashmjr/ClubHouseFollowers | bcb4d020879bb0f0887dc4bc285793f294a670f7 | [
"MIT"
] | null | null | null | seviceLayer/core/ServiceProvider.py | arashmjr/ClubHouseFollowers | bcb4d020879bb0f0887dc4bc285793f294a670f7 | [
"MIT"
] | null | null | null | seviceLayer/core/ServiceProvider.py | arashmjr/ClubHouseFollowers | bcb4d020879bb0f0887dc4bc285793f294a670f7 | [
"MIT"
] | null | null | null | from repository.core.RepositoryProvider import RepositoryProvider
from seviceLayer.SaveUserService import SaveUserService
from seviceLayer.SuggestionService import SuggestionService
from seviceLayer.SaveOrderService import SaveOrderService
from seviceLayer.UserFollowService import UserFollowService
from seviceLayer.Managers.AuthorizationManager import AuthorizationManager
from seviceLayer.PackagesService import PackagesService
from seviceLayer.RegisterService import RegisterService
from seviceLayer.LoginService import LoginService
from seviceLayer.ResetPasswordService import ResetPasswordService
class ServiceProvider:
repository_provider: RepositoryProvider
auth: AuthorizationManager
def __init__(self):
self.repository_provider = RepositoryProvider()
self.auth = AuthorizationManager()
def make_authorization_manager(self):
return self.auth
def make_register_service(self):
return RegisterService(self.repository_provider.make_authorization(), self.auth)
def make_login_service(self):
return LoginService(self.repository_provider.make_authorization(), self.auth)
def make_reset_password_service(self):
return ResetPasswordService(self.repository_provider.make_authorization(), self.auth)
def make_save_user_service(self):
return SaveUserService(self.repository_provider.make_user_profile(), self.auth)
def make_get_suggestions_service(self):
return SuggestionService(self.repository_provider.submit_orders(), self.repository_provider.make_user_follows(),
self.repository_provider.make_user_profile(), self.auth)
def make_save_orders_service(self):
return SaveOrderService(self.repository_provider.submit_orders(), self.repository_provider.make_user_profile(),
self.auth)
def make_user_follow_service(self):
return UserFollowService(self.repository_provider.make_user_follows(), self.repository_provider.make_user_profile(),
self.auth)
def make_get_packages_service(self):
return PackagesService(self.repository_provider.make_user_profile(), self.repository_provider.make_packages(),
self.auth)
| 41.563636 | 124 | 0.769029 | 227 | 2,286 | 7.46696 | 0.185022 | 0.159292 | 0.181711 | 0.168732 | 0.334513 | 0.329794 | 0.329794 | 0.305605 | 0.305605 | 0.21003 | 0 | 0 | 0.168416 | 2,286 | 54 | 125 | 42.333333 | 0.891636 | 0 | 0 | 0.078947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.263158 | false | 0.078947 | 0.263158 | 0.236842 | 0.842105 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
3c678c1a21caee1448e5684d284e43860d4ff783 | 3,332 | py | Python | ml-prediction/Gene.py | Bin-Chen-Lab/covid19_sex | b593ed877b4868278c4546280d05572d0d6addb9 | [
"MIT"
] | null | null | null | ml-prediction/Gene.py | Bin-Chen-Lab/covid19_sex | b593ed877b4868278c4546280d05572d0d6addb9 | [
"MIT"
] | null | null | null | ml-prediction/Gene.py | Bin-Chen-Lab/covid19_sex | b593ed877b4868278c4546280d05572d0d6addb9 | [
"MIT"
] | null | null | null | from __future__ import print_function
import os
import os.path
import numpy as np
import sys
if sys.version_info[0] == 2:
import cPickle as pickle
else:
import pickle
import torch.utils.data as data
import random
import pickle as pkl
class Gene(data.Dataset):
# define train(test)_data, train(test)_label
# define __getitem__ to return typical data point you want.
def __init__(self, root, dataset, fold='0', train=True):
self.train = train # training set or test set
self.fold = fold
fn = root + dataset + '_task_2/fold_' + str(fold) + '.pkl'
with open(fn, 'rb') as f:
d = pkl.load(f)
x_train, y_train_g, y_train_t, x_test, y_test_g, y_test_t = d[0], d[1], d[2], d[3], d[4], d[5]
# now load the picked numpy arrays
self.train_data, self.train_labels1, self.train_labels2 = x_train, y_train_g, y_train_t
self.test_data, self.test_labels1, self.test_labels2 = x_test, y_test_g, y_test_t
# up or down sampling of train data (hold for now)
def __getitem__(self, index):
"""
Args:
index (int): Index
Returns:
tuple: (feature, target) where target is index of the target class.
"""
if self.train:
ft, target1, target2 = self.train_data[index], self.train_labels1[index], self.train_labels2[index]
else:
ft, target1, target2 = self.test_data[index], self.test_labels1[index], self.test_labels2[index]
return ft, target1, target2, index
def __len__(self):
if self.train:
return len(self.train_data)
else:
return len(self.test_data)
class Gene3(data.Dataset):
# define train(test)_data, train(test)_label
# define __getitem__ to return typical data point you want.
def __init__(self, root, dataset, fold='0', train=True):
self.train = train # training set or test set
self.fold = fold
fn = root + dataset + '_task_3/fold_' + str(fold) + '.pkl'
with open(fn, 'rb') as f:
d = pkl.load(f)
x_train, y_train_g, y_train_t, y_train_a, x_test, y_test_g, y_test_t, y_test_a = d[0], d[1], d[2], d[3], d[4], d[5], d[6], d[7]
# now load the picked numpy arrays
self.train_data, self.train_labels1, self.train_labels2, self.train_labels3 = x_train, y_train_g, y_train_t, y_train_a
self.test_data, self.test_labels1, self.test_labels2, self.test_labels3 = x_test, y_test_g, y_test_t, y_test_a
# up or down sampling of train data (hold for now)
def __getitem__(self, index):
"""
Args:
index (int): Index
Returns:
tuple: (feature, target) where target is index of the target class.
"""
if self.train:
ft, target1, target2, target3 = self.train_data[index], self.train_labels1[index], self.train_labels2[index], self.train_labels3[index]
else:
ft, target1, target2, target3 = self.test_data[index], self.test_labels1[index], self.test_labels2[index], self.test_labels3[index]
return ft, target1, target2, target3, index
def __len__(self):
if self.train:
return len(self.train_data)
else:
return len(self.test_data)
| 32.349515 | 147 | 0.626351 | 500 | 3,332 | 3.918 | 0.19 | 0.101072 | 0.039816 | 0.024502 | 0.858601 | 0.803471 | 0.803471 | 0.803471 | 0.775906 | 0.733027 | 0 | 0.022932 | 0.267107 | 3,332 | 102 | 148 | 32.666667 | 0.779279 | 0.191477 | 0 | 0.5 | 0 | 0 | 0.015498 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.185185 | 0 | 0.444444 | 0.018519 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3c73cf9708156bc9266c33ac1d05a7cc4853c928 | 188 | py | Python | src/biopsykit/signals/ecg/__init__.py | Zwitscherle/BioPsyKit | 7200c5f1be75c20f53e1eb4c991aca1c89e3dd88 | [
"MIT"
] | 10 | 2020-11-05T13:34:55.000Z | 2022-03-11T16:20:10.000Z | src/biopsykit/signals/ecg/__init__.py | Zwitscherle/BioPsyKit | 7200c5f1be75c20f53e1eb4c991aca1c89e3dd88 | [
"MIT"
] | 14 | 2021-03-11T14:43:52.000Z | 2022-03-10T19:44:57.000Z | src/biopsykit/signals/ecg/__init__.py | Zwitscherle/BioPsyKit | 7200c5f1be75c20f53e1eb4c991aca1c89e3dd88 | [
"MIT"
] | 3 | 2021-09-13T13:14:38.000Z | 2022-02-19T09:13:25.000Z | """Module for ECG data analysis and visualization."""
from biopsykit.signals.ecg import plotting
from biopsykit.signals.ecg.ecg import EcgProcessor
__all__ = ["EcgProcessor", "plotting"]
| 31.333333 | 53 | 0.787234 | 23 | 188 | 6.26087 | 0.608696 | 0.180556 | 0.277778 | 0.319444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106383 | 188 | 5 | 54 | 37.6 | 0.857143 | 0.25 | 0 | 0 | 0 | 0 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
3c8f77503aab9d09a54763330f52144bcf52fd05 | 26 | py | Python | gevent_viz/__init__.py | sdiehl/gevent_viz | b3854036e273826a3f48c1f253c8655d86a5a6a1 | [
"Apache-2.0"
] | 1 | 2020-03-31T09:41:27.000Z | 2020-03-31T09:41:27.000Z | gevent_viz/__init__.py | sdiehl/gevent_viz | b3854036e273826a3f48c1f253c8655d86a5a6a1 | [
"Apache-2.0"
] | null | null | null | gevent_viz/__init__.py | sdiehl/gevent_viz | b3854036e273826a3f48c1f253c8655d86a5a6a1 | [
"Apache-2.0"
] | null | null | null | from profiler import main
| 13 | 25 | 0.846154 | 4 | 26 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b1c09a18c87efd5a15af408a693514641f082cae | 8,177 | py | Python | grinpy/invariants/disparity.py | somacdivad/grinpy | 597f9109b84f1c1aa8c8dd2ac5b572a05ba474de | [
"BSD-3-Clause"
] | 12 | 2019-08-27T11:04:09.000Z | 2022-03-03T07:38:42.000Z | grinpy/invariants/disparity.py | somacdivad/grinpy | 597f9109b84f1c1aa8c8dd2ac5b572a05ba474de | [
"BSD-3-Clause"
] | 18 | 2017-12-03T20:20:11.000Z | 2019-07-07T18:04:54.000Z | grinpy/invariants/disparity.py | somacdivad/grinpy | 597f9109b84f1c1aa8c8dd2ac5b572a05ba474de | [
"BSD-3-Clause"
] | 5 | 2017-11-28T22:43:05.000Z | 2021-07-02T08:48:43.000Z | # -*- coding: utf-8 -*-
# Copyright (C) 2017-2019 by
# David Amos <somacdivad@gmail.com>
# Randy Davila <davilar@uhd.edu>
# BSD license.
#
# Authors: David Amos <somacdivad@gmail.com>
# Randy Davila <davilar@uhd.edu>
"""Functions for computing disparity related invariants.
"""
from grinpy import nodes, number_of_nodes
from grinpy.functions.degree import closed_neighborhood_degree_list, neighborhood_degree_list
__all__ = [
"vertex_disparity",
"closed_vertex_disparity",
"disparity_sequence",
"closed_disparity_sequence",
"CW_disparity",
"closed_CW_disparity",
"inverse_disparity",
"closed_inverse_disparity",
"average_vertex_disparity",
"average_closed_vertex_disparity",
"k_disparity",
"closed_k_disparity",
"irregularity",
]
def vertex_disparity(G, v):
"""Return number of distinct degrees of neighbors of v.
Parameters
----------
G : NetworkX graph
An undirected graph.
v : node
A node in G.
Returns
-------
int
The number of distinct degrees of neighbors of v.
See Also
--------
closed_vertex_disparity
"""
if v not in nodes(G):
raise (ValueError)
return len(neighborhood_degree_list(G, v))
def closed_vertex_disparity(G, v):
"""Return number of distinct degrees of nodes in the closed neighborhood
of v.
Parameters
----------
G : NetworkX graph
An undirected graph.
v : node
A node in G.
Returns
-------
int
The number of distinct degrees of nodes in the closed neighborhood
of v.
See Also
--------
vertex_disparity
"""
if v not in nodes(G):
raise (ValueError)
return len(closed_neighborhood_degree_list(G, v))
def disparity_sequence(G):
"""Return the sequence of disparities of each node in the graph.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
list
The sequence of disparities of each node in the graph.
See Also
--------
closed_disparity_sequence, vertex_disparity
"""
return [vertex_disparity(G, v) for v in nodes(G)]
def closed_disparity_sequence(G):
"""Return the sequence of closed disparities of each node in the graph.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
list
The sequence of closed disparities of each node in the graph.
See Also
--------
closed_vertex_disparity, disparity_sequence
"""
return [closed_vertex_disparity(G, v) for v in nodes(G)]
def CW_disparity(G):
r"""Return the Caro-Wei disparity of the graph.
The *Caro-Wei disparity* of a graph is defined as:
.. math::
\sum_{v \in V(G)}\frac{1}{1 + disp(v)}
where *V(G)* is the set of nodes of *G* and *disp(v)* is the disparity of
the vertex v.
This invariant is inspired by the Caro-Wei bound for the independence number
of a graph, hence the name.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
float
The Caro-Wei disparity of the graph.
See Also
--------
closed_CW_disparity, closed_inverse_disparity, inverse_disparity
"""
return sum(1 / (1 + x) for x in disparity_sequence(G))
def closed_CW_disparity(G):
r"""Return the closed Caro-Wei disparity of the graph.
The *closed Caro-Wei disparity* of a graph is defined as:
.. math::
\sum_{v \in V(G)}\frac{1}{1 + cdisp(v)}
where *V(G)* is the set of nodes of *G* and *cdisp(v)* is the closed
disparity of the vertex v.
This invariant is inspired by the Caro-Wei bound for the independence number
of a graph, hence the name.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
float
The closed Caro-Wei disparity of the graph.
See Also
--------
CW_disparity, closed_inverse_disparity, inverse_disparity
"""
return sum(1 / (1 + x) for x in closed_disparity_sequence(G))
def inverse_disparity(G):
r"""Return the inverse disparity of the graph.
The *inverse disparity* of a graph is defined as:
.. math::
\sum_{v \in V(G)}\frac{1}{disp(v)}
where *V(G)* is the set of nodes of *G* and *disp(v)* is the disparity
of the vertex v.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
float
The inverse disparity of the graph.
See Also
--------
CW_disparity, closed_CW_disparity, closed_inverse_disparity
"""
return sum(1 / x for x in disparity_sequence(G))
def closed_inverse_disparity(G):
r"""Return the closed inverse disparity of the graph.
The *closed inverse disparity* of a graph is defined as:
.. math::
\sum_{v \in V(G)}\frac{1}{cdisp(v)}
where *V(G)* is the set of nodes of *G* and *cdisp(v)* is the closed
disparity of the vertex v.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
float
The closed inverse disparity of the graph.
See Also
--------
CW_disparity, closed_CW_disparity, inverse_disparity
"""
return sum(1 / x for x in closed_disparity_sequence(G))
def average_vertex_disparity(G):
"""Return the average vertex disparity of the graph.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
int
The average vertex disparity of the graph.
See Also
--------
average_closed_vertex_disparity, vertex_disparity
"""
D = disparity_sequence(G)
return sum(D) / len(D)
def average_closed_vertex_disparity(G):
"""Return the average closed vertex disparity of the graph.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
int
The average closed vertex disparity of the graph.
See Also
--------
average_vertex_disparity, closed_vertex_disparity
"""
D = closed_disparity_sequence(G)
return sum(D) / len(D)
def k_disparity(G, k):
r"""Return the k-disparity of the graph.
The *k-disparity* of a graph is defined as:
.. math::
\frac{2}{k(k+1)}\sum_{i=0}^{k-i}(k-i)d_i
where *k* is a positive integer and *d_i* is the i-th element in the
disparity sequence, ordered in weakly decreasing order.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
float
The k-disparity of the graph.
See Also
--------
closed_k_disparity
"""
D = disparity_sequence(G)
D.sort(reverse=True)
s = sum((k - i) * D[i] for i in range(k))
return (2 * s) / (k * (k + 1))
def closed_k_disparity(G, k):
r"""Return the closed k-disparity of the graph.
The *closed k-disparity* of a graph is defined as:
.. math::
\frac{2}{k(k+1)}\sum_{i=0}^{k-1}(k-i)d_i
where *k* is a positive integer and *d_i* is the i-th element in the
closed disparity sequence, ordered in weakly decreasing order.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
float
The closed k-disparity of the graph.
See Also
--------
k_disparity
"""
D = closed_disparity_sequence(G)
D.sort(reverse=True)
s = sum((k - i) * D[i] for i in range(k))
return (2 * s) / (k * (k + 1))
def irregularity(G):
r"""Return the irregularity measure of the graph.
The *irregularity* of an *n*-vertex graph is defined as:
.. math::
\frac{2}{n(n+1)}\sum_{i=0}^{n-i}(n-i)d_i
where *d_i* is the i-th element in the closed disparity sequence, ordered
in weakly decreasing order.
Parameters
----------
G : NetworkX graph
An undirected graph.
Returns
-------
float
The irregularity of the graph.
See Also
--------
k_disparity
"""
return closed_k_disparity(G, number_of_nodes(G))
| 21.805333 | 93 | 0.606457 | 1,115 | 8,177 | 4.333632 | 0.103139 | 0.059189 | 0.057947 | 0.062914 | 0.874379 | 0.840646 | 0.780215 | 0.703642 | 0.663079 | 0.632864 | 0 | 0.005888 | 0.273083 | 8,177 | 374 | 94 | 21.863636 | 0.807032 | 0.644858 | 0 | 0.258065 | 0 | 0 | 0.124502 | 0.063247 | 0 | 0 | 0 | 0 | 0 | 1 | 0.209677 | false | 0 | 0.032258 | 0 | 0.451613 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b1d4eaf9cd021211dbbe985eaac3a7e5ef80d7b6 | 61 | py | Python | srcipts/test/1.py | GerasimovRM/Where-I-Am | 58f6f0d1533421890f199dacabe523a447486b9f | [
"MIT"
] | null | null | null | srcipts/test/1.py | GerasimovRM/Where-I-Am | 58f6f0d1533421890f199dacabe523a447486b9f | [
"MIT"
] | null | null | null | srcipts/test/1.py | GerasimovRM/Where-I-Am | 58f6f0d1533421890f199dacabe523a447486b9f | [
"MIT"
] | null | null | null | from models import db_session
from models.user import User
| 12.2 | 29 | 0.819672 | 10 | 61 | 4.9 | 0.6 | 0.408163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163934 | 61 | 4 | 30 | 15.25 | 0.960784 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5901cca66ee6da5fe9b3c270d056040c30e041b9 | 43 | py | Python | 02/00/lookup.py | pylangstudy/201709 | 53d868786d7327a83bfa7f4149549c6f9855a6c6 | [
"CC0-1.0"
] | null | null | null | 02/00/lookup.py | pylangstudy/201709 | 53d868786d7327a83bfa7f4149549c6f9855a6c6 | [
"CC0-1.0"
] | 32 | 2017-09-01T00:52:17.000Z | 2017-10-01T00:30:02.000Z | 02/00/lookup.py | pylangstudy/201709 | 53d868786d7327a83bfa7f4149549c6f9855a6c6 | [
"CC0-1.0"
] | null | null | null | import codecs
print(codecs.lookup('utf'))
| 10.75 | 27 | 0.744186 | 6 | 43 | 5.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 43 | 3 | 28 | 14.333333 | 0.820513 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
592a58e78abb16eb01f8e3779e0bad9296c60a13 | 88 | py | Python | covidprognosis/__init__.py | OAfzal/CovidPrognosis | 1b5b4d0d712b8fee94279df02a634b61d6d5a3a0 | [
"MIT"
] | 151 | 2021-01-13T19:50:19.000Z | 2022-03-30T07:16:10.000Z | covidprognosis/__init__.py | OAfzal/CovidPrognosis | 1b5b4d0d712b8fee94279df02a634b61d6d5a3a0 | [
"MIT"
] | 14 | 2021-01-29T15:11:07.000Z | 2022-01-28T04:34:03.000Z | covidprognosis/__init__.py | OAfzal/CovidPrognosis | 1b5b4d0d712b8fee94279df02a634b61d6d5a3a0 | [
"MIT"
] | 35 | 2021-01-15T21:21:50.000Z | 2022-01-17T06:17:04.000Z | import covidprognosis.data
import covidprognosis.models
import covidprognosis.plmodules
| 22 | 31 | 0.897727 | 9 | 88 | 8.777778 | 0.555556 | 0.759494 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068182 | 88 | 3 | 32 | 29.333333 | 0.963415 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a7172cd70067bd762fb24b59583f8702198d4031 | 281 | py | Python | pyinstaller_config/hook-pygsp.py | damianfraszczak/rpasdt | 57039dea485f89d97aafc39ea52f595e35633826 | [
"MIT"
] | 2 | 2021-09-10T07:43:07.000Z | 2021-09-10T07:44:54.000Z | pyinstaller_config/hook-pygsp.py | damianfraszczak/rpasdt | 57039dea485f89d97aafc39ea52f595e35633826 | [
"MIT"
] | null | null | null | pyinstaller_config/hook-pygsp.py | damianfraszczak/rpasdt | 57039dea485f89d97aafc39ea52f595e35633826 | [
"MIT"
] | 2 | 2022-01-17T14:47:49.000Z | 2022-02-14T10:28:45.000Z | from PyInstaller.utils.hooks import collect_submodules
hiddenimports = collect_submodules("pygsp")
hiddenimports.extend(collect_submodules("pygsp.graphs"))
hiddenimports.extend(collect_submodules("pygsp.filters"))
hiddenimports.extend(collect_submodules("pygsp.graphs.nngraphs"))
| 40.142857 | 65 | 0.846975 | 30 | 281 | 7.766667 | 0.433333 | 0.364807 | 0.377682 | 0.463519 | 0.579399 | 0.403433 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039146 | 281 | 6 | 66 | 46.833333 | 0.862963 | 0 | 0 | 0 | 0 | 0 | 0.181495 | 0.074733 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
599bea54f5b5b6f1f3735b613f4309a327cbb75c | 130 | py | Python | dockerEE/element/__init__.py | ynaka81/dockerEE | bcbf01bfa9149d9bedf8d6105a050fb8cd0d47a2 | [
"MIT"
] | null | null | null | dockerEE/element/__init__.py | ynaka81/dockerEE | bcbf01bfa9149d9bedf8d6105a050fb8cd0d47a2 | [
"MIT"
] | 1 | 2015-09-24T22:12:13.000Z | 2015-09-25T13:02:21.000Z | dockerEE/element/__init__.py | ynaka81/dockerEE | bcbf01bfa9149d9bedf8d6105a050fb8cd0d47a2 | [
"MIT"
] | null | null | null | ## @package element
# environment emulation elements package
#
# The elements of environment emulation
from server import Server
| 18.571429 | 40 | 0.8 | 15 | 130 | 6.933333 | 0.666667 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 130 | 6 | 41 | 21.666667 | 0.945455 | 0.715385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
59ac19e0de75bcaece78f40472a72d1a22aed483 | 26,209 | py | Python | src/cmudict_parser/SentenceToIPA.tests.py | stefantaubert/cmudict-parser | 8f5d1b191a41929f1ce8c7acf391c23c08d2be15 | [
"MIT"
] | null | null | null | src/cmudict_parser/SentenceToIPA.tests.py | stefantaubert/cmudict-parser | 8f5d1b191a41929f1ce8c7acf391c23c08d2be15 | [
"MIT"
] | 14 | 2020-12-01T08:45:16.000Z | 2021-06-01T08:00:39.000Z | src/cmudict_parser/SentenceToIPA.tests.py | stefantaubert/cmudict-parser | 8f5d1b191a41929f1ce8c7acf391c23c08d2be15 | [
"MIT"
] | null | null | null | import unittest
from cmudict_parser.SentenceToIPA import (
big_letters_to_ipa,
extract_punctuation_after_word_except_hyphen_or_apostrophe,
extract_punctuation_before_word,
find_combination_of_certain_length_in_dict, get_ipa_of_word_in_sentence,
get_ipa_of_word_with_punctuation,
get_ipa_of_word_without_punctuation_or_unknown_words,
get_ipa_of_words_with_hyphen, ipa_of_punctuation_and_words_combined,
recombine_word, replace_unknown_with_is_string, sentence_to_ipa,
strip_apos, strip_apos_at_beginning_and_end_if_they_do_not_belong_to_word,
value_depending_on_is_alphabetic_value_in_punctuation_after_word,
word_and_hyphen_before_or_after, word_is_really_upper, word_with_apo)
class UnitTests(unittest.TestCase):
def __init__(self, methodName: str) -> None:
super().__init__(methodName)
# region big_letters_to_ipa
def test_big_letters_to_ipa__only_big_letters__returns_combination_of_values(self):
input_dict = {"A": "a", "P": "x", "R": "y", "S": "z"}
res = big_letters_to_ipa(input_dict, "PRS")
self.assertEqual("xyz", res)
def test_big_letters_to_ipa__empty_string__returns_empty_string(self):
input_dict = {"A": "a", "P": "x", "R": "y", "S": "z"}
res = big_letters_to_ipa(input_dict, "")
self.assertEqual("", res)
# endregion
# region word_is_really_upper
def test_word_is_really_upper__word_with_number__returns_false(self):
res = word_is_really_upper("PRS1")
self.assertEqual(False, res)
def test_word_is_really_upper__word_with_small_letters__returns_false(self):
res = word_is_really_upper("PRs")
self.assertEqual(False, res)
def test_word_is_really_upper__only_big_letters__returns_true(self):
res = word_is_really_upper("PRS")
self.assertEqual(True, res)
# endregion
# region get_ipa_of_word_without_punctuation_or_unknown_words
def test_get_ipa_of_word_without_punctuation_or_unknown_words__word_in_dict__returns_value(self):
input_dict = {"PRS": "abc", "P": "x", "R": "y", "S": "z"}
res = get_ipa_of_word_without_punctuation_or_unknown_words(
input_dict, "PRS", replace_unknown_with="_")
self.assertEqual("abc", res)
def test_get_ipa_of_word_without_punctuation_or_unknown_words__word_not_in_dict_with_only_upper_letters__returns_combination_of_values(self):
input_dict = {"PSR": "abc", "P": "x", "R": "y", "S": "z"}
res = get_ipa_of_word_without_punctuation_or_unknown_words(
input_dict, "PRS", replace_unknown_with="_")
self.assertEqual("xyz", res)
def test_get_ipa_of_word_without_punctuation_or_unknown_words__word_not_in_dict_replace_unknown_with_None__returns_word(self):
input_dict = {"PSR": "abc", "P": "x", "R": "y", "S": "z"}
res = get_ipa_of_word_without_punctuation_or_unknown_words(
input_dict, "prs", replace_unknown_with=None)
self.assertEqual("prs", res)
def test_get_ipa_of_word_without_punctuation_or_unknown_words__word_not_in_dict_replace_unknown_with_underline__returns_word(self):
input_dict = {"PSR": "abc", "P": "x", "R": "y", "S": "z"}
res = get_ipa_of_word_without_punctuation_or_unknown_words(
input_dict, "prs", replace_unknown_with="_")
self.assertEqual("___", res)
def test_get_ipa_of_word_without_punctuation_or_unknown_words__replace_unknown_with_string_with_more_than_one_char_and_word_not_in_dict__throws_exception(self):
self.assertRaises(ValueError, replace_unknown_with_is_string, "prs", replace_unknown_with="123")
def test_get_ipa_of_word_without_punctuation_or_unknown_words__word_not_in_dict_replace_unknown_with_costum_func__returns_word(self):
input_dict = {"PSR": "abc", "P": "x", "R": "y", "S": "z"}
res = get_ipa_of_word_without_punctuation_or_unknown_words(
input_dict, "prs", replace_unknown_with=lambda x: x + "123")
self.assertEqual("prs123", res)
# endregion
# region recombine_word
def test_recombine_word__startpos_is_zero_endpos_is_one__returns_first_word(self):
parts = ["cat", "o", "nine", "tails"]
res = recombine_word(parts, 0, 1)
self.assertEqual("cat", res)
def test_recombine_word__everything_except_first_and_last_word__returns_words_in_the_middle_connected_with_hyphens(self):
parts = ["cat", "o", "nine", "tails"]
res = recombine_word(parts, 1, len(parts) - 1)
self.assertEqual("o-nine", res)
def test_recombine_word__everything_except_first_word__returns_last_three_words_connected_with_hyphens(self):
parts = ["cat", "o", "nine", "tails"]
res = recombine_word(parts, 1, len(parts))
self.assertEqual("o-nine-tails", res)
# endregion
# region word_and_hyphen_before_or_after
# should return hyphen
def test_word_and_hyphen_before_or_after__startpos_is_zero_endpos_is_one__returns_first_word_and_hyphen(self):
parts = ["cat", "o", "nine", "tails"]
res = word_and_hyphen_before_or_after(parts, 0, 1)
self.assertEqual("cat", res[0])
self.assertEqual("-", res[1])
def test_word_and_hyphen_before_or_after__everything_except_first_and_last_word__returns_words_in_the_middle_connected_with_hyphens_and_hyphen(self):
parts = ["cat", "o", "nine", "tails"]
res = word_and_hyphen_before_or_after(parts, 1, len(parts) - 1)
self.assertEqual("o-nine", res[0])
self.assertEqual("-", res[1])
def test_word_and_hyphen_before_or_after__everything_except_first_word__returns_last_three_words_connected_with_hyphens_and_hyphen(self):
parts = ["cat", "o", "nine", "tails"]
res = word_and_hyphen_before_or_after(parts, 1, len(parts))
self.assertEqual("o-nine-tails", res[0])
self.assertEqual("-", res[1])
# should not return hyphen
def test_word_and_hyphen_before_or_after__endpos_is_zero__returns_empty_word_and_no_hyphen(self):
parts = ["cat", "o", "nine", "tails"]
res = word_and_hyphen_before_or_after(parts, 1, 0)
self.assertEqual("", res[0])
self.assertEqual("", res[1])
def test_word_and_hyphen_before_or_after__startpos_is_length_of_list__returns_empty_word_and_no_hyphen(self):
parts = ["cat", "o", "nine", "tails"]
res = word_and_hyphen_before_or_after(parts, len(parts), 0)
self.assertEqual("", res[0])
self.assertEqual("", res[1])
# endregion
# region find_combination_of_certain_length_in_dict
def test_find_combination_of_certain_length_in_dict__length_too_long__returns_none(self):
parts = ["to", "cat", "o", "nine", "tails", "to"]
input_dict = {"CAT-O-NINE-TAILS": "xyz", "TO": "a"}
res = find_combination_of_certain_length_in_dict(input_dict, parts, 5, "_")
self.assertIsNone(res)
def test_find_combination_of_certain_length_in_dict__right_length_start_in_middle__returns_combination_of_values(self):
parts = ["to", "cat", "o", "nine", "tails", "to"]
input_dict = {"CAT-O-NINE-TAILS": "xyz", "TO": "a"}
res = find_combination_of_certain_length_in_dict(input_dict, parts, 4, "_")
self.assertEqual("a-xyz-a", res)
def test_find_combination_of_certain_length_in_dict__right_length_start_at_beginning__returns_combination_of_values(self):
parts = ["cat", "o", "nine", "tails", "to"]
input_dict = {"CAT-O-NINE-TAILS": "xyz", "TO": "a"}
res = find_combination_of_certain_length_in_dict(input_dict, parts, 4, "_")
self.assertEqual("xyz-a", res)
def test_find_combination_of_certain_length_in_dict__right_length_combination_reaches_end__returns_combination_of_values(self):
parts = ["to", "cat", "o", "nine", "tails"]
input_dict = {"CAT-O-NINE-TAILS": "xyz", "TO": "a"}
res = find_combination_of_certain_length_in_dict(input_dict, parts, 4, "_")
self.assertEqual("a-xyz", res)
def test_find_combination_of_certain_length_in_dict__only_single_words__returns_combination_of_values(self):
parts = ["to", "no", "so"]
input_dict = {"NO": "x", "SO": "y", "TO": "z"}
res = find_combination_of_certain_length_in_dict(input_dict, parts, 1, "_")
self.assertEqual("z-x-y", res)
# endregion
# region get_ipa_of_words_with_hyphen
def test_get_ipa_of_words_with_hyphen__three_words__returns_combination_of_values(self):
input_word = "to-cat-o-nine-tails-to"
input_dict = {"CAT-O-NINE-TAILS": "xyz", "TO": "a"}
res = get_ipa_of_words_with_hyphen(input_dict, input_word, "_")
self.assertEqual("a-xyz-a", res)
def test_get_ipa_of_words_with_hyphen__two_words_with_longer_one_at_beginning__returns_combination_of_values(self):
input_word = "cat-o-nine-tails-to"
input_dict = {"CAT-O-NINE-TAILS": "xyz", "TO": "a"}
res = get_ipa_of_words_with_hyphen(input_dict, input_word, "_")
self.assertEqual("xyz-a", res)
def test_get_ipa_of_words_with_hyphen__two_words_with_longer_one_at_end__returns_combination_of_values(self):
input_word = "to-cat-o-nine-tails"
input_dict = {"CAT-O-NINE-TAILS": "xyz", "TO": "a"}
res = get_ipa_of_words_with_hyphen(input_dict, input_word, "_")
self.assertEqual("a-xyz", res)
def test_get_ipa_of_words_with_hyphen__only_single_words__returns_combination_of_values(self):
input_word = "to-no-so"
input_dict = {"NO": "x", "SO": "y", "TO": "z"}
res = get_ipa_of_words_with_hyphen(input_dict, input_word, "_")
self.assertEqual("z-x-y", res)
# endregion
# region value_depending_on_is_alphabetic_value_in_punctuation_after_word
def test_value_depending_on_is_alphabetic_value_in_punctuation_after_word__word_with_alphabetic_values_in_punctuation_after_word_which_are_not_in_dict__returns_input_ipa_and_first_char_of_punctuation_after_word_and_underlines_for_rest(self):
input_dict = {"A": "e", "B": "f", "C": "g"}
punctuation_before_word = ""
ipa_of_word_without_punctuation = "abc"
punctuation_after_word = "#abc"
res = value_depending_on_is_alphabetic_value_in_punctuation_after_word(
input_dict, punctuation_before_word, ipa_of_word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("abc#___", res)
def test_value_depending_on_is_alphabetic_value_in_punctuation_after_word__word_with_alphabetic_values_in_punctuation_after_word_which_are_in_dict__returns_input_ipa_and_first_char_of_punctuation_after_word_and_ipa_of_upper_letters(self):
input_dict = {"A": "e", "B": "f", "C": "g"}
punctuation_before_word = ""
ipa_of_word_without_punctuation = "abc"
punctuation_after_word = "#ABC"
res = value_depending_on_is_alphabetic_value_in_punctuation_after_word(
input_dict, punctuation_before_word, ipa_of_word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("abc#efg", res)
def test_value_depending_on_is_alphabetic_value_in_punctuation_after_word__word_without_alphabetic_values_in_punctuation_after_word__returns_input_ipa_and_keeps_punctuation_after_word_as_they_are(self):
input_dict = {"A": "e", "B": "f", "C": "d"}
punctuation_before_word = ""
ipa_of_word_without_punctuation = "abc"
punctuation_after_word = "#!'-'"
res = value_depending_on_is_alphabetic_value_in_punctuation_after_word(
input_dict, punctuation_before_word, ipa_of_word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("abc#!'-'", res)
# endregion
# region word_with_apo
def test_word_with_apo__no_apo_or_hyphen_at_end__returns_word__empty_char__and_word_with_apo_at_beginning_or_end(self):
input_word = "stones"
res = word_with_apo(input_word)
self.assertEqual(5, len(res))
self.assertEqual(input_word, res[0])
self.assertEqual("", res[1])
self.assertEqual("'" + input_word, res[2])
self.assertEqual(input_word + "'", res[3])
self.assertEqual("'" + input_word + "'", res[4])
def test_word_with_apo__apo_at_end__returns_word_without_apo_at_end__apo__and_word_with_apo_at_beginning_or_end(self):
input_word = "stones'"
res = word_with_apo(input_word)
self.assertEqual(5, len(res))
self.assertEqual("stones", res[0])
self.assertEqual("'", res[1])
self.assertEqual("'stones", res[2])
self.assertEqual(input_word, res[3])
self.assertEqual("'" + input_word, res[4])
def test_word_with_apo__apo_at_beginning__returns_word_with_apo_at_beginning__empty_string__word_with_one_more_apo_at_beginning__and_word_with_apo_at_end(self):
input_word = "'stones"
res = word_with_apo(input_word)
self.assertEqual(5, len(res))
self.assertEqual("'stones", res[0])
self.assertEqual("", res[1])
self.assertEqual("'" + input_word, res[2])
self.assertEqual(input_word + "'", res[3])
self.assertEqual("'" + input_word + "'", res[4])
# endregion
# region ipa_of_punctuation_and_words_combined
def test_ipa_of_punctuation_and_words_combined__word_with_apo_at_beginning__returns_value_of_this_word_plus_punctutations_at_beginning_and_end_but_not_the_apo_that_belongs_to_word(self):
input_dict = {"ALLO": "a", "'ALLO": "b"}
punctuation_before_word = "$-''"
word_without_punctuation = "allo"
punctuation_after_word = "+*"
res = ipa_of_punctuation_and_words_combined(
input_dict, punctuation_before_word, word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("$-'b+*", res)
def test_ipa_of_punctuation_and_words_combined__last_char_of_punctuation_before_word_is_apo_but_word_not_in_dict___returns_underlines_instead_of_word_and_keeps_punctuation(self):
input_dict = {"ALLO": "a", "'ALLO": "b"}
punctuation_before_word = "$-''"
word_without_punctuation = "bllo"
punctuation_after_word = "+*"
res = ipa_of_punctuation_and_words_combined(
input_dict, punctuation_before_word, word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("$-''____+*", res)
def test_ipa_of_punctuation_and_words_combined__word_with_apo_at_end_is_in_dict___returns_value_of_this_word_plus_punctutations_at_beginning_and_end(self):
input_dict = {"ALLO": "a", "'ALLO": "b", "ALLO'": "c"}
punctuation_before_word = "$-"
word_without_punctuation = "allo'"
punctuation_after_word = "+*"
res = ipa_of_punctuation_and_words_combined(
input_dict, punctuation_before_word, word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("$-c+*", res)
def test_ipa_of_punctuation_and_words_combined__word_has_apo_at_end_but_is_in_dict___returns_value_of_this_word_plus_punctutations_at_beginning_and_end(self):
input_dict = {"ALLO": "a", "'ALLO": "b"}
punctuation_before_word = "$-"
word_without_punctuation = "allo'"
punctuation_after_word = "+*"
res = ipa_of_punctuation_and_words_combined(
input_dict, punctuation_before_word, word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("$-a'+*", res)
def test_ipa_of_punctuation_and_words_combined__word_with_hyphen_and_is_in_dict___returns_value_of_this_word_plus_punctutations_at_beginning_and_end(self):
input_dict = {"AL-LO": "a", "AL": "b", "LO": "c"}
punctuation_before_word = "$-"
word_without_punctuation = "al-lo"
punctuation_after_word = "+*"
res = ipa_of_punctuation_and_words_combined(
input_dict, punctuation_before_word, word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("$-a+*", res)
def test_ipa_of_punctuation_and_words_combined__word_with_hyphen_and_is_not_in_dict_but_its_parts_are___returns_value_of_the_word_parts_connected_with_hyphen_plus_punctutations_at_beginning_and_end(self):
input_dict = {"ALLO": "a", "AL": "b", "LO": "c"}
punctuation_before_word = "$-"
word_without_punctuation = "al-lo"
punctuation_after_word = "+*"
res = ipa_of_punctuation_and_words_combined(
input_dict, punctuation_before_word, word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("$-b-c+*", res)
def test_ipa_of_punctuation_and_words_combined__word_with_hyphen_and_is_not_in_dict_but_one_of_its_parts_are___returns_value_of_this_part_and_underlines_for_the_part_not_in_dict_connected_with_hyphen_plus_punctutations_at_beginning_and_end(self):
input_dict = {"ALLO": "a", "AL": "b", "O": "c"}
punctuation_before_word = "$-"
word_without_punctuation = "al-lo"
punctuation_after_word = "+*"
res = ipa_of_punctuation_and_words_combined(
input_dict, punctuation_before_word, word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("$-b-__+*", res)
def test_ipa_of_punctuation_and_words_combined__normal_word_without_hyphen_or_apo___returns_value_of_word_plus_punctutations_at_beginning_and_end(self):
input_dict = {"ALLO": "a", "AL": "b", "O": "c"}
punctuation_before_word = "$-"
word_without_punctuation = "allo"
punctuation_after_word = "+*"
res = ipa_of_punctuation_and_words_combined(
input_dict, punctuation_before_word, word_without_punctuation, punctuation_after_word, "_")
self.assertEqual("$-a+*", res)
# endregion
# region extract_punctuation_before_word
def test_extract_punctuation_before_word__no_punctuation_before_word__returns_input_and_empty_string(self):
input_word = "allo#'!"
res = extract_punctuation_before_word(input_word)
self.assertEqual(input_word, res[0])
self.assertEqual("", res[1])
def test_extract_punctuation_before_word__punctuation_before_word__returns_punctuation_before_word_and_rest(self):
input_word = "&!allo#'!"
res = extract_punctuation_before_word(input_word)
self.assertEqual("allo#'!", res[0])
self.assertEqual("&!", res[1])
# endregion
# region extract_punctuation_after_word_except_hyphen_or_apostrophe
def test_extract_punctuation_after_word_except_hyphen_or_apostrophe__no_punctuation_after_word__returns_input_and_empty_string(self):
input_word = "allo"
res = extract_punctuation_after_word_except_hyphen_or_apostrophe(input_word)
self.assertEqual(input_word, res[0])
self.assertEqual("", res[1])
def test_extract_punctuation_after_word_except_hyphen_or_apostrophe__apostrophe_after_word__returns_word_with_hyphen__and__remaining_punctuation(self):
input_word = "allo'!"
res = extract_punctuation_after_word_except_hyphen_or_apostrophe(input_word)
self.assertEqual("allo'", res[0])
self.assertEqual("!", res[1])
def test_extract_punctuation_after_word_except_hyphen_or_apostrophe__punctuation_after_word_but_not_hyphen_or_apostrophe__returns_word__and__punctuation(self):
input_word = "allo#!"
res = extract_punctuation_after_word_except_hyphen_or_apostrophe(input_word)
self.assertEqual("allo", res[0])
self.assertEqual("#!", res[1])
# endregion
# region get_ipa_of_word_with_punctuation
def test_get_ipa_of_word_with_punctuation__word_with_hyphen_that_belongs_to_word__returns_value(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "A-B"
res = get_ipa_of_word_with_punctuation(input_dict, input_word, "_")
self.assertEqual("ab", res)
def test_get_ipa_of_word_with_punctuation__word_with_apo_in_the_middle_that_is_not_in_dict__returns_underlines(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "A'B"
res = get_ipa_of_word_with_punctuation(input_dict, input_word, "_")
self.assertEqual("___", res)
def test_get_ipa_of_word_with_punctuation__word_with_apo_at_beginning_and_hyphen_that_belong_to_word__returns_value(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "'A-B"
res = get_ipa_of_word_with_punctuation(input_dict, input_word, "_")
self.assertEqual("g", res)
def test_get_ipa_of_word_with_punctuation__word_with_apo_atend_and_hyphen_that_belong_to_word__returns_value(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "A-B'"
res = get_ipa_of_word_with_punctuation(input_dict, input_word, "_")
self.assertEqual("h", res)
def test_get_ipa_of_word_with_punctuation__word_with_apos_at_beginning_and_end_and_hyphen_that_belong_to_word__returns_value(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "'A-B'"
res = get_ipa_of_word_with_punctuation(input_dict, input_word, "_")
self.assertEqual("i", res)
def test_get_ipa_of_word_with_punctuation__word_without_hyphen_or_apo_but_with_hash_and_new_line__returns_hash_value_and_new_line(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "#A\n"
res = get_ipa_of_word_with_punctuation(input_dict, input_word, "_")
self.assertEqual("#c\n", res)
def test_get_ipa_of_word_with_punctuation__two_words_separated_by_punctuation_but_not_by_space_and_punctuation_at_beginning_and_end__returns_values_of_the_word_and_keeps_punctuation(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "#A#B#"
res = get_ipa_of_word_with_punctuation(input_dict, input_word, "_")
self.assertEqual("#c#d#", res)
# endregion
# region get_ipa_of_word_in_sentence
def test_get_ipa_of_word_in_sentence__word_without_punctuation__returns_value(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "A"
res = get_ipa_of_word_in_sentence(input_dict, input_word, "_")
self.assertEqual("c", res)
def test_get_ipa_of_word_in_sentence__word_with_punctuation_that_belongs_to_word__returns_value(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "'A"
res = get_ipa_of_word_in_sentence(input_dict, input_word, "_")
self.assertEqual("e", res)
def test_get_ipa_of_word_in_sentence__word_with_punctuation_that_belongs_not_to_word__returns_value_and_punctuation(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "-'A-"
res = get_ipa_of_word_in_sentence(input_dict, input_word, "_")
self.assertEqual("-e-", res)
# endregion
# region sentence_to_ipa
def test_sentence_to_ipa__senrds__return_combination_of_values(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "'A-B#'"
res = sentence_to_ipa(input_dict, input_word, "_", use_caching=False)
self.assertEqual("g#'", res)
def test_sentence_to_ipa__sentence_with_existing_words__return_combination_of_values(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "A B 'A A-B 'A-B' 'A-B#' A"
res = sentence_to_ipa(input_dict, input_word, "_", use_caching=False)
self.assertEqual("c d e ab i g#' c", res)
def test_sentence_to_ipa__sentence_with_words_not_in_dict_but_one_of_them_is_only_in_capital_letters__return_underlines_and_values_for_letters(self):
input_dict = {"A-B": "ab", "A": "c", "B": "d", "'A": "e",
"B'": "f", "'A-B": "g", "A-B'": "h", "'A-B'": "i"}
input_word = "abc BA"
res = sentence_to_ipa(input_dict, input_word, "_", use_caching=False)
self.assertEqual("___ dc", res)
def test_sentence_to_ipa__without_caching__executes_custom_func(self):
input_dict = {}
input_word = "x"
res = [sentence_to_ipa(
dict=input_dict,
sentence=input_word,
replace_unknown_with=lambda _: str(i),
use_caching=False
) for i in range(2)]
self.assertEqual(["0", "1"], res)
def test_sentence_to_ipa__with_caching__executes_custom_func_only_once(self):
input_dict = {}
input_word = "x"
res = [sentence_to_ipa(
dict=input_dict,
sentence=input_word,
replace_unknown_with=lambda _: str(i),
use_caching=True
) for i in range(2)]
self.assertEqual(["0", "0"], res)
# endregion
# region strip_apos
def test_strip_apos__word_with_apo_at_beginning(self):
input_word = "'''stones'"
res = strip_apos(input_word, 0)
self.assertEqual(2, len(res))
self.assertEqual("stones'", res[0])
self.assertEqual("'''", res[1])
def test_strip_apos__word_with_apo_at_end(self):
input_word = "'''stones'"
res = strip_apos(input_word, -1)
self.assertEqual(2, len(res))
self.assertEqual("'''stones", res[0])
self.assertEqual("'", res[1])
# endregion
# region strip_apos_at_beginning_and_end_if_they_do_not_belong_to_word
def test_strip_apos__word_with_apo_at_beginning_which_belongs_to_word(self):
input_word = "'''Allo'"
input_dict = {"'ALLO": "a", "ALLO": "b"}
res = strip_apos_at_beginning_and_end_if_they_do_not_belong_to_word(input_dict, input_word)
self.assertEqual(3, len(res))
self.assertEqual("'Allo", res[0])
self.assertEqual("''", res[1])
self.assertEqual("'", res[2])
def test_strip_apos__word_with_apo_at_end_which_belongs_to_word(self):
input_word = "'''stones''"
input_dict = {"STONES'": "a", "STONES": "b"}
res = strip_apos_at_beginning_and_end_if_they_do_not_belong_to_word(input_dict, input_word)
self.assertEqual(3, len(res))
self.assertEqual("stones'", res[0])
self.assertEqual("'''", res[1])
self.assertEqual("'", res[2])
def test_strip_apos__word_to_which_no_apos_belong(self):
input_word = "'''stones''"
input_dict = {"STONES": "b"}
res = strip_apos_at_beginning_and_end_if_they_do_not_belong_to_word(input_dict, input_word)
self.assertEqual(3, len(res))
self.assertEqual("stones", res[0])
self.assertEqual("'''", res[1])
self.assertEqual("''", res[2])
# endregion
if __name__ == '__main__':
suite = unittest.TestLoader().loadTestsFromTestCase(UnitTests)
unittest.TextTestRunner(verbosity=2).run(suite)
| 41.339117 | 248 | 0.727231 | 3,874 | 26,209 | 4.337377 | 0.052659 | 0.091948 | 0.022377 | 0.026424 | 0.880081 | 0.859965 | 0.824912 | 0.800869 | 0.764268 | 0.725882 | 0 | 0.004119 | 0.138617 | 26,209 | 633 | 249 | 41.404423 | 0.740167 | 0.034492 | 0 | 0.52506 | 0 | 0 | 0.068268 | 0.000871 | 0 | 0 | 0 | 0 | 0.250597 | 1 | 0.162291 | false | 0 | 0.004773 | 0 | 0.169451 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
ab68627c73944518e4c0add27c29a3a4d2a371d7 | 502 | py | Python | roi/__init__.py | Tianxiaomo/ROI | 8422716605f846c6f4276051a9738cb6c162611d | [
"Apache-2.0"
] | 2 | 2020-04-03T08:34:11.000Z | 2021-05-31T07:15:19.000Z | roi/__init__.py | Tianxiaomo/ROI | 8422716605f846c6f4276051a9738cb6c162611d | [
"Apache-2.0"
] | null | null | null | roi/__init__.py | Tianxiaomo/ROI | 8422716605f846c6f4276051a9738cb6c162611d | [
"Apache-2.0"
] | null | null | null | from .layers import FrozenBatchNorm2d, get_norm, NaiveSyncBatchNorm
from .layers import DeformConv, ModulatedDeformConv
from .layers import paste_masks_in_image
from .layers import batched_nms, batched_nms_rotated, nms, nms_rotated
from .layers import ROIAlign, roi_align
from .layers import ROIAlignRotated, roi_align_rotated
from .layers import ShapeSpec
from .layers import BatchNorm2d, Conv2d, ConvTranspose2d, cat, interpolate
from .layers import ROIAlignRotatedPadding, roi_align_rotated_padding
| 50.2 | 74 | 0.858566 | 63 | 502 | 6.619048 | 0.444444 | 0.215827 | 0.345324 | 0.110312 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00883 | 0.09761 | 502 | 9 | 75 | 55.777778 | 0.9117 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
aba767ef8b8d1718f2c1d36719db617232e10c48 | 7,548 | py | Python | tests/autodiff_test.py | wffpy/tinyflow-1 | f85a47f1b621036d3f5663b731e4451bf6233862 | [
"MIT"
] | 29 | 2020-05-01T12:10:53.000Z | 2022-03-14T07:48:24.000Z | tests/autodiff_test.py | wffpy/tinyflow-1 | f85a47f1b621036d3f5663b731e4451bf6233862 | [
"MIT"
] | 1 | 2021-04-07T05:55:49.000Z | 2021-12-20T07:36:54.000Z | tests/autodiff_test.py | wffpy/tinyflow-1 | f85a47f1b621036d3f5663b731e4451bf6233862 | [
"MIT"
] | 12 | 2020-09-27T07:07:15.000Z | 2022-03-09T07:55:30.000Z | from tinyflow import autodiff as ad
import numpy as np
def test_identity():
x2 = ad.Variable(name="x2")
y = x2
grad_x2, = ad.gradients(y, [x2])
executor = ad.Executor([y, grad_x2])
x2_val = 2 * np.ones(3)
y_val, grad_x2_val = executor.run(feed_dict={x2: x2_val})
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, x2_val)
assert np.array_equal(grad_x2_val, np.ones_like(x2_val))
def test_add_by_const():
x2 = ad.Variable(name="x2")
y = 5 + x2
grad_x2, = ad.gradients(y, [x2])
executor = ad.Executor([y, grad_x2])
x2_val = 2 * np.ones(3)
y_val, grad_x2_val = executor.run(feed_dict={x2: x2_val})
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, x2_val + 5)
assert np.array_equal(grad_x2_val, np.ones_like(x2_val))
def test_mul_by_const():
x2 = ad.Variable(name="x2")
y = 5 * x2
grad_x2, = ad.gradients(y, [x2])
executor = ad.Executor([y, grad_x2])
x2_val = 2 * np.ones(3)
y_val, grad_x2_val = executor.run(feed_dict={x2: x2_val})
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, x2_val * 5)
assert np.array_equal(grad_x2_val, np.ones_like(x2_val) * 5)
def test_add_two_vars():
x2 = ad.Variable(name="x2")
x3 = ad.Variable(name="x3")
y = x2 + x3
grad_x2, grad_x3 = ad.gradients(y, [x2, x3])
executor = ad.Executor([y, grad_x2, grad_x3])
x2_val = 2 * np.ones(3)
x3_val = 3 * np.ones(3)
y_val, grad_x2_val, grad_x3_val = executor.run(feed_dict={x2: x2_val, x3: x3_val})
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, x2_val + x3_val)
assert np.array_equal(grad_x2_val, np.ones_like(x2_val))
assert np.array_equal(grad_x3_val, np.ones_like(x3_val))
def test_mul_two_vars():
x2 = ad.Variable(name="x2")
x3 = ad.Variable(name="x3")
y = x2 * x3
grad_x2, grad_x3 = ad.gradients(y, [x2, x3])
executor = ad.Executor([y, grad_x2, grad_x3])
x2_val = 2 * np.ones(3)
x3_val = 3 * np.ones(3)
y_val, grad_x2_val, grad_x3_val = executor.run(feed_dict={x2: x2_val, x3: x3_val})
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, x2_val * x3_val)
assert np.array_equal(grad_x2_val, x3_val)
assert np.array_equal(grad_x3_val, x2_val)
def test_add_mul_mix_1():
x1 = ad.Variable(name="x1")
x2 = ad.Variable(name="x2")
x3 = ad.Variable(name="x3")
y = x1 + x2 * x3 * x1
grad_x1, grad_x2, grad_x3 = ad.gradients(y, [x1, x2, x3])
executor = ad.Executor([y, grad_x1, grad_x2, grad_x3])
x1_val = 1 * np.ones(3)
x2_val = 2 * np.ones(3)
x3_val = 3 * np.ones(3)
y_val, grad_x1_val, grad_x2_val, grad_x3_val = executor.run(feed_dict={x1: x1_val, x2: x2_val, x3: x3_val})
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, x1_val + x2_val * x3_val)
assert np.array_equal(grad_x1_val, np.ones_like(x1_val) + x2_val * x3_val)
assert np.array_equal(grad_x2_val, x3_val * x1_val)
assert np.array_equal(grad_x3_val, x2_val * x1_val)
def test_add_mul_mix_2():
x1 = ad.Variable(name="x1")
x2 = ad.Variable(name="x2")
x3 = ad.Variable(name="x3")
x4 = ad.Variable(name="x4")
y = x1 + x2 * x3 * x4
grad_x1, grad_x2, grad_x3, grad_x4 = ad.gradients(y, [x1, x2, x3, x4])
executor = ad.Executor([y, grad_x1, grad_x2, grad_x3, grad_x4])
x1_val = 1 * np.ones(3)
x2_val = 2 * np.ones(3)
x3_val = 3 * np.ones(3)
x4_val = 4 * np.ones(3)
y_val, grad_x1_val, grad_x2_val, grad_x3_val, grad_x4_val = executor.run(
feed_dict={x1: x1_val, x2: x2_val, x3: x3_val, x4: x4_val}
)
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, x1_val + x2_val * x3_val * x4_val)
assert np.array_equal(grad_x1_val, np.ones_like(x1_val))
assert np.array_equal(grad_x2_val, x3_val * x4_val)
assert np.array_equal(grad_x3_val, x2_val * x4_val)
assert np.array_equal(grad_x4_val, x2_val * x3_val)
def test_add_mul_mix_3():
x2 = ad.Variable(name="x2")
x3 = ad.Variable(name="x3")
z = x2 * x2 + x2 + x3 + 3
y = z * z + x3
grad_x2, grad_x3 = ad.gradients(y, [x2, x3])
executor = ad.Executor([y, grad_x2, grad_x3])
x2_val = 2 * np.ones(3)
x3_val = 3 * np.ones(3)
y_val, grad_x2_val, grad_x3_val = executor.run(feed_dict={x2: x2_val, x3: x3_val})
z_val = x2_val * x2_val + x2_val + x3_val + 3
expected_yval = z_val * z_val + x3_val
expected_grad_x2_val = 2 * (x2_val * x2_val + x2_val + x3_val + 3) * (2 * x2_val + 1)
expected_grad_x3_val = 2 * (x2_val * x2_val + x2_val + x3_val + 3) + 1
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, expected_yval)
assert np.array_equal(grad_x2_val, expected_grad_x2_val)
assert np.array_equal(grad_x3_val, expected_grad_x3_val)
def test_grad_of_grad():
x2 = ad.Variable(name="x2")
x3 = ad.Variable(name="x3")
y = x2 * x2 + x2 * x3
grad_x2, grad_x3 = ad.gradients(y, [x2, x3])
grad_x2_x2, grad_x2_x3 = ad.gradients(grad_x2, [x2, x3])
executor = ad.Executor([y, grad_x2, grad_x3, grad_x2_x2, grad_x2_x3])
x2_val = 2 * np.ones(3)
x3_val = 3 * np.ones(3)
y_val, grad_x2_val, grad_x3_val, grad_x2_x2_val, grad_x2_x3_val = executor.run(
feed_dict={x2: x2_val, x3: x3_val}
)
expected_yval = x2_val * x2_val + x2_val * x3_val
expected_grad_x2_val = 2 * x2_val + x3_val
expected_grad_x3_val = x2_val
expected_grad_x2_x2_val = 2 * np.ones_like(x2_val)
expected_grad_x2_x3_val = 1 * np.ones_like(x2_val)
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, expected_yval)
assert np.array_equal(grad_x2_val, expected_grad_x2_val)
assert np.array_equal(grad_x3_val, expected_grad_x3_val)
assert np.array_equal(grad_x2_x2_val, expected_grad_x2_x2_val)
assert np.array_equal(grad_x2_x3_val, expected_grad_x2_x3_val)
def test_matmul_two_vars():
x2 = ad.Variable(name="x2")
x3 = ad.Variable(name="x3")
y = ad.matmul_op(x2, x3)
grad_x2, grad_x3 = ad.gradients(y, [x2, x3])
executor = ad.Executor([y, grad_x2, grad_x3])
x2_val = np.array([[1, 2], [3, 4], [5, 6]]) # 3x2
x3_val = np.array([[7, 8, 9], [10, 11, 12]]) # 2x3
y_val, grad_x2_val, grad_x3_val = executor.run(feed_dict={x2: x2_val, x3: x3_val})
expected_yval = np.matmul(x2_val, x3_val)
expected_grad_x2_val = np.matmul(np.ones_like(expected_yval), np.transpose(x3_val))
expected_grad_x3_val = np.matmul(np.transpose(x2_val), np.ones_like(expected_yval))
assert isinstance(y, ad.Node)
assert np.array_equal(y_val, expected_yval)
assert np.array_equal(grad_x2_val, expected_grad_x2_val)
assert np.array_equal(grad_x3_val, expected_grad_x3_val)
def test_exp():
x1 = ad.Variable("x1")
x2 = ad.exp_op(x1)
x3 = x2 + 1
x4 = x2 * x3
x1_grad, = ad.gradients(x4, [x1])
executor = ad.Executor([x4])
x1_val = 1
x4_val, x1_grad = executor.run(feed_dict={x1: x1_val})
print(x4_val)
print(x1_grad)
def test_exp_grad():
x = ad.Variable("x")
y = ad.exp_op(x)
x_grad, = ad.gradients(y, [x])
executor = ad.Executor([y, x_grad])
x_val = 1
y_val, x_grad_val = executor.run(feed_dict={x: x_val})
print(y_val)
print(x_grad_val)
def test_lr():
W = ad.Variable(name="W")
b = ad.Variable(name="b")
X = ad.Variable(name="X")
y_ = ad.Variable(name="y_")
z = ad.matmul_op(X, W) + b
loss = ad.sigmoidcrossentropy_op(z, y_)
grad_W, grad_b = ad.gradients(loss, [W, b]) | 30.808163 | 111 | 0.65461 | 1,389 | 7,548 | 3.238301 | 0.048956 | 0.094486 | 0.092486 | 0.128057 | 0.83815 | 0.814584 | 0.770787 | 0.740551 | 0.705647 | 0.694976 | 0 | 0.070904 | 0.205882 | 7,548 | 245 | 112 | 30.808163 | 0.679513 | 0.000927 | 0 | 0.444444 | 0 | 0 | 0.006367 | 0 | 0 | 0 | 0 | 0 | 0.233333 | 1 | 0.072222 | false | 0 | 0.011111 | 0 | 0.083333 | 0.022222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e6663e9ef3a6a1f1bf91e0b5ba0b646a83f99bf2 | 102 | py | Python | es_cal/gcal/__init__.py | dli-invest/earnings-ipos-stock-calendar | 056729f507e3da19f164e84797bf5a89fbc4fefc | [
"BSL-1.0"
] | null | null | null | es_cal/gcal/__init__.py | dli-invest/earnings-ipos-stock-calendar | 056729f507e3da19f164e84797bf5a89fbc4fefc | [
"BSL-1.0"
] | 5 | 2021-08-04T03:40:39.000Z | 2022-03-08T03:57:48.000Z | es_cal/gcal/__init__.py | dli-invest/earnings-ipos-stock-calendar | 056729f507e3da19f164e84797bf5a89fbc4fefc | [
"BSL-1.0"
] | 1 | 2020-12-03T04:09:23.000Z | 2020-12-03T04:09:23.000Z | from es_cal.gcal.main import get_service, make_event_in_gcal
from es_cal.gcal.utils import decode_json | 51 | 60 | 0.872549 | 20 | 102 | 4.1 | 0.7 | 0.146341 | 0.219512 | 0.317073 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078431 | 102 | 2 | 61 | 51 | 0.87234 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
0501882c8f6b444e815b38b43a8852bb2f2cc512 | 31 | py | Python | model/__init__.py | Antonioxv/zi2zi-Pytorch-Implmentation | 238f91c3784644dc9b68f6e64f5cd2b0b5ffac86 | [
"Apache-2.0"
] | 1 | 2022-02-21T09:12:07.000Z | 2022-02-21T09:12:07.000Z | model/__init__.py | Antonioxv/zi2zi-Pytorch-Implmentation | 238f91c3784644dc9b68f6e64f5cd2b0b5ffac86 | [
"Apache-2.0"
] | null | null | null | model/__init__.py | Antonioxv/zi2zi-Pytorch-Implmentation | 238f91c3784644dc9b68f6e64f5cd2b0b5ffac86 | [
"Apache-2.0"
] | null | null | null | from .model_torch import Zi2Zi
| 15.5 | 30 | 0.83871 | 5 | 31 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.129032 | 31 | 1 | 31 | 31 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
051cbe293d250be393f2d81b105cd330082c2dc5 | 32 | py | Python | NER/loss/__init__.py | xueshang-liulp/diaKG-code | 5dd3e5f5060bec94c7508efb76aa25672c685b8c | [
"Apache-2.0"
] | 10 | 2021-06-07T10:45:24.000Z | 2022-03-01T13:22:29.000Z | NER/loss/__init__.py | xueshang-liulp/diaKG-code | 5dd3e5f5060bec94c7508efb76aa25672c685b8c | [
"Apache-2.0"
] | 2 | 2021-12-03T07:31:30.000Z | 2022-03-01T07:46:53.000Z | NER/loss/__init__.py | xueshang-liulp/diaKG-code | 5dd3e5f5060bec94c7508efb76aa25672c685b8c | [
"Apache-2.0"
] | 8 | 2021-06-04T05:25:18.000Z | 2022-02-10T05:59:24.000Z | from .dice_loss import DiceLoss
| 16 | 31 | 0.84375 | 5 | 32 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
054ce2905e622a39954de49149e2837aaf0623ec | 119 | py | Python | Python3/ProgramacaoFuncional/PrimeiroExemplo.py | arthursiq5/programacao-progressiva | 2ec91602a6f37c93b99e6a92239045cd1c4cef6b | [
"MIT"
] | null | null | null | Python3/ProgramacaoFuncional/PrimeiroExemplo.py | arthursiq5/programacao-progressiva | 2ec91602a6f37c93b99e6a92239045cd1c4cef6b | [
"MIT"
] | null | null | null | Python3/ProgramacaoFuncional/PrimeiroExemplo.py | arthursiq5/programacao-progressiva | 2ec91602a6f37c93b99e6a92239045cd1c4cef6b | [
"MIT"
] | null | null | null | def apply_twice(func, arg):
return func(func(arg))
def add_five(x):
return x + 5
print(apply_twice(add_five, 10))
| 13.222222 | 32 | 0.705882 | 22 | 119 | 3.636364 | 0.545455 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029703 | 0.151261 | 119 | 8 | 33 | 14.875 | 0.762376 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.4 | 0.8 | 0.2 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
05595ef43400635cd0c7ff72b9e3d904345928fd | 32 | py | Python | services/sidecar/tests/integration/mock/osparc_python_sample.py | colinRawlings/osparc-simcore | bf2f18d5bc1e574d5f4c238d08ad15156184c310 | [
"MIT"
] | 25 | 2018-04-13T12:44:12.000Z | 2022-03-12T15:01:17.000Z | services/sidecar/tests/integration/mock/osparc_python_sample.py | colinRawlings/osparc-simcore | bf2f18d5bc1e574d5f4c238d08ad15156184c310 | [
"MIT"
] | 2,553 | 2018-01-18T17:11:55.000Z | 2022-03-31T16:26:40.000Z | services/sidecar/tests/integration/mock/osparc_python_sample.py | colinRawlings/osparc-simcore | bf2f18d5bc1e574d5f4c238d08ad15156184c310 | [
"MIT"
] | 20 | 2018-01-18T19:45:33.000Z | 2022-03-29T07:08:47.000Z | print("Hello from the Python!")
| 16 | 31 | 0.71875 | 5 | 32 | 4.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.821429 | 0 | 0 | 0 | 0 | 0 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
05600d98bef263ed0ca8012eb92d15477ffb3f58 | 41 | py | Python | features_fixer/__init__.py | LudwikBielczynski/features_fixer | 43114e3d986265a1e6e34644d3734a361d3fa926 | [
"MIT"
] | null | null | null | features_fixer/__init__.py | LudwikBielczynski/features_fixer | 43114e3d986265a1e6e34644d3734a361d3fa926 | [
"MIT"
] | null | null | null | features_fixer/__init__.py | LudwikBielczynski/features_fixer | 43114e3d986265a1e6e34644d3734a361d3fa926 | [
"MIT"
] | null | null | null | from .features_fixer import FeaturesFixer | 41 | 41 | 0.902439 | 5 | 41 | 7.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.073171 | 41 | 1 | 41 | 41 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0569daad73f4a1f3835074258efbe5a0e3204142 | 69,541 | py | Python | sample/controller/commands.py | dsuarezgarcia/freecomet | a09e86ca94957660c70269230e578f0c670775c2 | [
"MIT"
] | null | null | null | sample/controller/commands.py | dsuarezgarcia/freecomet | a09e86ca94957660c70269230e578f0c670775c2 | [
"MIT"
] | null | null | null | sample/controller/commands.py | dsuarezgarcia/freecomet | a09e86ca94957660c70269230e578f0c670775c2 | [
"MIT"
] | null | null | null | # -*- encoding: utf-8 -*-
'''
The commands module.
'''
# General imports
import copy
# Custom imports
import sample.model.utils as utils
from sample.model.canvas_model import CanvasModel, SelectedDelimiterPoint
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# Command #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class Command(object):
'''
The Command abstract class. Specific Commands inherit from this class.
'''
''' Initialization method. '''
def __init__(self, controller):
# Protected Attributes
self._controller = controller
self._data = None
# Private Attributes
self.__is_unsaved_project = controller.get_is_unsaved_project()
self.__string = ""
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Methods #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
''' Execute Command method. '''
def execute(self, *args):
raise NotImplementedError("This method must be implemented.")
''' Undo Command method. '''
def undo(self, *args):
raise NotImplementedError("This method must be implemented.")
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_is_unsaved_project(self):
return self.__is_unsaved_project
def set_is_unsaved_project(self, is_unsaved_project):
self.__is_unsaved_project = is_unsaved_project
def get_data(self):
return self._data
def set_data(self, data):
self._data = data
def get_string(self):
return self.__string
def set_string(self, string):
self.__string = string
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# AddSamplesCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class AddSamplesCommand(Command):
'''
The AddSamplesCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
new_data = []
# Redo execution
while len(self._data) > 0:
(sample, parameters) = self._data.pop()
# Add sample
self._controller.add_sample(sample, parameters)
new_data.append(sample.get_id())
# Save data
self._data = new_data
''' Command.undo() behaviour. '''
def undo(self):
new_data = []
# Undo execution
for sample_id in self._data:
# Delete sample
(sample_copy, parameters, _) = self._controller.\
delete_sample(sample_id)
new_data.append((sample_copy, parameters))
new_data.reverse()
# Save data
self._data = new_data
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# DeleteSampleCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class DeleteSampleCommand(Command):
'''
The DeleteSample class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
# Retrieve data
sample_id = self._data
# Delete sample
(sample_copy, parameters, pos) = self._controller.delete_sample(
sample_id)
# Save data
self._data = (sample_copy, parameters, pos)
''' Command.undo() behaviour. '''
def undo(self):
# Retrieve data
(sample, parameters, pos) = self._data
# Add sample
self._controller.add_sample(sample, parameters, pos)
# Save data
self._data = sample.get_id()
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# RenameSampleCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class RenameSampleCommand(Command):
'''
The RenameSampleCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
self.__rename()
''' Command.undo() behaviour. '''
def undo(self):
self.__rename()
''' Renames Sample's name with given ID. '''
def __rename(self):
# Retrieve data
(sample_id, sample_name) = self._data
# Rename Sample
previous_name = self._controller.rename_sample(
sample_id, sample_name)
# Save data
self._data = (sample_id, previous_name)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# AddCometCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class AddCometCommand(Command):
'''
The AddCometCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
# Activate Sample
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to CanvasSelectionState
self._controller.canvas_transition_to_selection_state()
# Add comet
self._controller.add_comet(
self._data.get_sample_id(),
self._data.get_comet_copy(),
self._data.get_pos()
)
# Remove the Tail CanvasContour from the CanvasContour dictionary
if self._data.get_tail_canvas_contour() is not None:
del CanvasModel.get_instance().get_tail_contour_dict()[
self._data.get_tail_canvas_contour().get_id()]
# Remove the Head CanvasContour from the CanvasContour dictionary
del CanvasModel.get_instance().get_head_contour_dict()[
self._data.get_head_canvas_contour().get_id()]
''' Command.undo() behaviour. '''
def undo(self):
# Delete Comet
(comet_copy, pos) = self._controller.delete_comet(
self._data.get_sample_id(),
self._data.get_comet_id()
)
# Activate Sample
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to CanvasEditingState
self._controller.canvas_transition_to_editing_state()
# Set analyzed flag
self._controller.set_sample_analyzed_flag(
self._data.get_sample_id(), self._data.get_analyzed_flag())
current_scale_ratio = self._controller.get_sample_zoom_value(
self._data.get_sample_id())
scale = self._data.get_scale_ratio() != current_scale_ratio
# Add the previous Tail CanvasContour (before the comet was built)
if self._data.get_tail_canvas_contour() is not None:
# If scaling is needed
if scale:
utils.scale_canvas_contour(
self._data.get_tail_canvas_contour(),
current_scale_ratio / self._data.get_scale_ratio()
)
CanvasModel.get_instance().get_tail_contour_dict()[
self._data.get_tail_canvas_contour().get_id()] = \
copy.deepcopy(self._data.get_tail_canvas_contour())
# If scaling is needed
if scale:
utils.scale_canvas_contour(
self._data.get_head_canvas_contour(),
current_scale_ratio / self._data.get_scale_ratio()
)
# Add the previous Head CanvasContour (before the comet was built)
CanvasModel.get_instance().get_head_contour_dict()[
self._data.get_head_canvas_contour().get_id()] = \
copy.deepcopy(self._data.get_head_canvas_contour())
# Save data
self._data.set_comet_copy(comet_copy)
self._data.set_pos(pos)
self._data.set_scale_ratio(current_scale_ratio)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# AddCometCommandData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class AddCometCommandData(object):
'''
The AddCometCommandData class.
'''
''' Initialization method. '''
def __init__(self, sample_id, comet_id, analyzed_flag,
tail_canvas_contour, head_canvas_contour, scale_ratio):
self.__sample_id = sample_id
self.__comet_id = comet_id
self.__comet_copy = None
self.__pos = None
self.__analyzed_flag = analyzed_flag
self.__tail_canvas_contour = tail_canvas_contour
self.__head_canvas_contour = head_canvas_contour
self.__scale_ratio = scale_ratio
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_sample_id(self):
return self.__sample_id
def set_sample_id(self, sample_id):
self.__sample_id = sample_id
def get_comet_id(self):
return self.__comet_id
def set_comet_id(self, comet_id):
self.__comet_id = comet_id
def get_comet_copy(self):
return self.__comet_copy
def set_comet_copy(self, comet_copy):
self.__comet_copy = comet_copy
def get_pos(self):
return self.__pos
def set_pos(self, pos):
self.__pos = pos
def get_analyzed_flag(self):
return self.__analyzed_flag
def set_analyzed_flag(self, analyzed_flag):
self.__analyzed_flag = analyzed_flag
def get_tail_canvas_contour(self):
return self.__tail_canvas_contour
def set_tail_canvas_contour(self, tail_canvas_contour):
self.__tail_canvas_contour = tail_canvas_contour
def get_head_canvas_contour(self):
return self.__head_canvas_contour
def set_head_canvas_contour(self, head_canvas_contour):
self.__head_canvas_contour = head_canvas_contour
def get_scale_ratio(self):
return self.__scale_ratio
def set_scale_ratio(self, scale_ratio):
self.__scale_ratio = scale_ratio
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# DeleteCometCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class DeleteCometCommand(Command):
'''
The DeleteCometCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
# Retrieve data
(sample_id, comet_id) = self._data
# Activate Sample
if self._controller.get_active_sample_id() != sample_id:
self._controller.activate_sample(sample_id)
# Transition to CanvasSelectionState
self._controller.canvas_transition_to_selection_state()
# Delete comet
(comet_copy, pos) = self._controller.delete_comet(
sample_id, comet_id)
# Save data
self._data = (sample_id, comet_copy, pos)
''' Command.undo() behaviour. '''
def undo(self):
# Retrieve data
(sample_id, comet, pos) = self._data
# Activate Sample
if self._controller.get_active_sample_id() != sample_id:
self._controller.activate_sample(sample_id)
# Transition to CanvasSelectionState
self._controller.canvas_transition_to_selection_state()
# Add comet
self._controller.add_comet(sample_id, comet, pos)
# Select comet
self._controller.select_comet(sample_id, comet.get_id())
# Save data
self._data = (sample_id, comet.get_id())
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# RemoveCometTailCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class RemoveCometTailCommand(Command):
'''
The RemoveCometTailCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
# Retrieve data
(sample_id, comet_id) = self._data
# Activate Sample
if self._controller.get_active_sample_id() != sample_id:
self._controller.activate_sample(sample_id)
# Transition to CanvasSelectionState
self._controller.canvas_transition_to_selection_state()
# Remove comet tail
comet_contour = self._controller.remove_comet_tail(
sample_id, comet_id)
# Select comet
self._controller.select_comet(sample_id, comet_id)
# Save data
self._data = (sample_id, comet_id, comet_contour)
''' Command.undo() behaviour. '''
def undo(self):
# Retrieve data
(sample_id, comet_id, comet_contour) = self._data
# Activate Sample
if self._controller.get_active_sample_id() != sample_id:
self._controller.activate_sample(sample_id)
# Transition to CanvasSelectionState
self._controller.canvas_transition_to_selection_state()
# Add comet tail
self._controller.add_comet_tail(
sample_id, comet_id, comet_contour)
# Select comet
self._controller.select_comet(sample_id, comet_id)
# Save data
self._data = (sample_id, comet_id)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# AnalyzeSamplesCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class AnalyzeSamplesCommand(Command):
'''
The AnalyzeSamplesCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
self.__update_samples_comet_list()
''' Command.undo() behaviour. '''
def undo(self):
self.__update_samples_comet_list()
''' Updates the Comets for a Sample. '''
def __update_samples_comet_list(self):
# Replace sample's comet lists
self._data = self._controller.update_samples_comet_list(self._data)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# FlipSampleImageCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class FlipSampleImageCommand(Command):
'''
The FlipSampleImageCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
self.__flip_sample_image()
''' Command.undo() behaviour. '''
def undo(self):
self.__flip_sample_image()
''' Flips the Sample's image with given ID. '''
def __flip_sample_image(self):
# Retrieve data
sample_id = self._data
# Activate Sample
if self._controller.get_active_sample_id() != sample_id:
self._controller.activate_sample(sample_id)
# Transition to CanvasSelectionState
self._controller.canvas_transition_to_selection_state()
# Flip Sample's image
self._controller.flip_sample_image(sample_id)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# InvertSampleImageCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class InvertSampleImageCommand(Command):
'''
The InvertSampleImage class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
self.__invert_sample_image()
''' Command.undo() behaviour. '''
def undo(self):
self.__invert_sample_image()
''' Inverts the Sample's image with given ID. '''
def __invert_sample_image(self):
# Retrieve data
sample_id = self._data
# Activate Sample
if self._controller.get_active_sample_id() != sample_id:
self._controller.activate_sample(sample_id)
# Transition to CanvasSelectionState
self._controller.canvas_transition_to_selection_state()
# Invert Sample's image
self._controller.invert_sample_image(sample_id)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# EditCometContoursCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class EditCometContoursCommand(Command):
'''
The EditCometContoursCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Select Comet
self._controller.select_comet(
self._data.get_sample_id(), self._data.get_comet_id())
# Scale the CanvasContours dicts if needed
current_scale_ratio = self._controller.get_sample_zoom_value(
self._data.get_sample_id())
if self._data.get_scale_ratio() != current_scale_ratio:
utils.scale_canvas_contour_dict(
self._data.get_tail_canvas_contour_dict(),
current_scale_ratio / self._data.get_scale_ratio()
)
utils.scale_canvas_contour_dict(
self._data.get_head_canvas_contour_dict(),
current_scale_ratio / self._data.get_scale_ratio()
)
self._data.set_scale_ratio(current_scale_ratio)
# Set the Comet as being edited
self._controller.start_comet_being_edited(
self._data.get_sample_id(), self._data.get_comet_id(),
copy.deepcopy(self._data.get_tail_canvas_contour_dict()),
copy.deepcopy(self._data.get_head_canvas_contour_dict())
)
''' Command.undo() behaviour. '''
def undo(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Set Comet as not being edited
self._controller.quit_comet_being_edited()
# Select Comet
self._controller.select_comet(
self._data.get_sample_id(), self._data.get_comet_id())
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# EditCometContoursCommandData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class EditCometContoursCommandData(object):
'''
The EditCometContoursCommandData class.
'''
''' Initialization method. '''
def __init__(self, sample_id, comet_id, tail_canvas_contour_dict,
head_canvas_contour_dict, scale_ratio):
self.__sample_id = sample_id
self.__comet_id = comet_id
self.__tail_canvas_contour_dict = tail_canvas_contour_dict
self.__head_canvas_contour_dict = head_canvas_contour_dict
self.__scale_ratio = scale_ratio
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_sample_id(self):
return self.__sample_id
def set_sample_id(self, sample_id):
self.__sample_id = sample_id
def get_comet_id(self):
return self.__comet_id
def set_comet_id(self, comet_id):
self.__comet_id = comet_id
def get_tail_canvas_contour_dict(self):
return self.__tail_canvas_contour_dict
def set_tail_canvas_contour_dict(self, tail_canvas_contour_dict):
self.__tail_canvas_contour_dict = tail_canvas_contour_dict
def get_head_canvas_contour_dict(self):
return self.__head_canvas_contour_dict
def set_head_canvas_contour_dict(self, head_canvas_contour_dict):
self.__head_canvas_contour_dict = head_canvas_contour_dict
def get_scale_ratio(self):
return self.__scale_ratio
def set_scale_ratio(self, scale_ratio):
self.__scale_ratio = scale_ratio
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# CancelEditCometContoursCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class CancelEditCometContoursCommand(Command):
'''
The CancelEditCometContoursCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Quit Comet being edited
self._controller.quit_comet_being_edited()
''' Command.undo() behaviour. '''
def undo(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
current_scale_ratio = self._controller.get_sample_zoom_value(
self._data.get_sample_id())
# Scale the CanvasContour dictionaries if needed
if self._data.get_scale_ratio() != current_scale_ratio:
utils.scale_canvas_contour_dict(
self._data.get_tail_canvas_contour_dict(),
current_scale_ratio / self._data.get_scale_ratio()
)
utils.scale_canvas_contour_dict(
self._data.get_head_canvas_contour_dict(),
current_scale_ratio / self._data.get_scale_ratio()
)
# Update the scale_ratio attribute
self._data.set_scale_ratio(current_scale_ratio)
# Start Comet being edited
self._controller.start_comet_being_edited(
self._data.get_sample_id(), self._data.get_comet_id(),
copy.deepcopy(self._data.get_tail_canvas_contour_dict()),
copy.deepcopy(self._data.get_head_canvas_contour_dict())
)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# CancelEditCometContoursCommandData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class CancelEditCometContoursCommandData(object):
'''
The CancelEditCometContoursCommandData class.
'''
''' Initialization method. '''
def __init__(self, sample_id, comet_id, tail_canvas_contour_dict,
head_canvas_contour_dict, scale_ratio):
self.__sample_id = sample_id
self.__comet_id = comet_id
self.__tail_canvas_contour_dict = tail_canvas_contour_dict
self.__head_canvas_contour_dict = head_canvas_contour_dict
self.__scale_ratio = scale_ratio
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_sample_id(self):
return self.__sample_id
def set_sample_id(self, sample_id):
self.__sample_id = sample_id
def get_comet_id(self):
return self.__comet_id
def set_comet_id(self, comet_id):
self.__comet_id = comet_id
def get_tail_canvas_contour_dict(self):
return self.__tail_canvas_contour_dict
def set_tail_canvas_contour_dict(self, tail_canvas_contour_dict):
self.__tail_canvas_contour_dict = tail_canvas_contour_dict
def get_head_canvas_contour_dict(self):
return self.__head_canvas_contour_dict
def set_head_canvas_contour_dict(self, head_canvas_contour_dict):
self.__head_canvas_contour_dict = head_canvas_contour_dict
def get_scale_ratio(self):
return self.__scale_ratio
def set_scale_ratio(self, scale_ratio):
self.__scale_ratio = scale_ratio
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# UpdateCometContoursCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class UpdateCometContoursCommand(Command):
'''
The UpdateCometContoursCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() implementation method. '''
def execute(self):
# Activate Sample if needed
if (self._controller.get_active_sample_id() != self._data.
get_sample_id()):
self._controller.activate_sample(self._data.get_sample_id())
# Quit Comet being edited
self._controller.quit_comet_being_edited()
# Update Comet contours
self._controller.update_comet_contours(
self._data.get_sample_id(), self._data.get_comet_id(),
self._data.get_opencv_tail_contour(),
self._data.get_opencv_head_contour()
)
''' Command.undo() implementation method. '''
def undo(self):
# Activate Sample if needed
if (self._controller.get_active_sample_id() != self._data.
get_sample_id()):
self._controller.activate_sample(self._data.get_sample_id())
current_scale_ratio = self._controller.get_sample_zoom_value(
self._data.get_sample_id())
# Scale the CanvasContour dictionaries if needed
if self._data.get_scale_ratio() != current_scale_ratio:
utils.scale_canvas_contour_dict(
self._data.get_tail_canvas_contour_dict(),
current_scale_ratio / self._data.get_scale_ratio()
)
utils.scale_canvas_contour_dict(
self._data.get_head_canvas_contour_dict(),
current_scale_ratio / self._data.get_scale_ratio()
)
# Update the scale_ratio attribute
self._data.set_scale_ratio(current_scale_ratio)
# Start Comet being edited
self._controller.start_comet_being_edited(
self._data.get_sample_id(), self._data.get_comet_id(),
copy.deepcopy(self._data.get_tail_canvas_contour_dict()),
copy.deepcopy(self._data.get_head_canvas_contour_dict())
)
# Update Comet contours
self._controller.update_comet_contours(
self._data.get_sample_id(), self._data.get_comet_id(),
self._data.get_opencv_tail_contour(),
self._data.get_opencv_head_contour()
)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# UpdateCometContoursCommandData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class UpdateCometContoursCommandData(object):
'''
The UpdateCometContoursCommandData class.
'''
''' Initialization method. '''
def __init__(self, sample_id, comet_id, opencv_tail_contour,
opencv_head_contour, tail_canvas_contour_dict,
head_canvas_contour_dict, scale_ratio):
self.__sample_id = sample_id
self.__comet_id = comet_id
self.__opencv_tail_contour = opencv_tail_contour
self.__opencv_head_contour = opencv_head_contour
self.__tail_canvas_contour_dict = tail_canvas_contour_dict
self.__head_canvas_contour_dict = head_canvas_contour_dict
self.__scale_ratio = scale_ratio
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_sample_id(self):
return self.__sample_id
def set_sample_id(self, sample_id):
self.__sample_id = sample_id
def get_comet_id(self):
return self.__comet_id
def set_comet_id(self, comet_id):
self.__comet_id = comet_id
def get_opencv_tail_contour(self):
return self.__opencv_tail_contour
def set_opencv_tail_contour(self, opencv_tail_contour):
self.__opencv_tail_contour = opencv_tail_contour
def get_opencv_head_contour(self):
return self.__opencv_head_contour
def set_opencv_head_contour(self, opencv_head_contour):
self.__opencv_head_contour = opencv_head_contour
def get_tail_canvas_contour_dict(self):
return self.__tail_canvas_contour_dict
def set_tail_canvas_contour_dict(self, tail_canvas_contour_dict):
self.__tail_canvas_contour_dict = tail_canvas_contour_dict
def get_head_canvas_contour_dict(self):
return self.__head_canvas_contour_dict
def set_head_canvas_contour_dict(self, head_canvas_contour_dict):
self.__head_canvas_contour_dict = head_canvas_contour_dict
def get_scale_ratio(self):
return self.__scale_ratio
def set_scale_ratio(self, scale_ratio):
self.__scale_ratio = scale_ratio
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# CreateDelimiterPointCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class CreateDelimiterPointCommand(Command):
'''
The CreateDelimiterPointCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to CanvasEditingState
self._controller.canvas_transition_to_editing_state()
# Recalculate DelimiterPoint coordinates based on previous and
# current zoom values
current_scale_ratio = self._controller.get_sample_zoom_value(
self._data.get_sample_id())
if self._data.get_scale_ratio() != current_scale_ratio:
# Scale and set coordinates
self._data.set_coordinates(
utils.scale_point(
self._data.get_coordinates(),
current_scale_ratio / self._data.get_scale_ratio()
)
)
self._data.set_scale_ratio(current_scale_ratio)
# 'Create DelimiterPoint' use case
if self._data.get_root_delimiter_point_id() is None:
roommate_delimiter_point = None
if self._data.get_roommate() is not None:
roommate_delimiter_point = self._data.get_roommate().\
get_delimiter_point()
# Create DelimiterPoint
delimiter_point = self._controller.create_delimiter_point(
self._data.get_builder(),
self._data.get_coordinates(),
self._data.get_delimiter_point_id(),
self._data.get_canvas_contour_id(),
roommate_delimiter_point
)
# 'Create and connect DelimiterPoint' use case
else:
root_delimiter_point = CanvasModel.get_instance().get_delimiter_point(
self._data.get_root_delimiter_point_id(),
self._data.get_delimiter_point_type(),
self._data.get_canvas_contour_id()
)
roommate_delimiter_point = None
if self._data.get_roommate() is not None:
roommate_delimiter_point = self._data.get_roommate().\
get_delimiter_point()
# Create and connect DelimiterPoint
delimiter_point = self._controller.create_and_connect_delimiter_point(
self._data.get_builder(),
root_delimiter_point,
self._data.get_coordinates(),
self._data.get_delimiter_point_id(),
roommate_delimiter_point
)
# Update Canvas
self._controller.get_view().get_main_window().get_canvas().update()
if self._data.get_comet_being_edited_has_changed() is not None:
self._controller.set_comet_being_edited_has_changed(True)
''' Command.undo() behaviour. '''
def undo(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to CanvasEditingState
self._controller.canvas_transition_to_editing_state()
# Delete DelimiterPoint
selected_delimiter_point = SelectedDelimiterPoint(
self._data.get_delimiter_point_id(),
self._data.get_delimiter_point_type(),
self._data.get_canvas_contour_id()
)
selected_delimiter_point.set_origin(self._data.get_coordinates())
self._controller.delete_delimiter_points([selected_delimiter_point])
# Update Canvas
self._controller.get_view().get_main_window().get_canvas().update()
if self._data.get_comet_being_edited_has_changed() is not None:
self._controller.set_comet_being_edited_has_changed(
self._data.get_comet_being_edited_has_changed())
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# CreateDelimiterPointCommandData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class CreateDelimiterPointCommandData(object):
'''
The CreateDelimiterPointCommandData class.
'''
''' Initialization method. '''
def __init__(self, sample_id, delimiter_point_id, delimiter_point_type,
canvas_contour_id, roommate, coordinates, builder, scale_ratio):
self.__sample_id = sample_id
self.__delimiter_point_id = delimiter_point_id
self.__delimiter_point_type = delimiter_point_type
self.__canvas_contour_id = canvas_contour_id
self.__roommate = roommate
self.__coordinates = coordinates
self.__builder = builder
self.__scale_ratio = scale_ratio
self.__root_delimiter_point_id = None
self.__comet_being_edited_has_changed = None
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_sample_id(self):
return self.__sample_id
def set_sample_id(self, sample_id):
self.__sample_id = sample_id
def get_delimiter_point_id(self):
return self.__delimiter_point_id
def set_delimiter_point_id(self, delimiter_point_id):
self.__delimiter_point_id = delimiter_point_id
def get_delimiter_point_type(self):
return self.__delimiter_point_type
def set_delimiter_point_type(self, delimiter_point_type):
self.__delimiter_point_type = delimiter_point_type
def get_canvas_contour_id(self):
return self.__canvas_contour_id
def set_canvas_contour_id(self, canvas_contour_id):
self.__canvas_contour_id = canvas_contour_id
def get_coordinates(self):
return self.__coordinates
def set_coordinates(self, coordinates):
self.__coordinates = coordinates
def get_builder(self):
return self.__builder
def set_builder(self, builder):
self.__builder = builder
def get_scale_ratio(self):
return self.__scale_ratio
def set_scale_ratio(self, scale_ratio):
self.__scale_ratio = scale_ratio
def get_root_delimiter_point_id(self):
return self.__root_delimiter_point_id
def set_root_delimiter_point_id(self, root_delimiter_point_id):
self.__root_delimiter_point_id = root_delimiter_point_id
def get_roommate(self):
return self.__roommate
def set_roommate(self, roommate):
self.__roommate = roommate
def get_comet_being_edited_has_changed(self):
return self.__comet_being_edited_has_changed
def set_comet_being_edited_has_changed(self, comet_being_edited_has_changed):
self.__comet_being_edited_has_changed = comet_being_edited_has_changed
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# DeleteDelimiterPointsCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class DeleteDelimiterPointsCommand(Command):
'''
The DeleteDelimiterPointsCommand command. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to EditingSelectionState
self._controller.canvas_transition_to_editing_selection_state()
# Prepare the list of SelectedDelimiterPoints
selected_delimiter_point_list = []
for (canvas_contour_id, (deleted_delimiter_point_data_list, _)) in self._data.get_deleted_delimiter_point_data_dict().items():
for deleted_delimiter_point_data in deleted_delimiter_point_data_list:
selected_delimiter_point_list.append(
SelectedDelimiterPoint(
deleted_delimiter_point_data.get_delimiter_point_id(),
deleted_delimiter_point_data.get_builder().POINT_TYPE,
canvas_contour_id
)
)
# Delete the DelimiterPoints
self._controller.delete_delimiter_points(selected_delimiter_point_list)
if self._data.get_comet_being_edited_has_changed() is not None:
self._controller.set_comet_being_edited_has_changed(True)
''' Command.undo() behaviour. '''
def undo(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to EditingSelectionState
self._controller.canvas_transition_to_editing_selection_state()
# Recalculate DelimiterPoint coordinates based on previous and
# current zoom values
current_scale_ratio = self._controller.get_sample_zoom_value(
self._data.get_sample_id())
recalculate_coordinates = self._data.get_scale_ratio() != current_scale_ratio
# Add the DelimiterPoints that were removed
for (canvas_contour_id, (deleted_delimiter_point_data_list, closed)) in self._data.get_deleted_delimiter_point_data_dict().items():
for deleted_delimiter_point_data in deleted_delimiter_point_data_list:
if recalculate_coordinates:
# Scale and set coordinates
deleted_delimiter_point_data.set_coordinates(
utils.scale_point(
deleted_delimiter_point_data.get_coordinates(),
current_scale_ratio / self._data.get_scale_ratio()
)
)
roommate_delimiter_point = None
if deleted_delimiter_point_data.get_roommate() is not None:
roommate_delimiter_point = deleted_delimiter_point_data.\
get_roommate().get_delimiter_point()
# Add DelimiterPoint
self._controller.create_delimiter_point(
deleted_delimiter_point_data.get_builder(),
deleted_delimiter_point_data.get_coordinates(),
deleted_delimiter_point_data.get_delimiter_point_id(),
canvas_contour_id,
roomate_delimiter_point
)
# Set the CanvasContour 'closed' value
CanvasModel.get_instance().get_canvas_contour(
deleted_delimiter_point_data.get_builder().POINT_TYPE,
canvas_contour_id
).set_closed(closed)
# Connect the DelimiterPoints as they were connected before
for (canvas_contour_id, (deleted_delimiter_point_data_list, _)) in self._data.get_deleted_delimiter_point_data_dict().items():
for deleted_delimiter_point_data in deleted_delimiter_point_data_list:
# Source DelimiterPoint
src_delimiter_point = CanvasModel.get_instance().get_delimiter_point(
deleted_delimiter_point_data.get_delimiter_point_id(),
deleted_delimiter_point_data.get_builder().POINT_TYPE,
canvas_contour_id
)
for neighbor_id in deleted_delimiter_point_data.get_neighbor_id_list():
# Destination DelimiterPoint
dst_delimiter_point = CanvasModel.get_instance().get_delimiter_point(
neighbor_id,
deleted_delimiter_point_data.get_builder().POINT_TYPE,
canvas_contour_id
)
if src_delimiter_point is not None and dst_delimiter_point is not None:
# Connect DelimiterPoints
self._controller.connect_delimiter_points(
deleted_delimiter_point_data.get_builder(),
src_delimiter_point,
dst_delimiter_point,
canvas_contour_id
)
if recalculate_coordinates:
self._data.set_scale_ratio(current_scale_ratio)
if self._data.get_comet_being_edited_has_changed() is not None:
self._controller.set_comet_being_edited_has_changed(
self._data.get_comet_being_edited_has_changed())
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# DeleteDelimiterPointsCommandData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class DeleteDelimiterPointsCommandData(object):
'''
The DeleteDelimiterPointsCommandData class.
'''
''' Initialization method. '''
def __init__(self, sample_id, deleted_delimiter_point_data_dict,
scale_ratio):
self.__sample_id = sample_id
self.__deleted_delimiter_point_data_dict = deleted_delimiter_point_data_dict
self.__scale_ratio = scale_ratio
self.__comet_being_edited_has_changed = None
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_sample_id(self):
return self.__sample_id
def set_sample_id(self, sample_id):
self.__sample_id = sample_id
def get_deleted_delimiter_point_data_dict(self):
return self.__deleted_delimiter_point_data_dict
def set_deleted_delimiter_point_data_dict(self,
deleted_delimiter_point_data_dict):
self.__deleted_delimiter_point_data_dict = \
deleted_delimiter_point_data_dict
def get_scale_ratio(self):
return self.__scale_ratio
def set_scale_ratio(self, scale_ratio):
self.__scale_ratio = scale_ratio
def get_comet_being_edited_has_changed(self):
return self.__comet_being_edited_has_changed
def set_comet_being_edited_has_changed(self, comet_being_edited_has_changed):
self.__comet_being_edited_has_changed = comet_being_edited_has_changed
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# DeletedDelimiterPointData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class DeletedDelimiterPointData(object):
'''
The DeletedDelimiterPointData class.
'''
''' Initialization method. '''
def __init__(self, delimiter_point_id, coordinates, neighbor_id_list,
roommate, builder):
self.__delimiter_point_id = delimiter_point_id
self.__coordinates = coordinates
self.__neighbor_id_list = neighbor_id_list
self.__roommate = roommate
self.__builder = builder
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_delimiter_point_id(self):
return self.__delimiter_point_id
def set_delimiter_point_id(self, delimiter_point_id):
self.__delimiter_point_id = delimiter_point_id
def get_coordinates(self):
return self.__coordinates
def set_coordinates(self, coordinates):
self.__coordinates = coordinates
def get_neighbor_id_list(self):
return self.__neighbor_id_list
def set_neighbor_id_list(self, neighbor_id_list):
self.__neighbor_id_list = neighbor_id_list
def get_roommate(self):
return self.__roommate
def set_roommate(self, roommate):
self.__roommate = roommate
def get_builder(self):
return self.__builder
def set_builder(self, builder):
self.__builder = builder
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# ConnectDelimiterPointsCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class ConnectDelimiterPointsCommand(Command):
'''
The ConnectDelimiterPointsCommand command. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to CanvasEditingState
self._controller.canvas_transition_to_editing_state()
src_delimiter_point = CanvasModel.get_instance().get_delimiter_point(
self._data.get_src_delimiter_point_id(),
self._data.get_builder().POINT_TYPE,
self._data.get_previous_src_delimiter_point_canvas_contour_id()
)
dst_delimiter_point = CanvasModel.get_instance().get_delimiter_point(
self._data.get_dst_delimiter_point_id(),
self._data.get_builder().POINT_TYPE,
self._data.get_previous_dst_delimiter_point_canvas_contour_id()
)
# Connect DelimiterPoints
self._controller.connect_delimiter_points(
self._data.get_builder(),
src_delimiter_point, dst_delimiter_point,
self._data.get_new_canvas_contour_id()
)
if self._data.get_comet_being_edited_has_changed() is not None:
self._controller.set_comet_being_edited_has_changed(True)
''' Command.undo() behaviour. '''
def undo(self):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to CanvasEditingState
self._controller.canvas_transition_to_editing_state()
src_delimiter_point = CanvasModel.get_instance().get_delimiter_point(
self._data.get_src_delimiter_point_id(),
self._data.get_builder().POINT_TYPE,
self._data.get_new_canvas_contour_id()
)
dst_delimiter_point = CanvasModel.get_instance().get_delimiter_point(
self._data.get_dst_delimiter_point_id(),
self._data.get_builder().POINT_TYPE,
self._data.get_new_canvas_contour_id()
)
# Disconnect DelimiterPoints
self._controller.disconnect_delimiter_points(
self._data.get_builder(),
src_delimiter_point, dst_delimiter_point,
self._data.get_previous_src_delimiter_point_canvas_contour_id(),
self._data.get_previous_dst_delimiter_point_canvas_contour_id()
)
if self._data.get_comet_being_edited_has_changed() is not None:
self._controller.set_comet_being_edited_has_changed(
self._data.get_comet_being_edited_has_changed())
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# ConnectDelimiterPointsCommandData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class ConnectDelimiterPointsCommandData(object):
'''
The ConnectDelimiterPointsCommandData class.
'''
''' Initialization method. '''
def __init__(self, sample_id, builder, src_delimiter_point_id,
dst_delimiter_point_id, previous_src_delimiter_point_canvas_contour_id,
previous_dst_delimiter_point_canvas_contour_id, new_canvas_contour_id):
self.__sample_id = sample_id
self.__builder = builder
self.__src_delimiter_point_id = src_delimiter_point_id
self.__dst_delimiter_point_id = dst_delimiter_point_id
self.__previous_src_delimiter_point_canvas_contour_id = \
previous_src_delimiter_point_canvas_contour_id
self.__previous_dst_delimiter_point_canvas_contour_id = \
previous_dst_delimiter_point_canvas_contour_id
self.__new_canvas_contour_id = new_canvas_contour_id
self.__comet_being_edited_has_changed = None
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_sample_id(self):
return self.__sample_id
def set_sample_id(self, sample_id):
self.__sample_id = sample_id
def get_builder(self):
return self.__builder
def set_builder(self, builder):
self.__builder = builder
def get_src_delimiter_point_id(self):
return self.__src_delimiter_point_id
def set_src_delimiter_point_id(self, src_delimiter_point_id):
self.__src_delimiter_point_id = src_delimiter_point_id
def get_dst_delimiter_point_id(self):
return self.__dst_delimiter_point_id
def set_dst_delimiter_point_id(self, dst_delimiter_point_id):
self.__dst_delimiter_point_id = dst_delimiter_point_id
def get_previous_src_delimiter_point_canvas_contour_id(self):
return self.__previous_src_delimiter_point_canvas_contour_id
def set_previous_src_delimiter_point_canvas_contour_id(self,
previous_src_delimiter_point_canvas_contour_id):
self.__previous_src_delimiter_point_canvas_contour_id = \
previous_src_delimiter_point_canvas_contour_id
def get_previous_dst_delimiter_point_canvas_contour_id(self):
return self.__previous_dst_delimiter_point_canvas_contour_id
def set_previous_dst_delimiter_point_canvas_contour_id(self,
previous_dst_delimiter_point_canvas_contour_id):
self.__previous_dst_delimiter_point_canvas_contour_id = \
previous_dst_delimiter_point_canvas_contour_id
def get_new_canvas_contour_id(self):
return self.__new_canvas_contour_id
def set_new_canvas_contour_id(self, new_canvas_contour_id):
self.__new_canvas_contour_id = new_canvas_contour_id
def get_comet_being_edited_has_changed(self):
return self.__comet_being_edited_has_changed
def set_comet_being_edited_has_changed(self, comet_being_edited_has_changed):
self.__comet_being_edited_has_changed = comet_being_edited_has_changed
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# CloseCanvasContourCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class CloseCanvasContourCommand(Command):
'''
CloseCanvasContourCommand command. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
self.__replace_delimiter_point_dict(True)
''' Command.undo() behaviour. '''
def undo(self):
self.__replace_delimiter_point_dict(
self._data.get_comet_being_edited_has_changed())
'''
Replaces the DelimiterPoint dict of the CanvasContour with given ID.
'''
def __replace_delimiter_point_dict(self, has_changed):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to CanvasEditingState
self._controller.canvas_transition_to_editing_state()
# Scale CanvasContour if needed
current_scale_ratio = self._controller.get_sample_zoom_value(
self._data.get_sample_id())
if self._data.get_scale_ratio() != current_scale_ratio:
utils.scale_canvas_contour(
self._data.get_canvas_contour(),
current_scale_ratio/self._data.get_scale_ratio()
)
canvas_contour_copy = self._data.get_builder().get_contour_dict()[
self._data.get_canvas_contour().get_id()]
self._data.get_builder().get_contour_dict()[
self._data.get_canvas_contour().get_id()] = \
copy.deepcopy(self._data.get_canvas_contour())
self._data.set_scale_ratio(current_scale_ratio)
self._data.set_canvas_contour(canvas_contour_copy)
if self._data.get_comet_being_edited_has_changed() is not None:
self._controller.set_comet_being_edited_has_changed(
has_changed)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# CloseCanvasContourCommandData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class CloseCanvasContourCommandData(object):
'''
The CloseCanvasContourCommandData class.
'''
''' Initialization method. '''
def __init__(self, sample_id, builder, canvas_contour, scale_ratio):
self.__sample_id = sample_id
self.__builder = builder
self.__canvas_contour = canvas_contour
self.__scale_ratio = scale_ratio
self.__comet_being_edited_has_changed = None
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_sample_id(self):
return self.__sample_id
def set_sample_id(self, sample_id):
self.__sample_id = sample_id
def get_builder(self):
return self.__builder
def set_builder(self, builder):
self.__builder = builder
def get_canvas_contour(self):
return self.__canvas_contour
def set_canvas_contour(self, canvas_contour):
self.__canvas_contour = canvas_contour
def get_scale_ratio(self):
return self.__scale_ratio
def set_scale_ratio(self, scale_ratio):
self.__scale_ratio = scale_ratio
def get_comet_being_edited_has_changed(self):
return self.__comet_being_edited_has_changed
def set_comet_being_edited_has_changed(self, comet_being_edited_has_changed):
self.__comet_being_edited_has_changed = comet_being_edited_has_changed
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# MoveDelimiterPointsCommand #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class MoveDelimiterPointsCommand(Command):
'''
The MoveDelimiterPointsCommand class. Extends Command.
'''
''' Initialization method. '''
def __init__(self, controller):
super().__init__(controller)
''' Command.execute() behaviour. '''
def execute(self):
self.__move_delimiter_points(True)
''' Command.undo() behaviour. '''
def undo(self):
self.__move_delimiter_points(
self._data.get_comet_being_edited_has_changed())
''' Moves the DelimiterPoints to its origin. '''
def __move_delimiter_points(self, has_changed):
# Activate Sample if needed
if self._controller.get_active_sample_id() != self._data.get_sample_id():
self._controller.activate_sample(self._data.get_sample_id())
# Transition to CanvasEditingState
self._controller.canvas_transition_to_editing_state()
# Move points to origin coordinates
self._controller.move_delimiter_points_to_origin(
self._data.get_delimiter_point_selection(),
self._data.get_scale_ratio()
)
# Set new scale ratio
self._data.set_scale_ratio(self._controller.
get_sample_zoom_value(self._data.get_sample_id()))
if self._data.get_comet_being_edited_has_changed() is not None:
self._controller.set_comet_being_edited_has_changed(
has_changed)
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# #
# MoveDelimiterPointsCommandData #
# #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
class MoveDelimiterPointsCommandData(object):
'''
The MoveDelimiterPointsCommandData class. Extends
CometBeingEditedData.
'''
''' Initialization method. '''
def __init__(self, sample_id, delimiter_point_selection, scale_ratio):
self.__sample_id = sample_id
self.__delimiter_point_selection = delimiter_point_selection
self.__scale_ratio = scale_ratio
self.__comet_being_edited_has_changed = None
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
# Getters & Setters #
# ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ #
def get_sample_id(self):
return self.__sample_id
def set_sample_id(self, sample_id):
self.__sample_id = sample_id
def get_delimiter_point_selection(self):
return self.__delimiter_point_selection
def set_delimiter_point_selection(self, delimiter_point_selection):
self.__delimiter_point_selection = delimiter_point_selection
def get_scale_ratio(self):
return self.__scale_ratio
def set_scale_ratio(self, scale_ratio):
self.__scale_ratio = scale_ratio
def get_comet_being_edited_has_changed(self):
return self.__comet_being_edited_has_changed
def set_comet_being_edited_has_changed(self, comet_being_edited_has_changed):
self.__comet_being_edited_has_changed = comet_being_edited_has_changed
| 37.029286 | 139 | 0.516573 | 6,034 | 69,541 | 5.416639 | 0.036957 | 0.050177 | 0.055868 | 0.032554 | 0.829244 | 0.78745 | 0.748837 | 0.710745 | 0.67969 | 0.634561 | 0 | 0.000046 | 0.375304 | 69,541 | 1,878 | 140 | 37.029286 | 0.752313 | 0.263744 | 0 | 0.67664 | 0 | 0 | 0.00134 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.227848 | false | 0 | 0.003452 | 0.073648 | 0.337169 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
05862edd87b9eebed296000298c34c77c58674df | 513 | py | Python | src/command/help.py | Heisenberk/Calendar | 79db33ac5a1dac5dcdab44b0032352440b0cdb6b | [
"MIT"
] | null | null | null | src/command/help.py | Heisenberk/Calendar | 79db33ac5a1dac5dcdab44b0032352440b0cdb6b | [
"MIT"
] | null | null | null | src/command/help.py | Heisenberk/Calendar | 79db33ac5a1dac5dcdab44b0032352440b0cdb6b | [
"MIT"
] | null | null | null |
# Function to print help
def printHelp():
print("\033[1m- \033[32mcalendar -h/--help \033[0m\033[39m : for help")
print("\033[1m- \033[32mcalendar -a/--add \"Title\" \"Content\" YYYY/MM/DD \033[0m\033[39m : to add a new event")
print("\033[1m- \033[32mcalendar -v/--view \033[0m\033[39m : to see all events")
print("\033[1m- \033[32mcalendar -d/--delete \"Title\" YYYY/MM/DD\033[0m\033[39m : to delete an event")
print("\033[1m- \033[32mcalendar -c/--clean \033[0m\033[39m : to clean events before today")
| 51.3 | 114 | 0.662768 | 90 | 513 | 3.777778 | 0.377778 | 0.117647 | 0.147059 | 0.191176 | 0.582353 | 0.294118 | 0.123529 | 0.123529 | 0 | 0 | 0 | 0.2 | 0.122807 | 513 | 9 | 115 | 57 | 0.555556 | 0.042885 | 0 | 0 | 0 | 0.5 | 0.796715 | 0.051335 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0 | 0 | 0.166667 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
e972ade3e5948c8e8a82028f5bb59aeeeadf8675 | 46 | py | Python | traclinks/__init__.py | kzhamaji/TracLinksPlugin | 8a7e7765859c3f192d5feb2b6602c6a5150ffc7c | [
"BSD-3-Clause"
] | null | null | null | traclinks/__init__.py | kzhamaji/TracLinksPlugin | 8a7e7765859c3f192d5feb2b6602c6a5150ffc7c | [
"BSD-3-Clause"
] | null | null | null | traclinks/__init__.py | kzhamaji/TracLinksPlugin | 8a7e7765859c3f192d5feb2b6602c6a5150ffc7c | [
"BSD-3-Clause"
] | null | null | null | from traclinks import * # @UnresolvedImport
| 23 | 45 | 0.76087 | 4 | 46 | 8.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 46 | 1 | 46 | 46 | 0.921053 | 0.369565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e98d4dc9736b02f08e5857f99b17b4a6f67f216c | 29 | py | Python | hello_world.py | codesworth/profiles-rest-api | a9d35c6bdd747d5bb4781ff8914db3f569101bbc | [
"MIT"
] | null | null | null | hello_world.py | codesworth/profiles-rest-api | a9d35c6bdd747d5bb4781ff8914db3f569101bbc | [
"MIT"
] | null | null | null | hello_world.py | codesworth/profiles-rest-api | a9d35c6bdd747d5bb4781ff8914db3f569101bbc | [
"MIT"
] | null | null | null | print("Hello Backedn Wordl")
| 14.5 | 28 | 0.758621 | 4 | 29 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 29 | 1 | 29 | 29 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0.655172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
e9c13755ad8403e8d0786111f855b6b3f7010cfd | 73 | py | Python | experiments/thompson_abstraction_optimisation_experiments.py | act65/mdps | 59f35467baa83b953ccdac5290acfcc31f33fd28 | [
"MIT"
] | null | null | null | experiments/thompson_abstraction_optimisation_experiments.py | act65/mdps | 59f35467baa83b953ccdac5290acfcc31f33fd28 | [
"MIT"
] | null | null | null | experiments/thompson_abstraction_optimisation_experiments.py | act65/mdps | 59f35467baa83b953ccdac5290acfcc31f33fd28 | [
"MIT"
] | null | null | null | import numpy as np
import numpy.random as rnd
import mdp.utils as utils
| 14.6 | 26 | 0.794521 | 14 | 73 | 4.142857 | 0.571429 | 0.37931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178082 | 73 | 4 | 27 | 18.25 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2f0206896eb2154b3270ecdb9ddbdc2a3737f8f3 | 166 | py | Python | admin_views/templatetags/admin_views_extras.py | massover/django-admin-views | cf8097f88dab72a4b8664addacdf9fa4c1649565 | [
"MIT"
] | null | null | null | admin_views/templatetags/admin_views_extras.py | massover/django-admin-views | cf8097f88dab72a4b8664addacdf9fa4c1649565 | [
"MIT"
] | null | null | null | admin_views/templatetags/admin_views_extras.py | massover/django-admin-views | cf8097f88dab72a4b8664addacdf9fa4c1649565 | [
"MIT"
] | null | null | null | from django import template
from admin_views import router
register = template.Library()
@register.simple_tag
def get_admin_views():
return router.admin_views
| 16.6 | 30 | 0.801205 | 23 | 166 | 5.565217 | 0.608696 | 0.234375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138554 | 166 | 9 | 31 | 18.444444 | 0.895105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0.166667 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
f99a58ff1006d53cf4b91f2209efdfca6800fcc2 | 2,674 | py | Python | epytope/Data/pssms/smm/mat/B_07_02_11.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 7 | 2021-02-01T18:11:28.000Z | 2022-01-31T19:14:07.000Z | epytope/Data/pssms/smm/mat/B_07_02_11.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 22 | 2021-01-02T15:25:23.000Z | 2022-03-14T11:32:53.000Z | epytope/Data/pssms/smm/mat/B_07_02_11.py | christopher-mohr/epytope | 8ac9fe52c0b263bdb03235a5a6dffcb72012a4fd | [
"BSD-3-Clause"
] | 4 | 2021-05-28T08:50:38.000Z | 2022-03-14T11:45:32.000Z | B_07_02_11 = {0: {'A': 0.004, 'C': 0.0, 'E': -0.172, 'D': 0.0, 'G': -0.596, 'F': 0.067, 'I': 0.089, 'H': 0.335, 'K': -0.039, 'M': 0.232, 'L': -0.046, 'N': 0.294, 'Q': 0.429, 'P': 0.09, 'S': -0.407, 'R': -0.168, 'T': 0.009, 'W': 0.0, 'V': -0.001, 'Y': -0.12}, 1: {'A': -0.244, 'C': 0.054, 'E': 0.037, 'D': 0.379, 'G': 0.0, 'F': 0.018, 'I': 0.16, 'H': 0.0, 'K': 0.0, 'M': 0.035, 'L': 0.2, 'N': 0.0, 'Q': 0.011, 'P': -1.106, 'S': 0.204, 'R': -0.025, 'T': 0.07, 'W': 0.0, 'V': -0.044, 'Y': 0.251}, 2: {'A': -0.0, 'C': 0.0, 'E': 0.0, 'D': 0.0, 'G': -0.0, 'F': -0.0, 'I': 0.0, 'H': 0.0, 'K': -0.0, 'M': -0.0, 'L': 0.0, 'N': 0.0, 'Q': 0.0, 'P': -0.0, 'S': 0.0, 'R': -0.0, 'T': -0.0, 'W': 0.0, 'V': -0.0, 'Y': -0.0}, 3: {'A': -0.143, 'C': 0.028, 'E': 0.022, 'D': 0.068, 'G': 0.03, 'F': 0.038, 'I': -0.101, 'H': -0.016, 'K': -0.003, 'M': -0.015, 'L': -0.022, 'N': 0.002, 'Q': -0.003, 'P': 0.051, 'S': 0.092, 'R': -0.149, 'T': 0.03, 'W': -0.003, 'V': 0.011, 'Y': 0.081}, 4: {'A': -0.0, 'C': 0.0, 'E': 0.001, 'D': 0.0, 'G': -0.002, 'F': 0.002, 'I': 0.0, 'H': 0.0, 'K': -0.002, 'M': -0.0, 'L': 0.001, 'N': -0.0, 'Q': 0.001, 'P': -0.001, 'S': 0.003, 'R': 0.0, 'T': -0.002, 'W': 0.0, 'V': 0.0, 'Y': -0.0}, 5: {'A': 0.0, 'C': 0.001, 'E': 0.0, 'D': -0.0, 'G': 0.0, 'F': 0.0, 'I': -0.001, 'H': 0.0, 'K': 0.001, 'M': 0.0, 'L': -0.002, 'N': 0.0, 'Q': 0.001, 'P': 0.001, 'S': -0.001, 'R': 0.001, 'T': -0.001, 'W': 0.0, 'V': 0.0, 'Y': -0.0}, 6: {'A': 0.074, 'C': 0.046, 'E': 0.167, 'D': -0.091, 'G': 0.018, 'F': 0.094, 'I': 0.037, 'H': -0.091, 'K': 0.0, 'M': -0.264, 'L': -0.057, 'N': -0.008, 'Q': 0.018, 'P': 0.09, 'S': 0.006, 'R': -0.266, 'T': 0.114, 'W': 0.001, 'V': 0.124, 'Y': -0.011}, 7: {'A': 0.022, 'C': 0.0, 'E': 0.006, 'D': 0.0, 'G': 0.052, 'F': 0.18, 'I': 0.171, 'H': 0.157, 'K': 0.03, 'M': 0.045, 'L': -0.033, 'N': 0.035, 'Q': 0.152, 'P': -0.336, 'S': -0.331, 'R': 0.013, 'T': -0.233, 'W': 0.001, 'V': -0.098, 'Y': 0.166}, 8: {'A': -0.159, 'C': -0.98, 'E': 0.327, 'D': 0.057, 'G': 0.125, 'F': -0.218, 'I': -0.654, 'H': 0.0, 'K': 0.086, 'M': -0.117, 'L': -0.112, 'N': 0.171, 'Q': 0.871, 'P': -0.44, 'S': 0.301, 'R': 0.091, 'T': 0.185, 'W': 0.069, 'V': 0.247, 'Y': 0.151}, 9: {'A': -0.117, 'C': -0.011, 'E': 0.405, 'D': 0.005, 'G': 0.084, 'F': 0.012, 'I': 0.041, 'H': 0.028, 'K': -0.008, 'M': 0.123, 'L': -0.169, 'N': -0.085, 'Q': 0.309, 'P': -0.305, 'S': 0.004, 'R': -0.082, 'T': -0.38, 'W': 0.0, 'V': 0.168, 'Y': -0.023}, 10: {'A': -0.038, 'C': 0.0, 'E': 0.0, 'D': 0.0, 'G': 0.0, 'F': 0.022, 'I': -0.086, 'H': 0.0, 'K': 0.027, 'M': 0.006, 'L': -0.022, 'N': 0.0, 'Q': -0.007, 'P': 0.001, 'S': -0.01, 'R': 0.046, 'T': -0.028, 'W': -0.008, 'V': 0.044, 'Y': 0.053}, -1: {'con': 4.59281}} | 2,674 | 2,674 | 0.372102 | 679 | 2,674 | 1.460972 | 0.204713 | 0.122984 | 0.018145 | 0.024194 | 0.305444 | 0.139113 | 0.139113 | 0.139113 | 0.102823 | 0.075605 | 0 | 0.346102 | 0.170157 | 2,674 | 1 | 2,674 | 2,674 | 0.100946 | 0 | 0 | 0 | 0 | 0 | 0.083364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f9dd90cb923c1317d55870c0d443e5ddc0a7e5ed | 2,761 | py | Python | libsaas/services/twilio/queues.py | MidtownFellowship/libsaas | 541bb731b996b08ede1d91a235cb82895765c38a | [
"MIT"
] | 155 | 2015-01-27T15:17:59.000Z | 2022-02-20T00:14:08.000Z | libsaas/services/twilio/queues.py | MidtownFellowship/libsaas | 541bb731b996b08ede1d91a235cb82895765c38a | [
"MIT"
] | 14 | 2015-01-12T08:22:37.000Z | 2021-06-16T19:49:31.000Z | libsaas/services/twilio/queues.py | MidtownFellowship/libsaas | 541bb731b996b08ede1d91a235cb82895765c38a | [
"MIT"
] | 43 | 2015-01-28T22:41:45.000Z | 2021-09-21T04:44:26.000Z | from libsaas import http, parsers
from libsaas.services import base
from libsaas.services.twilio import resource
class MembersBase(resource.TwilioResource):
path = 'Members'
def create(self, *args, **kwargs):
raise base.MethodNotSupported()
def delete(self, *args, **kwargs):
raise base.MethodNotSupported()
class Member(MembersBase):
pass
class Members(MembersBase):
@base.apimethod
def get(self, Page=None, PageSize=None, AfterSid=None):
"""
Fetch the list of members for a conference.
:var Page: The current page number. Zero-indexed, so the first page
is 0.
:vartype Page: int
:var PageSize: How many resources to return in each list page.
The default is 50, and the maximum is 1000.
:vartype PageSize: int
:var AfterSid: The last Sid returned in the previous page, used to
avoid listing duplicated resources if new ones are created while
paging.
:vartype AfterSid: str
"""
params = resource.get_params(None, locals())
request = http.Request('GET', self.get_url(), params)
return request, parsers.parse_json
def update(self, *args, **kwargs):
raise base.MethodNotSupported()
class QueuesBase(resource.TwilioResource):
path = 'Queues'
class Queue(QueuesBase):
def create(self, *args, **kwargs):
raise base.MethodNotSupported()
@base.resource(Members)
def members(self):
"""
Return the list of members in this queue.
"""
return Members(self)
@base.resource(Member)
def member(self, sid):
"""
Return a member in this queue.
"""
return Member(self, sid)
class Queues(QueuesBase):
@base.apimethod
def get(self, Page=None, PageSize=None, AfterSid=None):
"""
Fetch the list of conferences of an account.
:var Page: The current page number. Zero-indexed, so the first page
is 0.
:vartype Page: int
:var PageSize: How many resources to return in each list page.
The default is 50, and the maximum is 1000.
:vartype PageSize: int
:var AfterSid: The last Sid returned in the previous page, used to
avoid listing duplicated resources if new ones are created while
paging.
:vartype AfterSid: str
"""
params = resource.get_params(None, locals())
request = http.Request('GET', self.get_url(), params)
return request, parsers.parse_json
def update(self, *args, **kwargs):
raise base.MethodNotSupported()
def delete(self, *args, **kwargs):
raise base.MethodNotSupported()
| 25.803738 | 76 | 0.625498 | 331 | 2,761 | 5.199396 | 0.271903 | 0.027891 | 0.048809 | 0.066241 | 0.723998 | 0.723998 | 0.723998 | 0.718187 | 0.683905 | 0.683905 | 0 | 0.007089 | 0.284679 | 2,761 | 106 | 77 | 26.04717 | 0.864304 | 0.351684 | 0 | 0.55 | 0 | 0 | 0.012362 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.025 | 0.075 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
fb00b5ac09f3915f2cf2549396c3e16cf41eb938 | 66 | py | Python | malaya/train/model/__init__.py | ahmed3991/malaya | d90be6d5b2a1393a3f8b8b1ffa8ae676cdaa083c | [
"MIT"
] | 1 | 2021-03-19T22:42:34.000Z | 2021-03-19T22:42:34.000Z | malaya/train/model/__init__.py | ahmed3991/malaya | d90be6d5b2a1393a3f8b8b1ffa8ae676cdaa083c | [
"MIT"
] | null | null | null | malaya/train/model/__init__.py | ahmed3991/malaya | d90be6d5b2a1393a3f8b8b1ffa8ae676cdaa083c | [
"MIT"
] | null | null | null | from . import alxlnet
from . import bigbird
from . import pegasus
| 16.5 | 21 | 0.772727 | 9 | 66 | 5.666667 | 0.555556 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 66 | 3 | 22 | 22 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fb291e564221004ba101cd499fb05a03f358e1df | 126 | py | Python | docker_charon/__init__.py | gabrieldemarmiesse/charon | 8ea4680a653ef334b937dca92c0c8ca91e9ef686 | [
"MIT"
] | 12 | 2021-11-17T19:38:52.000Z | 2022-01-30T14:06:00.000Z | docker_charon/__init__.py | gabrieldemarmiesse/charon | 8ea4680a653ef334b937dca92c0c8ca91e9ef686 | [
"MIT"
] | null | null | null | docker_charon/__init__.py | gabrieldemarmiesse/charon | 8ea4680a653ef334b937dca92c0c8ca91e9ef686 | [
"MIT"
] | null | null | null | from docker_charon.decoder import BlobNotFound, ManifestNotFound, push_payload
from docker_charon.encoder import make_payload
| 42 | 78 | 0.888889 | 16 | 126 | 6.75 | 0.6875 | 0.185185 | 0.296296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079365 | 126 | 2 | 79 | 63 | 0.931034 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
34a0e9d1827a2bea8178a75eb6bd90425b4e0ec5 | 96 | py | Python | 0x0B-python-input_output/0-main.py | gogomillan/holbertonschool-higher_level_programming | 1549ffc4fdc284271684321ff6edd882a314193a | [
"MIT"
] | null | null | null | 0x0B-python-input_output/0-main.py | gogomillan/holbertonschool-higher_level_programming | 1549ffc4fdc284271684321ff6edd882a314193a | [
"MIT"
] | null | null | null | 0x0B-python-input_output/0-main.py | gogomillan/holbertonschool-higher_level_programming | 1549ffc4fdc284271684321ff6edd882a314193a | [
"MIT"
] | 1 | 2020-09-25T17:54:36.000Z | 2020-09-25T17:54:36.000Z | #!/usr/bin/python3
read_file = __import__('0-read_file').read_file
read_file("my_file_0.txt")
| 16 | 47 | 0.75 | 17 | 96 | 3.647059 | 0.529412 | 0.516129 | 0.387097 | 0.516129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033708 | 0.072917 | 96 | 5 | 48 | 19.2 | 0.662921 | 0.177083 | 0 | 0 | 0 | 0 | 0.311688 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
34ee0673655f0ba1f48e8b26bc029d4c0361777d | 37 | py | Python | HCmd/waypoints/__init__.py | sarahswinton/asdp4_hornet | a509168d7bca9070d96baaac3b60d623755aa692 | [
"Unlicense"
] | null | null | null | HCmd/waypoints/__init__.py | sarahswinton/asdp4_hornet | a509168d7bca9070d96baaac3b60d623755aa692 | [
"Unlicense"
] | null | null | null | HCmd/waypoints/__init__.py | sarahswinton/asdp4_hornet | a509168d7bca9070d96baaac3b60d623755aa692 | [
"Unlicense"
] | null | null | null | from .waypoints import WaypointsCmd
| 18.5 | 36 | 0.837838 | 4 | 37 | 7.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 37 | 1 | 37 | 37 | 0.96875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9b7a5eb1d23eb62b14e575c74761eda8a2d21eaf | 86 | py | Python | app/main/__init__.py | Burence1/times-bueno | a8bb92199d3b0a934835e076862dffcc6b6178af | [
"MIT"
] | null | null | null | app/main/__init__.py | Burence1/times-bueno | a8bb92199d3b0a934835e076862dffcc6b6178af | [
"MIT"
] | null | null | null | app/main/__init__.py | Burence1/times-bueno | a8bb92199d3b0a934835e076862dffcc6b6178af | [
"MIT"
] | null | null | null | from flask import Blueprint
main=Blueprint('main',__name__)
from.import views, errors | 21.5 | 31 | 0.813953 | 12 | 86 | 5.5 | 0.666667 | 0.393939 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 86 | 4 | 32 | 21.5 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0.045977 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
9b893c0cffc6c6c6235a1ea7a84a027dc4dab949 | 147 | py | Python | Codefights/arcade/python-arcade/level-7/47.Frequency-Analysis/Python/solution1.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | 7 | 2017-09-20T16:40:39.000Z | 2021-08-31T18:15:08.000Z | Codefights/arcade/python-arcade/level-7/47.Frequency-Analysis/Python/solution1.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | Codefights/arcade/python-arcade/level-7/47.Frequency-Analysis/Python/solution1.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | # Python3
# 有限制修改區域
from collections import Counter
def frequencyAnalysis(encryptedText):
return Counter(encryptedText).most_common(1)[0][0]
| 18.375 | 54 | 0.782313 | 17 | 147 | 6.705882 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031008 | 0.122449 | 147 | 7 | 55 | 21 | 0.852713 | 0.102041 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
9bb49453ae3d6c2d08d14fc350173597a677bc87 | 133 | py | Python | piws_api/__init__.py | rustprooflabs/piws | fa26464257a2da598878eb85af29439dc253abcb | [
"MIT"
] | 1 | 2018-05-03T01:59:55.000Z | 2018-05-03T01:59:55.000Z | piws_api/__init__.py | rustprooflabs/piws | fa26464257a2da598878eb85af29439dc253abcb | [
"MIT"
] | null | null | null | piws_api/__init__.py | rustprooflabs/piws | fa26464257a2da598878eb85af29439dc253abcb | [
"MIT"
] | null | null | null | """PiWS API sends data to TYG API service."""
from piws_api import config
from piws_api.send_data import run
LOGGER = config.LOGGER
| 22.166667 | 45 | 0.774436 | 23 | 133 | 4.347826 | 0.565217 | 0.21 | 0.22 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150376 | 133 | 5 | 46 | 26.6 | 0.884956 | 0.293233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
32e4fcf7129014879b657bc2a4f68bb6450dad43 | 198 | py | Python | models/pawn.py | otaviohrg/chess-game | 3cbec7f18419c28eb13176b889950d04c06c3187 | [
"MIT"
] | 1 | 2021-11-27T19:13:51.000Z | 2021-11-27T19:13:51.000Z | models/pawn.py | otaviohrg/chess-game | 3cbec7f18419c28eb13176b889950d04c06c3187 | [
"MIT"
] | null | null | null | models/pawn.py | otaviohrg/chess-game | 3cbec7f18419c28eb13176b889950d04c06c3187 | [
"MIT"
] | null | null | null | from .piece import Piece
class Pawn(Piece):
def __init__(self, color, x, y):
super().__init__(color, x, y)
self.symbol = 'p'
def generate_moves(self, board):
return | 22 | 37 | 0.60101 | 27 | 198 | 4.074074 | 0.666667 | 0.109091 | 0.127273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.272727 | 198 | 9 | 38 | 22 | 0.763889 | 0 | 0 | 0 | 1 | 0 | 0.005025 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.142857 | 0.714286 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
32e9066e134d7d541e3840870e9b577bf0d1af3b | 319 | py | Python | src/assignments/main_assignment2.py | acc-cosc-1336/cosc-1336-spring-2018-EricScotty | 80c0249a583dc178cfc7bb95b851d7f3240dc3e9 | [
"MIT"
] | null | null | null | src/assignments/main_assignment2.py | acc-cosc-1336/cosc-1336-spring-2018-EricScotty | 80c0249a583dc178cfc7bb95b851d7f3240dc3e9 | [
"MIT"
] | null | null | null | src/assignments/main_assignment2.py | acc-cosc-1336/cosc-1336-spring-2018-EricScotty | 80c0249a583dc178cfc7bb95b851d7f3240dc3e9 | [
"MIT"
] | null | null | null | from assignment2 import faculty_evaluation_result
'''Write code to call the faculty_evaluation_result function with data of your choice'''
print(faculty_evaluation_result (80, 30, 50, 70, 80, 90))
print(faculty_evaluation_result (80, 80, 50, 80, 80, 90))
print(faculty_evaluation_result (0, 30, 50, 70, 80, 90))
| 45.571429 | 89 | 0.758621 | 50 | 319 | 4.64 | 0.48 | 0.366379 | 0.49569 | 0.362069 | 0.482759 | 0.275862 | 0 | 0 | 0 | 0 | 0 | 0.130909 | 0.137931 | 319 | 6 | 90 | 53.166667 | 0.712727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0.75 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
32eb97a5befd33a6ee9488e286e8c65672f4b212 | 34 | py | Python | quantipy/core/tools/dp/__init__.py | avilaton/quantipy | 6ce4e5bfb22c6520164d8884fe6f83240e9baa21 | [
"MIT"
] | 67 | 2015-07-29T18:39:46.000Z | 2022-01-10T12:32:26.000Z | quantipy/core/tools/dp/__init__.py | avilaton/quantipy | 6ce4e5bfb22c6520164d8884fe6f83240e9baa21 | [
"MIT"
] | 1,052 | 2015-07-10T15:14:17.000Z | 2021-11-14T11:14:58.000Z | quantipy/core/tools/dp/__init__.py | avilaton/quantipy | 6ce4e5bfb22c6520164d8884fe6f83240e9baa21 | [
"MIT"
] | 15 | 2016-04-06T14:40:08.000Z | 2020-08-12T18:36:30.000Z | import io
import prep
import query | 11.333333 | 12 | 0.852941 | 6 | 34 | 4.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147059 | 34 | 3 | 12 | 11.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
32ef0453a2ab437cb859c2c31c6716b009916947 | 13,502 | py | Python | tests/backends/test_dbmodel.py | MattToast/SmartSim | 4bd5e231445abd9b888561930db859062708678a | [
"BSD-2-Clause"
] | null | null | null | tests/backends/test_dbmodel.py | MattToast/SmartSim | 4bd5e231445abd9b888561930db859062708678a | [
"BSD-2-Clause"
] | null | null | null | tests/backends/test_dbmodel.py | MattToast/SmartSim | 4bd5e231445abd9b888561930db859062708678a | [
"BSD-2-Clause"
] | null | null | null | import sys
import pytest
import smartsim
from smartsim import Experiment, status
from smartsim._core.utils import installed_redisai_backends
from smartsim.error.errors import SSUnsupportedError
should_run = True
try:
import tensorflow.keras as keras
from tensorflow.keras.layers import Conv2D, Input
except ImportError:
should_run = False
should_run &= "tensorflow" in installed_redisai_backends()
class Net(keras.Model):
def __init__(self):
super(Net, self).__init__(name="cnn")
self.conv = Conv2D(1, 3, 1)
def call(self, x):
y = self.conv(x)
return y
def save_tf_cnn(path, file_name):
"""Create a Keras CNN for testing purposes"""
from smartsim.ml.tf import freeze_model
n = Net()
input_shape = (3, 3, 1)
n.build(input_shape=(None, *input_shape))
inputs = Input(input_shape)
outputs = n(inputs)
model = keras.Model(inputs=inputs, outputs=outputs, name=n.name)
return freeze_model(model, path, file_name)
def create_tf_cnn():
"""Create a Keras CNN for testing purposes"""
from smartsim.ml.tf import serialize_model
n = Net()
input_shape = (3, 3, 1)
inputs = Input(input_shape)
outputs = n(inputs)
model = keras.Model(inputs=inputs, outputs=outputs, name=n.name)
return serialize_model(model)
@pytest.mark.skipif(not should_run, reason="Test needs TF to run")
def test_db_model(fileutils, wlmutils):
"""Test DB Models on remote DB"""
exp_name = "test-db-model"
# get test setup
test_dir = fileutils.make_test_dir()
sr_test_script = fileutils.get_test_conf_path("run_dbmodel_smartredis.py")
exp = Experiment(exp_name, exp_path=test_dir, launcher="local")
# create colocated model
run_settings = exp.create_run_settings(exe=sys.executable, exe_args=sr_test_script)
smartsim_model = exp.create_model("smartsim_model", run_settings)
smartsim_model.set_path(test_dir)
db = exp.create_database(port=wlmutils.get_test_port(), interface="lo")
exp.generate(db)
model, inputs, outputs = create_tf_cnn()
model_file2, inputs2, outputs2 = save_tf_cnn(test_dir, "model2.pb")
smartsim_model.add_ml_model(
"cnn",
"TF",
model=model,
device="CPU",
inputs=inputs,
outputs=outputs,
tag="test",
)
smartsim_model.add_ml_model(
"cnn2",
"TF",
model_path=model_file2,
device="CPU",
inputs=inputs2,
outputs=outputs2,
tag="test",
)
for db_model in smartsim_model._db_models:
print(db_model)
# Assert we have added both models
assert len(smartsim_model._db_models) == 2
exp.start(db, smartsim_model, block=True)
statuses = exp.get_status(smartsim_model)
exp.stop(db)
assert all([stat == status.STATUS_COMPLETED for stat in statuses])
@pytest.mark.skipif(not should_run, reason="Test needs TF to run")
def test_db_model_ensemble(fileutils, wlmutils):
"""Test DBModels on remote DB, with an ensemble"""
exp_name = "test-db-model-ensemble"
# get test setup
test_dir = fileutils.make_test_dir()
sr_test_script = fileutils.get_test_conf_path("run_dbmodel_smartredis.py")
exp = Experiment(exp_name, exp_path=test_dir, launcher="local")
# create colocated model
run_settings = exp.create_run_settings(exe=sys.executable, exe_args=sr_test_script)
smartsim_ensemble = exp.create_ensemble(
"smartsim_model", run_settings=run_settings, replicas=2
)
smartsim_ensemble.set_path(test_dir)
smartsim_model = exp.create_model("smartsim_model", run_settings)
smartsim_model.set_path(test_dir)
db = exp.create_database(port=wlmutils.get_test_port(), interface="lo")
exp.generate(db)
model, inputs, outputs = create_tf_cnn()
model_file2, inputs2, outputs2 = save_tf_cnn(test_dir, "model2.pb")
smartsim_ensemble.add_ml_model(
"cnn", "TF", model=model, device="CPU", inputs=inputs, outputs=outputs
)
for entity in smartsim_ensemble:
entity.disable_key_prefixing()
entity.add_ml_model(
"cnn2",
"TF",
model_path=model_file2,
device="CPU",
inputs=inputs2,
outputs=outputs2,
)
# Ensemble must add all available DBModels to new entity
smartsim_ensemble.add_model(smartsim_model)
smartsim_model.add_ml_model(
"cnn2",
"TF",
model_path=model_file2,
device="CPU",
inputs=inputs2,
outputs=outputs2,
)
# Assert we have added one model to the ensemble
assert len(smartsim_ensemble._db_models) == 1
# Assert we have added two models to each entity
assert all([len(entity._db_models) == 2 for entity in smartsim_ensemble])
exp.start(db, smartsim_ensemble, block=True)
statuses = exp.get_status(smartsim_ensemble)
exp.stop(db)
assert all([stat == status.STATUS_COMPLETED for stat in statuses])
@pytest.mark.skipif(not should_run, reason="Test needs TF to run")
def test_colocated_db_model(fileutils, wlmutils):
"""Test DB Models on colocated DB"""
exp_name = "test-colocated-db-model"
exp = Experiment(exp_name, launcher="local")
# get test setup
test_dir = fileutils.make_test_dir()
sr_test_script = fileutils.get_test_conf_path("run_dbmodel_smartredis.py")
# create colocated model
colo_settings = exp.create_run_settings(exe=sys.executable, exe_args=sr_test_script)
colo_model = exp.create_model("colocated_model", colo_settings)
colo_model.set_path(test_dir)
colo_model.colocate_db(
port=wlmutils.get_test_port(), db_cpus=1, limit_app_cpus=False, debug=True, ifname="lo"
)
model_file, inputs, outputs = save_tf_cnn(test_dir, "model1.pb")
model_file2, inputs2, outputs2 = save_tf_cnn(test_dir, "model2.pb")
colo_model.add_ml_model(
"cnn", "TF", model_path=model_file, device="CPU", inputs=inputs, outputs=outputs
)
colo_model.add_ml_model(
"cnn2",
"TF",
model_path=model_file2,
device="CPU",
inputs=inputs2,
outputs=outputs2,
)
# Assert we have added both models
assert len(colo_model._db_models) == 2
exp.start(colo_model, block=True)
statuses = exp.get_status(colo_model)
assert all([stat == status.STATUS_COMPLETED for stat in statuses])
@pytest.mark.skipif(not should_run, reason="Test needs TF to run")
def test_colocated_db_model_ensemble(fileutils, wlmutils):
"""Test DBModel on colocated ensembles, first colocating DB,
then adding DBModel.
"""
exp_name = "test-colocated-db-model-ensemble"
# get test setup
test_dir = fileutils.make_test_dir()
exp = Experiment(exp_name, launcher="local", exp_path=test_dir)
sr_test_script = fileutils.get_test_conf_path("run_dbmodel_smartredis.py")
# create colocated model
colo_settings = exp.create_run_settings(exe=sys.executable, exe_args=sr_test_script)
colo_ensemble = exp.create_ensemble(
"colocated_ens", run_settings=colo_settings, replicas=2
)
colo_ensemble.set_path(test_dir)
colo_model = exp.create_model("colocated_model", colo_settings)
colo_model.set_path(test_dir)
colo_model.colocate_db(
port=wlmutils.get_test_port(), db_cpus=1, limit_app_cpus=False, debug=True, ifname="lo"
)
model_file, inputs, outputs = save_tf_cnn(test_dir, "model1.pb")
model_file2, inputs2, outputs2 = save_tf_cnn(test_dir, "model2.pb")
for i, entity in enumerate(colo_ensemble):
entity.colocate_db(
port=wlmutils.get_test_port() + i, db_cpus=1, limit_app_cpus=False, debug=True, ifname="lo"
)
# Test that models added individually do not conflict with enemble ones
entity.add_ml_model(
"cnn2",
"TF",
model_path=model_file2,
device="CPU",
inputs=inputs2,
outputs=outputs2,
)
# Test adding a model from ensemble
colo_ensemble.add_ml_model(
"cnn",
"TF",
model_path=model_file,
device="CPU",
inputs=inputs,
outputs=outputs,
tag="test",
)
# Ensemble should add all available DBModels to new model
colo_ensemble.add_model(colo_model)
colo_model.colocate_db(
port=wlmutils.get_test_port() + len(colo_ensemble),
db_cpus=1,
limit_app_cpus=False,
debug=True,
ifname="lo",
)
colo_model.add_ml_model(
"cnn2",
"TF",
model_path=model_file2,
device="CPU",
inputs=inputs2,
outputs=outputs2,
)
exp.start(colo_ensemble, block=True)
statuses = exp.get_status(colo_ensemble)
assert all([stat == status.STATUS_COMPLETED for stat in statuses])
@pytest.mark.skipif(not should_run, reason="Test needs TF to run")
def test_colocated_db_model_ensemble_reordered(fileutils, wlmutils):
"""Test DBModel on colocated ensembles, first adding the DBModel to the
ensemble, then colocating DB.
"""
exp_name = "test-colocated-db-model-ensemble-reordered"
# get test setup
test_dir = fileutils.make_test_dir()
exp = Experiment(exp_name, launcher="local", exp_path=test_dir)
sr_test_script = fileutils.get_test_conf_path("run_dbmodel_smartredis.py")
# create colocated model
colo_settings = exp.create_run_settings(exe=sys.executable, exe_args=sr_test_script)
colo_ensemble = exp.create_ensemble(
"colocated_ens", run_settings=colo_settings, replicas=2
)
colo_ensemble.set_path(test_dir)
colo_model = exp.create_model("colocated_model", colo_settings)
colo_model.set_path(test_dir)
model_file, inputs, outputs = save_tf_cnn(test_dir, "model1.pb")
model_file2, inputs2, outputs2 = save_tf_cnn(test_dir, "model2.pb")
# Test adding a model from ensemble
colo_ensemble.add_ml_model(
"cnn", "TF", model_path=model_file, device="CPU", inputs=inputs, outputs=outputs
)
for i, entity in enumerate(colo_ensemble):
entity.colocate_db(
wlmutils.get_test_port() + i, db_cpus=1, limit_app_cpus=False, debug=True, ifname="lo"
)
# Test that models added individually do not conflict with enemble ones
entity.add_ml_model(
"cnn2",
"TF",
model_path=model_file2,
device="CPU",
inputs=inputs2,
outputs=outputs2,
)
# Ensemble should add all available DBModels to new model
colo_ensemble.add_model(colo_model)
colo_model.colocate_db(
port=wlmutils.get_test_port() + len(colo_ensemble),
db_cpus=1,
limit_app_cpus=False,
debug=True,
ifname="lo",
)
colo_model.add_ml_model(
"cnn2",
"TF",
model_path=model_file2,
device="CPU",
inputs=inputs2,
outputs=outputs2,
)
exp.start(colo_ensemble, block=True)
statuses = exp.get_status(colo_ensemble)
assert all([stat == status.STATUS_COMPLETED for stat in statuses])
@pytest.mark.skipif(not should_run, reason="Test needs TF to run")
def test_colocated_db_model_errors(fileutils, wlmutils):
"""Test error when colocated db model has no file."""
exp_name = "test-colocated-db-model-error"
exp = Experiment(exp_name, launcher="local")
# get test setup
test_dir = fileutils.make_test_dir()
sr_test_script = fileutils.get_test_conf_path("run_dbmodel_smartredis.py")
# create colocated model
colo_settings = exp.create_run_settings(exe=sys.executable, exe_args=sr_test_script)
colo_model = exp.create_model("colocated_model", colo_settings)
colo_model.set_path(test_dir)
colo_model.colocate_db(
port=wlmutils.get_test_port(), db_cpus=1, limit_app_cpus=False, debug=True, ifname="lo"
)
model, inputs, outputs = create_tf_cnn()
with pytest.raises(SSUnsupportedError):
colo_model.add_ml_model(
"cnn", "TF", model=model, device="CPU", inputs=inputs, outputs=outputs
)
colo_ensemble = exp.create_ensemble(
"colocated_ens", run_settings=colo_settings, replicas=2
)
colo_ensemble.set_path(test_dir)
for i, entity in enumerate(colo_ensemble):
entity.colocate_db(
port=wlmutils.get_test_port() + i, db_cpus=1, limit_app_cpus=False, debug=True, ifname="lo"
)
with pytest.raises(SSUnsupportedError):
colo_ensemble.add_ml_model(
"cnn", "TF", model=model, device="CPU", inputs=inputs, outputs=outputs
)
# Check errors for reverse order of DBModel addition and DB colocation
# create colocated model
colo_settings2 = exp.create_run_settings(
exe=sys.executable, exe_args=sr_test_script
)
# Reverse order of DBModel and model
colo_ensemble2 = exp.create_ensemble(
"colocated_ens", run_settings=colo_settings2, replicas=2
)
colo_ensemble2.set_path(test_dir)
colo_ensemble2.add_ml_model(
"cnn", "TF", model=model, device="CPU", inputs=inputs, outputs=outputs
)
for i, entity in enumerate(colo_ensemble2):
with pytest.raises(SSUnsupportedError):
entity.colocate_db(
port=wlmutils.get_test_port() + i, db_cpus=1, limit_app_cpus=False, debug=True, ifname="lo"
)
with pytest.raises(SSUnsupportedError):
colo_ensemble.add_model(colo_model)
| 31.4 | 107 | 0.679603 | 1,811 | 13,502 | 4.795693 | 0.097736 | 0.02821 | 0.018423 | 0.017732 | 0.845366 | 0.827634 | 0.806448 | 0.782153 | 0.743235 | 0.743235 | 0 | 0.008532 | 0.218708 | 13,502 | 429 | 108 | 31.473193 | 0.814769 | 0.096282 | 0 | 0.672185 | 0 | 0 | 0.071794 | 0.024592 | 0 | 0 | 0 | 0 | 0.029801 | 1 | 0.033113 | false | 0 | 0.036424 | 0 | 0.082781 | 0.003311 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fd2c71cee9027b82dae4b4e7dd0dfd4a22c5935e | 112 | py | Python | py/corrLSS/__init__.py | gdhungana/corrLSS | 0b8f09bad42542160082ee5c9445ade681e00af1 | [
"MIT"
] | null | null | null | py/corrLSS/__init__.py | gdhungana/corrLSS | 0b8f09bad42542160082ee5c9445ade681e00af1 | [
"MIT"
] | null | null | null | py/corrLSS/__init__.py | gdhungana/corrLSS | 0b8f09bad42542160082ee5c9445ade681e00af1 | [
"MIT"
] | null | null | null | """
corrLSS
=========
"""
from __future__ import absolute_import, division, print_function, unicode_literals
| 12.444444 | 82 | 0.714286 | 11 | 112 | 6.636364 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 112 | 8 | 83 | 14 | 0.744898 | 0.151786 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
fd35938bb6575fcc71a073f491f7f11e98e60204 | 36 | py | Python | phantasm/parse/__init__.py | devonnuri/phantasm | 3083f26c20cfb3bb511b5c5ee2231e1535b23e62 | [
"MIT"
] | 1 | 2020-01-16T09:19:53.000Z | 2020-01-16T09:19:53.000Z | phantasm/parse/__init__.py | devonnuri/phantasm | 3083f26c20cfb3bb511b5c5ee2231e1535b23e62 | [
"MIT"
] | null | null | null | phantasm/parse/__init__.py | devonnuri/phantasm | 3083f26c20cfb3bb511b5c5ee2231e1535b23e62 | [
"MIT"
] | 1 | 2021-12-03T14:10:12.000Z | 2021-12-03T14:10:12.000Z | from phantasm.parse.parser import *
| 18 | 35 | 0.805556 | 5 | 36 | 5.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 36 | 1 | 36 | 36 | 0.90625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fd4050993e15055bd19581087105110109da58d6 | 35,581 | py | Python | src/unit_test/scripts/nas_outgoing_svcs_ut.py | open-switch/opx-nas-linux | 073b287c7c998b0dc16bc732fa37bbdddfd69d66 | [
"CC-BY-4.0"
] | 1 | 2017-12-28T16:57:02.000Z | 2017-12-28T16:57:02.000Z | src/unit_test/scripts/nas_outgoing_svcs_ut.py | open-switch/opx-nas-linux | 073b287c7c998b0dc16bc732fa37bbdddfd69d66 | [
"CC-BY-4.0"
] | 10 | 2017-08-07T22:43:34.000Z | 2021-06-09T13:34:01.000Z | src/unit_test/scripts/nas_outgoing_svcs_ut.py | open-switch/opx-nas-linux | 073b287c7c998b0dc16bc732fa37bbdddfd69d66 | [
"CC-BY-4.0"
] | 14 | 2017-01-05T19:18:42.000Z | 2020-03-06T10:01:04.000Z | #!/usr/bin/python
#
# Copyright (c) 2019 Dell Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at http://www.apache.org/licenses/LICENSE-2.0
#
# THIS CODE IS PROVIDED ON AN *AS IS* BASIS, WITHOUT WARRANTIES OR
# CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING WITHOUT
# LIMITATION ANY IMPLIED WARRANTIES OR CONDITIONS OF TITLE, FITNESS
# FOR A PARTICULAR PURPOSE, MERCHANTABLITY OR NON-INFRINGEMENT.
#
# See the Apache Version 2.0 License for specific language governing
# permissions and limitations under the License.
#
import cps
import cps_object
import cps_utils
import sys
import subprocess
import argparse
import socket
import time
import binascii
"""
Example:
nas_outgoing_svcs_ut.py create -f ipv4 --dest-ip 1.1.1.1 -p tcp -d 41 --out-src-ip 8.8.8.8
nas_outgoing_svcs_ut.py delete -f ipv4 --dest-ip 1.1.1.1 -p tcp -d 41 --out-src-ip 8.8.8.8
nas_outgoing_svcs_ut.py info
nas_outgoing_svcs_ut.py create -f ipv4 --dest-ip 2.2.2.2 -p tcp -d 51 --out-src-ip 9.9.9.9
nas_outgoing_svcs_ut.py delete 1
nas_outgoing_svcs_ut.py create -n default -f ipv4 --dest-ip 3.3.3.3 -p tcp -d 61 --out-src-ip 8.1.1.1
nas_outgoing_svcs_ut.py default -n default -f ipv4 --dest-ip 3.3.3.3 -p tcp -d 61 --out-src-ip 8.1.1.1
nas_outgoing_svcs_ut.py create -n management -f ipv4 --dest-ip 3.3.3.3 -p tcp -d 61 --out-src-ip 8.1.1.1
nas_outgoing_svcs_ut.py default -n management -f ipv4 --dest-ip 3.3.3.3 -p tcp -d 61 --out-src-ip 8.1.1.1
"""
def parse_ip_mask(key, val):
ip_mask = val.split('/')
if len(ip_mask) < 2:
ip_addr = ip_mask[0]
else:
ip_addr, mask = ip_mask[:2]
for af in [socket.AF_INET, socket.AF_INET6]:
try:
ip_bin = socket.inet_pton(af, ip_addr)
return (af, binascii.hexlify(ip_bin))
except socket.error:
continue
return None
def parse_af(key, val):
if val.lower() == 'ipv4':
return socket.AF_INET
else:
return socket.AF_INET6
def parse_protocol(key, val):
if val.lower() == 'tcp':
return 1
elif val.lower() == 'udp':
return 2
elif val.lower() == 'icmp':
return 3
else:
return 4
arg_cps_attr_map = {
'rule_id': ('id', None),
'vrf_name': ('ni-name', None),
'addr_family': ('af', parse_af),
'protocol': ('protocol', parse_protocol),
'dst_port': ('public-port', None),
'dest_ip': (['af', 'public-ip'], parse_ip_mask),
'out_src_ip': (['af', 'outgoing-source-ip'], parse_ip_mask),
'private_port': ('private-port', None),
'private_ip': (['af', 'private-ip'], parse_ip_mask)
}
def exec_shell(cmd):
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True)
(out, err) = proc.communicate()
return out
def test_pre_req_cfg(clear = False, mgmt_ip = '10.11.70.22/8'):
#config test pre requisite - manangement vrf
mode = 'OPX'
ret = exec_shell('opx-show-version | grep \"OS_NAME.*Enterprise\"')
if ret:
mode = 'DoD'
if mode is 'DoD':
#configure the test pre requisites via CLI
if clear:
cmd_list = ['configure terminal',
'interface mgmt1/1/1',
'no ip address',
'exit',
'ip vrf management',
'no interface management',
'exit',
'no ip vrf management',
'interface mgmt1/1/1',
'ip address ' + mgmt_ip,
'end']
#configure data VRF test pre requisites via CLI
data_vrf_cmd_list = ['configure terminal',
'interface vlan 1201',
'no ip address',
'no ip vrf forwarding ',
'exit',
'no interface vlan 1201',
'no ip vrf test-vrf',
'exit',
'end']
else:
cmd_list = ['configure terminal',
'interface mgmt1/1/1',
'no ip address',
'no ipv6 address',
'exit',
'ip vrf management',
'interface management',
'exit',
'interface mgmt1/1/1',
'ip address ' + mgmt_ip,
'end']
#configure data VRF test pre requisites via CLI
data_vrf_cmd_list = ['configure terminal',
'ip vrf test-vrf',
'exit',
'interface vlan 1201',
'ip vrf forwarding test-vrf',
'ip address 121.121.121.1/24',
'end']
cfg_file = open('/tmp/test_pre_req', 'w')
for item in cmd_list:
print>>cfg_file, item
for item in data_vrf_cmd_list:
print>>cfg_file, item
cfg_file.close()
exec_shell('sudo -u admin clish --b /tmp/test_pre_req')
else:
print 'UT for BASE is not supported yet.'
parser = argparse.ArgumentParser(description = 'Tool for Outgoing IP service configuration')
parser.add_argument('operation', choices = ['create', 'delete', 'set', 'info', 'pre-cfg', 'run-test'])
parser.add_argument('rule_id', type = int, nargs = '?', help = 'Rule ID')
parser.add_argument('--clear', action = 'store_true', help = 'Cleanup pre-configuration for testing')
parser.add_argument('--mgmt-ip', help = 'Management IP address and mask for testing')
parser.add_argument('-n', '--vrf-name', default = 'default', help = 'VRF name')
parser.add_argument('-f', '--addr-family', choices = ['ipv4', 'ipv6'], help = 'Address family')
parser.add_argument('-p', '--protocol', choices = ['tcp', 'udp', 'icmp', 'all'], help = 'Protocol')
parser.add_argument('-d', '--dst-port', type = int, help = 'L4 destination port')
parser.add_argument('-i', '--seq-num', type = int, help = 'Sequence number')
parser.add_argument('-dip', '--dest-ip', help = 'Destination IP address')
parser.add_argument('-sip', '--out-src-ip', help = 'Outgoing Source IP address')
parser.add_argument('--private-ip', help = 'Private IP address')
parser.add_argument('--private-port', type = int, help = 'Private L4 destination port')
test_count = 0
def outgoing_svcs_test(is_negative_test = False, *test_args):
global parser
global test_count
if len(test_args) == 0:
args = vars(parser.parse_args())
else:
args = vars(parser.parse_args(test_args))
op = args['operation']
if op == 'pre-cfg':
#config test pre-req
clear = args['clear']
print 'Running test pre-configruation %s...' % ('cleanup ' if clear else '')
if 'mgmt_ip' in args and args['mgmt_ip'] is not None:
print 'Management IP: %s' % args['mgmt_ip']
test_pre_req_cfg(clear, args['mgmt_ip'])
else:
test_pre_req_cfg(clear)
time.sleep(20)
print 'Done with pre-configuration'
return True
if op != 'run-test':
print '*** Running %stest: %s ***' % ('negative ' if is_negative_test else '', ' '.join(test_args))
obj = cps_object.CPSObject('vrf-firewall/ns-outgoing-service')
for arg_name, arg_val in args.items():
if arg_name in arg_cps_attr_map and arg_val is not None:
attr_name, func = arg_cps_attr_map[arg_name]
if func is not None:
attr_val = func(arg_name, arg_val)
if type(attr_name) is list:
if attr_val is None:
raise RuntimeError('Failed to convert input %s' % arg_name)
if len(attr_name) != len(attr_val):
raise RuntimeError('Invalid argument %s' % arg_name)
for idx in range(len(attr_name)):
if attr_val[idx] is not None:
obj.add_attr(attr_name[idx], attr_val[idx])
else:
if attr_val is not None:
obj.add_attr(attr_name, attr_val)
else:
obj.add_attr(attr_name, arg_val)
if op == 'info':
ret_list = []
if cps.get([obj.get()], ret_list) == False:
raise RuntimeError('Failed to get object')
# exit on success for negative test
# when obj get returned empty list, return false
if not ret_list and is_negative_test is False:
print 'Info get failed'
raise RuntimeError('Failed to get object')
if ret_list and is_negative_test is True:
print 'Info get returned incorrect info'
raise RuntimeError('Failed to get correct object')
for ret_obj in ret_list:
cps_utils.print_obj(ret_obj)
elif op == 'run-test':
if run_test_outgoing_svcs() == False:
print 'UT failed'
sys.exit(1)
print 'UT success'
return True
else:
upd = (op, obj.get())
ret_val = cps_utils.CPSTransaction([upd]).commit()
# exit on success for negative test
if is_negative_test and ret_val != False:
raise RuntimeError('Operation %s should have failed but succeed' % op)
elif not is_negative_test and ret_val == False:
raise RuntimeError('Failed to %s object' % op)
if ret_val != False:
print 'Input object %s' % op
cps_utils.print_obj(ret_val[0])
test_count += 1
return True
def run_test_outgoing_svcs():
#print 'info before run-test'
#print '--------------------'
#outgoing_svcs_test(False, "info")
#print '--------------------'
try:
# test w/o any vrf name
outgoing_svcs_test(False, "create", "-f", "ipv4", "-dip", "1.1.1.1", "-p", "tcp", "-d", "41", "-sip", "8.8.8.8")
outgoing_svcs_test(False, "create", "-f", "ipv4", "-dip", "2.2.2.2", "-p", "tcp", "-d", "51", "-sip", "9.9.9.9")
outgoing_svcs_test(False, "info", "-f", "ipv4", "-dip", "1.1.1.1", "-p", "tcp", "-d", "41", "-sip", "8.8.8.8")
outgoing_svcs_test(False, "info", "-f", "ipv4", "-dip", "2.2.2.2", "-p", "tcp", "-d", "51", "-sip", "9.9.9.9")
outgoing_svcs_test(False, "delete", "-f", "ipv4", "-dip", "1.1.1.1", "-p", "tcp", "-d", "41", "-sip", "8.8.8.8")
outgoing_svcs_test(False, "delete", "-f", "ipv4", "-dip", "2.2.2.2", "-p", "tcp", "-d", "51", "-sip", "9.9.9.9")
outgoing_svcs_test(True, "info", "-f", "ipv4", "-dip", "1.1.1.1", "-p", "tcp", "-d", "41", "-sip", "8.8.8.8")
outgoing_svcs_test(True, "info", "-f", "ipv4", "-dip", "2.2.2.2", "-p", "tcp", "-d", "51", "-sip", "9.9.9.9")
# test for different protocol
outgoing_svcs_test(False, "create", "-f", "ipv4", "-dip", "1.1.2.1", "-p", "udp", "-d", "41", "-sip", "8.1.1.1")
outgoing_svcs_test(False, "info", "-f", "ipv4", "-dip", "1.1.2.1", "-p", "udp", "-d", "41", "-sip", "8.1.1.1")
outgoing_svcs_test(False, "delete", "-f", "ipv4", "-dip", "1.1.2.1", "-p", "udp", "-d", "41", "-sip", "8.1.1.1")
outgoing_svcs_test(True, "info", "-f", "ipv4", "-dip", "1.1.2.1", "-p", "udp", "-d", "41", "-sip", "8.1.1.1")
# test for config with vrf name
outgoing_svcs_test(False, "create", "-n", "default", "-f", "ipv4", "-dip", "1.1.4.1", "-p", "udp", "-d", "42", "-sip", "8.1.1.2")
outgoing_svcs_test(False, "create", "-n", "default", "-f", "ipv4", "-dip", "1.1.5.1", "-p", "tcp", "-d", "52", "-sip", "9.1.1.2")
outgoing_svcs_test(False, "info", "-n", "default", "-f", "ipv4", "-dip", "1.1.4.1", "-p", "udp", "-d", "42", "-sip", "8.1.1.2")
outgoing_svcs_test(False, "info", "-n", "default", "-f", "ipv4", "-dip", "1.1.5.1", "-p", "tcp", "-d", "52", "-sip", "9.1.1.2")
outgoing_svcs_test(False, "delete", "-n", "default", "-f", "ipv4", "-dip", "1.1.4.1", "-p", "udp", "-d", "42", "-sip", "8.1.1.2")
outgoing_svcs_test(False, "delete", "-n", "default", "-f", "ipv4", "-dip", "1.1.5.1", "-p", "tcp", "-d", "52", "-sip", "9.1.1.2")
outgoing_svcs_test(True, "info", "-n", "default", "-f", "ipv4", "-dip", "1.1.4.1", "-p", "udp", "-d", "42", "-sip", "8.1.1.2")
outgoing_svcs_test(True, "info", "-n", "default", "-f", "ipv4", "-dip", "1.1.5.1", "-p", "tcp", "-d", "52", "-sip", "9.1.1.2")
# test for config with management vrf name
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.6.1", "-p", "udp", "-d", "43", "-sip", "8.1.1.3")
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.7.1", "-p", "tcp", "-d", "53", "-sip", "9.1.1.3")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.6.1", "-p", "udp", "-d", "43", "-sip", "8.1.1.3")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.7.1", "-p", "tcp", "-d", "53", "-sip", "9.1.1.3")
outgoing_svcs_test(False, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.6.1", "-p", "udp", "-d", "43", "-sip", "8.1.1.3")
outgoing_svcs_test(False, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.7.1", "-p", "tcp", "-d", "53", "-sip", "9.1.1.3")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.6.1", "-p", "udp", "-d", "43", "-sip", "8.1.1.3")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.7.1", "-p", "tcp", "-d", "53", "-sip", "9.1.1.3")
# test for create w/o vrf name and validate for entry with 'default' vrf name
outgoing_svcs_test(False, "create", "-f", "ipv4", "-dip", "1.1.8.1", "-p", "tcp", "-d", "44", "-sip", "8.1.1.4")
outgoing_svcs_test(False, "info", "-n", "default", "-f", "ipv4", "-dip", "1.1.8.1", "-p", "tcp", "-d", "44", "-sip", "8.1.1.4")
outgoing_svcs_test(False, "delete", "-f", "ipv4", "-dip", "1.1.8.1", "-p", "tcp", "-d", "44", "-sip", "8.1.1.4")
outgoing_svcs_test(True, "info", "-f", "ipv4", "-dip", "1.1.8.1", "-p", "tcp", "-d", "44", "-sip", "8.1.1.4")
# test for create with 'default' vrf name and validate for entry w/o vrf name
outgoing_svcs_test(False, "create", "-n", "default", "-f", "ipv4", "-dip", "1.1.9.1", "-p", "tcp", "-d", "45", "-sip", "8.1.1.5")
outgoing_svcs_test(False, "info", "-f", "ipv4", "-dip", "1.1.9.1", "-p", "tcp", "-d", "45", "-sip", "8.1.1.5")
outgoing_svcs_test(False, "delete", "-f", "ipv4", "-dip", "1.1.9.1", "-p", "tcp", "-d", "45", "-sip", "8.1.1.5")
outgoing_svcs_test(True, "info", "-n", "default", "-f", "ipv4", "-dip", "1.1.9.1", "-p", "tcp", "-d", "45", "-sip", "8.1.1.5")
## Negative test
# Duplicate create (this rule should be created by application as default)
outgoing_svcs_test(False, "create", "-n", "default", "-f", "ipv4", "-dip", "1.1.10.1", "-p", "tcp", "-d", "46", "-sip", "8.1.1.6")
outgoing_svcs_test(True, "create", "-n", "default", "-f", "ipv4", "-dip", "1.1.10.1", "-p", "tcp", "-d", "46", "-sip", "8.1.1.6")
outgoing_svcs_test(False, "info", "-n", "default", "-f", "ipv4", "-dip", "1.1.10.1", "-p", "tcp", "-d", "46", "-sip", "8.1.1.6")
outgoing_svcs_test(False, "delete", "-n", "default", "-f", "ipv4", "-dip", "1.1.10.1", "-p", "tcp", "-d", "46", "-sip", "8.1.1.6")
# Delete non-existent rule
outgoing_svcs_test(True, "delete", "-n", "default", "-f", "ipv4", "-dip", "1.1.10.1", "-p", "tcp", "-d", "46", "-sip", "8.1.1.6")
## test for service binding rules
# test for service binding rules - outgoing service DNAT rules
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.11.1", "-p", "udp", "-d", "121")
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.11.1", "-p", "tcp", "-d", "122")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.11.1", "-p", "udp", "-d", "121", "--private-ip", "127.100.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.11.1", "-p", "tcp", "-d", "122", "--private-ip", "127.100.100.1", "--private-port", "62001")
outgoing_svcs_test(False, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.11.1", "-p", "udp", "-d", "121")
outgoing_svcs_test(False, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.11.1", "-p", "tcp", "-d", "122")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.11.1", "-p", "udp", "-d", "121", "--private-ip", "127.100.100.1", "--private-port", "62000")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.11.1", "-p", "tcp", "-d", "122", "--private-ip", "127.100.100.1", "--private-port", "62001")
# test for service binding rule & SNAT rule
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.12.1", "-p", "udp", "-d", "123")
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.12.1", "-p", "udp", "-d", "123", "-sip", "8.1.1.7")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.12.1", "-p", "udp", "-d", "123", "--private-ip", "127.100.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.12.1", "-p", "udp", "-d", "123", "-sip", "8.1.1.7")
outgoing_svcs_test(False, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.12.1", "-p", "udp", "-d", "123")
outgoing_svcs_test(False, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.12.1", "-p", "udp", "-d", "123", "-sip", "8.1.1.7")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.12.1", "-p", "udp", "-d", "123")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.12.1", "-p", "udp", "-d", "123", "-sip", "8.1.1.7")
# test for service binding rule & SNAT rule and operations involving service binding rules
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124")
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124", "-sip", "8.1.1.8")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124", "--private-ip", "127.100.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124", "-sip", "8.1.1.8")
outgoing_svcs_test(False, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124")
#delete of service binding rule should not delete outgoing source ip rule
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124", "-sip", "8.1.1.8")
outgoing_svcs_test(True, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124")
outgoing_svcs_test(False, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124", "-sip", "8.1.1.8")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.13.1", "-p", "tcp", "-d", "124", "-sip", "8.1.1.8")
# test for service binding rule & delete operations
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.14.1", "-p", "udp", "-d", "131")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.14.1", "-p", "udp", "-d", "131", "--private-ip", "127.100.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.14.1", "-p", "tcp", "-d", "132")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.14.1", "-p", "tcp", "-d", "132", "--private-ip", "127.100.100.1", "--private-port", "62001")
#delete previously created rules and check same private IP/port is allocated for the rule that is created after this
outgoing_svcs_test(False, "delete", "1")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.14.1", "-p", "udp", "-d", "131", "--private-ip", "127.100.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "create", "-n", "management", "-f", "ipv4", "-dip", "1.1.15.1", "-p", "tcp", "-d", "133")
outgoing_svcs_test(False, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.15.1", "-p", "tcp", "-d", "133", "--private-ip", "127.100.100.1", "--private-port", "62000")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.15.1", "-p", "tcp", "-d", "133", "--private-ip", "127.100.100.1", "--private-port", "62001")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.15.1", "-p", "tcp", "-d", "133", "--private-ip", "127.100.100.3", "--private-port", "62000")
outgoing_svcs_test(False, "delete", "2")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.14.1", "-p", "tcp", "-d", "132", "--private-ip", "127.100.100.1", "--private-port", "62001")
#delete with invalid private IP, private port combination (negative test)
outgoing_svcs_test(True, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.15.1", "-p", "tcp", "-d", "133", "--private-ip", "127.100.100.1", "--private-port", "62001")
outgoing_svcs_test(True, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.15.1", "-p", "tcp", "-d", "133", "--private-ip", "127.100.100.3", "--private-port", "62000")
#delete with valid private IP, private port combination
outgoing_svcs_test(False, "delete", "-n", "management", "-f", "ipv4", "-dip", "1.1.15.1", "-p", "tcp", "-d", "133", "--private-ip", "127.100.100.1", "--private-port", "62000")
outgoing_svcs_test(True, "info", "-n", "management", "-f", "ipv4", "-dip", "1.1.15.1", "-p", "tcp", "-d", "133", "--private-ip", "127.100.100.1", "--private-port", "62000")
# test for data vrf SNAT rules
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.16.1", "-p", "udp", "-d", "144", "-sip", "9.1.1.7")
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.17.1", "-p", "udp", "-d", "145", "-sip", "9.1.1.8")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.16.1", "-p", "udp", "-d", "144", "-sip", "9.1.1.7")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.16.1", "-p", "udp", "-d", "144", "-sip", "9.1.1.7")
outgoing_svcs_test(True, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.16.1", "-p", "udp", "-d", "144", "-sip", "9.1.1.7")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.16.1", "-p", "udp", "-d", "144", "-sip", "9.1.1.7")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.17.1", "-p", "udp", "-d", "145", "-sip", "9.1.1.8")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.17.1", "-p", "udp", "-d", "145", "-sip", "9.1.1.8")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.17.1", "-p", "udp", "-d", "145", "-sip", "9.1.1.8")
# test for data vrf tcp protocol
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.18.1", "-p", "tcp", "-d", "41", "-sip", "11.1.1.1")
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "2.2.18.2", "-p", "tcp", "-d", "51", "-sip", "12.1.1.1")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.18.1", "-p", "tcp", "-d", "41", "-sip", "11.1.1.1")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "2.2.18.2", "-p", "tcp", "-d", "51", "-sip", "12.1.1.1")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.18.1", "-p", "tcp", "-d", "41", "-sip", "11.1.1.1")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "2.2.18.2", "-p", "tcp", "-d", "51", "-sip", "12.1.1.1")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.18.1", "-p", "tcp", "-d", "41", "-sip", "11.1.1.1")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "2.2.18.2", "-p", "tcp", "-d", "51", "-sip", "12.1.1.1")
# test for data vrf different protocol
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.19.1", "-p", "udp", "-d", "41", "-sip", "13.1.1.1")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.19.1", "-p", "udp", "-d", "41", "-sip", "13.1.1.1")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.19.1", "-p", "udp", "-d", "41", "-sip", "13.1.1.1")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.19.1", "-p", "udp", "-d", "41", "-sip", "13.1.1.1")
# Negative test for data vrf config with vrf name and delete/get w/o vrf-name
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.20.1", "-p", "udp", "-d", "42", "-sip", "14.1.1.1")
outgoing_svcs_test(True, "info", "-n", "default", "-f", "ipv4", "-dip", "1.1.20.1", "-p", "udp", "-d", "42", "-sip", "14.1.1.1")
outgoing_svcs_test(True, "info", "-f", "ipv4", "-dip", "1.1.20.1", "-p", "udp", "-d", "42", "-sip", "14.1.1.1")
outgoing_svcs_test(True, "delete", "-n", "default", "-f", "ipv4", "-dip", "1.1.20.1", "-p", "udp", "-d", "42", "-sip", "14.1.1.1")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.20.1", "-p", "udp", "-d", "42", "-sip", "14.1.1.1")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.20.1", "-p", "udp", "-d", "42", "-sip", "14.1.1.1")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.20.1", "-p", "udp", "-d", "42", "-sip", "14.1.1.1")
## Negative test for data vrf
# Duplicate create (this rule should be created by application as default)
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.21.1", "-p", "tcp", "-d", "46", "-sip", "15.1.1.1")
outgoing_svcs_test(True, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.21.1", "-p", "tcp", "-d", "46", "-sip", "15.1.1.1")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.21.1", "-p", "tcp", "-d", "46", "-sip", "15.1.1.1")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.21.1", "-p", "tcp", "-d", "46", "-sip", "15.1.1.1")
# Delete non-existent rule
outgoing_svcs_test(True, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.21.1", "-p", "tcp", "-d", "46", "-sip", "15.1.1.1")
## test for data vrf service binding rules
# test for data vrf service binding rules - outgoing service DNAT rules
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.22.1", "-p", "udp", "-d", "121")
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.22.1", "-p", "tcp", "-d", "122")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.22.1", "-p", "udp", "-d", "121", "--private-ip", "127.101.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.22.1", "-p", "tcp", "-d", "122", "--private-ip", "127.101.100.1", "--private-port", "62001")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.22.1", "-p", "udp", "-d", "121")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.22.1", "-p", "tcp", "-d", "122")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.22.1", "-p", "udp", "-d", "121", "--private-ip", "127.101.100.1", "--private-port", "62000")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.22.1", "-p", "tcp", "-d", "122", "--private-ip", "127.101.100.1", "--private-port", "62001")
# test for service binding rule & SNAT rule
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.23.1", "-p", "udp", "-d", "123")
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.23.1", "-p", "udp", "-d", "123", "-sip", "16.1.1.1")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.23.1", "-p", "udp", "-d", "123", "--private-ip", "127.101.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.23.1", "-p", "udp", "-d", "123", "-sip", "16.1.1.1")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.23.1", "-p", "udp", "-d", "123")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.23.1", "-p", "udp", "-d", "123", "-sip", "16.1.1.1")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.23.1", "-p", "udp", "-d", "123")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.23.1", "-p", "udp", "-d", "123", "-sip", "16.1.1.1")
# test for data vrf service binding rule & SNAT rule and operations involving service binding rules
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.24.1", "-p", "tcp", "-d", "124")
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.24.1", "-p", "tcp", "-d", "124", "-sip", "17.1.1.1")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.24.1", "-p", "tcp", "-d", "124", "--private-ip", "127.101.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.24.1", "-p", "tcp", "-d", "124", "-sip", "17.1.1.1")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.24.1", "-p", "tcp", "-d", "124")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.24.1", "-p", "tcp", "-d", "124", "-sip", "17.1.1.1")
#data vrf test for delete of service binding rule should not delete outgoing source ip rule
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.25.1", "-p", "tcp", "-d", "124", "-sip", "18.1.1.1")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.25.1", "-p", "tcp", "-d", "124", "-sip", "18.1.1.1")
outgoing_svcs_test(True, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.25.1", "-p", "tcp", "-d", "124")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.25.1", "-p", "tcp", "-d", "124", "-sip", "18.1.1.1")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.25.1", "-p", "tcp", "-d", "124")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.25.1", "-p", "tcp", "-d", "124", "-sip", "18.1.1.1")
# test for data vrf service binding rule & delete operations
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.26.1", "-p", "udp", "-d", "131")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.26.1", "-p", "udp", "-d", "131", "--private-ip", "127.101.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.26.1", "-p", "tcp", "-d", "132")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.26.1", "-p", "tcp", "-d", "132", "--private-ip", "127.101.100.1", "--private-port", "62001")
#delete previously created rules and check same private IP/port is allocated for the rule that is created after this
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.26.1", "-p", "udp", "-d", "131")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.26.1", "-p", "udp", "-d", "131", "--private-ip", "127.101.100.1", "--private-port", "62000")
outgoing_svcs_test(False, "create", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.27.1", "-p", "tcp", "-d", "133")
outgoing_svcs_test(False, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.27.1", "-p", "tcp", "-d", "133", "--private-ip", "127.101.100.1", "--private-port", "62000")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.27.1", "-p", "tcp", "-d", "133", "--private-ip", "127.101.100.1", "--private-port", "62001")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.27.1", "-p", "tcp", "-d", "133", "--private-ip", "127.101.100.3", "--private-port", "62000")
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.26.1", "-p", "tcp", "-d", "132", "--private-ip", "127.101.100.1", "--private-port", "62001")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.26.1", "-p", "tcp", "-d", "132", "--private-ip", "127.101.100.1", "--private-port", "62001")
#delete with invalid private IP, private port combination (negative test)
outgoing_svcs_test(True, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.27.1", "-p", "tcp", "-d", "133", "--private-ip", "127.101.100.1", "--private-port", "62001")
outgoing_svcs_test(True, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.27.1", "-p", "tcp", "-d", "133", "--private-ip", "127.101.100.3", "--private-port", "62000")
#delete with valid private IP, private port combination
outgoing_svcs_test(False, "delete", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.27.1", "-p", "tcp", "-d", "133", "--private-ip", "127.101.100.1", "--private-port", "62000")
outgoing_svcs_test(True, "info", "-n", "test-vrf", "-f", "ipv4", "-dip", "1.1.27.1", "-p", "tcp", "-d", "133", "--private-ip", "127.101.100.1", "--private-port", "62000")
except RuntimeError as ex:
print 'UT failed: %s' % ex
return False
finally:
print 'Finished tests: %d' % test_count
#print 'info after run-test'
#print '--------------------'
#outgoing_svcs_test(False, "info")
#print '--------------------'
print 'All UT finished'
return True
if __name__ == '__main__':
try:
outgoing_svcs_test()
except RuntimeError as ex:
print 'Failed: %s' % ex
sys.exit(1)
| 70.318182 | 183 | 0.514404 | 5,394 | 35,581 | 3.29366 | 0.064887 | 0.033885 | 0.147698 | 0.075988 | 0.771924 | 0.740572 | 0.722391 | 0.715411 | 0.70618 | 0.69481 | 0 | 0.085876 | 0.209634 | 35,581 | 505 | 184 | 70.457426 | 0.545872 | 0.086704 | 0 | 0.150273 | 0 | 0 | 0.35196 | 0.001737 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.02459 | null | null | 0.04918 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fd582ff8113c0637795be1fa8ae59af9e7216a44 | 194 | py | Python | actstream/tests/__init__.py | tcdent/django-activity-stream | f8b4fb80683dcae54b9795ba7d43f6827328fe75 | [
"BSD-3-Clause"
] | null | null | null | actstream/tests/__init__.py | tcdent/django-activity-stream | f8b4fb80683dcae54b9795ba7d43f6827328fe75 | [
"BSD-3-Clause"
] | null | null | null | actstream/tests/__init__.py | tcdent/django-activity-stream | f8b4fb80683dcae54b9795ba7d43f6827328fe75 | [
"BSD-3-Clause"
] | null | null | null | from .test_gfk import GFKManagerTestCase
from .test_zombies import ZombieTest
from .test_activity import ActivityTestCase
from .test_feeds import FeedsTestCase
from .test_views import ViewsTest
| 32.333333 | 43 | 0.871134 | 25 | 194 | 6.56 | 0.52 | 0.243902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103093 | 194 | 5 | 44 | 38.8 | 0.942529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b5be9f1ca0d9827439331876732b013b08177782 | 105 | py | Python | tests/modules/pkg1/sub/__main__.py | AboudFayad/coveragepy | b9c84eb813ad23f053120b8dad31b635c81376ae | [
"Apache-2.0"
] | 2 | 2021-03-29T19:55:15.000Z | 2021-11-15T12:30:19.000Z | tests/modules/pkg1/sub/__main__.py | AboudFayad/coveragepy | b9c84eb813ad23f053120b8dad31b635c81376ae | [
"Apache-2.0"
] | 1 | 2018-11-04T10:30:13.000Z | 2018-11-04T10:30:13.000Z | tests/modules/pkg1/sub/__main__.py | AboudFayad/coveragepy | b9c84eb813ad23f053120b8dad31b635c81376ae | [
"Apache-2.0"
] | 2 | 2018-02-27T08:56:41.000Z | 2020-12-22T22:10:38.000Z | # Used in the tests for run_python_module
import sys
print("pkg1.sub.__main__: passed %s" % sys.argv[1])
| 26.25 | 51 | 0.742857 | 19 | 105 | 3.789474 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021978 | 0.133333 | 105 | 3 | 52 | 35 | 0.769231 | 0.371429 | 0 | 0 | 0 | 0 | 0.4375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 6 |
9500fb491bf32035676fb7418dbbf65c4f6a66a0 | 72 | py | Python | alfred/views/__init__.py | Sefrwahed/Alfred | 0b77ec547fb665ef29fe1a3b7e1c4ad30c31170d | [
"MIT"
] | 5 | 2016-09-06T10:29:24.000Z | 2017-02-22T14:07:48.000Z | alfred/views/__init__.py | Sefrwahed/Alfred | 0b77ec547fb665ef29fe1a3b7e1c4ad30c31170d | [
"MIT"
] | 66 | 2016-09-06T06:40:24.000Z | 2022-03-11T23:18:05.000Z | alfred/views/__init__.py | Sefrwahed/Alfred | 0b77ec547fb665ef29fe1a3b7e1c4ad30c31170d | [
"MIT"
] | 3 | 2016-10-06T15:17:38.000Z | 2016-12-04T13:25:53.000Z | from .main_widget import MainWidget
from .main_window import MainWindow
| 24 | 35 | 0.861111 | 10 | 72 | 6 | 0.7 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 72 | 2 | 36 | 36 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9517bd6704f3a0171c0d7d16d524bd468db665e7 | 251 | py | Python | src/django/api/checks.py | azavea/open-apparel-registry | 20f7a6d502d9152c85ee7f2696b25b6badf98924 | [
"MIT"
] | 32 | 2019-01-26T05:04:03.000Z | 2022-03-11T15:09:09.000Z | src/django/api/checks.py | azavea/open-apparel-registry | 20f7a6d502d9152c85ee7f2696b25b6badf98924 | [
"MIT"
] | 1,586 | 2019-01-15T21:54:42.000Z | 2022-03-31T17:38:14.000Z | src/django/api/checks.py | azavea/open-apparel-registry | 20f7a6d502d9152c85ee7f2696b25b6badf98924 | [
"MIT"
] | 7 | 2019-02-28T03:32:46.000Z | 2021-11-04T17:03:46.000Z | from watchman.decorators import check
from api.matching import GazetteerCache
@check
def _check_gazetteercache():
GazetteerCache.get_latest()
return {'ok': True}
def gazetteercache():
return {'gazetteercache': _check_gazetteercache()}
| 19.307692 | 54 | 0.760956 | 26 | 251 | 7.153846 | 0.538462 | 0.204301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.143426 | 251 | 12 | 55 | 20.916667 | 0.865116 | 0 | 0 | 0 | 0 | 0 | 0.063745 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.25 | 0.125 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
20f9bc3ba9a892805c2b0cb62a7cd96646a9b114 | 23,694 | py | Python | piwebapi_samples/Python/create_sandbox.py | osi-amit/OSI-Samples-PI-System | fe1f86e73386e0ef99c9cc9170bec630e6bed409 | [
"Apache-2.0"
] | 2 | 2021-04-01T09:46:12.000Z | 2021-04-01T09:46:16.000Z | piwebapi_samples/Python/create_sandbox.py | osi-amit/OSI-Samples-PI-System | fe1f86e73386e0ef99c9cc9170bec630e6bed409 | [
"Apache-2.0"
] | 10 | 2021-03-11T00:09:23.000Z | 2022-03-02T09:19:43.000Z | piwebapi_samples/Python/create_sandbox.py | osi-amit/OSI-Samples-PI-System | fe1f86e73386e0ef99c9cc9170bec630e6bed409 | [
"Apache-2.0"
] | 1 | 2020-10-14T15:34:01.000Z | 2020-10-14T15:34:01.000Z | """ This script creates and deletes a PI Web API Asset database, AF category,
AF Template and AF Element, creating a sandbox used by the other methods
When creating the sandbox, the following order must be followed:
create_database, create_category, create_template, create_element
This python script requires some pre-requisites:
1. A back-end server with PI WEB API with CORS enabled.
"""
import json
import getpass
import requests
from requests.auth import HTTPBasicAuth
from requests_kerberos import HTTPKerberosAuth
OSI_AF_ATTRIBUTE_TAG = 'OSIPythonAttributeSampleTag'
OSI_AF_CATEGORY = 'OSIPythonCategory'
OSI_AF_DATABASE = 'OSIPythonDatabase'
OSI_AF_ELEMENT = 'OSIPythonElement'
OSI_AF_TEMPLATE = 'OSIPythonTemplate'
OSI_TAG = 'OSIPythonSampleTag'
OSI_TAG_SINUSOID = 'OSIPythonAttributeSinusoid'
OSI_TAG_SINUSOIDU = 'OSIPythonAttributeSinusoidU'
def call_headers(include_content_type):
""" Create API call headers
@includeContentType boolean: flag determines whether or not the
content-type header is included
"""
if include_content_type is True:
header = {
'content-type': 'application/json',
'X-Requested-With': 'XmlHttpRequest'
}
else:
header = {
'X-Requested-With': 'XmlHttpRequest'
}
return header
def call_security_method(security_method, user_name, user_password):
""" Create API call security method
@param security_method string: security method to use: basic or kerberos
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
"""
if security_method.lower() == 'basic':
security_auth = HTTPBasicAuth(user_name, user_password)
else:
security_auth = HTTPKerberosAuth(mutual_authentication='REQUIRED',
sanitize_mutual_error_response=False)
return security_auth
def create_sandbox(piwebapi_url, asset_server, pi_server, user_name, user_password,
piwebapi_security_method):
""" Create the sandbox. Calls methods to create the structure needed by the other calls.
@param piwebapi_url string: the URL of the PI Web API
@param asset_server string: Name of the Asset Server
@param pi_server string: Name of the PI Server
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
@param piwebapi_security_method string: Security method: basic or kerberos
"""
create_database(piwebapi_url, asset_server, user_name,
user_password, piwebapi_security_method)
create_category(piwebapi_url, asset_server, user_name,
user_password, piwebapi_security_method)
create_template(piwebapi_url, asset_server, pi_server,
user_name, user_password, piwebapi_security_method)
create_element(piwebapi_url, asset_server, user_name,
user_password, piwebapi_security_method)
delete_element(piwebapi_url, asset_server, user_name,
user_password, piwebapi_security_method)
delete_template(piwebapi_url, asset_server, user_name,
user_password, piwebapi_security_method)
delete_category(piwebapi_url, asset_server, user_name,
user_password, piwebapi_security_method)
delete_database(piwebapi_url, asset_server, user_name,
user_password, piwebapi_security_method)
def create_database(piwebapi_url, asset_server, user_name, user_password, piwebapi_security_method):
""" Create Python Web API Sample database
@param piwebapi_url string: the URL of the PI Web API
@param asset_server string: Name of the Asset Server
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
@param piwebapi_security_method string: Security method: basic or kerberos
"""
print('Create Database')
# create security method - basic or kerberos
security_method = call_security_method(
piwebapi_security_method, user_name, user_password)
# Get AF Server
response = requests.get(piwebapi_url + '/assetservers?path=\\\\' + asset_server,
auth=security_method, verify=False)
# Only continue if the first request was successful
if response.status_code == 200:
# Deserialize the JSON Response
data = json.loads(response.text)
# Create the body for the request
request_body = {
'Name': OSI_AF_DATABASE,
'Description': 'Database for Python Web API',
'ExtendedProperties': {}
}
# Create a header
header = call_headers(True)
# Create the database
response = requests.post(data['Links']['Self'] + '/assetdatabases',
auth=security_method, verify=False,
json=request_body, headers=header)
if response.status_code == 201:
print('Database {} created'.format(OSI_AF_DATABASE))
else:
print(response.status_code, response.reason, response.text)
else:
print(response.status_code, response.reason, response.text)
return response.status_code
def create_category(piwebapi_url, asset_server, user_name, user_password, piwebapi_security_method):
""" Create an AF Category
@param piwebapi_url string: the URL of the PI Web API
@param asset_server string: Name of the Asset Server
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
@param piwebapi_security_method string: Security method: basic or kerberos
"""
print('Create Category')
# create security method - basic or kerberos
security_method = call_security_method(
piwebapi_security_method, user_name, user_password)
# Get the database
request_url = '{}/assetdatabases?path=\\\\{}\\{}'.format(
piwebapi_url, asset_server, OSI_AF_DATABASE)
response = requests.get(request_url, auth=security_method, verify=False)
# Only continue if the first request was successful
if response.status_code == 200:
# Deserialize the JSON Response
data = json.loads(response.text)
# Create the body for the request
request_body = {
'Name': OSI_AF_CATEGORY,
'Description': '{} category'.format(OSI_AF_CATEGORY)
}
# Create a header
header = call_headers(True)
# Create the element category
response = requests.post(data['Links']['Self'] + '/elementcategories',
auth=security_method, verify=False, json=request_body, headers=header)
if response.status_code == 201:
print('Category {} created'.format(OSI_AF_CATEGORY))
else:
print(response.status_code, response.reason, response.text)
else:
print(response.status_code, response.reason, response.text)
return response.status_code
def create_template(piwebapi_url, asset_server, pi_server, user_name, user_password,
piwebapi_security_method):
""" Create an AF template
@param piwebapi_url string: the URL of the PI Web API
@param asset_server string: Name of the Asset Server
@param pi_server string: Name of the PI Server
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
@param piwebapi_security_method string: Security method: basic or kerberos
"""
print('Create Template')
# create security method - basic or kerberos
security_method = call_security_method(
piwebapi_security_method, user_name, user_password)
# Get the database
request_url = '{}/assetdatabases?path=\\\\{}\\{}'.format(
piwebapi_url, asset_server, OSI_AF_DATABASE)
response = requests.get(request_url, auth=security_method, verify=False)
# Only continue if the first request was successful
if response.status_code == 200:
# Deserialize the JSON Response
data = json.loads(response.text)
# Create the body for the request
request_body = {
'Name': OSI_AF_TEMPLATE,
'Description': '{} Template'.format(OSI_AF_TEMPLATE),
'CategoryNames': [OSI_AF_CATEGORY],
'AllowElementToExtend': True
}
# Create a header
header = call_headers(True)
# Create the element template
response = requests.post(data['Links']['Self'] + '/elementtemplates', auth=security_method,
verify=False, json=request_body, headers=header)
# If the template was created, add attributes
if response.status_code == 201:
print('Template {} created'.format(OSI_AF_TEMPLATE))
# Get the newly created machine template
request_url = '{}/elementtemplates?path=\\\\{}\\{}\\ElementTemplates[{}]'.format(
piwebapi_url, asset_server, OSI_AF_DATABASE, OSI_AF_TEMPLATE)
response = requests.get(
request_url, auth=security_method, verify=False)
data = json.loads(response.text)
# Add templte attributes
response = requests.post(data['Links']['Self'] + '/attributetemplates',
auth=security_method, verify=False,
json={'Name': 'Active', 'Description': '',
'IsConfigurationItem': True, 'Type': 'Boolean'},
headers=header)
response = requests.post(data['Links']['Self'] + '/attributetemplates',
auth=security_method, verify=False,
json={'Name': 'OS', 'Description': 'Operating System',
'IsConfigurationItem': True, 'Type': 'String'},
headers=header)
response = requests.post(data['Links']['Self'] + '/attributetemplates',
auth=security_method, verify=False,
json={'Name': 'OSVersion',
'Description': 'Operating System Version',
'IsConfigurationItem': True, 'Type': 'String'},
headers=header)
response = requests.post(data['Links']['Self'] + '/attributetemplates',
auth=security_method, verify=False,
json={'Name': 'IPAddresses',
'Description': 'A list of IP Addresses for all NIC',
'IsConfigurationItem': True, 'Type': 'String'},
headers=header)
# Add Sinusoid U
response = requests.post(data['Links']['Self'] + '/attributetemplates',
auth=security_method, verify=False,
json={'Name': OSI_TAG_SINUSOID,
'Description': '', 'IsConfigurationItem': False,
'Type': 'Double', 'DataReferencePlugIn': 'PI Point',
'ConfigString': '\\\\' + pi_server + '\\SinusoidU'},
headers=header)
# Add Sinusoid
response = requests.post(data['Links']['Self'] + '/attributetemplates',
auth=security_method, verify=False,
json={'Name': OSI_TAG_SINUSOIDU, 'Description': '',
'IsConfigurationItem': False, 'Type': 'Double',
'DataReferencePlugIn': 'PI Point',
'ConfigString': '\\\\' + pi_server + '\\Sinusoid'},
headers=header)
# Add the sampleTag attribute
response = requests.post(data['Links']['Self'] + '/attributetemplates',
auth=security_method, verify=False,
json={'Name': OSI_AF_ATTRIBUTE_TAG, 'Description': '',
'IsConfigurationItem': False, 'Type': 'Double',
'DataReferencePlugIn': 'PI Point',
'ConfigString': '\\\\' + pi_server +
'\\%Element%_{};ReadOnly=False;'.format(OSI_TAG) +
'ptclassname=classic;pointtype=Float64;' +
'pointsource=webapi'},
headers=header)
if response.status_code == 201:
print('Template {} created'.format(OSI_AF_TEMPLATE))
else:
print(response.status_code, response.reason, response.text)
else:
print(response.status_code, response.reason, response.text)
else:
print(response.status_code, response.reason, response.text)
return response.status_code
def create_element(piwebapi_url, asset_server, user_name, user_password, piwebapi_security_method):
""" Create an AF element
@param piwebapi_url string: the URL of the PI Web API
@param asset_server string: Name of the Asset Server
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
@param piwebapi_security_method string: Security method: basic or kerberos
"""
print('Create Element')
# create security method - basic or kerberos
security_method = call_security_method(
piwebapi_security_method, user_name, user_password)
# Get the sample database
request_url = '{}/assetdatabases?path=\\\\{}\\{}'.format(
piwebapi_url, asset_server, OSI_AF_DATABASE)
response = requests.get(request_url, auth=security_method, verify=False)
# Only continue if the first request was successful
if response.status_code == 200:
# Deserialize the JSON Response
data = json.loads(response.text)
# create a body for the request
request_body = {
'Name': OSI_AF_ELEMENT,
'Description': '{} element'.format(OSI_AF_ELEMENT),
'TemplateName': OSI_AF_TEMPLATE,
'ExtendedProperties': {}
}
# Create a header that passes in json
header = call_headers(True)
# Create the element
response = requests.post(data['Links']['Self'] + '//elements', auth=security_method,
verify=False, json=request_body, headers=header)
if response.status_code == 201:
print('Equipment {} created'.format(OSI_AF_ELEMENT))
# Get the newly created element
request_url = '{}/elements?path=\\\\{}\\{}\\{}'.format(
piwebapi_url, asset_server, OSI_AF_DATABASE, OSI_AF_ELEMENT)
response = requests.get(
request_url, auth=security_method, verify=False)
data = json.loads(response.text)
# Create the tags based on the template configuration
response = requests.post(piwebapi_url + '/elements/' + data['WebId'] + '/config',
auth=security_method, verify=False,
json={'includeChildElements': True}, headers=header)
print(json.dumps(json.loads(response.text), indent=4, sort_keys=True))
else:
print(response.status_code, response.reason, response.text)
else:
print(response.status_code, response.reason, response.text)
return response.status_code
def delete_element(piwebapi_url, asset_server, user_name, user_password, piwebapi_security_method):
""" Delete an AF element
@param piwebapi_url string: the URL of the PI Web API
@param asset_server string: Name of the Asset Server
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
@param piwebapi_security_method string: Security method: basic or kerberos
"""
print('Delete Element')
# create security method - basic or kerberos
security_method = call_security_method(
piwebapi_security_method, user_name, user_password)
# Get the element
request_url = '{}/elements?path=\\\\{}\\{}\\{}'.format(
piwebapi_url, asset_server, OSI_AF_DATABASE, OSI_AF_ELEMENT)
response = requests.get(request_url, auth=security_method, verify=False)
# Only continue if the first request was successful
if response.status_code == 200:
# Deserialize the JSON Response
data = json.loads(response.text)
# Create a header
header = call_headers(False)
# Delete the element
response = requests.delete(data['Links']['Self'], auth=security_method,
verify=False, headers=header)
if response.status_code == 204:
print('Element {} Deleted'.format(OSI_AF_ELEMENT))
else:
print(response.status_code, response.reason, response.text)
else:
print(response.status_code, response.reason, response.text)
return response.status_code
def delete_template(piwebapi_url, asset_server, user_name, user_password, piwebapi_security_method):
""" Delete an AF template
@param piwebapi_url string: the URL of the PI Web API
@param asset_server string: Name of the Asset Server
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
@param piwebapi_security_method string: Security method: basic or kerberos
"""
print('Delete Template')
# create security method - basic or kerberos
security_method = call_security_method(
piwebapi_security_method, user_name, user_password)
# Get the element template
request_url = '{}/elementtemplates?path=\\\\{}\\{}\\ElementTemplates[{}]'.format(
piwebapi_url, asset_server, OSI_AF_DATABASE, OSI_AF_TEMPLATE)
response = requests.get(request_url, auth=security_method, verify=False)
# Only continue if the first request was successful
if response.status_code == 200:
# Deserialize the JSON Response
data = json.loads(response.text)
# Create a header
header = call_headers(True)
# Delete the element template
request_url = '{}/elementtemplates/{}'.format(
piwebapi_url, data['WebId'])
response = requests.delete(
request_url, auth=security_method, verify=False, headers=header)
if response.status_code == 204:
print('Template {} Deleted'.format(OSI_AF_TEMPLATE))
else:
print(response.status_code, response.reason, response.text)
else:
print(response.status_code, response.reason, response.text)
return response.status_code
def delete_category(piwebapi_url, asset_server, user_name, user_password, piwebapi_security_method):
""" Delete an AF Category
@param piwebapi_url string: the URL of the PI Web API
@param asset_server string: Name of the Asset Server
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
@param piwebapi_security_method string: Security method: basic or kerberos
"""
print('Delete Category')
# create security method - basic or kerberos
security_method = call_security_method(
piwebapi_security_method, user_name, user_password)
# Get the element category
request_url = '{}/elementcategories?path=\\\\{}\\{}\\CategoriesElement[{}]'.format(piwebapi_url, asset_server, OSI_AF_DATABASE, OSI_AF_CATEGORY)
response = requests.get(request_url, auth=security_method, verify=False)
# Only continue if the first request was successful
if response.status_code == 200:
# Deserialize the JSON Response
data = json.loads(response.text)
# Create a header
header = call_headers(False)
# Delete the element category
response = requests.delete(data['Links']['Self'], auth=security_method,
verify=False, headers=header)
if response.status_code == 204:
print('Category {} deleted.'.format(OSI_AF_CATEGORY))
else:
print(response.status_code, response.reason, response.text)
else:
print(response.status_code, response.reason, response.text)
return response.status_code
def delete_database(piwebapi_url, asset_server, user_name, user_password, piwebapi_security_method):
""" Delete Python Web API Sample database
@param piwebapi_url string: the URL of the PI Web API
@param asset_server string: Name of the Asset Server
@param user_name string: The user's credentials name
@param user_password string: The user's credentials password
@param piwebapi_security_method string: Security method: basic or kerberos
"""
print('Delete Database')
# create security method - basic or kerberos
security_method = call_security_method(
piwebapi_security_method, user_name, user_password)
# Get AF Server
request_url = '{}/assetdatabases?path=\\\\{}\\{}'.format(
piwebapi_url, asset_server, OSI_AF_DATABASE)
response = requests.get(request_url, auth=security_method, verify=False)
# Only continue if the first request was successful
if response.status_code == 200:
# Deserialize the JSON Response
data = json.loads(response.text)
# Create the header
header = call_headers(True)
# Delete the sample database
response = requests.delete(piwebapi_url + '/assetdatabases/' + data['WebId'],
auth=security_method, verify=False, headers=header)
if response.status_code == 204:
print('Database {} deleted.'.format(OSI_AF_DATABASE))
else:
print(response.status_code, response.reason, response.text)
else:
print(response.status_code, response.reason, response.text)
return response.status_code
# Main method
def main():
""" Main method. Receive user input and call the do_batch_call method """
piwebapi_url = str(input('Enter the PI Web API url: '))
af_server_name = str(input('Enter the Asset Server Name: '))
pi_server_name = str(input('Enter the PI Server Name: '))
piwebapi_user = str(input('Enter the user name: '))
piwebapi_password = str(getpass.getpass('Enter the password: '))
piwebapi_security_method = str(input('Enter the security method, Basic or Kerberos:'))
piwebapi_security_method = piwebapi_security_method.lower()
create_sandbox(piwebapi_url, af_server_name, pi_server_name, piwebapi_user, piwebapi_password,
piwebapi_security_method)
if __name__ == '__main__':
main()
| 43.796673 | 148 | 0.625475 | 2,547 | 23,694 | 5.616411 | 0.08245 | 0.101783 | 0.052849 | 0.037749 | 0.788885 | 0.774694 | 0.751066 | 0.743726 | 0.742817 | 0.737225 | 0 | 0.003258 | 0.287626 | 23,694 | 540 | 149 | 43.877778 | 0.844244 | 0.244999 | 0 | 0.596667 | 0 | 0 | 0.137328 | 0.032449 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0.1 | 0.016667 | 0 | 0.09 | 0.116667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
1f128680dc4e77ff9373d2dc4c1a01a635643803 | 4,838 | py | Python | stixcore/products/level1/scienceL1.py | nicHoch/STIXCore | 16822bbb37046f8e6c03be51909cfc91e9822cf7 | [
"BSD-3-Clause"
] | 1 | 2022-03-31T13:42:43.000Z | 2022-03-31T13:42:43.000Z | stixcore/products/level1/scienceL1.py | nicHoch/STIXCore | 16822bbb37046f8e6c03be51909cfc91e9822cf7 | [
"BSD-3-Clause"
] | 192 | 2020-11-03T22:40:19.000Z | 2022-03-31T15:17:13.000Z | stixcore/products/level1/scienceL1.py | nicHoch/STIXCore | 16822bbb37046f8e6c03be51909cfc91e9822cf7 | [
"BSD-3-Clause"
] | 3 | 2020-11-09T15:05:18.000Z | 2022-01-21T07:52:51.000Z | from collections import defaultdict
from stixcore.products.level0.scienceL0 import ScienceProduct
from stixcore.products.product import L1Mixin
from stixcore.time import SCETimeRange
__all__ = ['RawPixelData', 'CompressedPixelData', 'SummedPixelData', 'Visibility', 'Spectrogram',
'Aspect']
class RawPixelData(ScienceProduct, L1Mixin):
"""Raw X-ray pixel counts: compression level 0. No aggregation.
In level 1 format.
"""
def __init__(self, *, service_type, service_subtype, ssid, control,
data, idb_versions=defaultdict(SCETimeRange), **kwargs):
super().__init__(service_type=service_type, service_subtype=service_subtype,
ssid=ssid, control=control, data=data, idb_versions=idb_versions, **kwargs)
self.name = 'xray-rpd'
self.level = 'L1'
@classmethod
def is_datasource_for(cls, *, service_type, service_subtype, ssid, **kwargs):
return (kwargs['level'] == 'L1' and service_type == 21
and service_subtype == 6 and ssid == 20)
class CompressedPixelData(ScienceProduct, L1Mixin):
"""Aggregated (over time and/or energies) X-ray pixel counts: compression level 1.
In level 1 format.
"""
def __init__(self, *, service_type, service_subtype, ssid, control,
data, idb_versions=defaultdict(SCETimeRange), **kwargs):
super().__init__(service_type=service_type, service_subtype=service_subtype,
ssid=ssid, control=control, data=data, idb_versions=idb_versions, **kwargs)
self.name = 'xray-cpd'
self.level = 'L1'
@classmethod
def is_datasource_for(cls, *, service_type, service_subtype, ssid, **kwargs):
return (kwargs['level'] == 'L1' and service_type == 21
and service_subtype == 6 and ssid == 21)
class SummedPixelData(ScienceProduct, L1Mixin):
"""Aggregated (over time and/or energies and pixelsets) X-ray pixel counts: compression level 2.
In level 1 format.
"""
def __init__(self, *, service_type, service_subtype, ssid, control,
data, idb_versions=defaultdict(SCETimeRange), **kwargs):
super().__init__(service_type=service_type, service_subtype=service_subtype,
ssid=ssid, control=control, data=data, idb_versions=idb_versions, **kwargs)
self.name = 'xray-scpd'
self.level = 'L1'
@classmethod
def is_datasource_for(cls, *, service_type, service_subtype, ssid, **kwargs):
return (kwargs['level'] == 'L1' and service_type == 21
and service_subtype == 6 and ssid == 22)
class Visibility(ScienceProduct, L1Mixin):
"""
X-ray Visibilities or compression Level 3 data
In level 1 format.
"""
def __init__(self, *, service_type, service_subtype, ssid, control,
data, idb_versions=defaultdict(SCETimeRange), **kwargs):
super().__init__(service_type=service_type, service_subtype=service_subtype,
ssid=ssid, control=control, data=data, idb_versions=idb_versions, **kwargs)
self.name = 'xray-vis'
self.level = 'L1'
@classmethod
def is_datasource_for(cls, *, service_type, service_subtype, ssid, **kwargs):
return (kwargs['level'] == 'L1' and service_type == 21
and service_subtype == 6 and ssid == 23)
class Spectrogram(ScienceProduct, L1Mixin):
"""
X-ray Spectrogram or compression Level 2 data
In level 1 format.
"""
def __init__(self, *, service_type, service_subtype, ssid, control,
data, idb_versions=defaultdict(SCETimeRange), **kwargs):
super().__init__(service_type=service_type, service_subtype=service_subtype,
ssid=ssid, control=control, data=data, idb_versions=idb_versions, **kwargs)
self.name = 'xray-spec'
self.level = 'L1'
@classmethod
def is_datasource_for(cls, *, service_type, service_subtype, ssid, **kwargs):
return (kwargs['level'] == 'L1' and service_type == 21
and service_subtype == 6 and ssid == 24)
class Aspect(ScienceProduct, L1Mixin):
"""Bulk Aspect data.
In level 1 format.
"""
def __init__(self, *, service_type, service_subtype, ssid, control,
data, idb_versions=defaultdict(SCETimeRange), **kwargs):
super().__init__(service_type=service_type, service_subtype=service_subtype,
ssid=ssid, control=control, data=data, idb_versions=idb_versions, **kwargs)
self.name = 'aspect-burst'
self.level = 'L1'
@classmethod
def is_datasource_for(cls, *, service_type, service_subtype, ssid, **kwargs):
return (kwargs['level'] == 'L1' and service_type == 21
and service_subtype == 6 and ssid == 42)
| 40.316667 | 100 | 0.651716 | 554 | 4,838 | 5.435018 | 0.142599 | 0.109598 | 0.143474 | 0.149452 | 0.809366 | 0.809366 | 0.778479 | 0.778479 | 0.743939 | 0.743939 | 0 | 0.016748 | 0.234808 | 4,838 | 119 | 101 | 40.655462 | 0.796596 | 0.096114 | 0 | 0.666667 | 0 | 0 | 0.042309 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.055556 | 0.083333 | 0.388889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1f76b4de0b5d1a952d3a313865dd8f716016d429 | 18 | py | Python | histstats.py | stephenchestnut/fbballer | c05a604bbbe69d2253499600c5e0efc415dc006c | [
"MIT"
] | null | null | null | histstats.py | stephenchestnut/fbballer | c05a604bbbe69d2253499600c5e0efc415dc006c | [
"MIT"
] | null | null | null | histstats.py | stephenchestnut/fbballer | c05a604bbbe69d2253499600c5e0efc415dc006c | [
"MIT"
] | null | null | null |
import loaddata
| 4.5 | 15 | 0.777778 | 2 | 18 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 18 | 3 | 16 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1f97f7932393e3e946916b29bcccf3844af7bc07 | 99 | py | Python | threema_mm/__init__.py | Enproduktion/threema-mattermost | e2162e83e6bf7b26c2fe72e814c798c9f2363a4c | [
"BSD-2-Clause-FreeBSD"
] | 12 | 2016-01-10T22:49:58.000Z | 2017-04-16T21:26:04.000Z | threema_mm/__init__.py | Enproduktion/threema-mattermost | e2162e83e6bf7b26c2fe72e814c798c9f2363a4c | [
"BSD-2-Clause-FreeBSD"
] | 1 | 2017-02-14T05:52:16.000Z | 2017-03-07T11:16:06.000Z | threema_mm/__init__.py | Enproduktion/threema-mattermost | e2162e83e6bf7b26c2fe72e814c798c9f2363a4c | [
"BSD-2-Clause-FreeBSD"
] | 3 | 2016-06-22T15:56:16.000Z | 2020-07-01T11:20:23.000Z | from flask import Flask
import threema_mm.settings
app = Flask(__name__)
import threema_mm.views
| 14.142857 | 26 | 0.818182 | 15 | 99 | 5 | 0.6 | 0.293333 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131313 | 99 | 6 | 27 | 16.5 | 0.872093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2f30ee21759d9d84535640d750d0620763a394a2 | 164 | py | Python | venv3/lib/python3.8/site-packages/punch/vcs_use_cases/release.py | paul-romeo/pytest-in-60-minutes | a4817312081347737f87801c0623054eba599418 | [
"MIT"
] | 94 | 2016-05-23T17:13:11.000Z | 2021-12-03T23:06:45.000Z | venv3/lib/python3.8/site-packages/punch/vcs_use_cases/release.py | paul-romeo/pytest-in-60-minutes | a4817312081347737f87801c0623054eba599418 | [
"MIT"
] | 39 | 2016-05-19T17:57:53.000Z | 2020-12-26T09:57:21.000Z | venv3/lib/python3.8/site-packages/punch/vcs_use_cases/release.py | paul-romeo/pytest-in-60-minutes | a4817312081347737f87801c0623054eba599418 | [
"MIT"
] | 15 | 2016-05-23T20:22:37.000Z | 2019-12-27T21:13:04.000Z | from __future__ import print_function, absolute_import, division
from punch.vcs_use_cases import use_case
class VCSReleaseUseCase(use_case.VCSUseCase):
pass
| 20.5 | 64 | 0.835366 | 22 | 164 | 5.772727 | 0.727273 | 0.110236 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121951 | 164 | 7 | 65 | 23.428571 | 0.881944 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
2f480ef8cac8a494a0eed05c702832e2e527e5ff | 22 | py | Python | hello.py | parichitran/py-hw | 4fde1358f5fec3b21be08f1c888b3dcfae5278bd | [
"Apache-2.0"
] | null | null | null | hello.py | parichitran/py-hw | 4fde1358f5fec3b21be08f1c888b3dcfae5278bd | [
"Apache-2.0"
] | null | null | null | hello.py | parichitran/py-hw | 4fde1358f5fec3b21be08f1c888b3dcfae5278bd | [
"Apache-2.0"
] | null | null | null | print "hello folks222" | 22 | 22 | 0.818182 | 3 | 22 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 0.090909 | 22 | 1 | 22 | 22 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.608696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
2f57791505d331a7ec15f130f1f2dd6e3db5ddf1 | 35 | py | Python | router/__init__.py | sovaai/sova-bls-core | efd183a8cebb2f989630ec6b1f09727a701b7f88 | [
"Apache-2.0"
] | 41 | 2020-10-28T08:37:39.000Z | 2020-10-29T12:09:43.000Z | router/__init__.py | sovaai/sova-bls-core | efd183a8cebb2f989630ec6b1f09727a701b7f88 | [
"Apache-2.0"
] | null | null | null | router/__init__.py | sovaai/sova-bls-core | efd183a8cebb2f989630ec6b1f09727a701b7f88 | [
"Apache-2.0"
] | null | null | null | from .router import MessageHandler
| 17.5 | 34 | 0.857143 | 4 | 35 | 7.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
2f6cb2fb889611221b297b1401cd4e1a4586c27e | 29,938 | py | Python | tests/test_branch.py | cjeanner/git_wrapper | a809af53fac3ba77c927fe56901fc8d3f5189019 | [
"MIT"
] | null | null | null | tests/test_branch.py | cjeanner/git_wrapper | a809af53fac3ba77c927fe56901fc8d3f5189019 | [
"MIT"
] | null | null | null | tests/test_branch.py | cjeanner/git_wrapper | a809af53fac3ba77c927fe56901fc8d3f5189019 | [
"MIT"
] | null | null | null | #! /usr/bin/env python
"""Tests for GitCommit"""
from mock import Mock, patch
import git
import pytest
from git_wrapper.repo import GitRepo
from git_wrapper import exceptions
def test_on_head_only_all_new(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN on_head_only method is called with no upstream equivalent changes
THEN a dictionary is returned containing two sha1's and commits
"""
repo = GitRepo('./', mock_repo)
lines = '+ sha1 commit1\n+ sha2 commit2\n+ sha3 commit3'
attrs = {'cherry.return_value': lines}
mock_repo.git.configure_mock(**attrs)
expected = {'sha1': 'commit1', 'sha2': 'commit2', 'sha3': 'commit3'}
assert expected == repo.branch.cherry_on_head_only('upstream', 'HEAD')
def test_on_head_only_with_mixed(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN on_head_only method is called with a mix of
upstream equivalent and not equivalent changes
THEN a dictionary is returned containing two sha1's and commits
"""
repo = GitRepo('./', mock_repo)
lines = '+ sha1 commit1\n- sha2 commit2\n+ sha3 commit3'
attrs = {'cherry.return_value': lines}
mock_repo.git.configure_mock(**attrs)
expected = {'sha1': 'commit1', 'sha3': 'commit3'}
assert expected == repo.branch.cherry_on_head_only('upstream', 'HEAD')
def test_on_head_only_no_new(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN on_head_only method is called with a only upstream equivalent changes
THEN an empty dictionary is returned
"""
repo = GitRepo('./', mock_repo)
lines = '- sha1 commit1\n- sha2 commit2\n- sha3 commit3'
attrs = {'cherry.return_value': lines}
mock_repo.git.configure_mock(**attrs)
assert {} == repo.branch.cherry_on_head_only('upstream', 'HEAD')
def test_on_head_only_empty(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN on_head_only is called with no changes
THEN an empty dictionary is returned
"""
repo = GitRepo('./', mock_repo)
lines = ''
attrs = {'cherry.return_value': lines}
mock_repo.git.configure_mock(**attrs)
assert {} == repo.branch.cherry_on_head_only('upstream', 'HEAD')
def test_all_equivalent_changes(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN equivalent is called with only equivalent upstream/downstream changes.
THEN a dictionary is returned with all changes
"""
repo = GitRepo('./', mock_repo)
lines = '- sha1 commit1\n- sha2 commit2\n- sha3 commit3'
attrs = {'cherry.return_value': lines}
mock_repo.git.configure_mock(**attrs)
expected = {'sha1': 'commit1', 'sha2': 'commit2', 'sha3': 'commit3'}
assert expected == repo.branch.cherry_equivalent('upstream', 'HEAD')
def test_equivalent_mixed_changes(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN equivalent is called with mix equivalent and HEAD changes.
THEN a dictionary is returned with only the equivalent changes.
"""
repo = GitRepo('./', mock_repo)
lines = '+ sha1 commit1\n- sha2 commit2\n+ sha3 commit3'
attrs = {'cherry.return_value': lines}
mock_repo.git.configure_mock(**attrs)
expected = {'sha2': 'commit2'}
assert expected == repo.branch.cherry_equivalent('upstream', 'HEAD')
def test_equivalent_downstream_only(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN equivalent is called with mix HEAD only changes.
THEN an empty dictionary is returned.
"""
repo = GitRepo('./', mock_repo)
lines = '+ sha1 commit1\n+ sha2 commit2\n+ sha3 commit3'
attrs = {'cherry.return_value': lines}
mock_repo.git.configure_mock(**attrs)
assert {} == repo.branch.cherry_equivalent('upstream', 'HEAD')
def test_equivalent_no_changes(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN equivalent is called with no changes.
THEN an empty dictionary is returned.
"""
repo = GitRepo('./', mock_repo)
lines = ''
attrs = {'cherry.return_value': lines}
mock_repo.git.configure_mock(**attrs)
assert {} == repo.branch.cherry_equivalent('upstream', 'HEAD')
def test_rebase(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN branch.rebase_to_hash is called with a valid branch name and a valid hash
THEN git.checkout called
AND git.rebase called
"""
mock_repo.is_dirty.return_value = False
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
repo.branch.rebase_to_hash('test', '12345')
assert repo.repo.git.checkout.called is True
assert repo.repo.git.rebase.called is True
def test_rebase_dirty_repo(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN branch.rebase_to_hash is called on a dirty repository
THEN a DirtyRepositoryException is raised
"""
mock_repo.is_dirty.return_value = True
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.DirtyRepositoryException):
repo.branch.rebase_to_hash('test', '12345')
assert mock_repo.is_dirty.called is True
def test_rebase_branch_not_found(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN branch.rebase_to_hash is called with an invalid branch name
THEN a ReferenceNotFoundException is raised
AND the exception message contains branch
"""
mock_repo.is_dirty.return_value = False
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object') as mock_name_to_object:
with pytest.raises(exceptions.ReferenceNotFoundException) as exc_info:
mock_name_to_object.side_effect = git.exc.BadName()
repo.branch.rebase_to_hash('doesNotExist', '12345')
assert 'branch' in str(exc_info.value)
def test_rebase_hash_not_found(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN branch.rebase_to_hash is called with a valid branch name and an invalid hash
THEN a ReferenceNotFoundException is raised
AND the exception message contains hash
"""
mock_repo.is_dirty.return_value = False
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object') as mock_name_to_object:
with pytest.raises(exceptions.ReferenceNotFoundException) as exc_info:
# First name_to_object call is to check the branch, let it succeed
def side_effect(mock, ref):
if ref != "branchA":
raise git.exc.BadName
mock_name_to_object.side_effect = side_effect
repo.branch.rebase_to_hash('branchA', '12345')
assert 'hash' in str(exc_info.value)
def test_rebase_error_during_checkout(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN branch.rebase_to_hash is called with a valid branch name and a valid hash
AND checkout fails with an exception
THEN a CheckoutException is raised
"""
mock_repo.is_dirty.return_value = False
mock_repo.git.checkout.side_effect = git.GitCommandError('checkout', '')
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.CheckoutException):
repo.branch.rebase_to_hash('branchA', '12345')
def test_rebase_error_during_rebase(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN branch.rebase_to_hash is called with a valid branch name and a valid hash
AND rebase fails with an exception
THEN a RebaseException is raised
"""
mock_repo.is_dirty.return_value = False
mock_repo.git.rebase.side_effect = git.GitCommandError('rebase', '')
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.RebaseException):
repo.branch.rebase_to_hash('branchA', '12345')
def test_abort_rebase(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN branch.abort_rebase is called
THEN git.rebase called
"""
repo = GitRepo('./', mock_repo)
repo.branch.abort_rebase()
assert repo.repo.git.rebase.called is True
def test_abort_rebase_error(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN abort_rebase is called
AND the abort fails with an exception
THEN an AbortException is raised
"""
mock_repo.git.rebase.side_effect = git.GitCommandError('rebase', '')
repo = GitRepo('./', mock_repo)
with pytest.raises(exceptions.AbortException):
repo.branch.abort_rebase()
def test_apply_patch(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_patch is called with a valid branch_name and valid path
THEN git.am is called
"""
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
repo.branch.apply_patch('test_branch', './requirements.txt')
assert repo.git.am.called is True
def test_apply_patch_wrong_branch_name(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_patch is called with an invalid branch_name and valid path
THEN ReferenceNotFoundExceptionRaised
AND git.am not called
"""
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object') as mock_name_to_object:
mock_name_to_object.side_effect = git.exc.BadName()
with pytest.raises(exceptions.ReferenceNotFoundException):
repo.branch.apply_patch('invalid_branch', './requirements.txt')
assert repo.git.am.called is False
def test_apply_patch_not_a_file(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_patch is called with a valid branch_name and invalid path
THEN FileDoesntExistException raised
AND git.am not called
"""
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.FileDoesntExistException):
repo.branch.apply_patch('test_branch', './git_wrapper')
assert repo.git.am.called is False
def test_apply_patch_checkout_error(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_patch is called with a valid branch name and a valid path
AND checkout fails with an exception
THEN a CheckoutException is raised
AND git.am not called
"""
mock_repo.git.checkout.side_effect = git.GitCommandError('checkout', '')
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.CheckoutException):
repo.branch.apply_patch('test_branch', './requirements.txt')
assert repo.git.am.called is False
def test_apply_patch_apply_error(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_patch is called with a valid branch name and a valid path
AND git.am fails with an exception
THEN a ChangeNotAppliedException is raised
"""
mock_repo.git.am.side_effect = git.GitCommandError('am', '')
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.ChangeNotAppliedException):
repo.branch.apply_patch('test_branch', './requirements.txt')
def test_apply_diff(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_diff is called with a valid branch_name and valid diff_path and valid message
THEN index.commit is called
"""
mock_repo.is_dirty.return_value = False
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
repo.branch.apply_diff('test_branch', './requirements.txt', 'message', True)
assert repo.git.apply.called is True
assert repo.git.commit.called is True
def test_apply_diff_on_invalid_branch(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_diff is called with an invalid branch_name and valid path
THEN ReferenceNotFoundExceptionRaised
AND git.apply not called
"""
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object') as mock_name_to_object:
mock_name_to_object.side_effect = git.exc.BadName()
with pytest.raises(exceptions.ReferenceNotFoundException):
repo.branch.apply_diff('invalid_branch', './requirements.txt', 'message')
assert repo.git.apply.called is False
def test_apply_diff_on_dirty_workspace(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_diff is called on a dirty repository
THEN a DirtyRepositoryException is raised
AND git.apply not called
"""
mock_repo.is_dirty.return_value = True
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.DirtyRepositoryException):
repo.branch.apply_diff('test_branch', './requirements.txt', 'message')
assert mock_repo.is_dirty.called is True
assert repo.git.apply.called is False
def test_apply_diff_no_commit_message(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_diff is called with valid branch_name, valid diff_path and invalid message
THEN CommitMessageMissingException raised
AND index.commit not called
"""
mock_repo.is_dirty.return_value = False
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.CommitMessageMissingException):
repo.branch.apply_diff('test_branch', './requirements.txt', '')
assert repo.git.commit.called is False
def test_apply_diff_not_a_file(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_diff is called with valid parameters
THEN FileDoesntExistException raised
AND git.apply not called
"""
mock_repo.is_dirty.return_value = False
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.FileDoesntExistException):
repo.branch.apply_diff('test_branch', 'doesntexist.txt', 'message')
assert repo.git.apply.called is False
def test_apply_diff_checkout_error(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_diff is called with valid parameters
AND checkout fails with an exception
THEN a CheckoutException is raised
AND index.commit not called
"""
mock_repo.is_dirty.return_value = False
mock_repo.git.checkout.side_effect = git.GitCommandError('checkout', '')
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.CheckoutException):
repo.branch.apply_diff('invalid_branch', './requirements.txt', 'my message')
assert repo.git.commit.called is False
def test_apply_diff_apply_fails(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_diff is called with a valid branch_name and valid diff_path and valid message
AND git.apply fails with an exception
THEN an ChangeNotAppliedException is raised
"""
mock_repo.is_dirty.return_value = False
mock_repo.git.apply.side_effect = git.GitCommandError('apply', '')
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.ChangeNotAppliedException):
repo.branch.apply_diff('test_branch', './requirements.txt', 'message')
assert repo.git.commit.called is False
def test_apply_diff_apply_nothing_to_commit(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN apply_diff is called with a valid branch_name and valid diff_path and valid message
WHEN commit is called with a valid message
AND there are no diff changes
THEN git.apply called
AND index.commit not called
"""
mock_repo.is_dirty.return_value = False
repo = GitRepo('./', mock_repo)
repo.git.diff.return_value = []
with patch('git.repo.fun.name_to_object'):
repo.branch.apply_diff('test_branch', './requirements.txt', 'message')
assert repo.git.apply.called is True
assert repo.git.commit.called is False
def test_abort_patch_apply(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN abort_patch_apply is called
THEN git.am called
"""
repo = GitRepo('./', mock_repo)
repo.branch.abort_patch_apply()
assert repo.git.am.called is True
def test_abort_patch_apply_error(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN abort_patch_apply is called
AND the abort_patch_apply fails with an exception
THEN an Abort_Patch_ApplyException is raised
"""
mock_repo.git.am.side_effect = git.GitCommandError('abort_patch_apply', '')
repo = GitRepo('./', mock_repo)
with pytest.raises(exceptions.AbortException):
repo.branch.abort_patch_apply()
def test_reverse_diff(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN reverse_diff is called with a valid diff_path
THEN git.am called
"""
repo = GitRepo('./', mock_repo)
repo.branch.reverse_diff('./requirements.txt')
assert repo.git.apply.called is True
def test_reverse_diff_diff_file_doesnt_exist(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN reverse_diff is called with and invalid diff_path
THEN FileDoesntExistException raised
AND git.apply not called
"""
repo = GitRepo('./', mock_repo)
with pytest.raises(exceptions.FileDoesntExistException):
repo.branch.reverse_diff('./thisdoesntexist')
assert repo.git.apply.called is False
def test_reverse_diff_error(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN reverse_diff is called with a valid diff_path
AND the reverse_diff fails with an exception
THEN an RevertException is raised
"""
mock_repo.git.apply.side_effect = git.GitCommandError('apply', '')
repo = GitRepo('./', mock_repo)
with pytest.raises(exceptions.RevertException):
repo.branch.reverse_diff('./requirements.txt')
def test_log_diff(mock_repo, fake_commits):
"""
GIVEN GitRepo initialized with a path and repo
WHEN log_diff is called with two valid hashes
THEN a list of log entries is returned
"""
mock_repo.iter_commits.return_value = fake_commits
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
log_diff = repo.branch.log_diff('12345', '54321')
assert len(log_diff) == 3
assert log_diff[2] == (
"commit 0020000000000000000\n"
"Author: Test Author <testauthor@example.com>\n"
"Date: Wed Dec 05 10:36:19 2018 \n\n"
"This is a commit message (#2)\nWith some details."
)
def test_short_log_diff(mock_repo, fake_commits):
"""
GIVEN GitRepo initialized with a path and repo
WHEN short_log_diff is called with two valid hashes
THEN a list of log entries is returned
"""
mock_repo.iter_commits.return_value = fake_commits
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
log_diff = repo.branch.short_log_diff('12345', '54321')
assert len(log_diff) == 3
assert log_diff == ["0000000 This is a commit message (#0)",
"0010000 This is a commit message (#1)",
"0020000 This is a commit message (#2)"]
def test_log_diff_with_pattern(mock_repo, fake_commits):
"""
GIVEN GitRepo initialized with a path and repo
WHEN log_diff is called with two valid hashes and a pattern
THEN a list of log entries is returned
"""
mock_repo.iter_commits.return_value = fake_commits
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
log_diff = repo.branch.log_diff('12345', '54321',
pattern="$hash $author")
assert log_diff == [
"0000000000000000000 Test Author <testauthor@example.com>",
"0010000000000000000 Test Author <testauthor@example.com>",
"0020000000000000000 Test Author <testauthor@example.com>"
]
def test_log_diff_no_results(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN log_diff is called with two valid hashes
AND there are no results
THEN an empty list is returned
"""
mock_repo.iter_commits.return_value = []
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object'):
log_diff = repo.branch.log_diff('12345', '12345')
assert log_diff == []
def test_log_diff_invalid_hash(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN log_diff is called with a invalid hash
THEN a ReferenceNotFoundException is raised
"""
repo = GitRepo('./', mock_repo)
with patch('git.repo.fun.name_to_object') as mock_name_to_object:
with pytest.raises(exceptions.ReferenceNotFoundException):
mock_name_to_object.side_effect = git.exc.BadName()
repo.branch.log_diff('doesNotExist', '12345')
assert mock_repo.iter_commits.called is False
def test_reset(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN reset is called
THEN repo.head.reset is called
"""
mock_remote = Mock()
mock_repo.remote.return_value = mock_remote
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object'):
repo.branch.hard_reset()
assert mock_remote.fetch.called is True # Sync is called
assert mock_repo.head.reset.called is True # Reset is called
def test_reset_remote_reference_not_found(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN reset is called
AND the remote + branch reference doesn't exist
THEN ReferenceNotFoundException is raised
"""
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object') as mock_name_to_object:
mock_name_to_object.side_effect = git.exc.BadName()
with pytest.raises(exceptions.ReferenceNotFoundException):
repo.branch.hard_reset(refresh=False, remote="doesntExist")
assert mock_repo.head.reset.called is False
def test_reset_checkout_failure(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN reset is called
AND git.checkout fails
THEN CheckoutException is raised
"""
mock_repo.git.checkout.side_effect = git.GitCommandError('checkout', '')
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object'):
with pytest.raises(exceptions.CheckoutException):
repo.branch.hard_reset(refresh=False)
assert mock_repo.head.reset.called is False
def test_reset_reset_failure(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN reset is called
AND git.reset fails
THEN ResetException is raised
"""
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object'):
mock_repo.head.reset.side_effect = git.GitCommandError('reset', '')
with pytest.raises(exceptions.ResetException):
repo.branch.hard_reset(refresh=False)
def test_local_branch_exists(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.exists is called with a valid branch and None remote
THEN True is returned
"""
repo = GitRepo(repo=mock_repo)
mock_repo.branches = ["master", "test"]
assert repo.branch.exists("test") is True
def test_local_branch_doesnt_exist(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.exists is called with an invalid branch and None remote
THEN False is returned
"""
repo = GitRepo(repo=mock_repo)
mock_repo.branches = ["master", "test"]
assert repo.branch.exists("another-test") is False
def test_branch_exists_with_invalid_remote(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.exists is called with a valid branch and invalid remote
THEN a RemoteException is raised
"""
repo = GitRepo(repo=mock_repo)
with pytest.raises(exceptions.RemoteException):
assert repo.branch.exists("another", "doesntexist")
def test_remote_branch_exists(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.exists is called with a valid branch and valid remote
THEN True is returned
"""
repo = GitRepo(repo=mock_repo)
remote = Mock(spec=git.Remote)
remote.configure_mock(name="testremote", refs=["testbranch"])
mock_repo.remotes.extend([remote])
assert repo.branch.exists("testbranch", "testremote") is True
def test_remote_branch_doesnt_exists(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.exists is called with an invalid branch and valid remote
THEN True is returned
"""
repo = GitRepo(repo=mock_repo)
remote = Mock(spec=git.Remote)
remote.configure_mock(name="testremote", refs=[])
mock_repo.remotes.extend([remote])
assert repo.branch.exists("testbranch", "testremote") is False
def test_create_branch(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.create is called with a valid name and start_ref
THEN git.branch is called
"""
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object'):
assert repo.branch.create("test", "123456") is True
repo.git.branch.assert_called_with("test", "123456")
def test_create_branch_with_bad_start_ref(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.create is called with a valid name and invalid start_ref
THEN a ReferenceNotFoundException is raised
"""
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object') as mock_name_to_object:
mock_name_to_object.side_effect = git.exc.BadName()
with pytest.raises(exceptions.ReferenceNotFoundException):
assert repo.branch.create("test", "badref")
def test_create_branch_already_exists(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.create is called with a valid name and start_ref
AND the branch already exists
THEN git.branch is not called
"""
repo = GitRepo(repo=mock_repo)
mock_repo.branches = ["test", "master"]
with patch('git.repo.fun.name_to_object'):
repo.branch.create("test", "123456")
assert repo.git.branch.called is False
def test_create_branch_already_exists_and_reset_it(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.create is called with a valid name and start_ref
AND the branch already exists and reset_if_exists is True
THEN hard_reset_to_ref is called
"""
repo = GitRepo(repo=mock_repo)
mock_repo.branches = ["test", "master"]
mock_hard_reset = Mock()
repo.branch.hard_reset_to_ref = mock_hard_reset
with patch('git.repo.fun.name_to_object'):
repo.branch.create("test", "123456", True)
assert mock_hard_reset.called is True
def test_remote_contains_branch_not_found(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN branch.remote_contains is called with an invalid branch name
THEN a ReferenceNotFoundException is raised
AND the exception message contains branch
"""
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object') as mock_name_to_object:
with pytest.raises(exceptions.ReferenceNotFoundException) as exc_info:
mock_name_to_object.side_effect = git.exc.BadName()
repo.branch.remote_contains('doesNotExist', '12345')
assert 'branch' in str(exc_info.value)
def test_remote_contains_commit_not_found(mock_repo):
"""
GIVEN GitRepo initialized with a path and repo
WHEN branch.remote_contains is called with an invalid commit hash
THEN a ReferenceNotFoundException is raised
AND the exception message contains hash
"""
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object') as mock_name_to_object:
with pytest.raises(exceptions.ReferenceNotFoundException) as exc_info:
# First name_to_object call is to check the branch, let it succeed
def side_effect(mock, ref):
if ref != "origin/mybranch":
raise git.exc.BadName
mock_name_to_object.side_effect = side_effect
repo.branch.remote_contains('origin/mybranch', 'doesNotExist')
assert 'hash' in str(exc_info.value)
def test_remote_contains_with_commit_present(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.remote_contains is called with a valid branch and hash
AND git_repo.git.branch returns data
THEN branch.remote_contains returns True
"""
remote_branch = "origin/mybranch"
mock_repo.git.branch.return_value = remote_branch
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object'):
assert repo.branch.remote_contains(remote_branch, '12345') is True
def test_remote_contains_with_commit_absent(mock_repo):
"""
GIVEN GitRepo is initialized with a path and repo
WHEN branch.remote_contains is called with a valid branch and hash
AND git_repo.git.branch returns empty string
THEN branch.remote_contains returns True
"""
mock_repo.git.branch.return_value = ""
repo = GitRepo(repo=mock_repo)
with patch('git.repo.fun.name_to_object'):
assert repo.branch.remote_contains("origin/mybranch", '12345') is False
| 34.332569 | 92 | 0.70499 | 4,159 | 29,938 | 4.882183 | 0.052416 | 0.064615 | 0.044127 | 0.055159 | 0.880719 | 0.852647 | 0.811771 | 0.795568 | 0.773849 | 0.758828 | 0 | 0.011992 | 0.203387 | 29,938 | 871 | 93 | 34.371986 | 0.839406 | 0.31455 | 0 | 0.636605 | 0 | 0 | 0.164702 | 0.057171 | 0 | 0 | 0 | 0 | 0.151194 | 1 | 0.153846 | false | 0 | 0.013263 | 0 | 0.167109 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c83ededce5c5e859c17f4043a5ba059a09ecdf04 | 36 | py | Python | CodeAnalysis/SourceMeter_Interface/SourceMeter-8.2.0-x64-linux/Python/Tools/python/astroid/tests/testdata/python3/data/SSL1/__init__.py | ishtjot/susereumutep | 56e20c1777e0c938ac42bd8056f84af9e0b76e46 | [
"Apache-2.0"
] | 463 | 2015-01-15T08:17:42.000Z | 2022-03-28T15:10:20.000Z | CodeAnalysis/SourceMeter_Interface/SourceMeter-8.2.0-x64-linux/Python/Tools/python/astroid/tests/testdata/python3/data/SSL1/__init__.py | ishtjot/susereumutep | 56e20c1777e0c938ac42bd8056f84af9e0b76e46 | [
"Apache-2.0"
] | 52 | 2015-01-06T02:43:59.000Z | 2022-03-14T11:15:21.000Z | CodeAnalysis/SourceMeter_Interface/SourceMeter-8.2.0-x64-linux/Python/Tools/python/astroid/tests/testdata/python3/data/SSL1/__init__.py | ishtjot/susereumutep | 56e20c1777e0c938ac42bd8056f84af9e0b76e46 | [
"Apache-2.0"
] | 249 | 2015-01-07T22:49:49.000Z | 2022-03-18T02:32:06.000Z | from .Connection1 import Connection
| 18 | 35 | 0.861111 | 4 | 36 | 7.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0.111111 | 36 | 1 | 36 | 36 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c8677fb70162f03ae46ad9edef0744a1df84474c | 6,305 | py | Python | tests/test_routes/test_pet.py | jordan-hamilton/petnet-feeder-service | 66c3192b0e66f4eefb5fd55cceb1219fa1ddb914 | [
"MIT"
] | 25 | 2021-01-21T04:41:07.000Z | 2022-02-19T13:16:54.000Z | tests/test_routes/test_pet.py | jordan-hamilton/petnet-feeder-service | 66c3192b0e66f4eefb5fd55cceb1219fa1ddb914 | [
"MIT"
] | 101 | 2021-01-23T05:23:33.000Z | 2022-03-28T13:38:13.000Z | tests/test_routes/test_pet.py | ericchapman80/petnet-api-hacking | 23cff84317d7380d7d1c0a2718cc153e83920906 | [
"MIT"
] | 5 | 2021-05-19T03:35:57.000Z | 2022-03-08T18:03:43.000Z | from fastapi.testclient import TestClient
SAMPLE_PET = {
"name": "Minnie",
"image": "base64data",
"animal_type": "cat",
"weight": 3628.74,
"birthday": 1483164000000000,
"activity_level": 5,
}
class MockPet:
def __init__(self, device_hid: str):
self.device_hid = device_hid
def test_list_pets(client: TestClient):
response = client.get("/api/v1/pet")
assert response.status_code == 200
assert response.json() == []
def test_create_list_and_delete_pets(client: TestClient):
response = client.get("/api/v1/pet/1")
assert response.status_code == 404
assert response.json() == {"detail": "No pet found with ID 1"}
response = client.post("/api/v1/pet", json=SAMPLE_PET)
assert response.status_code == 200
assert response.json() == {"id": 1, **SAMPLE_PET, "device_hid": None}
response = client.get("/api/v1/pet/1")
assert response.status_code == 200
assert response.json() == {"id": 1, **SAMPLE_PET, "device_hid": None}
response = client.get("/api/v1/pet")
assert response.status_code == 200
assert len(response.json()) == 1
response = client.delete("/api/v1/pet/1")
assert response.status_code == 200
response = client.get("/api/v1/pet/1")
assert response.status_code == 404
assert response.json() == {"detail": "No pet found with ID 1"}
def test_update_pet(client: TestClient):
response = client.post("/api/v1/pet", json=SAMPLE_PET)
pet_id = response.json()["id"]
response = client.put(f"/api/v1/pet/{pet_id}", json={"name": "Bad Kitty!"})
assert response.status_code == 200
assert response.json()["name"] == "Bad Kitty!"
def test_set_and_get_pet_schedule(
client: TestClient, with_registered_device: None, mocker
):
from tests.test_database_models import SAMPLE_DEVICE_HID
response = client.post("/api/v1/pet", json=SAMPLE_PET)
pet_id = response.json()["id"]
schedule_url = f"/api/v1/pet/{pet_id}/schedule"
response = client.get(schedule_url)
assert response.status_code == 200
assert response.json() == {"events": []}
meal_info = {"name": "Breakfast", "time": 3600, "portion": 0.0625}
response = client.post(schedule_url, json=meal_info)
assert response.status_code == 400
assert response.json() == {
"detail": "Can't schedule event on pet without assigned feeder!"
}
cmd = mocker.patch("feeder.api.routers.pet.router.client.send_cmd_schedule")
client.put(f"/api/v1/pet/{pet_id}", json={"device_hid": SAMPLE_DEVICE_HID})
response = client.post(schedule_url, json=meal_info)
assert len(response.json()["events"]) == 1
assert response.json()["events"][0] == {
**meal_info,
"event_id": 1,
"enabled": True,
"result": None,
}
cmd.assert_called_once()
mocker.patch("feeder.api.routers.pet.get_pet", return_value=MockPet(""))
response = client.post(schedule_url, json=meal_info)
assert response.status_code == 400
assert response.json() == {
"detail": "Can't schedule event on pet without assigned feeder!"
}
mocker.patch(
"feeder.api.routers.pet.get_pet", return_value=MockPet(SAMPLE_DEVICE_HID)
)
mocker.patch("feeder.api.routers.pet.KronosDevices.get", return_value=[])
response = client.post(schedule_url, json=meal_info)
assert response.status_code == 500
assert response.json() == {"detail": "Assigned device doesn't exist!"}
def test_set_and_get_pet_schedule_after_feed(
client: TestClient, with_sample_feed: None, mocker
):
from tests.test_database_models import SAMPLE_DEVICE_HID
response = client.post(
"/api/v1/pet", json={**SAMPLE_PET, "device_hid": SAMPLE_DEVICE_HID}
)
pet_id = response.json()["id"]
schedule_url = f"/api/v1/pet/{pet_id}/schedule"
mocker.patch("feeder.api.routers.pet.router.client.send_cmd_schedule")
meal_info = {"name": "Breakfast", "time": 0, "portion": 0.0625}
response = client.post(schedule_url, json=meal_info)
assert response.json()["events"][0]["result"] is None
meal_info = {"time": 3600}
response = client.put(f"{schedule_url}/1", json=meal_info)
assert response.json()["events"][0]["result"]["device_hid"] == SAMPLE_DEVICE_HID
mocker.patch("feeder.api.routers.pet.get_pet", return_value=MockPet(""))
response = client.put(f"{schedule_url}/1", json=meal_info)
assert response.status_code == 400
assert response.json() == {
"detail": "Can't update event on pet without assigned feeder!"
}
mocker.patch(
"feeder.api.routers.pet.get_pet", return_value=MockPet(SAMPLE_DEVICE_HID)
)
mocker.patch("feeder.api.routers.pet.KronosDevices.get", return_value=[])
response = client.put(f"{schedule_url}/1", json=meal_info)
assert response.status_code == 500
assert response.json() == {"detail": "Assigned device doesn't exist!"}
def test_delete_scheduled_event(
client: TestClient, with_registered_device: None, mocker
):
from tests.test_database_models import SAMPLE_DEVICE_HID
response = client.post(
"/api/v1/pet", json={**SAMPLE_PET, "device_hid": SAMPLE_DEVICE_HID}
)
pet_id = response.json()["id"]
schedule_url = f"/api/v1/pet/{pet_id}/schedule"
mocker.patch("feeder.api.routers.pet.router.client.send_cmd_schedule")
meal_info = {"name": "Breakfast", "time": 0, "portion": 0.0625}
response = client.post(schedule_url, json=meal_info)
evtid = response.json()["events"][0]["event_id"]
response = client.delete(f"{schedule_url}/{evtid}")
assert response.status_code == 200
assert response.json()["events"] == []
mocker.patch("feeder.api.routers.pet.get_pet", return_value=MockPet(""))
response = client.delete(f"{schedule_url}/{evtid}")
assert response.status_code == 400
assert response.json() == {
"detail": "Can't update event on pet without assigned feeder!"
}
mocker.patch(
"feeder.api.routers.pet.get_pet", return_value=MockPet(SAMPLE_DEVICE_HID)
)
mocker.patch("feeder.api.routers.pet.KronosDevices.get", return_value=[])
response = client.delete(f"{schedule_url}/{evtid}")
assert response.status_code == 500
assert response.json() == {"detail": "Assigned device doesn't exist!"}
| 35.621469 | 84 | 0.673275 | 844 | 6,305 | 4.840047 | 0.125592 | 0.119951 | 0.079315 | 0.099878 | 0.864627 | 0.84798 | 0.844308 | 0.832803 | 0.821787 | 0.734884 | 0 | 0.025852 | 0.171768 | 6,305 | 176 | 85 | 35.823864 | 0.756415 | 0 | 0 | 0.630435 | 0 | 0 | 0.245837 | 0.097542 | 0 | 0 | 0 | 0 | 0.275362 | 1 | 0.050725 | false | 0 | 0.028986 | 0 | 0.086957 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c0742d1008e8423f4ac1d8d5ecc1108c7fe0d3b8 | 19,849 | py | Python | pycle/bicycle-scrapes/epey-scrape/downLink6.py | fusuyfusuy/School-Projects | 8e38f19da90f63ac9c9ec91e550fc5aaab3d0234 | [
"MIT"
] | null | null | null | pycle/bicycle-scrapes/epey-scrape/downLink6.py | fusuyfusuy/School-Projects | 8e38f19da90f63ac9c9ec91e550fc5aaab3d0234 | [
"MIT"
] | null | null | null | pycle/bicycle-scrapes/epey-scrape/downLink6.py | fusuyfusuy/School-Projects | 8e38f19da90f63ac9c9ec91e550fc5aaab3d0234 | [
"MIT"
] | null | null | null |
from bs4 import BeautifulSoup
import os
import wget
from urllib.request import Request, urlopen
bicycles=[{'name': 'Bisan RX 9100 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bisan-rx-9100.html'}, {'name': 'Carraro CR-T World Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-cr-t-world.html'}, {'name': 'Bianchi Snap Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-snap-26.html'}, {'name': 'Kron XC100 27.5 HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-xc100-27-5-hd.html'}, {'name': 'Vertech Aurora 27.5 Bisiklet', 'link': 'https://www.epey.com/bisiklet/vertech-aurora-27-5.html'}, {'name': 'Kron Fold 3.0 Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-fold-3-0.html'}, {'name': 'Kron XC150 26 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-xc150-26-v.html'}, {'name': 'Bisan FX 3500 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bisan-fx-3500.html'}, {'name': 'Salcano NG650 27.5 MD Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-ng650-md-27-5.html'}, {'name': 'Bisan CTS 5100 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bisan-cts-5100-26.html'}, {'name': 'Bianchi Aspid 49 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-aspid-49.html'}, {'name': 'Kron EFD100 Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-efd100.html'}, {'name': 'Bianchi Travel 504 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-travel-504.html'}, {'name': 'Kron RF100 26 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-rf100-26-v.html'}, {'name': 'Kron RC4000 Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-rc4000.html'}, {'name': 'Bianchi RCX 226 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-rcx-226-26.html'}, {'name': 'RKS T7 Bisiklet', 'link': 'https://www.epey.com/bisiklet/rks-t7.html'}, {'name': 'Corelli Atrox 2.0 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-atrox-2-0.html'}, {'name': 'Carraro Race 006 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-race-006.html'}, {'name': 'Corelli Desire Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-desire.html'}, {'name': 'Corelli Neon 1.1 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-neon-1-1.html'}, {'name': 'Kron CX100 Unisex Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-cx100-unisex.html'}, {'name': 'Bianchi Aspid 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-aspid-26.html'}, {'name': 'Salcano Üsküp 700 Man Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-uskup-700-man.html'}, {'name': 'Ümit 2037 Folding 2D Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2037-folding-2d.html'}, {'name': 'Kross Trans 4.0 Bisiklet', 'link': 'https://www.epey.com/bisiklet/kross-trans-4-0.html'}, {'name': 'Gitane Gipsy Bisiklet', 'link': 'https://www.epey.com/bisiklet/gitane-gipsy.html'}, {'name': 'Salcano Lion 26 MD Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-lion-26-md.html'}, {'name': 'Corelli Swing 3.2 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-swing-3-2.html'}, {'name': 'Corelli Voras 3 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-voras-3.html'}, {'name': 'Peugeot JM242 Bisiklet', 'link': 'https://www.epey.com/bisiklet/peugeot-jm242.html'}, {'name': 'Corelli Smile 16 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-smile-16.html'}, {'name': 'Carraro Buffalo 24 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-buffalo-24.html'}, {'name': 'Carraro Juliana 24 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-juliana-24.html'}, {'name': 'Ümit 1416 Hello Kitty Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-1416-hello-kitty.html'}, {'name': 'Ümit 2675 Spartan Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2675-spartan.html'}, {'name': 'Kron CX50 24 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-cx50-24-v.html'}, {'name': 'Corelli Oldtown L Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-oldtown-l.html'}, {'name': 'Corelli Snoop 2.0 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-snoop-2-0.html'}, {'name': 'Corelli Neon 1.2 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-neon-1-2.html'}, {'name': 'Salcano Daisy 16 Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-daisy-16.html'}, {'name': 'Corelli Adonis 1.0 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-adonis-1-0.html'}, {'name': 'Kron R2 Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-r2.html'}, {'name': 'Merida Scultura 300 Bisiklet', 'link': 'https://www.epey.com/bisiklet/merida-scultura-300.html'}, {'name': 'Bianchi Captain Kidd 14 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-captain-kidd.html'}, {'name': 'Ümit 2025 Winx Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2025-winx.html'}, {'name': 'Salcano City Wind 10 Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-city-wind-10.html'}, {'name': 'Salcano Bodrum 24 Lady Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-bodrum-24-lady.html'}, {'name': 'Bianchi Rally 20 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-rally-20.html'}, {'name': 'Ümit 2424 Redhawk 24 Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2424-redhawk-24.html'}, {'name': 'Ümit 1204 Transformers Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-1204-transformers.html'}, {'name': 'Ümit 1640 Stitch Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-1640-stitch.html'}, {'name': 'Mosso 29 Wildfire LTD HYD Bisiklet', 'link': 'https://www.epey.com/bisiklet/mosso-29-wildfire-ltd-hyd.html'}, {'name': 'Salcano 400 24 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-400-24-v.html'}, {'name': 'Salcano City Wind HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-city-wind-hd.html'}, {'name': 'Salcano NG333 27.5 HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-ng333-27-5-hd.html'}, {'name': 'Carraro Daytona 727 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-daytona-727.html'}, {'name': 'Kron FD3000 Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-fd3000.html'}, {'name': 'Kron FXC500 26 HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-fxc500-26-hd.html'}, {'name': 'Vortex 5.0 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/vortex-5-0-26.html'}, {'name': 'Kron XC75 20 Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-xc75-20.html'}, {'name': 'Kron Vortex 4.0 24 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-vortex-4-0-24-v.html'}, {'name': 'Kron Venüs 3.0 24 Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-venus-3-0-24.html'}, {'name': 'Peugeot T17-28 Bisiklet', 'link': 'https://www.epey.com/bisiklet/peugeot-t17-28.html'}, {'name': 'Peugeot M14-26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/peugeot-m14-26.html'}, {'name': 'Bianchi Strada 104 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-strada-104.html'}, {'name': 'Carraro Modena Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-modena.html'}, {'name': 'Peugeot F12 Bisiklet', 'link': 'https://www.epey.com/bisiklet/peugeot-f12.html'}, {'name': 'Carraro Elite 705 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-elite-705.html'}, {'name': 'Carraro Force 900 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-force-900.html'}, {'name': 'Berg Ford Mustang GT Bisiklet', 'link': 'https://www.epey.com/bisiklet/berg-ford-mustang-gt.html'}, {'name': 'Berg Buzzy Fiat 500 Bisiklet', 'link': 'https://www.epey.com/bisiklet/berg-buzzy-fiat-500.html'}, {'name': 'Ümit 2027 30 Alanya Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2027-30-alanya.html'}, {'name': 'Ümit 2459 Albatros 2D Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2459-albatros-2d.html'}, {'name': 'Ümit 2646 Gigantus 2D Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2646-gigantus-2d.html'}, {'name': 'Ümit 2756 Accrue 2D Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2756-accrue-2d.html'}, {'name': 'Kron CX50 28 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-cx50-28-v.html'}, {'name': 'Sedona 300L Bisiklet', 'link': 'https://www.epey.com/bisiklet/sedona-300l.html'}, {'name': 'Corelli Sandy 2.0 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-sandy-2-0.html'}, {'name': 'Corelli Swing 1.1 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-swing-1-1.html'}, {'name': 'Corelli Teton 2.0 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-teton-2-0.html'}, {'name': 'Bianchi Buffalo 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-buffalo-26.html'}, {'name': 'Carraro 704 Grande Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-704-grande.html'}, {'name': 'Ghost Kato 4.7 AL Bisiklet', 'link': 'https://www.epey.com/bisiklet/ghost-kato-4-7-al.html'}, {'name': 'Bianchi Infinito CV Chorus Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-infinito-cv-chorus.html'}, {'name': 'Salcano İmpetus 27.5 Deore Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-impetus-27-5-deore.html'}, {'name': 'Salcano NG450 29 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-ng450-29-v.html'}, {'name': 'Salcano NG750 24 HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-ng750-24-hd.html'}, {'name': 'Salcano NG850 26 Man Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-ng850-26-man.html'}, {'name': 'Salcano XRS035 Tiagra Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-xrs035-tiagra.html'}, {'name': 'Salcano City Fun 50 HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-city-fun-50-hd.html'}, {'name': 'Salcano Marina Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-marina.html'}, {'name': 'Soho Fitt 9.1 Bisiklet', 'link': 'https://www.epey.com/bisiklet/soho-fitt-9-1.html'}, {'name': 'Corelli March 2.0 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-march-2-0.html'}, {'name': 'Corelli Adonis 2.1 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-adonis-2-1.html'}, {'name': 'Corelli Jazz 1.1 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-jazz-1-1.html'}, {'name': 'Corelli Cyborg 1.2 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-cyborg-1-2.html'}, {'name': 'Corelli Scopri 1.2 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corelli-scopri-1-2.html'}, {'name': 'Orbis Reflex 20 Bisiklet', 'link': 'https://www.epey.com/bisiklet/orbis-reflex-20.html'}, {'name': 'Orbis Tweety 18 Bisiklet', 'link': 'https://www.epey.com/bisiklet/orbis-tweety-18.html'}, {'name': 'Orbis Buddy 14 Bisiklet', 'link': 'https://www.epey.com/bisiklet/orbis-buddy-14.html'}, {'name': 'Orbis Jungle Tiger 16 Bisiklet', 'link': 'https://www.epey.com/bisiklet/orbis-jungle-tiger-16.html'}, {'name': 'Orbis Viper 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/orbis-viper-26.html'}, {'name': 'Orbis Escape 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/orbis-escape-26.html'}, {'name': 'Tern Verge X20 Bisiklet', 'link': 'https://www.epey.com/bisiklet/tern-verge-x20.html'}, {'name': 'Dahon Mariner D8 Bisiklet', 'link': 'https://www.epey.com/bisiklet/dahon-mariner-d8.html'}, {'name': 'Kron XC100 20 Lady V Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-xc100-20-lady-v.html'}, {'name': 'Kron TX100 MD Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-tx100-city-md.html'}, {'name': 'Kron WSX450 27.5 HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-wsx450-27-5-man-hd.html'}, {'name': 'Whistle Miwok 1721 Bisiklet', 'link': 'https://www.epey.com/bisiklet/whistle-miwok-1721.html'}, {'name': 'Mosso 24 WildFire V Girls Bisiklet', 'link': 'https://www.epey.com/bisiklet/mosso-24-wildfire-v-girls.html'}, {'name': 'Mosso Legarda 1721 MDM V Bisiklet', 'link': 'https://www.epey.com/bisiklet/mosso-legarda-1721-mdm-v.html'}, {'name': 'Mosso 771TB3 DMD Deore Bisiklet', 'link': 'https://www.epey.com/bisiklet/mosso-771tb3-dmd-deore.html'}, {'name': 'Mosso 790PRO 105 Bisiklet', 'link': 'https://www.epey.com/bisiklet/mosso-790pro-105.html'}, {'name': 'Mosso 7581XC XT Bisiklet', 'link': 'https://www.epey.com/bisiklet/mosso-7581xc-xt.html'}, {'name': 'Merida Speeder 900 Bisiklet', 'link': 'https://www.epey.com/bisiklet/merida-speeder-900.html'}, {'name': 'Ghost Tacana 2 29 Bisiklet', 'link': 'https://www.epey.com/bisiklet/ghost-tacana-2-29.html'}, {'name': 'Ghost Lanao 2 27.5 Bisiklet', 'link': 'https://www.epey.com/bisiklet/ghost-lanao-2-27-5.html'}, {'name': 'Carraro Daytona 2927 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-daytona-2927.html'}, {'name': 'Carraro Daytona 2724 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-daytona-2724.html'}, {'name': 'Bisan XTY 5400 HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/bisan-xty-5400-hd.html'}, {'name': 'Salcano Sarajevo 24 Lady Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-sarajevo-24-lady.html'}, {'name': 'Bianchi RCX 110 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-rcx-110.html'}, {'name': 'Bianchi Touring 313 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-touring-313.html'}, {'name': 'Salcano Nova 24 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-nova-24-v.html'}, {'name': 'Salcano NG450 26 HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-ng450-26-hd.html'}, {'name': 'Salcano City Sport 30 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-city-sport-30-v.html'}, {'name': 'Ümit 1649 Monster High Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-1649-monster-high.html'}, {'name': 'Cube Attain Race Disk Bisiklet', 'link': 'https://www.epey.com/bisiklet/cube-attain-race-disk.html'}, {'name': 'Arbike 2802 Bisiklet', 'link': 'https://www.epey.com/bisiklet/arbike-2802.html'}, {'name': 'Anatolia Amörtisörlü Bayan 24 Bisiklet', 'link': 'https://www.epey.com/bisiklet/anatolia-amortisorlu-bayan-24.html'}, {'name': 'Salcano NG650 29 MD Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-ng650-md-29.html'}, {'name': 'Cube LTD Race 27.5 Bisiklet', 'link': 'https://www.epey.com/bisiklet/cube-ltd-race-27-5.html'}, {'name': 'Cube Cross Pro Bisiklet', 'link': 'https://www.epey.com/bisiklet/cube-cross-pro.html'}, {'name': 'Cube Cross Race Cyclocross Bisiklet', 'link': 'https://www.epey.com/bisiklet/cube-cross-race-cyclocross.html'}, {'name': 'Cube Agree C:62 Pro Bisiklet', 'link': 'https://www.epey.com/bisiklet/cube-agree-c62-pro.html'}, {'name': 'Cube LTD Race 29 Bisiklet', 'link': 'https://www.epey.com/bisiklet/cube-ltd-race-29.html'}, {'name': 'Cube Elite C68 SL 29 Bisiklet', 'link': 'https://www.epey.com/bisiklet/cube-elite-c68-sl-29.html'}, {'name': 'Merida BIG.NINE 600 29 Bisiklet', 'link': 'https://www.epey.com/bisiklet/merida-big-nine-600-29.html'}, {'name': 'Merida BIG.SEVEN Team Issue 27.5 Bisiklet', 'link': 'https://www.epey.com/bisiklet/merida-big-seven-team-issue-27-5.html'}, {'name': 'Trek X-Caliber 7 27.5 Bisiklet', 'link': 'https://www.epey.com/bisiklet/trek-x-caliber-7-27-5.html'}, {'name': 'Cannondale Trail SL 3 29 Bisiklet', 'link': 'https://www.epey.com/bisiklet/cannondale-trail-sl-3-29.html'}, {'name': 'Corratec Shape Urban Lady 28 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corratec-shape-urban-lady-28.html'}, {'name': 'Corratec Cct Team Ultegra 11S 28 Bisiklet', 'link': 'https://www.epey.com/bisiklet/corratec-cct-team-ultegra-11s-28.html'}, {'name': 'Cannondale Supersix Evo HI Mod Ultegra 28 Bisiklet', 'link': 'https://www.epey.com/bisiklet/cannondale-supersix-evo-hi-mod-ultegra-28.html'}, {'name': 'Cannondale Alu Flash 2 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/cannondale-alu-flash-2-26.html'}, {'name': 'Cannondale Trail Womens 4 27.5 Bisiklet', 'link': 'https://www.epey.com/bisiklet/cannondale-trail-womens-4-27-5.html'}, {'name': 'Schwinn Fastback 3 Men 28 Bisiklet', 'link': 'https://www.epey.com/bisiklet/schwinn-fastback-3-men-28.html'}, {'name': 'Geotech Legend 2.0 Carbon 28 Bisiklet', 'link': 'https://www.epey.com/bisiklet/geotech-legend-2-0-carbon-28.html'}, {'name': 'Kron XC250 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-xc250-26.html'}, {'name': 'Kron TX150 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/kron-tx150-v.html'}, {'name': 'Salcano Igman 26 Deore Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-igman-deore-26.html'}, {'name': 'Salcano Astro 27.5 V Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-astro-v-27-5.html'}, {'name': 'Salcano XRS001 Red Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-xrs001-red.html'}, {'name': 'Salcano XRS060 Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-xrs060.html'}, {'name': 'Salcano Badboy 20 Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-badboy-20.html'}, {'name': 'Salcano Manhattan Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-manhattan.html'}, {'name': 'Salcano Assos 20 27.5 X1 Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-assos-20-x1-27-5.html'}, {'name': 'Salcano Istanbul 30 Lady HD Bisiklet', 'link': 'https://www.epey.com/bisiklet/salcano-istanbul-30-lady-hd.html'}, {'name': 'Scott Aspect 680 Bisiklet', 'link': 'https://www.epey.com/bisiklet/scott-aspect-680-26.html'}, {'name': 'Sedona 340 Bisiklet', 'link': 'https://www.epey.com/bisiklet/sedona-340.html'}, {'name': 'Bianchi Hotwheels 12 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-hotwheels-12.html'}, {'name': 'Bianchi Daisy Bayan 24 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-daisy-bayan-24.html'}, {'name': 'Bianchi ARX 629 29 inç Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-arx-629-29.html'}, {'name': 'Bianchi RCX 326 Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-rcx-326-26.html'}, {'name': 'Bianchi Luca Bisiklet', 'link': 'https://www.epey.com/bisiklet/bianchi-luca-24.html'}, {'name': 'Carraro Sportive 324 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-sportive-324.html'}, {'name': 'Carraro Country Three Nexus 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-country-three-nexus-26.html'}, {'name': 'Carraro Big 2720 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-big-2720.html'}, {'name': 'Carraro Force 291 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-force-291.html'}, {'name': 'Sedona 870 27.5 Bisiklet', 'link': 'https://www.epey.com/bisiklet/sedona-870-27-5.html'}, {'name': 'Carraro Big 2920 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-big-2920.html'}, {'name': 'Ümit 2755 Kratos 2D Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2755-kratos-2d.html'}, {'name': 'Ümit 2831 Velocity Lady Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2831-velocity-lady.html'}, {'name': 'Ümit 2699 Wagen 26 Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2699-wagen-26.html'}, {'name': 'Ümit 2071 Superbomber 20 Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2071-superbomber-20.html'}, {'name': 'Ümit 2801 Taurus Lady 28 Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2801-taurus-lady-28.html'}, {'name': 'Ümit 2470 Angela 24 Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2470-angela-24.html'}, {'name': 'Ümit 2967 Mirage 2D Bisiklet', 'link': 'https://www.epey.com/bisiklet/umit-2967-mirage-2d.html'}, {'name': 'RKS TN25 Bisiklet', 'link': 'https://www.epey.com/bisiklet/rks-tn25.html'}, {'name': 'Carraro Kifuka 29 Bisiklet', 'link': 'https://www.epey.com/bisiklet/carraro-kifuka-29.html'}]
for i in bicycles:
url = i['link']
try:
req = Request(url, headers={'User-Agent': 'Mozilla/5.0'})
webpage = urlopen(req).read()
except:
print("err in "+i['link'])
else:
print("Downloaded "+i['name']+" ", end="\r")
fileName = i['name'].replace('/','_')
f = open("./listItems/"+fileName+'.html', 'wb')
f.write(webpage)
f.close
| 902.227273 | 19,269 | 0.689052 | 3,043 | 19,849 | 4.494578 | 0.120605 | 0.158807 | 0.224976 | 0.264678 | 0.652702 | 0.589384 | 0.576808 | 0.554288 | 0.293485 | 0.100022 | 0 | 0.060522 | 0.078493 | 19,849 | 21 | 19,270 | 945.190476 | 0.687169 | 0 | 0 | 0 | 0 | 1.833333 | 0.814792 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.222222 | 0.111111 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c0a2435931178fc5a136929d0a58143a46d4ae03 | 157 | py | Python | jekoffice/views.py | Erbarbar/JekOffice | ef87f1e824a6420a040bf65e9acbc59bfbd51835 | [
"Apache-2.0"
] | null | null | null | jekoffice/views.py | Erbarbar/JekOffice | ef87f1e824a6420a040bf65e9acbc59bfbd51835 | [
"Apache-2.0"
] | 14 | 2020-02-12T00:42:56.000Z | 2022-03-11T23:50:49.000Z | jekoffice/views.py | Erbarbar/JekOffice | ef87f1e824a6420a040bf65e9acbc59bfbd51835 | [
"Apache-2.0"
] | 1 | 2019-06-23T18:04:44.000Z | 2019-06-23T18:04:44.000Z | from django.shortcuts import render
from django.http import HttpResponse
def home_view(request, *args, **kargs):
return render(request, 'home.html',{})
| 26.166667 | 42 | 0.751592 | 21 | 157 | 5.571429 | 0.714286 | 0.17094 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127389 | 157 | 5 | 43 | 31.4 | 0.854015 | 0 | 0 | 0 | 0 | 0 | 0.057325 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
238fc999ee09801cbcb0e11c2003f51acb96beb3 | 164 | py | Python | src/scraping/tasks/__init__.py | mstrechen/advanced-news-scraper | dc54a057eb7c14d0e390b82f6b308f5a924cb966 | [
"MIT"
] | null | null | null | src/scraping/tasks/__init__.py | mstrechen/advanced-news-scraper | dc54a057eb7c14d0e390b82f6b308f5a924cb966 | [
"MIT"
] | 3 | 2021-04-06T18:16:57.000Z | 2021-12-13T20:55:52.000Z | src/scraping/tasks/__init__.py | mstrechen/advanced-news-scraper | dc54a057eb7c14d0e390b82f6b308f5a924cb966 | [
"MIT"
] | null | null | null | # flake8: noqa
from .run_site_parsers import run_site_parsers_task
from .parse_article import parse_article_task
from .parse_news_list import parse_news_list_task
| 27.333333 | 51 | 0.871951 | 27 | 164 | 4.814815 | 0.444444 | 0.107692 | 0.215385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006757 | 0.097561 | 164 | 5 | 52 | 32.8 | 0.871622 | 0.073171 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
23958e663fc659f6ea5f836f5d56bbfcef82c9a2 | 329 | py | Python | cfn_ami_to_mapping/__init__.py | vbanakh/cfn-ami-to-mapping | 3946be1272a769b8791a4abcdb754822eb671a09 | [
"MIT"
] | 19 | 2019-09-04T06:53:21.000Z | 2019-09-20T08:49:02.000Z | cfn_ami_to_mapping/__init__.py | vbanakh/cfn-ami-to-mapping | 3946be1272a769b8791a4abcdb754822eb671a09 | [
"MIT"
] | 1 | 2021-02-11T11:46:32.000Z | 2021-02-11T11:46:32.000Z | cfn_ami_to_mapping/__init__.py | vbanakh/cfn-ami-to-mapping | 3946be1272a769b8791a4abcdb754822eb671a09 | [
"MIT"
] | 3 | 2019-11-03T10:20:39.000Z | 2021-11-09T17:56:42.000Z | from cfn_ami_to_mapping.get.get import Get
from cfn_ami_to_mapping.enrich.enrich import Enrich
from cfn_ami_to_mapping.transform.transform import Transform
from cfn_ami_to_mapping.generate.generate import Generate
from cfn_ami_to_mapping.validate import Validate
__all__ = ('Get', 'Enrich', 'Transform', 'Generate', 'Validate')
| 41.125 | 64 | 0.835866 | 50 | 329 | 5.12 | 0.24 | 0.136719 | 0.195313 | 0.234375 | 0.371094 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 329 | 7 | 65 | 47 | 0.850498 | 0 | 0 | 0 | 1 | 0 | 0.103343 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.833333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
23a138cdf14b042b2d115642950b5ce684fe280d | 31 | py | Python | image_to_latex/data/__init__.py | niits/image-to-latex | 602b4263ac849dbdfb69988dde6df5936efd9c95 | [
"MIT"
] | 944 | 2021-07-02T07:27:01.000Z | 2022-03-31T11:16:29.000Z | image_to_latex/data/__init__.py | niits/image-to-latex | 602b4263ac849dbdfb69988dde6df5936efd9c95 | [
"MIT"
] | 15 | 2021-08-09T16:47:28.000Z | 2022-03-18T09:41:42.000Z | image_to_latex/data/__init__.py | niits/image-to-latex | 602b4263ac849dbdfb69988dde6df5936efd9c95 | [
"MIT"
] | 166 | 2021-08-05T11:44:11.000Z | 2022-03-30T10:55:58.000Z | from .im2latex import Im2Latex
| 15.5 | 30 | 0.83871 | 4 | 31 | 6.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 0.129032 | 31 | 1 | 31 | 31 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
23a532902ebef892bcf99bcf0595f60b0095c683 | 137 | py | Python | mixed_tabs_and_spaces.py | sudoalgorithm/wtfPython | 310ba05449d1a0811ad98485f2ef7617e8c696b3 | [
"WTFPL"
] | 4 | 2018-12-28T19:56:44.000Z | 2021-11-14T19:11:10.000Z | mixed_tabs_and_spaces.py | charettes/wtfPython | ddbbe691860cbea5d2b46781d7217f0ea2797391 | [
"WTFPL"
] | 34 | 2019-11-11T10:56:17.000Z | 2021-08-02T08:28:09.000Z | mixed_tabs_and_spaces.py | charettes/wtfPython | ddbbe691860cbea5d2b46781d7217f0ea2797391 | [
"WTFPL"
] | 1 | 2019-06-17T18:39:54.000Z | 2019-06-17T18:39:54.000Z | def square(x):
sum_so_far = 0
for counter in range(x):
sum_so_far = sum_so_far + x
return sum_so_far
print(square(10))
| 17.125 | 35 | 0.649635 | 26 | 137 | 3.115385 | 0.538462 | 0.246914 | 0.395062 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0.255474 | 137 | 7 | 36 | 19.571429 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.166667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
23e920d5484ad56704b2277ce962dbeb9fd5d8a5 | 140 | py | Python | challenge_0/python/walkingluo/src/hello_world.py | rchicoli/2017-challenges | 44f0b672e5dea34de1dde131b6df837d462f8e29 | [
"Apache-2.0"
] | 271 | 2017-01-01T22:58:36.000Z | 2021-11-28T23:05:29.000Z | challenge_0/python/walkingluo/src/hello_world.py | AakashOfficial/2017Challenges | a8f556f1d5b43c099a0394384c8bc2d826f9d287 | [
"Apache-2.0"
] | 283 | 2017-01-01T23:26:05.000Z | 2018-03-23T00:48:55.000Z | challenge_0/python/walkingluo/src/hello_world.py | AakashOfficial/2017Challenges | a8f556f1d5b43c099a0394384c8bc2d826f9d287 | [
"Apache-2.0"
] | 311 | 2017-01-01T22:59:23.000Z | 2021-09-23T00:29:12.000Z | #
# Print Hello World
#
def print_hello_world():
print("Hello World")
if __name__ == '__main__':
print_hello_world()
| 11.666667 | 27 | 0.607143 | 16 | 140 | 4.5625 | 0.4375 | 0.547945 | 0.821918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.271429 | 140 | 11 | 28 | 12.727273 | 0.715686 | 0.121429 | 0 | 0 | 0 | 0 | 0.175926 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0 | 0 | 0.25 | 0.75 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
f1b0a1e0150ebdaafcabbc1df42f046e63821442 | 49 | py | Python | keycloak_admin_aio/_resources/users/by_id/role_mappings/__init__.py | V-Mann-Nick/keycloak-admin-aio | 83ac1af910e492a5864eb369aacfc0512e5c8c45 | [
"Apache-2.0"
] | 12 | 2021-11-08T18:03:09.000Z | 2022-03-17T16:34:06.000Z | keycloak_admin_aio/_resources/users/by_id/role_mappings/__init__.py | V-Mann-Nick/keycloak-admin-aio | 83ac1af910e492a5864eb369aacfc0512e5c8c45 | [
"Apache-2.0"
] | null | null | null | keycloak_admin_aio/_resources/users/by_id/role_mappings/__init__.py | V-Mann-Nick/keycloak-admin-aio | 83ac1af910e492a5864eb369aacfc0512e5c8c45 | [
"Apache-2.0"
] | 1 | 2021-11-14T13:55:30.000Z | 2021-11-14T13:55:30.000Z | from .role_mappings import UsersByIdRoleMappings
| 24.5 | 48 | 0.897959 | 5 | 49 | 8.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 49 | 1 | 49 | 49 | 0.955556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f1bf5d467676243b64c2aab381c69cd2eb18e6ff | 29 | py | Python | python-package/insightface/data/__init__.py | qaz734913414/insightface | 4101fe608ca1d38604a23d53f32314ce8a28fe79 | [
"MIT"
] | 12,377 | 2017-12-04T02:46:57.000Z | 2022-03-31T16:48:31.000Z | python-package/insightface/data/__init__.py | qaz734913414/insightface | 4101fe608ca1d38604a23d53f32314ce8a28fe79 | [
"MIT"
] | 1,851 | 2017-12-05T05:41:23.000Z | 2022-03-30T13:06:22.000Z | python-package/insightface/data/__init__.py | qaz734913414/insightface | 4101fe608ca1d38604a23d53f32314ce8a28fe79 | [
"MIT"
] | 4,198 | 2017-12-05T02:57:19.000Z | 2022-03-30T10:29:37.000Z | from .image import get_image
| 14.5 | 28 | 0.827586 | 5 | 29 | 4.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f1e7a20428f343dfdeb5ebf3f0d1036480772bab | 41 | py | Python | src/decomposition/__init__.py | frmunozz/irregular-bag-of-pattern | d1e118546aca89bff75036a06f74a76a2fdb1a82 | [
"MIT"
] | 1 | 2022-01-23T21:40:10.000Z | 2022-01-23T21:40:10.000Z | src/decomposition/__init__.py | frmunozz/irregular-bag-of-pattern | d1e118546aca89bff75036a06f74a76a2fdb1a82 | [
"MIT"
] | null | null | null | src/decomposition/__init__.py | frmunozz/irregular-bag-of-pattern | d1e118546aca89bff75036a06f74a76a2fdb1a82 | [
"MIT"
] | null | null | null | from .lsa import LSA
from .pca import PCA | 20.5 | 20 | 0.780488 | 8 | 41 | 4 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170732 | 41 | 2 | 21 | 20.5 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7b146ced665dec2efc1941e7d08248e3ed3d4e79 | 609 | py | Python | bark_ml/environments/blueprints/__init__.py | mansoorcheema/bark-ml | 349c0039a5f54778d6b7aea7fd18e3e979efc3a3 | [
"MIT"
] | 3 | 2020-04-25T12:38:55.000Z | 2020-04-29T11:55:45.000Z | bark_ml/environments/blueprints/__init__.py | SebastianGra/bark-ml_MCTS_RL | 8334f141d02bdc012a0bc6ac00d679018e0f46f3 | [
"MIT"
] | 24 | 2020-05-05T13:53:17.000Z | 2020-08-13T15:58:51.000Z | bark_ml/environments/blueprints/__init__.py | SebastianGra/bark-ml_MCTS_RL | 8334f141d02bdc012a0bc6ac00d679018e0f46f3 | [
"MIT"
] | 1 | 2022-01-27T13:08:46.000Z | 2022-01-27T13:08:46.000Z | from bark_ml.environments.blueprints.blueprint import Blueprint
from bark_ml.environments.blueprints.highway.highway import ContinuousHighwayBlueprint
from bark_ml.environments.blueprints.highway.highway import DiscreteHighwayBlueprint
from bark_ml.environments.blueprints.merging.merging import ContinuousMergingBlueprint
from bark_ml.environments.blueprints.merging.merging import DiscreteMergingBlueprint
from bark_ml.environments.blueprints.intersection.intersection import ContinuousIntersectionBlueprint
from bark_ml.environments.blueprints.intersection.intersection import DiscreteIntersectionBlueprint | 87 | 101 | 0.91133 | 62 | 609 | 8.83871 | 0.258065 | 0.10219 | 0.127737 | 0.281022 | 0.664234 | 0.605839 | 0.605839 | 0.605839 | 0 | 0 | 0 | 0 | 0.044335 | 609 | 7 | 102 | 87 | 0.941581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
7b2d16a0677624a7bbdbead6f0dd0e4c148c3782 | 384 | py | Python | src/secml/ml/classifiers/gradients/tests/test_classes/__init__.py | zangobot/secml | 95a293e1201c24256eb7fe2f1d2125cd5f318c8c | [
"Apache-2.0"
] | 63 | 2020-04-20T16:31:16.000Z | 2022-03-29T01:05:35.000Z | src/secml/ml/classifiers/gradients/tests/test_classes/__init__.py | zangobot/secml | 95a293e1201c24256eb7fe2f1d2125cd5f318c8c | [
"Apache-2.0"
] | 5 | 2020-04-21T11:31:39.000Z | 2022-03-24T13:42:56.000Z | src/secml/ml/classifiers/gradients/tests/test_classes/__init__.py | zangobot/secml | 95a293e1201c24256eb7fe2f1d2125cd5f318c8c | [
"Apache-2.0"
] | 8 | 2020-04-21T09:16:42.000Z | 2022-02-23T16:28:43.000Z | from .c_classifier_gradient_test import CClassifierGradientTest
from .c_classifier_gradient_test_linear import CClassifierGradientTestLinear
from .c_classifier_gradient_test_ridge import CClassifierGradientTestRidge
from .c_classifier_gradient_test_logistic import \
CClassifierGradientTestLogisticRegression
from .c_classifier_gradient_test_svm import CClassifierGradientTestSVM
| 54.857143 | 76 | 0.919271 | 39 | 384 | 8.564103 | 0.384615 | 0.07485 | 0.224551 | 0.344311 | 0.404192 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065104 | 384 | 6 | 77 | 64 | 0.930362 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.833333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9e2ef9c05111a8581addbc5953c2ff393851ac6d | 36 | py | Python | src/core/__init__.py | week-with-me/quiz-crawler | 9dbaa80e03018fb6908c48b35cd31930f8a04b9b | [
"MIT"
] | null | null | null | src/core/__init__.py | week-with-me/quiz-crawler | 9dbaa80e03018fb6908c48b35cd31930f8a04b9b | [
"MIT"
] | null | null | null | src/core/__init__.py | week-with-me/quiz-crawler | 9dbaa80e03018fb6908c48b35cd31930f8a04b9b | [
"MIT"
] | null | null | null | from src.core.config import settings | 36 | 36 | 0.861111 | 6 | 36 | 5.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 36 | 1 | 36 | 36 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9e771ec354094efac374ca6f6ee4234787577a44 | 105 | py | Python | verilog_cores/test/icebreaker/constraints.py | elmsfu/CSI2Rx | cd3a78d49266599238db0cd23115a07adfa599d7 | [
"MIT"
] | 1 | 2019-11-14T21:07:20.000Z | 2019-11-14T21:07:20.000Z | verilog_cores/test/icebreaker/constraints.py | elmsfu/CSI2Rx | cd3a78d49266599238db0cd23115a07adfa599d7 | [
"MIT"
] | null | null | null | verilog_cores/test/icebreaker/constraints.py | elmsfu/CSI2Rx | cd3a78d49266599238db0cd23115a07adfa599d7 | [
"MIT"
] | null | null | null | ctx.addClock("csi_rx_i.dphy_clk", 24)
ctx.addClock("video_clk", 24)
ctx.addClock("uart_i.sys_clk_i", 12)
| 26.25 | 37 | 0.752381 | 21 | 105 | 3.428571 | 0.571429 | 0.458333 | 0.222222 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060606 | 0.057143 | 105 | 3 | 38 | 35 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9ea860b6029a635a781f5d0eee31686468dc210c | 143 | py | Python | src/backend/api/handlers/error.py | guineawheek/ftc-data-take-2 | 337bff2077eadb3bd6bbebd153cbb6181c99516f | [
"MIT"
] | null | null | null | src/backend/api/handlers/error.py | guineawheek/ftc-data-take-2 | 337bff2077eadb3bd6bbebd153cbb6181c99516f | [
"MIT"
] | null | null | null | src/backend/api/handlers/error.py | guineawheek/ftc-data-take-2 | 337bff2077eadb3bd6bbebd153cbb6181c99516f | [
"MIT"
] | null | null | null | from typing import Tuple, Union
def handle_404(_e: Union[int, Exception]) -> Tuple[dict, int]:
return {"Error": "Invalid endpoint"}, 404
| 23.833333 | 62 | 0.692308 | 20 | 143 | 4.85 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.160839 | 143 | 5 | 63 | 28.6 | 0.758333 | 0 | 0 | 0 | 0 | 0 | 0.146853 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
7b4443ca05f29221daabe26cc41eba2cc9ed1548 | 42 | py | Python | kerlym/a3c/__init__.py | osh/ddqn | add2b253e0dcf1c9a05524a28eab066be8adb3b9 | [
"MIT"
] | 327 | 2016-05-17T16:25:52.000Z | 2021-11-16T03:01:21.000Z | kerlym/a3c/__init__.py | osh/ddqn | add2b253e0dcf1c9a05524a28eab066be8adb3b9 | [
"MIT"
] | 16 | 2016-06-02T22:57:21.000Z | 2017-04-03T14:05:15.000Z | kerlym/a3c/__init__.py | osh/ddqn | add2b253e0dcf1c9a05524a28eab066be8adb3b9 | [
"MIT"
] | 99 | 2016-05-17T17:33:51.000Z | 2021-12-15T13:17:39.000Z | import networks
import worker
import a3c
| 10.5 | 15 | 0.833333 | 6 | 42 | 5.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028571 | 0.166667 | 42 | 3 | 16 | 14 | 0.971429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7b45340e31025bd4f3d43a60c5eb5571f1cc9e3b | 1,118 | py | Python | dbmfinancas.py | vinerodrigues/sistema-loja_main | 15024e5f42ae446935986fbbf27dec470741e5d8 | [
"MIT"
] | null | null | null | dbmfinancas.py | vinerodrigues/sistema-loja_main | 15024e5f42ae446935986fbbf27dec470741e5d8 | [
"MIT"
] | null | null | null | dbmfinancas.py | vinerodrigues/sistema-loja_main | 15024e5f42ae446935986fbbf27dec470741e5d8 | [
"MIT"
] | null | null | null |
import semidbm
financas = semidbm.open("C:/sistema_loja-main/financas.db", "c")
a = financas.keys()
#print("Datas dentro de finanças",a)
#for i in a:
# del financas[i]
def add_financas(nome, valor, data, comentario):
##print(data)
if data.encode() in financas: #####Adicionando os dados no data base
aux_1 = financas[data]
aux_2 =aux_1.decode()+"§"+nome+valor+comentario
financas[data] = aux_2
#print("Dentro do in/segundo loop---",financas[data].decode())
else:
financas[data]=nome+valor+comentario
#print("Fora do in, no else primeiro loop ",financas[data])
#print(financas.keys())
def rem_financas(nome, valor, data, comentario):
##print(data)
if data.encode() in financas: #####Adicionando os dados no data base
aux_1 = financas[data]
aux_2 = "-"+aux_1.decode()+"§"+nome+valor+comentario
financas[data] = aux_2
##print("Dentro do in/segundo loop---",financas[data].decode())
a = financas.keys()
##print(a)
#for i in a:
# del financas[i]
else:
financas[data]=nome+valor+comentario
##print("Fora do in, no else primeiro loop ",financas[data])
##print(financas.keys())
| 30.216216 | 69 | 0.686047 | 170 | 1,118 | 4.458824 | 0.264706 | 0.158311 | 0.079156 | 0.084433 | 0.831135 | 0.831135 | 0.831135 | 0.831135 | 0.778364 | 0.778364 | 0 | 0.008333 | 0.141324 | 1,118 | 36 | 70 | 31.055556 | 0.779167 | 0.41771 | 0 | 0.666667 | 0 | 0 | 0.058065 | 0.051613 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.055556 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7b4f6bcc61cca19089315bf030b0117e3a279522 | 22,942 | py | Python | test/test_fenics.py | kursawe/forcedynamics | 61bd6d871bc9850bbd49757e7d483c224aeef15b | [
"BSD-3-Clause"
] | null | null | null | test/test_fenics.py | kursawe/forcedynamics | 61bd6d871bc9850bbd49757e7d483c224aeef15b | [
"BSD-3-Clause"
] | null | null | null | test/test_fenics.py | kursawe/forcedynamics | 61bd6d871bc9850bbd49757e7d483c224aeef15b | [
"BSD-3-Clause"
] | null | null | null | import unittest
import os
import os.path
import time
import sys
os.environ["OMP_NUM_THREADS"] = "1"
# import matplotlib as mpl
# mpl.use('Agg')
# mpl.rcParams['mathtext.default'] = 'regular'
import matplotlib.pyplot as plt
# font = {'size' : 10}
# plt.rc('font', **font)
import numpy as np
import fenics
import ufl
import celluloid
# make sure we find the right python module
sys.path.append(os.path.join(os.path.dirname(__file__),'..','src'))
class TestFenics(unittest.TestCase):
def xest_first_tutorial(self):
T = 2.0 # final time
num_steps = 10 # number of time steps
dt = T / num_steps # time step size
alpha = 3 # parameter alpha
beta = 1.2 # parameter beta
# Create mesh and define function space
nx = ny = 8
mesh = fenics.UnitSquareMesh(nx, ny)
V = fenics.FunctionSpace(mesh, 'P', 1)
# Define boundary condition
u_D = fenics.Expression('1 + x[0]*x[0] + alpha*x[1]*x[1] + beta*t', degree=2, alpha=alpha, beta=beta, t=0)
def boundary(x, on_boundary): return on_boundary
bc = fenics.DirichletBC(V, u_D, boundary)
# Define initial value
u_n = fenics.interpolate(u_D, V) #u_n = project(u_D, V)
# Define variational problem
u = fenics.TrialFunction(V)
v = fenics.TestFunction(V)
f = fenics.Constant(beta - 2 - 2*alpha)
F = u*v*fenics.dx + dt*fenics.dot(fenics.grad(u), fenics.grad(v))*fenics.dx - (u_n + dt*f)*v*fenics.dx
a, L = fenics.lhs(F), fenics.rhs(F)
# Time-stepping
u = fenics.Function(V)
t = 0
vtkfile = fenics.File(os.path.join(os.path.dirname(__file__),'output','heat_constructed_solution','solution.pvd'))
not_initialised = True
for n in range(num_steps):
# Update current time
t += dt
u_D.t = t
# Compute solution
fenics.solve(a == L, u, bc)
# Plot the solution
vtkfile << (u, t)
fenics.plot(u)
if not_initialised:
animation_camera = celluloid.Camera(plt.gcf())
not_initialised = False
animation_camera.snap()
# Compute error at vertices
u_e = fenics.interpolate(u_D, V)
error = np.abs(u_e.vector().get_local() - u.vector().get_local()).max()
print('t = %.2f: error = %.3g' % (t, error))
# Update previous solution
u_n.assign(u)
# Hold plot
animation = animation_camera.animate()
animation.save(os.path.join(os.path.dirname(__file__),'output','heat_equation.mp4'))
def xest_second_tutorial(self):
T = 2.0 # final time
num_steps = 50 # number of time steps
dt = T / num_steps # time step size
# Create mesh and define function space
nx = ny = 30
mesh = fenics.RectangleMesh(fenics.Point(-2, -2), fenics.Point(2, 2), nx, ny)
V = fenics.FunctionSpace(mesh, 'P', 1)
# Define boundary condition
def boundary(x, on_boundary): return on_boundary
bc = fenics.DirichletBC(V, fenics.Constant(0), boundary)
# Define initial value
u_0 = fenics.Expression('exp(-a*pow(x[0], 2) - a*pow(x[1], 2))', degree=2, a=5)
u_n = fenics.interpolate(u_0, V)
# Define variational problem
u = fenics.TrialFunction(V)
v = fenics.TestFunction(V)
f = fenics.Constant(0)
F = u*v*fenics.dx + dt*fenics.dot(fenics.grad(u), fenics.grad(v))*fenics.dx - (u_n + dt*f)*v*fenics.dx
a, L = fenics.lhs(F), fenics.rhs(F)
# Create VTK file for saving solution
vtkfile = fenics.File(os.path.join(os.path.dirname(__file__),'output','heat_gaussian','solution.pvd'))
# Time-stepping
u = fenics.Function(V)
t = 0
not_initialised = True
for n in range(num_steps):
# Update current time
t += dt
# Compute solution
fenics.solve(a == L, u, bc)
# Save to file and plot solution
vtkfile << (u, t)
# Here we'll need to call tripcolor ourselves to get access to the color range
fenics.plot(u)
animation_camera.snap()
u_n.assign(u)
animation = animation_camera.animate()
animation.save(os.path.join(os.path.dirname(__file__),'output','heat_gaussian.mp4'))
def xest_implement_2d_myosin(self):
#Parameters
total_time = 1.0
number_of_time_steps = 100
delta_t = total_time/number_of_time_steps
nx = ny = 100
domain_size = 1.0
lambda_ = 5.0
mu = 2.0
gamma = 1.0
eta_b = 0.0
eta_s = 1.0
k_b = 1.0
k_u = 1.0
# zeta_1 = -0.5
zeta_1 = 0.0
zeta_2 = 1.0
mu_a = 1.0
K_0 = 1.0
K_1 = 0.0
K_2 = 0.0
K_3 = 0.0
D = 0.25
alpha = 3
c=0.1
# Sub domain for Periodic boundary condition
class PeriodicBoundary(fenics.SubDomain):
# Left boundary is "target domain" G
def inside(self, x, on_boundary):
# return True if on left or bottom boundary AND NOT on one of the two corners (0, 1) and (1, 0)
return bool((fenics.near(x[0], 0) or fenics.near(x[1], 0)) and
(not ((fenics.near(x[0], 0) and fenics.near(x[1], 1)) or
(fenics.near(x[0], 1) and fenics.near(x[1], 0)))) and on_boundary)
def map(self, x, y):
if fenics.near(x[0], 1) and fenics.near(x[1], 1):
y[0] = x[0] - 1.
y[1] = x[1] - 1.
elif fenics.near(x[0], 1):
y[0] = x[0] - 1.
y[1] = x[1]
else: # near(x[1], 1)
y[0] = x[0]
y[1] = x[1] - 1.
periodic_boundary_condition = PeriodicBoundary()
#Set up finite elements
mesh = fenics.RectangleMesh(fenics.Point(0, 0), fenics.Point(domain_size, domain_size), nx, ny)
vector_element = fenics.VectorElement('P',fenics.triangle,2,dim = 2)
single_element = fenics.FiniteElement('P',fenics.triangle,2)
mixed_element = fenics.MixedElement(vector_element,single_element)
V = fenics.FunctionSpace(mesh,mixed_element, constrained_domain = periodic_boundary_condition)
v,r = fenics.TestFunctions(V)
full_trial_function = fenics.Function(V)
u, rho = fenics.split(full_trial_function)
full_trial_function_n = fenics.Function(V)
u_n, rho_n = fenics.split(full_trial_function_n)
#Define non-linear weak formulation
def epsilon(u):
return 0.5*(fenics.nabla_grad(u) + fenics.nabla_grad(u).T) #return sym(nabla_grad(u))
def sigma_e(u):
return lambda_*ufl.nabla_div(u)*fenics.Identity(2) + 2*mu*epsilon(u)
def sigma_d(u):
return eta_b*ufl.nabla_div(u)*fenics.Identity(2) + 2*eta_s*epsilon(u)
# def sigma_a(u,rho):
# return ( -zeta_1*rho/(1+zeta_2*rho)*mu_a*fenics.Identity(2)*(K_0+K_1*ufl.nabla_div(u)+
# K_2*ufl.nabla_div(u)*ufl.nabla_div(u)+K_3*ufl.nabla_div(u)*ufl.nabla_div(u)*ufl.nabla_div(u)))
def sigma_a(u,rho):
return -zeta_1*rho/(1+zeta_2*rho)*mu_a*fenics.Identity(2)*(K_0)
F = ( gamma*fenics.dot(u,v)*fenics.dx - gamma*fenics.dot(u_n,v)*fenics.dx + fenics.inner(sigma_d(u),fenics.nabla_grad(v))*fenics.dx -
fenics.inner(sigma_d(u_n),fenics.nabla_grad(v))*fenics.dx - delta_t*fenics.inner(sigma_e(u)+sigma_a(u,rho),fenics.nabla_grad(v))*fenics.dx
+rho*r*fenics.dx-rho_n*r*fenics.dx + ufl.nabla_div(rho*u)*r*fenics.dx - ufl.nabla_div(rho*u_n)*r*fenics.dx -
D*delta_t*fenics.dot(fenics.nabla_grad(rho),fenics.nabla_grad(r))*fenics.dx +
delta_t*(-k_u*rho*fenics.exp(alpha*ufl.nabla_div(u))+k_b*(1-c*ufl.nabla_div(u)))*r*fenics.dx)
# F = ( gamma*fenics.dot(u,v)*fenics.dx - gamma*fenics.dot(u_n,v)*fenics.dx + fenics.inner(sigma_d(u),fenics.nabla_grad(v))*fenics.dx -
# fenics.inner(sigma_d(u_n),fenics.nabla_grad(v))*fenics.dx - delta_t*fenics.inner(sigma_e(u)+sigma_a(u,rho),fenics.nabla_grad(v))*fenics.dx
# +rho*r*fenics.dx-rho_n*r*fenics.dx + ufl.nabla_div(rho*u)*r*fenics.dx - ufl.nabla_div(rho*u_n)*r*fenics.dx -
# D*delta_t*fenics.dot(fenics.nabla_grad(rho),fenics.nabla_grad(r))*fenics.dx +delta_t*(-k_u*rho*fenics.exp(alpha*ufl.nabla_div(u))+k_b*(1-c*ufl.nabla_div(u))))
vtkfile_rho = fenics.File(os.path.join(os.path.dirname(__file__),'output','myosin_2d','solution_rho.pvd'))
vtkfile_u = fenics.File(os.path.join(os.path.dirname(__file__),'output','myosin_2d','solution_u.pvd'))
# rho_0 = fenics.Expression(((('0.0'),('0.0'),('0.0')),('sin(x[0])')), degree=1 )
# full_trial_function_n = fenics.project(rho_0, V)
time = 0.0
for time_index in range(number_of_time_steps):
# Update current time
time += delta_t
# Compute solution
fenics.solve(F==0,full_trial_function)
# Save to file and plot solution
vis_u, vis_rho = full_trial_function.split()
vtkfile_rho << (vis_rho, time)
vtkfile_u << (vis_u, time)
full_trial_function_n.assign(full_trial_function)
def xest_1d_myosin_displacement_only(self):
#Parameters
total_time = 1.0
number_of_time_steps = 100
# delta_t = fenics.Constant(total_time/number_of_time_steps)
delta_t = total_time/number_of_time_steps
nx = 1000
domain_size = 1.0
b = fenics.Constant(6.0)
k = fenics.Constant(0.5)
z_1 = fenics.Constant(-5.5) #always negative
# z_1 = fenics.Constant(0.0) #always negative
z_2 = 0.0 # always positive
xi_0 = fenics.Constant(1.0) #always positive
xi_1 = fenics.Constant(1.0) #always positive
xi_2 = 0.0 #always positive
xi_3 = 0.0 #always negative
d = fenics.Constant(0.15)
alpha = fenics.Constant(1.0)
c=fenics.Constant(0.1)
# Sub domain for Periodic boundary condition
class PeriodicBoundary(fenics.SubDomain):
# Left boundary is "target domain" G
def inside(self, x, on_boundary):
return bool(- fenics.DOLFIN_EPS < x[0] < fenics.DOLFIN_EPS and on_boundary)
def map(self, x, y):
y[0] = x[0] - 1
periodic_boundary_condition = PeriodicBoundary()
#Set up finite elements
mesh = fenics.IntervalMesh(nx,0.0,1.0)
vector_element = fenics.FiniteElement('P',fenics.interval,1)
single_element = fenics.FiniteElement('P',fenics.interval,1)
mixed_element = fenics.MixedElement(vector_element,single_element)
V = fenics.FunctionSpace(mesh, mixed_element, constrained_domain = periodic_boundary_condition)
# V = fenics.FunctionSpace(mesh, mixed_element)
v,r = fenics.TestFunctions(V)
full_trial_function = fenics.Function(V)
u, rho = fenics.split(full_trial_function)
full_trial_function_n = fenics.Function(V)
u_n, rho_n = fenics.split(full_trial_function_n)
u_initial = fenics.Constant(0.0)
rho_initial = fenics.Constant(1.0/k)
# u_n = fenics.interpolate(u_initial, V.sub(0).collapse())
# rho_n = fenics.interpolate(rho_initial, V.sub(1).collapse())
# perturbation = np.zeros(rho_n.vector().size())
# perturbation[:int(perturbation.shape[0]/2)] = 1.0
# rho_n.vector().set_local(np.array(rho_n.vector())+1.0*(0.5-np.random.random(rho_n.vector().size())))
# u_n.vector().set_local(np.array(u_n.vector())+4.0*(0.5-np.random.random(u_n.vector().size())))
initial_condition_expression = fenics.Expression(('0.0','5.0*sin(pi*x[0])*sin(pi*x[0])'), degree=2)
initial_condition = fenics.project(initial_condition_expression, V)
fenics.assign(full_trial_function_n, initial_condition)
initial_u, initial_rho = fenics.split(initial_condition)
u_n, rho_n = fenics.split(full_trial_function_n)
F = ( u*v*fenics.dx - u_n*v*fenics.dx
+ delta_t*(b+(z_1*initial_rho)/(1+z_2*initial_rho)*c*xi_1)*u.dx(0)*v.dx(0)*fenics.dx
# - delta_t*(z_1*rho)/(1+z_2*rho)*c*c*xi_2/2.0*u.dx(0)*u.dx(0)*v.dx(0)*fenics.dx
# + delta_t*(z_1*rho)/(1+z_2*rho)*c*c*c*xi_3/6.0*u.dx(0)*u.dx(0)*u.dx(0)*v.dx(0)*fenics.dx
- delta_t*z_1*initial_rho/(1+z_2*initial_rho)*xi_0*v.dx(0)*fenics.dx
+ u.dx(0)*v.dx(0)*fenics.dx - u_n.dx(0)*v.dx(0)*fenics.dx
# + initial_rho*r*fenics.dx - initial_rho*r*fenics.dx
- rho*u*r.dx(0)*fenics.dx + rho*u_n*r.dx(0)*fenics.dx
+ delta_t*d*rho.dx(0)*r.dx(0)*fenics.dx
+ delta_t*k*fenics.exp(alpha*u.dx(0))*rho*r*fenics.dx
- delta_t*r*fenics.dx
+ delta_t*c*u.dx(0)*r*fenics.dx)
vtkfile_rho = fenics.File(os.path.join(os.path.dirname(__file__),'output','myosin_2d','solution_rho.pvd'))
vtkfile_u = fenics.File(os.path.join(os.path.dirname(__file__),'output','myosin_2d','solution_u.pvd'))
# rho_0 = fenics.Expression(((('0.0'),('0.0'),('0.0')),('sin(x[0])')), degree=1 )
# full_trial_function_n = fenics.project(rho_0, V)
# print('initial u and rho')
# print(u_n.vector())
# print(rho_n.vector())
time = 0.0
not_initialised = True
plt.figure()
for time_index in range(number_of_time_steps):
# Update current time
time += delta_t
# Compute solution
fenics.solve(F==0,full_trial_function)
# Save to file and plot solution
vis_u, vis_rho = full_trial_function.split()
plt.subplot(411)
fenics.plot(vis_u, color = 'blue')
plt.ylim(-0.5,0.5)
plt.title('displacement')
plt.subplot(412)
fenics.plot(-vis_u.dx(0), color = 'blue')
plt.ylim(-10,10)
plt.title('-displacement divergence')
plt.subplot(413)
fenics.plot(vis_rho, color = 'blue')
plt.title('myosin density')
plt.ylim(0,7)
plt.subplot(414)
fenics.plot(initial_rho, color = 'blue')
plt.title('imposed myosin density')
plt.ylim(0,7)
plt.tight_layout()
if not_initialised:
animation_camera = celluloid.Camera(plt.gcf())
not_initialised = False
animation_camera.snap()
print('time is')
print(time)
# plt.savefig(os.path.join(os.path.dirname(__file__),'output','this_output_at_time_' + '{:04d}'.format(time_index) + '.png'))
# print('this u and rho')
# print(np.array(vis_u.vector()))
# print(np.array(vis_rho.vector()))
# vtkfile_rho << (vis_rho, time)
# vtkfile_u << (vis_u, time)
full_trial_function_n.assign(full_trial_function)
animation = animation_camera.animate()
animation.save(os.path.join(os.path.dirname(__file__),'output','myosin_1D_displacement_only.mp4'))
# movie_command = "ffmpeg -r 1 -i " + os.path.join(os.path.dirname(__file__),'output','this_output_at_time_%04d.png') + " -vcodec mpeg4 -y " + \
# os.path.join(os.path.dirname(__file__),'output','movie.mp4')
# print(movie_command)
# os.system(movie_command)
def xest_implement_1d_myosin(self):
#Parameters
total_time = 10.0
number_of_time_steps = 1000
# delta_t = fenics.Constant(total_time/number_of_time_steps)
delta_t = total_time/number_of_time_steps
nx = 1000
domain_size = 1.0
b = fenics.Constant(6.0)
k = fenics.Constant(0.5)
z_1 = fenics.Constant(-10.5) #always negative
# z_1 = fenics.Constant(0.0) #always negative
z_2 = 0.1 # always positive
xi_0 = fenics.Constant(1.0) #always positive
xi_1 = fenics.Constant(1.0) #always positive
xi_2 = 0.001 #always positive
xi_3 = 0.0001 #always negative
d = fenics.Constant(0.15)
alpha = fenics.Constant(1.0)
c=fenics.Constant(0.1)
# Sub domain for Periodic boundary condition
class PeriodicBoundary(fenics.SubDomain):
# Left boundary is "target domain" G
def inside(self, x, on_boundary):
return bool(- fenics.DOLFIN_EPS < x[0] < fenics.DOLFIN_EPS and on_boundary)
def map(self, x, y):
y[0] = x[0] - 1
periodic_boundary_condition = PeriodicBoundary()
#Set up finite elements
mesh = fenics.IntervalMesh(nx,0.0,1.0)
vector_element = fenics.FiniteElement('P',fenics.interval,1)
single_element = fenics.FiniteElement('P',fenics.interval,1)
mixed_element = fenics.MixedElement(vector_element,single_element)
V = fenics.FunctionSpace(mesh, mixed_element, constrained_domain = periodic_boundary_condition)
# V = fenics.FunctionSpace(mesh, mixed_element)
v,r = fenics.TestFunctions(V)
full_trial_function = fenics.Function(V)
u, rho = fenics.split(full_trial_function)
full_trial_function_n = fenics.Function(V)
u_n, rho_n = fenics.split(full_trial_function_n)
u_initial = fenics.Constant(0.0)
# rho_initial = fenics.Expression('1.0*sin(pi*x[0])*sin(pi*x[0])+1.0/k0', degree=2,k0 = k)
rho_initial = fenics.Expression('1/k0', degree=2,k0 = k)
u_n = fenics.interpolate(u_initial, V.sub(0).collapse())
rho_n = fenics.interpolate(rho_initial, V.sub(1).collapse())
# perturbation = np.zeros(rho_n.vector().size())
# perturbation[:int(perturbation.shape[0]/2)] = 1.0
rho_n.vector().set_local(np.array(rho_n.vector())+1.0*(0.5-np.random.random(rho_n.vector().size())))
# u_n.vector().set_local(np.array(u_n.vector())+4.0*(0.5-np.random.random(u_n.vector().size())))
fenics.assign(full_trial_function_n, [u_n,rho_n])
u_n, rho_n = fenics.split(full_trial_function_n)
F = ( u*v*fenics.dx - u_n*v*fenics.dx
+ delta_t*(b+(z_1*rho)/(1+z_2*rho)*c*xi_1)*u.dx(0)*v.dx(0)*fenics.dx
- delta_t*(z_1*rho)/(1+z_2*rho)*c*c*xi_2/2.0*u.dx(0)*u.dx(0)*v.dx(0)*fenics.dx
+ delta_t*(z_1*rho)/(1+z_2*rho)*c*c*c*xi_3/6.0*u.dx(0)*u.dx(0)*u.dx(0)*v.dx(0)*fenics.dx
- delta_t*z_1*rho/(1+z_2*rho)*xi_0*v.dx(0)*fenics.dx
+ u.dx(0)*v.dx(0)*fenics.dx - u_n.dx(0)*v.dx(0)*fenics.dx
+ rho*r*fenics.dx - rho_n*r*fenics.dx
- rho*u*r.dx(0)*fenics.dx + rho*u_n*r.dx(0)*fenics.dx
+ delta_t*d*rho.dx(0)*r.dx(0)*fenics.dx
+ delta_t*k*fenics.exp(alpha*u.dx(0))*rho*r*fenics.dx
- delta_t*r*fenics.dx
+ delta_t*c*u.dx(0)*r*fenics.dx)
vtkfile_rho = fenics.File(os.path.join(os.path.dirname(__file__),'output','myosin_2d','solution_rho.pvd'))
vtkfile_u = fenics.File(os.path.join(os.path.dirname(__file__),'output','myosin_2d','solution_u.pvd'))
# rho_0 = fenics.Expression(((('0.0'),('0.0'),('0.0')),('sin(x[0])')), degree=1 )
# full_trial_function_n = fenics.project(rho_0, V)
# print('initial u and rho')
# print(u_n.vector())
# print(rho_n.vector())
time = 0.0
not_initialised = True
plt.figure()
for time_index in range(number_of_time_steps):
# Update current time
time += delta_t
# Compute solution
fenics.solve(F==0,full_trial_function)
# Save to file and plot solution
vis_u, vis_rho = full_trial_function.split()
plt.subplot(311)
fenics.plot(vis_u, color = 'blue')
plt.ylim(-0.5,0.5)
plt.subplot(312)
fenics.plot(-vis_u.dx(0), color = 'blue')
plt.ylim(-2,2)
plt.title('actin density change')
plt.subplot(313)
fenics.plot(vis_rho, color = 'blue')
plt.title('myosin density')
plt.ylim(0,7)
plt.tight_layout()
if not_initialised:
animation_camera = celluloid.Camera(plt.gcf())
not_initialised = False
animation_camera.snap()
print('time is')
print(time)
# plt.savefig(os.path.join(os.path.dirname(__file__),'output','this_output_at_time_' + '{:04d}'.format(time_index) + '.png'))
# print('this u and rho')
# print(np.array(vis_u.vector()))
# print(np.array(vis_rho.vector()))
# vtkfile_rho << (vis_rho, time)
# vtkfile_u << (vis_u, time)
full_trial_function_n.assign(full_trial_function)
animation = animation_camera.animate()
animation.save(os.path.join(os.path.dirname(__file__),'output','myosin_1D.mp4'))
# movie_command = "ffmpeg -r 1 -i " + os.path.join(os.path.dirname(__file__),'output','this_output_at_time_%04d.png') + " -vcodec mpeg4 -y " + \
# os.path.join(os.path.dirname(__file__),'output','movie.mp4')
# print(movie_command)
# os.system(movie_command)
def test_plot_elastic_free_energy(self):
z_2 = 0.1
c = 0.1
rho_b = 10
b = 5
xi_1 = 1.0
xi_2 = 10.0
xi_3 = -55.0
epsilon = np.linspace(-7,4,100)
def elastic_free_energy(z_1):
this_elastic_free_energy = ( 0.5*(b+(z_1*rho_b)/(1+z_2*rho_b)*c*xi_1)*epsilon*epsilon
- 1.0/3.0*(z_1*rho_b)/(1+z_2*rho_b)*c*c*xi_2/2.0*epsilon*epsilon*epsilon
+ 1.0/4.0*(z_1*rho_b)/(1+z_2*rho_b)*c*c*c*xi_3/6.0*epsilon*epsilon*epsilon*epsilon )
return this_elastic_free_energy
z1_values = [-0.1,-6.0,-8.0]
plt.figure(figsize = (4.5,2.5))
for z_1 in z1_values:
this_free_energy = elastic_free_energy(z_1)
plt.plot(epsilon,this_free_energy, label = str(z_1))
plt.ylim(-20,30)
plt.xlabel('$\epsilon$')
plt.ylabel('$\Phi(\epsilon)$')
plt.legend()
plt.tight_layout()
plt.savefig(os.path.join(os.path.dirname(__file__),'output','elastic_free_energy.pdf'))
| 46.068273 | 174 | 0.57933 | 3,392 | 22,942 | 3.720814 | 0.093455 | 0.037398 | 0.041756 | 0.019016 | 0.820616 | 0.786625 | 0.772364 | 0.767451 | 0.73964 | 0.726805 | 0 | 0.035813 | 0.277046 | 22,942 | 497 | 175 | 46.160966 | 0.72513 | 0.260091 | 0 | 0.571429 | 0 | 0.005831 | 0.042936 | 0.006414 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055394 | false | 0 | 0.029155 | 0.026239 | 0.119534 | 0.014577 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c8f4a5fed2037a88b4c35e0ecd4b687dce90acc4 | 76 | py | Python | python-practice/tempCodeRunnerFile.py | refinec/refinec-keep-learning | f6753fee81b5c3f2c240ec37ead54e967d975d01 | [
"MIT"
] | null | null | null | python-practice/tempCodeRunnerFile.py | refinec/refinec-keep-learning | f6753fee81b5c3f2c240ec37ead54e967d975d01 | [
"MIT"
] | null | null | null | python-practice/tempCodeRunnerFile.py | refinec/refinec-keep-learning | f6753fee81b5c3f2c240ec37ead54e967d975d01 | [
"MIT"
] | null | null | null |
# -*- coding: utf-8 -*-
# coding=utf-8
# !/usr/bin/Python3
print(type(int)) | 15.2 | 23 | 0.592105 | 12 | 76 | 3.75 | 0.75 | 0.4 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.131579 | 76 | 5 | 24 | 15.2 | 0.636364 | 0.684211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
a81e7e8f8ca73a122a1bf9a5f081d4dca7c78054 | 36 | py | Python | eds/openmtc-gevent/futile/src/futile/signal/__init__.py | piyush82/elastest-device-emulator-service | b4d6b393d6042c54a7b3dfb5f58cad5efd00f0e7 | [
"Apache-2.0"
] | null | null | null | eds/openmtc-gevent/futile/src/futile/signal/__init__.py | piyush82/elastest-device-emulator-service | b4d6b393d6042c54a7b3dfb5f58cad5efd00f0e7 | [
"Apache-2.0"
] | null | null | null | eds/openmtc-gevent/futile/src/futile/signal/__init__.py | piyush82/elastest-device-emulator-service | b4d6b393d6042c54a7b3dfb5f58cad5efd00f0e7 | [
"Apache-2.0"
] | null | null | null | from timeout import timeout, Timeout | 36 | 36 | 0.861111 | 5 | 36 | 6.2 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 36 | 1 | 36 | 36 | 0.96875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.