hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
607ca84470c8804722dc9c2950ea75318ec4a1d6 | 31 | py | Python | gscraper/__init__.py | AlexNilsson/python-image-scraper | 03d88a24cfdca4f4e155932240109db7c2b2d86a | [
"MIT"
] | null | null | null | gscraper/__init__.py | AlexNilsson/python-image-scraper | 03d88a24cfdca4f4e155932240109db7c2b2d86a | [
"MIT"
] | null | null | null | gscraper/__init__.py | AlexNilsson/python-image-scraper | 03d88a24cfdca4f4e155932240109db7c2b2d86a | [
"MIT"
] | null | null | null | from .core import scrapeImages
| 15.5 | 30 | 0.83871 | 4 | 31 | 6.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 1 | 31 | 31 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
60d1d7d51896f12ced46ac9ad6e38698310badf8 | 27 | py | Python | adpengine/__init__.py | dice-project/DICE-Anomaly-Detection-Tool | a5eeacb9e888348adbe97be0c26a500f2f03ec6f | [
"Apache-2.0"
] | 4 | 2017-02-06T15:33:06.000Z | 2018-05-08T01:43:03.000Z | adpengine/__init__.py | dice-project/DICE-Anomaly-Detection-Tool | a5eeacb9e888348adbe97be0c26a500f2f03ec6f | [
"Apache-2.0"
] | null | null | null | adpengine/__init__.py | dice-project/DICE-Anomaly-Detection-Tool | a5eeacb9e888348adbe97be0c26a500f2f03ec6f | [
"Apache-2.0"
] | null | null | null | from dmonadpengine import * | 27 | 27 | 0.851852 | 3 | 27 | 7.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 27 | 1 | 27 | 27 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7160fd9e42bbfc4f6c897a1882e9aa96a7560b97 | 31 | py | Python | TOPSIS-ParthArora-101853039/__init__.py | parthrr510/TOPSIS-ParthArora-101853039 | 6233ddb24e174eb2e561c288c822a4daa1258684 | [
"MIT"
] | null | null | null | TOPSIS-ParthArora-101853039/__init__.py | parthrr510/TOPSIS-ParthArora-101853039 | 6233ddb24e174eb2e561c288c822a4daa1258684 | [
"MIT"
] | null | null | null | TOPSIS-ParthArora-101853039/__init__.py | parthrr510/TOPSIS-ParthArora-101853039 | 6233ddb24e174eb2e561c288c822a4daa1258684 | [
"MIT"
] | null | null | null | from Assignment6 import topsis
| 15.5 | 30 | 0.870968 | 4 | 31 | 6.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.129032 | 31 | 1 | 31 | 31 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
719e77a965d4c33d2c025e9d5e30397a8e2ac23c | 67 | py | Python | py_tdlib/constructors/get_country_code.py | Mr-TelegramBot/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 24 | 2018-10-05T13:04:30.000Z | 2020-05-12T08:45:34.000Z | py_tdlib/constructors/get_country_code.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 3 | 2019-06-26T07:20:20.000Z | 2021-05-24T13:06:56.000Z | py_tdlib/constructors/get_country_code.py | MrMahdi313/python-tdlib | 2e2d21a742ebcd439971a32357f2d0abd0ce61eb | [
"MIT"
] | 5 | 2018-10-05T14:29:28.000Z | 2020-08-11T15:04:10.000Z | from ..factory import Method
class getCountryCode(Method):
pass
| 11.166667 | 29 | 0.776119 | 8 | 67 | 6.5 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149254 | 67 | 5 | 30 | 13.4 | 0.912281 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
71e749ade64252247b048f84b0217018506fe040 | 9,673 | py | Python | tests/core/test_runner.py | iri6e4k0/vedro | dd51c16400993d0fe1fd34bba57edff710ac2638 | [
"Apache-2.0"
] | 2 | 2021-08-24T12:49:30.000Z | 2022-01-23T07:21:25.000Z | tests/core/test_runner.py | iri6e4k0/vedro | dd51c16400993d0fe1fd34bba57edff710ac2638 | [
"Apache-2.0"
] | 20 | 2015-12-09T11:04:23.000Z | 2022-03-20T09:18:17.000Z | tests/core/test_runner.py | iri6e4k0/vedro | dd51c16400993d0fe1fd34bba57edff710ac2638 | [
"Apache-2.0"
] | 3 | 2015-12-09T07:31:23.000Z | 2022-01-28T11:03:24.000Z | import sys
from pytest import raises
if sys.version_info >= (3, 8):
from unittest.mock import AsyncMock
else:
from asynctest.mock import CoroutineMock as AsyncMock
from unittest.mock import Mock, call
import pytest
from baby_steps import given, then, when
from vedro import Scenario
from vedro.core import Dispatcher, Runner, VirtualScenario, VirtualStep
from vedro.events import (
ExceptionRaisedEvent,
ScenarioFailedEvent,
ScenarioPassedEvent,
ScenarioRunEvent,
StepFailedEvent,
StepPassedEvent,
StepRunEvent,
)
@pytest.fixture()
def dispatcher_():
return AsyncMock(Dispatcher())
@pytest.fixture()
def runner(dispatcher_: Dispatcher):
return Runner(dispatcher_)
@pytest.mark.asyncio
@pytest.mark.parametrize("method_mock_factory", (Mock, AsyncMock,))
async def test_runner_run_step_passed(method_mock_factory: Mock, *,
runner: Runner, dispatcher_: Dispatcher):
with given:
scenario_ = Mock(Scenario, step=method_mock_factory(return_value=None))
step = VirtualStep(scenario_.step)
with when:
step_result = await runner.run_step(step, scenario_)
with then:
assert scenario_.mock_calls == [call.step(scenario_)]
assert step_result.is_passed() is True
assert step_result.exc_info is None
assert isinstance(step_result.started_at, float)
assert isinstance(step_result.ended_at, float)
assert dispatcher_.mock_calls == [
call.fire(StepRunEvent(step_result)),
call.fire(StepPassedEvent(step_result)),
]
@pytest.mark.asyncio
@pytest.mark.parametrize("method_mock_factory", (Mock, AsyncMock))
async def test_runner_run_step_failed(method_mock_factory: Mock, *,
runner: Runner, dispatcher_: Dispatcher):
with given:
exception = AssertionError()
scenario_ = Mock(Scenario, step=method_mock_factory(side_effect=exception))
step = VirtualStep(scenario_.step)
with when:
step_result = await runner.run_step(step, scenario_)
with given:
assert scenario_.mock_calls == [call.step(scenario_)]
assert step_result.is_failed() is True
assert step_result.exc_info.value == exception
assert isinstance(step_result.started_at, float)
assert isinstance(step_result.ended_at, float)
assert dispatcher_.mock_calls == [
call.fire(StepRunEvent(step_result)),
call.fire(ExceptionRaisedEvent(step_result.exc_info)),
call.fire(StepFailedEvent(step_result)),
]
@pytest.mark.asyncio
@pytest.mark.parametrize("method_mock_factory", (Mock, AsyncMock))
async def test_runner_run_step_interrupted(*, method_mock_factory: Mock, dispatcher_: Dispatcher):
with given:
interrupt_exception = KeyboardInterrupt
scenario_ = Mock(Scenario, step=method_mock_factory(side_effect=interrupt_exception))
virtual_step = VirtualStep(scenario_.step)
runner = Runner(dispatcher_, interrupt_exceptions=(interrupt_exception,))
with when, raises(BaseException) as exception:
await runner.run_step(virtual_step, scenario_)
with given:
assert exception.type is interrupt_exception
assert scenario_.mock_calls == [call.step(scenario_)]
@pytest.mark.asyncio
async def test_runner_run_scenario_no_steps_passed(*, runner: Runner, dispatcher_: Dispatcher):
with given:
scenario_ = Mock(Scenario, step=Mock(return_value=None), __file__="/tmp/scenario.py")
virtual_scenario = VirtualScenario(scenario_, [])
with when:
scenario_result = await runner.run_scenario(virtual_scenario)
with then:
assert scenario_result.is_passed() is True
assert isinstance(scenario_result.started_at, float)
assert isinstance(scenario_result.ended_at, float)
assert dispatcher_.mock_calls == [
call.fire(ScenarioRunEvent(scenario_result)),
call.fire(ScenarioPassedEvent(scenario_result)),
]
@pytest.mark.asyncio
async def test_runner_run_scenario_single_step_passed(*, runner: Runner, dispatcher_: Dispatcher):
with given:
scenario_ = Mock(Scenario, step=Mock(return_value=None), __file__="/tmp/scenario.py")
step = VirtualStep(scenario_.step)
virtual_scenario = VirtualScenario(scenario_, [step])
with when:
scenario_result = await runner.run_scenario(virtual_scenario)
with then:
assert scenario_result.is_passed() is True
assert isinstance(scenario_result.started_at, float)
assert isinstance(scenario_result.ended_at, float)
step_results = scenario_result.step_results
assert dispatcher_.mock_calls == [
call.fire(ScenarioRunEvent(scenario_result)),
call.fire(StepRunEvent(step_results[0])),
call.fire(StepPassedEvent(step_results[0])),
call.fire(ScenarioPassedEvent(scenario_result)),
]
@pytest.mark.asyncio
async def test_runner_run_scenario_single_step_failed(*, runner: Runner, dispatcher_: Dispatcher):
with given:
exception = AssertionError()
scenario_ = Mock(Scenario, step=Mock(side_effect=exception), __file__="/tmp/scenario.py")
step = VirtualStep(scenario_.step)
virtual_scenario = VirtualScenario(scenario_, [step])
with when:
scenario_result = await runner.run_scenario(virtual_scenario)
with then:
assert scenario_result.is_failed() is True
assert isinstance(scenario_result.started_at, float)
assert isinstance(scenario_result.ended_at, float)
step_results = scenario_result.step_results
assert dispatcher_.mock_calls == [
call.fire(ScenarioRunEvent(scenario_result)),
call.fire(StepRunEvent(step_results[0])),
call.fire(ExceptionRaisedEvent(step_results[0].exc_info)),
call.fire(StepFailedEvent(step_results[0])),
call.fire(ScenarioFailedEvent(scenario_result)),
]
@pytest.mark.asyncio
async def test_runner_run_scenario_multiple_steps_passed(*, runner: Runner,
dispatcher_: Dispatcher):
with given:
scenario_ = Mock(Scenario, __file__="/tmp/scenario.py",
first_step=Mock(return_value=None),
second_step=Mock(return_value=None))
first_step = VirtualStep(scenario_.first_step)
second_step = VirtualStep(scenario_.second_step)
virtual_scenario = VirtualScenario(scenario_, [first_step, second_step])
with when:
scenario_result = await runner.run_scenario(virtual_scenario)
with then:
assert scenario_result.is_passed() is True
assert isinstance(scenario_result.started_at, float)
assert isinstance(scenario_result.ended_at, float)
first_step_result = scenario_result.step_results[0]
second_step_result = scenario_result.step_results[1]
assert dispatcher_.mock_calls == [
call.fire(ScenarioRunEvent(scenario_result)),
call.fire(StepRunEvent(first_step_result)),
call.fire(StepPassedEvent(first_step_result)),
call.fire(StepRunEvent(second_step_result)),
call.fire(StepPassedEvent(second_step_result)),
call.fire(ScenarioPassedEvent(scenario_result)),
]
@pytest.mark.asyncio
async def test_runner_run_scenario_multiple_steps_failed():
with given:
dispatcher = AsyncMock(Dispatcher)
runner = Runner(dispatcher)
exception = AssertionError()
scenario_ = Mock(Scenario, __file__="/tmp/scenario.py",
first_step=Mock(return_value=None),
second_step=Mock(side_effect=exception),
third_step=Mock(return_value=None))
first_step = VirtualStep(scenario_.first_step)
second_step = VirtualStep(scenario_.second_step)
third_step = VirtualStep(scenario_.third_step)
scenario = VirtualScenario(scenario_, [first_step, second_step, third_step])
with when:
scenario_result = await runner.run_scenario(scenario)
with then:
assert scenario_result.is_failed() is True
assert isinstance(scenario_result.started_at, float)
assert isinstance(scenario_result.ended_at, float)
first_step_result = scenario_result.step_results[0]
second_step_result = scenario_result.step_results[1]
assert dispatcher.mock_calls == [
call.fire(ScenarioRunEvent(scenario_result)),
call.fire(StepRunEvent(first_step_result)),
call.fire(StepPassedEvent(first_step_result)),
call.fire(StepRunEvent(second_step_result)),
call.fire(ExceptionRaisedEvent(second_step_result.exc_info)),
call.fire(StepFailedEvent(second_step_result)),
call.fire(ScenarioFailedEvent(scenario_result)),
]
@pytest.mark.asyncio
async def test_runner_interrupted_scenario(*, dispatcher_: Dispatcher):
with given:
interrupt_exception = KeyboardInterrupt
runner = Runner(dispatcher_, interrupt_exceptions=(interrupt_exception,))
step_ = Mock(side_effect=interrupt_exception)
scenario_ = Mock(Scenario, step=step_, __file__="/tmp/scenario.py")
virtual_scenario = VirtualScenario(scenario_, [VirtualStep(step_)])
with when, raises(BaseException) as exception:
await runner.run_scenario(virtual_scenario)
with then:
assert exception.type is interrupt_exception
| 35.693727 | 98 | 0.69389 | 1,049 | 9,673 | 6.080076 | 0.086749 | 0.079022 | 0.032926 | 0.028222 | 0.848855 | 0.814675 | 0.796174 | 0.712135 | 0.697554 | 0.648479 | 0 | 0.001456 | 0.21896 | 9,673 | 270 | 99 | 35.825926 | 0.842753 | 0 | 0 | 0.625616 | 0 | 0 | 0.015817 | 0 | 0 | 0 | 0 | 0 | 0.187192 | 1 | 0.009852 | false | 0.08867 | 0.049261 | 0.009852 | 0.068966 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
e082b8fd0e8142bceca268718bb070271419def5 | 27 | py | Python | print('hello world').py | BradLyman/MafiaGame | c6f14a303b126886e8593ed91d034650cb7d82b0 | [
"MIT"
] | null | null | null | print('hello world').py | BradLyman/MafiaGame | c6f14a303b126886e8593ed91d034650cb7d82b0 | [
"MIT"
] | null | null | null | print('hello world').py | BradLyman/MafiaGame | c6f14a303b126886e8593ed91d034650cb7d82b0 | [
"MIT"
] | null | null | null | print('hello stupid Brad.') | 27 | 27 | 0.740741 | 4 | 27 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 27 | 1 | 27 | 27 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
e08cf6eb7f3ec9dc6ba30a19c42910ee06b33e5d | 29 | py | Python | worker/__init__.py | Alan-Rick/flask_template_two | 053b611c687eeee874b941daf4237eec5524ee96 | [
"MIT"
] | null | null | null | worker/__init__.py | Alan-Rick/flask_template_two | 053b611c687eeee874b941daf4237eec5524ee96 | [
"MIT"
] | null | null | null | worker/__init__.py | Alan-Rick/flask_template_two | 053b611c687eeee874b941daf4237eec5524ee96 | [
"MIT"
] | null | null | null | from .celery_worker import *
| 14.5 | 28 | 0.793103 | 4 | 29 | 5.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e0bf22ecd82f86fc5e036d4827e425cc9f3a0dfd | 167 | py | Python | src/combine.py | pranayjoshi/Medico | 2508a39d58eec50f5e94f3c878c00f599fff6629 | [
"MIT"
] | 13 | 2020-09-04T09:16:15.000Z | 2021-01-27T07:03:12.000Z | src/combine.py | bhargavaganti/Medico | 9059c59f49211f48a27805a00807121ac6f27b27 | [
"MIT"
] | 1 | 2020-10-04T03:23:45.000Z | 2020-10-04T03:23:45.000Z | src/combine.py | bhargavaganti/Medico | 9059c59f49211f48a27805a00807121ac6f27b27 | [
"MIT"
] | 2 | 2020-11-27T12:25:10.000Z | 2022-01-11T06:25:33.000Z | import sys
sys.path.append("./src/AI_files")
sys.path.append("./src/Speech_process")
import speech_to_text
import detector
speech_to_text.run()
print(detector.run()) | 20.875 | 39 | 0.778443 | 27 | 167 | 4.592593 | 0.518519 | 0.112903 | 0.209677 | 0.258065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065868 | 167 | 8 | 40 | 20.875 | 0.794872 | 0 | 0 | 0 | 0 | 0 | 0.202381 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.428571 | 0 | 0.428571 | 0.142857 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
1cb65ed814911faaacc978cdeca369e6ad621eff | 47 | py | Python | android/red.py | Abdulhadi5692HDI/BIGTEXT | af5e9ac9d89d8a3719f2535129e73ed553043db4 | [
"Unlicense"
] | null | null | null | android/red.py | Abdulhadi5692HDI/BIGTEXT | af5e9ac9d89d8a3719f2535129e73ed553043db4 | [
"Unlicense"
] | null | null | null | android/red.py | Abdulhadi5692HDI/BIGTEXT | af5e9ac9d89d8a3719f2535129e73ed553043db4 | [
"Unlicense"
] | null | null | null | from colorama import Fore
print(Fore.RED + "")
| 15.666667 | 25 | 0.723404 | 7 | 47 | 4.857143 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 47 | 2 | 26 | 23.5 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
1cd499a19627451cb8cb27576706d997a3ee807b | 51 | py | Python | Class Notes/Flask_app/compute.py | alannanoguchi/DS-2.3-Data-Science-in-Production | df0ebef3db963d848a7a8fdc585da769dcb2c865 | [
"MIT"
] | null | null | null | Class Notes/Flask_app/compute.py | alannanoguchi/DS-2.3-Data-Science-in-Production | df0ebef3db963d848a7a8fdc585da769dcb2c865 | [
"MIT"
] | null | null | null | Class Notes/Flask_app/compute.py | alannanoguchi/DS-2.3-Data-Science-in-Production | df0ebef3db963d848a7a8fdc585da769dcb2c865 | [
"MIT"
] | null | null | null | import math
def compute(r):
return math.sin(r) | 12.75 | 22 | 0.686275 | 9 | 51 | 3.888889 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.196078 | 51 | 4 | 22 | 12.75 | 0.853659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
1cf84206043ce917f0b51b9586a1a238a7c2d6dc | 2,108 | py | Python | Ago-Dic-2020/flores-fernandez-fernando/Practicas/Practica-5/Test_Practica-5_strategy.py | bryanbalderas/DAS_Sistemas | 1e31f088c0de7134471025a5730b0abfc19d936e | [
"MIT"
] | 41 | 2017-09-26T09:36:32.000Z | 2022-03-19T18:05:25.000Z | Ago-Dic-2020/flores-fernandez-fernando/Practicas/Practica-5/Test_Practica-5_strategy.py | bryanbalderas/DAS_Sistemas | 1e31f088c0de7134471025a5730b0abfc19d936e | [
"MIT"
] | 67 | 2017-09-11T05:06:12.000Z | 2022-02-14T04:44:04.000Z | Ago-Dic-2020/flores-fernandez-fernando/Practicas/Practica-5/Test_Practica-5_strategy.py | bryanbalderas/DAS_Sistemas | 1e31f088c0de7134471025a5730b0abfc19d936e | [
"MIT"
] | 210 | 2017-09-01T00:10:08.000Z | 2022-03-19T18:05:12.000Z | import unittest
from Practica5_strategy import *
class StrategyTest(unittest.TestCase):
def test_basic_auth_strategy(self):
context = AuthContext(BasicAuthConcreteStrategy(usr='tintin', passwd='123456'))
self.assertEqual(
context.authenticate(),
'### Authenticated with Basic Auth\n\tUser: tintin\n\tPass: 123456'
)
def test_oauth_strategy(self):
cred = {
"access_token": "una cadena muy larga",
"token_type": "Bearer",
"expires_in": 3600,
"refresh_token": "una cadena muy larga 2",
"scope": "readAndWrite"
}
context = AuthContext(OauthAuthConcreteStrategy(credentials=cred))
self.assertEqual(
context.authenticate(),
'### Authenticated with OAuth\n\tCredentials: {"access_token":"una cadena muy larga","token_type":"Bearer","expires_in":3600,"refresh_token":"una cadena muy larga 2","scope":"readAndWrite"}'
)
def test_api_key_strategy(self):
context = AuthContext(ApiKeyConcreteStrategy(api_key='tintin-123456'))
self.assertEqual(
context.authenticate(),
'### Authenticated with API Key\n\tKey: tintin-123456'
)
def test_default_strategy(self):
self.assertEqual(
AuthContext().authenticate(),
'### Authenticated with OAuth\n\tCredentials: {"access_token":"una cadena muy larga","token_type":"Bearer","expires_in":3600,"refresh_token":"una cadena muy larga 2","scope":"default"}'
)
def test_updating_strategy(self):
context = AuthContext(BasicAuthConcreteStrategy(usr='tintin', passwd='123456'))
self.assertEqual(
context.authenticate(),
'### Authenticated with Basic Auth\n\tUser: tintin\n\tPass: 123456'
)
context.set_strategy(ApiKeyConcreteStrategy(api_key='tintin-123456'))
self.assertEqual(
context.authenticate(),
'### Authenticated with API Key\n\tKey: tintin-123456'
)
if __name__ == "__main__":
unittest.main() | 39.773585 | 202 | 0.63093 | 210 | 2,108 | 6.161905 | 0.271429 | 0.069552 | 0.134467 | 0.078825 | 0.742658 | 0.742658 | 0.725657 | 0.725657 | 0.725657 | 0.725657 | 0 | 0.040151 | 0.243833 | 2,108 | 53 | 203 | 39.773585 | 0.771644 | 0 | 0 | 0.369565 | 0 | 0.043478 | 0.366524 | 0.105737 | 0 | 0 | 0 | 0 | 0.130435 | 1 | 0.108696 | false | 0.086957 | 0.043478 | 0 | 0.173913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
e8304bc90b617140933d9402630552eae52716b4 | 121 | py | Python | python/faceonnx/__init__.py | QuantumLiu/FaceONNX | cc630020f51f0e8b05e9839c58aa4bd1ac040409 | [
"MIT"
] | 22 | 2021-08-02T05:09:13.000Z | 2022-03-23T18:44:10.000Z | python/faceonnx/__init__.py | QuantumLiu/FaceONNX | cc630020f51f0e8b05e9839c58aa4bd1ac040409 | [
"MIT"
] | 6 | 2021-10-02T22:17:58.000Z | 2022-03-27T01:42:44.000Z | python/faceonnx/__init__.py | QuantumLiu/FaceONNX | cc630020f51f0e8b05e9839c58aa4bd1ac040409 | [
"MIT"
] | 7 | 2021-08-10T02:41:26.000Z | 2022-03-23T18:40:10.000Z | from .engine import *
from .imaging import *
from .landmarks import *
from .embeddings import *
from .rectangles import * | 24.2 | 25 | 0.760331 | 15 | 121 | 6.133333 | 0.466667 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157025 | 121 | 5 | 26 | 24.2 | 0.901961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1c2927cf1a445c9ca44b216f8f7d15432031eac4 | 3,208 | py | Python | test/test_epifm.py | ecell/scopyon | 99436fbfd34bb684966846eba75b206c2806f69c | [
"BSD-3-Clause"
] | 6 | 2018-12-24T16:20:55.000Z | 2021-06-12T20:50:04.000Z | test/test_epifm.py | ecell/bioimaging | 99436fbfd34bb684966846eba75b206c2806f69c | [
"BSD-3-Clause"
] | 9 | 2019-03-03T15:30:37.000Z | 2020-08-27T05:48:33.000Z | test/test_epifm.py | ecell/scopyon | 99436fbfd34bb684966846eba75b206c2806f69c | [
"BSD-3-Clause"
] | 3 | 2019-03-05T22:51:38.000Z | 2020-02-03T13:58:48.000Z | import unittest
class TestEPIFM(unittest.TestCase):
def setUp(self):
self.radial_cutoff = 1000.0e-9
self.radial_resolution = 1.0e-9
def tearDown(self):
pass
def test1(self):
import scopyon._epifm
def test2(self):
print('Testing TRITC ...')
import numpy
from scopyon._epifm import PointSpreadingFunction
psf = PointSpreadingFunction(psf_radial_cutoff=self.radial_cutoff, psf_radial_width=None, psf_depth_cutoff=1000.0e-9, fluorophore_type="Tetramethylrhodamine(TRITC)", psf_wavelength=5.78e-07)
depth = 0.0
radial = numpy.arange(0.0, self.radial_cutoff, self.radial_resolution, dtype=float)
psf_r = psf.get_distribution(radial, depth)
self.assertIs(type(psf_r), numpy.ndarray)
self.assertEqual(psf_r.ndim, 1)
self.assertEqual(psf_r.size, radial.size)
self.assertTrue((psf_r >= 0.0).all())
tot_r = numpy.sum(2 * numpy.pi * radial * psf_r) * self.radial_resolution
print(f'Integral of radial distribution = {tot_r}')
psf_cart = psf.radial_to_cartesian(radial, psf_r, self.radial_cutoff, self.radial_resolution)
tot_cart = psf_cart.sum() * (self.radial_resolution * self.radial_resolution)
print(f'Integral of cartesian distribution = {tot_cart}')
camera = numpy.zeros((512, 512))
pixel_length = 4.444444444444444e-08
psf.overlay_signal_(camera, psf_cart, numpy.zeros(3, dtype=float), pixel_length, self.radial_resolution, 1.0)
# tot_camera = camera.sum() * (self.radial_resolution * self.radial_resolution)
tot_camera = camera.sum()
print(f'Integral of detected = {tot_camera}')
def test3(self):
print('Testing Gaussian ...')
import numpy
from scopyon._epifm import PointSpreadingFunction
psf = PointSpreadingFunction(psf_radial_cutoff=self.radial_cutoff, psf_radial_width=1.0e-7, psf_depth_cutoff=1000.0e-9, fluorophore_type="Gaussian", psf_wavelength=6.0e-7)
depth = 0.0
radial = numpy.arange(0.0, self.radial_cutoff, self.radial_resolution, dtype=float)
psf_r = psf.get_distribution(radial, depth)
self.assertIs(type(psf_r), numpy.ndarray)
self.assertEqual(psf_r.ndim, 1)
self.assertEqual(psf_r.size, radial.size)
self.assertTrue((psf_r >= 0.0).all())
tot_r = numpy.sum(2 * numpy.pi * radial * psf_r) * self.radial_resolution
print(f'Integral of radial distribution = {tot_r}')
psf_cart = psf.radial_to_cartesian(radial, psf_r, self.radial_cutoff, self.radial_resolution)
tot_cart = psf_cart.sum() * (self.radial_resolution * self.radial_resolution)
print(f'Integral of cartesian distribution = {tot_cart}')
camera = numpy.zeros((512, 512))
pixel_length = 4.444444444444444e-08
psf.overlay_signal_(camera, psf_cart, numpy.zeros(3, dtype=float), pixel_length, self.radial_resolution, 1.0)
# tot_camera = camera.sum() * (self.radial_resolution * self.radial_resolution)
tot_camera = camera.sum()
print(f'Integral of detected = {tot_camera}')
if __name__ == '__main__':
unittest.main()
| 42.773333 | 198 | 0.679239 | 425 | 3,208 | 4.894118 | 0.190588 | 0.115385 | 0.163462 | 0.063462 | 0.85 | 0.85 | 0.85 | 0.85 | 0.815385 | 0.815385 | 0 | 0.041225 | 0.206047 | 3,208 | 74 | 199 | 43.351351 | 0.775422 | 0.048317 | 0 | 0.690909 | 0 | 0 | 0.106885 | 0.008852 | 0 | 0 | 0 | 0 | 0.145455 | 1 | 0.090909 | false | 0.018182 | 0.109091 | 0 | 0.218182 | 0.145455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1c47869bfa0f88eba2e94f57df3c36bcb2331ede | 404 | py | Python | server/src/prefect_server/utilities/__init__.py | louisditzel/prefect | b1a02fee623b965e756a38aa09059db780ab67eb | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2020-05-10T14:32:32.000Z | 2020-05-10T14:32:32.000Z | server/src/prefect_server/utilities/__init__.py | louisditzel/prefect | b1a02fee623b965e756a38aa09059db780ab67eb | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2022-02-14T11:25:57.000Z | 2022-02-27T16:25:14.000Z | server/src/prefect_server/utilities/__init__.py | louisditzel/prefect | b1a02fee623b965e756a38aa09059db780ab67eb | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2020-05-31T04:42:56.000Z | 2020-05-31T04:42:56.000Z | # Licensed under the Prefect Community License, available at
# https://www.prefect.io/legal/prefect-community-license
import prefect_server.utilities.context
import prefect_server.utilities.exceptions
import prefect_server.utilities.graphql
import prefect_server.utilities.logging
import prefect_server.utilities.names
import prefect_server.utilities.tests
import prefect_server.utilities.asynchronous
| 33.666667 | 60 | 0.868812 | 51 | 404 | 6.745098 | 0.431373 | 0.264535 | 0.386628 | 0.569767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066832 | 404 | 11 | 61 | 36.727273 | 0.912467 | 0.279703 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1c92bbebf0ef3206a0556424cec456a07a3c2d52 | 110 | py | Python | toontown/pets/PetDCImportsAI.py | MasterLoopyBM/Toontown | ebed7fc3f2ef06a529cf02eda7ab46361aceef9d | [
"MIT"
] | 1 | 2020-02-07T18:15:12.000Z | 2020-02-07T18:15:12.000Z | toontown/pets/PetDCImportsAI.py | TrueBlueDogemon/Toontown | ebed7fc3f2ef06a529cf02eda7ab46361aceef9d | [
"MIT"
] | null | null | null | toontown/pets/PetDCImportsAI.py | TrueBlueDogemon/Toontown | ebed7fc3f2ef06a529cf02eda7ab46361aceef9d | [
"MIT"
] | 2 | 2020-11-08T03:38:35.000Z | 2021-09-02T07:03:47.000Z | if hasattr(simbase, 'wantPets') and simbase.wantPets:
import DistributedPetAI
import DistributedPetUD
| 27.5 | 53 | 0.781818 | 11 | 110 | 7.818182 | 0.727273 | 0.348837 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154545 | 110 | 3 | 54 | 36.666667 | 0.924731 | 0 | 0 | 0 | 0 | 0 | 0.072727 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
98da8092064f3f8a56e8cbd799f5fca34eac9289 | 248 | py | Python | mllib/supervised/__init__.py | posterrieri/mllib | 809265573eb5af5c68f92537ed90390795008e40 | [
"MIT"
] | null | null | null | mllib/supervised/__init__.py | posterrieri/mllib | 809265573eb5af5c68f92537ed90390795008e40 | [
"MIT"
] | null | null | null | mllib/supervised/__init__.py | posterrieri/mllib | 809265573eb5af5c68f92537ed90390795008e40 | [
"MIT"
] | null | null | null | from .parametric import LinearRegression
from .parametric import LogisticRegressionClassifier
from .non_parametric import kNearestNeighbors
__all__ = ['LinearRegression',
'LogisticRegressionClassifier',
'kNearestNeighbors'] | 35.428571 | 52 | 0.778226 | 17 | 248 | 11.058824 | 0.470588 | 0.255319 | 0.212766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165323 | 248 | 7 | 53 | 35.428571 | 0.908213 | 0 | 0 | 0 | 0 | 0 | 0.24498 | 0.11245 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
98de2a3835095d92f2e4e0e5dd2c9655069d3699 | 2,560 | py | Python | p8_test/test_local/test_eta1.py | crazynayan/tpf1 | c81a15d88d4d1f3ed2cf043c90782a4b8509ef14 | [
"MIT"
] | 1 | 2020-01-27T10:10:40.000Z | 2020-01-27T10:10:40.000Z | p8_test/test_local/test_eta1.py | crazynayan/tpf1 | c81a15d88d4d1f3ed2cf043c90782a4b8509ef14 | [
"MIT"
] | 4 | 2019-08-23T05:24:23.000Z | 2021-09-16T10:05:55.000Z | p8_test/test_local/test_eta1.py | crazynayan/tpf1 | c81a15d88d4d1f3ed2cf043c90782a4b8509ef14 | [
"MIT"
] | null | null | null | from p1_utils.data_type import DataType
from p8_test.test_local import TestDebug
class Eta1Test(TestDebug):
DEBUG_DATA = list()
SEGMENT = "ETA1"
def test_eta1_vanilla(self):
test_data = self.tpf_server.run("ETA1", self.test_data)
self.output = test_data.output
self.assertEqual("ETAX265.6", self.output.last_line, self.output.last_node)
self.assertIn("UNABLE TO END TRANSACTION - NO PNR PRESENT IN WORK AREA", self.output.messages)
self.assertEqual(list(), self.output.dumps)
def test_eta1_el_restricted(self):
self.test_data.add_fields([("EBW000", 10)], "EB0EB")
self.test_data.set_field("MI0ACC", DataType("C", input="EL").to_bytes())
self.test_data.set_field("WA0FNS", DataType("X", input="10").to_bytes())
self.test_data.set_field("WA0UB4", DataType("X", input="08").to_bytes())
test_data = self.tpf_server.run("ETA1", self.test_data)
self.output = test_data.output
self.assertEqual("$$UIO1$$.2", self.output.last_line, self.output.last_node)
self.assertIn("RESTRICTED" + 40 * " ", self.output.messages)
def test_eta1_e_no_error(self):
self.test_data.set_field("MI0ACC", DataType("C", input="E").to_bytes())
self.test_data.set_field("WA0FNS", DataType("X", input="10").to_bytes())
self.test_data.set_field("WA0UB4", DataType("X", input="08").to_bytes())
test_data = self.tpf_server.run("ETA1", self.test_data)
self.output = test_data.output
self.assertEqual("ETAX265.6", self.output.last_line, self.output.last_node)
self.assertIn("UNABLE TO END TRANSACTION - NO PNR PRESENT IN WORK AREA", self.output.messages)
def test_eta1_el_plus_off_queue(self):
self.test_data.set_field("MI0ACC", DataType("C", input="EL+").to_bytes())
test_data = self.tpf_server.run("ETA1", self.test_data)
self.output = test_data.output
self.assertEqual("$$UIO1$$.2", self.output.last_line, self.output.last_node)
off_queue = "CANNOT DO THIS IF OFF QUEUE"
self.assertIn(off_queue + (50 - len(off_queue)) * " ", self.output.messages)
def test_eta1_el_off_queue(self):
self.test_data.set_field("MI0ACC", DataType("C", input="EL").to_bytes())
test_data = self.tpf_server.run("ETA1", self.test_data)
self.output = test_data.output
self.assertEqual("ETAX265.6", self.output.last_line, self.output.last_node)
self.assertIn("UNABLE TO END TRANSACTION - NO PNR PRESENT IN WORK AREA", self.output.messages)
| 52.244898 | 102 | 0.677344 | 371 | 2,560 | 4.45283 | 0.199461 | 0.116223 | 0.108959 | 0.072639 | 0.807506 | 0.807506 | 0.789952 | 0.763317 | 0.763317 | 0.763317 | 0 | 0.027462 | 0.175 | 2,560 | 48 | 103 | 53.333333 | 0.754735 | 0 | 0 | 0.585366 | 0 | 0 | 0.139844 | 0 | 0 | 0 | 0 | 0 | 0.268293 | 1 | 0.121951 | false | 0 | 0.04878 | 0 | 0.243902 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
98e9860ecd5998386af2393a5a5039b8bdf97a91 | 2,909 | py | Python | tests/test_bpf_idea.py | ulugbekna/angr-platforms | 3f374b72e5a141f8b421050e3f800eef10175198 | [
"BSD-2-Clause"
] | 43 | 2017-09-21T23:26:50.000Z | 2022-03-26T08:51:45.000Z | tests/test_bpf_idea.py | ulugbekna/angr-platforms | 3f374b72e5a141f8b421050e3f800eef10175198 | [
"BSD-2-Clause"
] | 27 | 2017-09-29T00:00:46.000Z | 2022-03-31T01:14:54.000Z | tests/test_bpf_idea.py | ulugbekna/angr-platforms | 3f374b72e5a141f8b421050e3f800eef10175198 | [
"BSD-2-Clause"
] | 23 | 2017-10-06T19:29:25.000Z | 2022-03-19T20:56:24.000Z |
import nose
import os
import angr
from angr_platforms.bpf import *
from angr_platforms.bpf.lift_bpf import MAX_INSTR_ID
TEST_PROGRAMS_BASE = str(os.path.join(os.path.dirname(os.path.realpath(__file__)), '..', 'test_programs', 'bpf'))
def test_idea_correct_flag():
idea_bpf = os.path.join(TEST_PROGRAMS_BASE, 'idea.bpf')
proj = angr.Project(idea_bpf, main_opts={'backend': 'bpf'})
assert proj.arch.name == 'BPF'
state = proj.factory.entry_state()
simgr = proj.factory.simulation_manager(state)
# Initialize the state with the correct flag
flag = "w0w_y0u_are_Master-0F-secc0mp///>_w_<///"
# the syscall number must be 0x1337
state.memory.store(proj.arch.DATA_BASE, 0x1337, endness='Iend_LE')
# input variables
for i in range(0, len(flag), 4):
state.memory.store(proj.arch.DATA_BASE + 0x10 + i, state.solver.BVV(ord(flag[i]), 8))
state.memory.store(proj.arch.DATA_BASE + 0x10 + i + 1, state.solver.BVV(ord(flag[i+1]), 8))
state.memory.store(proj.arch.DATA_BASE + 0x10 + i + 2, state.solver.BVV(ord(flag[i+2]), 8))
state.memory.store(proj.arch.DATA_BASE + 0x10 + i + 3, state.solver.BVV(ord(flag[i+3]), 8))
# Execute until it returns
simgr.explore(find=(MAX_INSTR_ID * 8,))
nose.tools.assert_equal(len(simgr.found), 1)
nose.tools.assert_equal(simgr.found[0].history.addr, 4058 * 8) # executed until "ret ALLOW"
nose.tools.assert_equal(simgr.found[0].regs._res._model_concrete.value, 1) # the result is ALLOW
def test_idea_incorrect_flag():
idea_bpf = os.path.join(TEST_PROGRAMS_BASE, 'idea.bpf')
proj = angr.Project(idea_bpf, main_opts={'backend': 'bpf'})
assert proj.arch.name == 'BPF'
state = proj.factory.entry_state()
simgr = proj.factory.simulation_manager(state)
# Initialize the state with the incorrect flag
flag = "w0w_y0u_are_Master-0F-secc0mp///>_w_<//\\"
# the syscall number must be 0x1337
state.memory.store(proj.arch.DATA_BASE, 0x1337, endness='Iend_LE')
# input variables
for i in range(0, len(flag), 4):
state.memory.store(proj.arch.DATA_BASE + 0x10 + i, state.solver.BVV(ord(flag[i]), 8))
state.memory.store(proj.arch.DATA_BASE + 0x10 + i + 1, state.solver.BVV(ord(flag[i+1]), 8))
state.memory.store(proj.arch.DATA_BASE + 0x10 + i + 2, state.solver.BVV(ord(flag[i+2]), 8))
state.memory.store(proj.arch.DATA_BASE + 0x10 + i + 3, state.solver.BVV(ord(flag[i+3]), 8))
# Execute until it returns
simgr.explore(find=(MAX_INSTR_ID * 8,))
nose.tools.assert_equal(len(simgr.found), 1)
nose.tools.assert_equal(simgr.found[0].history.addr, 4045 * 8) # executed until "ret DENY"
nose.tools.assert_equal(simgr.found[0].regs._res._model_concrete.value, 0) # the result is DENY
def main():
test_idea_correct_flag()
test_idea_incorrect_flag()
if __name__ == '__main__':
main()
| 37.294872 | 113 | 0.683053 | 458 | 2,909 | 4.155022 | 0.222707 | 0.050447 | 0.084078 | 0.105097 | 0.805045 | 0.805045 | 0.805045 | 0.805045 | 0.805045 | 0.805045 | 0 | 0.039571 | 0.166036 | 2,909 | 77 | 114 | 37.779221 | 0.744847 | 0.113097 | 0 | 0.577778 | 0 | 0 | 0.063548 | 0.031579 | 0 | 0 | 0.017154 | 0 | 0.177778 | 1 | 0.066667 | false | 0 | 0.111111 | 0 | 0.177778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c7021b6d56675a3c3e62fdc32db22fe76bfe3b25 | 83 | py | Python | pokemongo_bot/walkers/__init__.py | islanderman/PokemonGo-Bot | 3c9806b3de71b5c2c38ba92f22ed662901ee700d | [
"MIT"
] | 2 | 2018-11-27T06:02:24.000Z | 2019-12-31T19:10:32.000Z | pokemongo_bot/walkers/__init__.py | 0x2400/PokemonGo-Bot | 3c9806b3de71b5c2c38ba92f22ed662901ee700d | [
"MIT"
] | 1 | 2018-10-28T04:50:46.000Z | 2018-10-28T04:50:46.000Z | pokemongo_bot/walkers/__init__.py | 0x2400/PokemonGo-Bot | 3c9806b3de71b5c2c38ba92f22ed662901ee700d | [
"MIT"
] | 1 | 2017-10-29T18:59:07.000Z | 2017-10-29T18:59:07.000Z | from polyline_generator import Polyline
from polyline_walker import PolylineWalker
| 27.666667 | 42 | 0.903614 | 10 | 83 | 7.3 | 0.6 | 0.328767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096386 | 83 | 2 | 43 | 41.5 | 0.973333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c779692d791c5e6537aadb868abf7824709e4d21 | 4,108 | py | Python | Python/Scheduling Algorithms/fcfs.py | shruti8301/Algorithms-Cheatsheet-Resources | cece012bba7f47c3a1ecfaff380dcbc787c26149 | [
"MIT"
] | 199 | 2019-12-01T01:23:34.000Z | 2022-02-28T10:30:40.000Z | Python/Scheduling Algorithms/fcfs.py | shruti8301/Algorithms-Cheatsheet-Resources | cece012bba7f47c3a1ecfaff380dcbc787c26149 | [
"MIT"
] | 35 | 2020-06-08T17:59:22.000Z | 2021-11-11T04:00:29.000Z | Python/Scheduling Algorithms/fcfs.py | shruti8301/Algorithms-Cheatsheet-Resources | cece012bba7f47c3a1ecfaff380dcbc787c26149 | [
"MIT"
] | 106 | 2020-02-05T01:28:19.000Z | 2022-03-11T05:38:54.000Z | import numpy as np # Contains the method to calculate Standard Deviation
def fcfs(process):
case = int(input('Without Interrupt (1) or With Interrupt (2): '))
# If the relative arrival time is 0 this sort according to priority
for i in range(1, len(process)):
if process[i][1] == '0':
if process[i][5] < process[i - 1][5]:
j = i
while process[j][5] < process[j - 1][5] and j >= 1:
process[j], process[j - 1] = process[j - 1], process[j]
j-=1
# Case for without interrupt
if case == 1:
print('Process\tTurnaround Time\t\tWaiting Time\tCompletion Time')
print(process[0][0], "\t\t", int(process[0][2]) - int(process[0][1]), "\t\t\t", 0, "\t\t", int(process[0][2]) + int(process[0][1]))
arrivalTime = 0 # Arrival Time of every process
completionTime = int(process[0][2]) # Overall completion time
turnAround = [int(process[0][2]) - int(process[0][1])] # Stores Turnaround time of every process
waiting = [0] # Stores Waiting Time of every process
for i in range(1, len(process)):
arrivalTime += int(process[i][1]) # Arrival time of the current process
completionTime = int(completionTime + max(0,int(process[i][1])- int(process[i-1][2]))+int(process[i][2])) # Completion time of the current process
#totalCompletion += int(process[i][2]) # Total completion time till current process
turnAroundTime = completionTime - arrivalTime # Turnaround Time of the current process
turnAround.append(turnAroundTime) # Adding Turnaround time of the current process to the list
waiting.append(max(0, turnAroundTime - int(process[i][2]))) # Adding waiting time of the current process to the list
print(process[i][0], "\t\t", turnAroundTime, "\t\t\t", waiting[i], "\t\t", completionTime)
print('Average Waiting Time : ', sum(waiting) / len(waiting)) # Printing average waiting time
print('Standard Deviation of Turnaround Time : ', np.std(turnAround)) # Printing Standard Deviation of the Turnaround time
# Case for with interrupt
elif case == 2:
print('Process\tTurnaround Time\t\tWaiting Time\tCompletion Time')
print(process[0][0], "\t\t", int(process[0][2]) + float(process[0][3]) + float(process[0][4]) - int(process[0][1]), "\t\t\t", float(process[0][3]) + float(process[0][4]), "\t\t", int(process[0][2]) + float(process[0][3]) + float(process[0][4]) + int(process[0][1]))
arrivalTime = 0 # Arrival Time of every process
completionTime = int(process[0][2]) + float(process[0][3]) + float(process[0][4]) # Overall completion time
turnAround = [int(process[0][2]) + float(process[0][3]) + float(process[0][4]) - int(process[0][1])] # Stores Turnaround time of every process
waiting = [float(process[0][3]) + float(process[0][4])] # Stores Waiting Time of every process
for i in range(1, len(process)):
arrivalTime += int(process[i][1]) # Arrival time of the current process
completionTime = float(completionTime + max(0,int(process[i][1])-int(process[i-1][2]) - float(process[i-1][3]) - float(process[i-1][4])) + float(process[i][2]) + float(process[i][3]) + float(process[i][4])) # Completion time of the current process
turnAroundTime = completionTime - arrivalTime # Turnaround Time of the current process
turnAround.append(turnAroundTime) # Adding Turnaroud time of the current process to the list
waiting.append(max(0, turnAroundTime - (int(process[i][2])))) # Adding waiting tiem of the current process to the list
print(process[i][0], "\t\t", turnAroundTime, "\t\t\t", waiting[i], "\t\t", completionTime)
print('Average Waiting Time : ', sum(waiting) / len(waiting)) # Printing average waiting time
print('Standard Deviation of Turnaround Time : ', np.std(turnAround)) # Printing Standard Deviation of the Turnaround time
# Case for invalid input
else:
print('Invalid Input\nGive Valid input')
fcfs(process)
| 63.2 | 267 | 0.638997 | 585 | 4,108 | 4.487179 | 0.135043 | 0.085333 | 0.058667 | 0.072381 | 0.80419 | 0.794286 | 0.770667 | 0.76 | 0.708952 | 0.708952 | 0 | 0.032348 | 0.209834 | 4,108 | 64 | 268 | 64.1875 | 0.77634 | 0.282863 | 0 | 0.422222 | 0 | 0 | 0.127915 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022222 | false | 0 | 0.022222 | 0 | 0.044444 | 0.244444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c7802d2e0f27b276f1669cd593e3c7d1676807a0 | 45 | py | Python | flask_app/__init__.py | marilynwaldman/flask_gunicorn_nginx_docker | 436999062ab7bdab5e40c0a0f403bb2d661dbb0b | [
"MIT"
] | null | null | null | flask_app/__init__.py | marilynwaldman/flask_gunicorn_nginx_docker | 436999062ab7bdab5e40c0a0f403bb2d661dbb0b | [
"MIT"
] | null | null | null | flask_app/__init__.py | marilynwaldman/flask_gunicorn_nginx_docker | 436999062ab7bdab5e40c0a0f403bb2d661dbb0b | [
"MIT"
] | null | null | null | from flask_app.app import app as application
| 22.5 | 44 | 0.844444 | 8 | 45 | 4.625 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 45 | 1 | 45 | 45 | 0.948718 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c78105f3662dbc3143bae9e8419fa4fc0eaa314b | 121 | py | Python | src/bnn_priors/bnn_priors/data/__init__.py | activatedgeek/uncertainty-da-bayesian-classification | a270fb095f4790dea15327145897d09d0ba9c80b | [
"Apache-2.0"
] | 31 | 2021-02-16T09:35:03.000Z | 2022-03-31T17:18:54.000Z | src/bnn_priors/bnn_priors/data/__init__.py | activatedgeek/understanding-bayesian-classification | a270fb095f4790dea15327145897d09d0ba9c80b | [
"Apache-2.0"
] | 1 | 2021-05-10T15:25:48.000Z | 2021-05-10T15:25:48.000Z | src/bnn_priors/bnn_priors/data/__init__.py | activatedgeek/understanding-bayesian-classification | a270fb095f4790dea15327145897d09d0ba9c80b | [
"Apache-2.0"
] | 4 | 2021-02-21T03:38:00.000Z | 2021-12-24T15:13:29.000Z | from .base import *
from .toy_data import *
from .UCI.uci import *
from .CIFAR.cifar import *
from .MNIST.mnist import *
| 20.166667 | 26 | 0.727273 | 19 | 121 | 4.578947 | 0.421053 | 0.45977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165289 | 121 | 5 | 27 | 24.2 | 0.861386 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c788cf890934902dfb5476d333aee52513e88347 | 45 | py | Python | src/masonite/filesystem/providers/__init__.py | cercos/masonite | f7f220efa7fae833683e9f07ce13c3795a87d3b8 | [
"MIT"
] | 1,816 | 2018-02-14T01:59:51.000Z | 2022-03-31T17:09:20.000Z | src/masonite/filesystem/providers/__init__.py | cercos/masonite | f7f220efa7fae833683e9f07ce13c3795a87d3b8 | [
"MIT"
] | 340 | 2018-02-11T00:27:26.000Z | 2022-03-21T12:00:24.000Z | src/masonite/filesystem/providers/__init__.py | cercos/masonite | f7f220efa7fae833683e9f07ce13c3795a87d3b8 | [
"MIT"
] | 144 | 2018-03-18T00:08:16.000Z | 2022-02-26T01:51:58.000Z | from .StorageProvider import StorageProvider
| 22.5 | 44 | 0.888889 | 4 | 45 | 10 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 45 | 1 | 45 | 45 | 0.97561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c797a07eabae67dada4d060a32171070546a531f | 7,967 | py | Python | test_scripts/ns_instance/duan/service/vfc/nfvo/lcm/lcm/ns/tests/test_query_subscriptions.py | lremember/VFC | 837559db1396091811382359100bfc60e1aab5b2 | [
"MIT"
] | 4 | 2018-08-29T02:51:38.000Z | 2021-11-16T11:36:11.000Z | test_scripts/ns_instance/duan/service/vfc/nfvo/lcm/lcm/ns/tests/test_query_subscriptions.py | lremember/VFC-Files | 837559db1396091811382359100bfc60e1aab5b2 | [
"MIT"
] | null | null | null | test_scripts/ns_instance/duan/service/vfc/nfvo/lcm/lcm/ns/tests/test_query_subscriptions.py | lremember/VFC-Files | 837559db1396091811382359100bfc60e1aab5b2 | [
"MIT"
] | 1 | 2019-05-12T08:21:19.000Z | 2019-05-12T08:21:19.000Z | # Copyright (c) 2019, CMCC Technologies Co., Ltd.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
from django.test import TestCase
from rest_framework import status
from rest_framework.test import APIClient
from lcm.pub.database.models import SubscriptionModel
from lcm.ns.tests import SUBSCRIPTION_DICT
class TestQuerySubscriptions(TestCase):
def setUp(self):
self.client = APIClient()
self.test_single_subscription = SUBSCRIPTION_DICT.copy()
self.subscription_id = "99442b18-a5c7-11e8-998c-bf1755941f16"
self.ns_instance_id = "cd552c9c-ab6f-11e8-b354-236c32aa91a1"
SubscriptionModel.objects.all().delete()
def tearDown(self):
pass
def test_get_subscriptions(self):
ns_instance_filter = {
"nsdIds": [],
"nsInstanceIds": [self.ns_instance_id],
"nsInstanceNames": []
}
links = {
"self": {
"href": "/api/v1/subscriptions/99442b18-a5c7-11e8-998c-bf1755941f16"
}
}
SubscriptionModel(
subscription_id=self.subscription_id,
callback_uri="http://aurl.com",
auth_info="{}",
notification_types="['NsLcmOperationOccurrenceNotification']",
operation_types="['INSTANTIATE']",
operation_states="['STARTING']",
links=json.dumps(links),
ns_instance_filter=json.dumps(ns_instance_filter)).save()
response = self.client.get("/api/nslcm/v1/subscriptions", format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual([self.test_single_subscription], response.data)
def test_get_subscriptions_with_ns_instance_id(self):
ns_instance_filter = {
"nsdIds": [],
"nsInstanceIds": [self.ns_instance_id],
"nsInstanceNames": []
}
links = {
"self": {
"href": "/api/v1/subscriptions/99442b18-a5c7-11e8-998c-bf1755941f16"
}
}
SubscriptionModel(
subscription_id=self.subscription_id,
callback_uri="http://aurl.com",
auth_info="{}",
notification_types="['NsLcmOperationOccurrenceNotification']",
operation_types="['INSTANTIATE']",
operation_states="['STARTING']",
links=json.dumps(links),
ns_instance_filter=json.dumps(ns_instance_filter)).save()
dummy_ns_id = "584b35e2-b2a2-11e8-8e11-645106374fd3"
dummy_subscription_id = "947dcd2c-b2a2-11e8-b365-645106374fd4"
ns_instance_filter["nsInstanceIds"].append(dummy_ns_id)
SubscriptionModel(
subscription_id=dummy_subscription_id,
callback_uri="http://aurl.com",
auth_info="{}",
notification_types="['NsLcmOperationOccurrenceNotification']",
operation_types="['INSTANTIATE']",
operation_states="['STARTING']",
links=json.dumps(links),
ns_instance_filter=json.dumps(ns_instance_filter)).save()
response = self.client.get("/api/nslcm/v1/subscriptions?nsInstanceId=" + dummy_ns_id, format='json')
expected_response = self.test_single_subscription.copy()
expected_response["id"] = dummy_subscription_id
expected_response["filter"]["nsInstanceSubscriptionFilter"]["nsInstanceIds"] = ns_instance_filter["nsInstanceIds"]
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual([expected_response], response.data)
def test_get_subscriptions_with_unknown_ns_instance_id(self):
ns_instance_filter = {
"nsdIds": [],
"nsInstanceIds": [self.ns_instance_id],
"nsInstanceNames": []
}
links = {
"self": {
"href": "/api/v1/subscriptions/99442b18-a5c7-11e8-998c-bf1755941f16"
}
}
SubscriptionModel(
subscription_id=self.subscription_id,
callback_uri="http://aurl.com",
auth_info="{}",
notification_types="['NsLcmOperationOccurrenceNotification']",
operation_types="['INSTANTIATE']",
operation_states="['STARTING']",
links=json.dumps(links),
ns_instance_filter=json.dumps(ns_instance_filter)).save()
response = self.client.get("/api/nslcm/v1/subscriptions?nsInstanceId=dummy", format='json')
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual([], response.data)
def test_get_subscriptions_with_invalid_filter(self):
ns_instance_filter = {
"nsdIds": [],
"nsInstanceIds": [self.ns_instance_id],
"nsInstanceNames": []
}
links = {
"self": {
"href": "/api/v1/subscriptions/99442b18-a5c7-11e8-998c-bf1755941f16"
}
}
SubscriptionModel(
subscription_id=self.subscription_id,
callback_uri="http://aurl.com",
auth_info="{}",
notification_types="['NsLcmOperationOccurrenceNotification']",
operation_types="['INSTANTIATE']",
operation_states="['STARTING']",
links=json.dumps(links),
ns_instance_filter=json.dumps(ns_instance_filter)).save()
response = self.client.get("/api/nslcm/v1/subscriptions?dummy=dummy", format='json')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_get_subscriptions_with_operation_type_filter(self):
ns_instance_filter = {
"nsdIds": [],
"nsInstanceIds": [self.ns_instance_id],
"nsInstanceNames": []
}
links = {
"self": {
"href": "/api/v1/subscriptions/99442b18-a5c7-11e8-998c-bf1755941f16"
}
}
SubscriptionModel(
subscription_id=self.subscription_id,
callback_uri="http://aurl.com",
auth_info="{}",
notification_types="['NsLcmOperationOccurrenceNotification']",
operation_types="['INSTANTIATE']",
operation_states="['STARTING']",
links=json.dumps(links),
ns_instance_filter=json.dumps(ns_instance_filter)).save()
dummy_ns_id = "584b35e2-b2a2-11e8-8e11-645106374fd3"
dummy_subscription_id = "947dcd2c-b2a2-11e8-b365-645106374fd4"
ns_instance_filter["nsInstanceIds"].append(dummy_ns_id)
SubscriptionModel(
subscription_id=dummy_subscription_id,
callback_uri="http://aurl.com",
auth_info="{}",
notification_types="['NsLcmOperationOccurrenceNotification']",
operation_types="['SCALE']",
operation_states="['STARTING']",
links=json.dumps(links),
ns_instance_filter=json.dumps(ns_instance_filter)).save()
response = self.client.get("/api/nslcm/v1/subscriptions?operationTypes=SCALE", format='json')
expected_response = self.test_single_subscription.copy()
expected_response["id"] = dummy_subscription_id
expected_response["filter"]["nsInstanceSubscriptionFilter"]["nsInstanceIds"] = ns_instance_filter["nsInstanceIds"]
expected_response["filter"]["operationTypes"] = ["SCALE"]
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual([expected_response], response.data)
| 43.535519 | 122 | 0.635998 | 784 | 7,967 | 6.219388 | 0.205357 | 0.063577 | 0.075472 | 0.03589 | 0.788761 | 0.776661 | 0.776661 | 0.755127 | 0.755127 | 0.755127 | 0 | 0.04788 | 0.245011 | 7,967 | 182 | 123 | 43.774725 | 0.76276 | 0.070918 | 0 | 0.71875 | 0 | 0 | 0.229198 | 0.141118 | 0 | 0 | 0 | 0 | 0.05625 | 1 | 0.04375 | false | 0.00625 | 0.0375 | 0 | 0.0875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c7b52d1f13da220694c9e0e412114a0299c924f5 | 90 | py | Python | web/contact_us/services.py | bandirom/django-blog | a8232ee8e4b7380b0760296de865cca2c5feda87 | [
"MIT"
] | 1 | 2021-08-11T10:51:28.000Z | 2021-08-11T10:51:28.000Z | web/contact_us/services.py | bandirom/django-blog | a8232ee8e4b7380b0760296de865cca2c5feda87 | [
"MIT"
] | null | null | null | web/contact_us/services.py | bandirom/django-blog | a8232ee8e4b7380b0760296de865cca2c5feda87 | [
"MIT"
] | 6 | 2021-04-07T17:03:52.000Z | 2021-07-18T04:46:59.000Z | from django.conf import settings
from . import models
class ContactUsService:
pass
| 11.25 | 32 | 0.766667 | 11 | 90 | 6.272727 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 90 | 7 | 33 | 12.857143 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
c7e614baccd29a900f233f0e786c61ad5948f74a | 6,651 | py | Python | tests/ipv6_link_local_test.py | chaoskao/sonic-utilities | 47a9a0f56db95265c15c74c4c8dc6a3998bfd2d3 | [
"Apache-2.0"
] | null | null | null | tests/ipv6_link_local_test.py | chaoskao/sonic-utilities | 47a9a0f56db95265c15c74c4c8dc6a3998bfd2d3 | [
"Apache-2.0"
] | 4 | 2021-01-12T13:47:39.000Z | 2021-09-22T16:38:18.000Z | tests/ipv6_link_local_test.py | chaoskao/sonic-utilities | 47a9a0f56db95265c15c74c4c8dc6a3998bfd2d3 | [
"Apache-2.0"
] | null | null | null | import os
from click.testing import CliRunner
import config.main as config
import show.main as show
from utilities_common.db import Db
show_ipv6_link_local_mode_output="""\
+------------------+----------+
| Interface Name | Mode |
+==================+==========+
| Ethernet0 | Disabled |
+------------------+----------+
| PortChannel0001 | Disabled |
+------------------+----------+
"""
class TestIPv6LinkLocal(object):
@classmethod
def setup_class(cls):
os.environ['UTILITIES_UNIT_TESTING'] = "1"
print("SETUP")
def test_show_ipv6_link_local_mode(self):
runner = CliRunner()
db = Db()
obj = {'db':db.cfgdb}
# show ipv6 link-local-mode output
result = runner.invoke(show.cli.commands["ipv6"].commands["link-local-mode"], [], obj=obj)
print(result.output)
assert result.output == show_ipv6_link_local_mode_output
def test_config_enable_disable_ipv6_link_local_on_physical_interface(self):
runner = CliRunner()
db = Db()
obj = {'db':db.cfgdb}
# Enable ipv6 link local on Ethernet0
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["enable"].commands["use-link-local-only"], ["Ethernet0"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code == 0
assert result.output == ''
# Disable ipv6 link local on Ethernet0
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["disable"].commands["use-link-local-only"], ["Ethernet0"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code == 0
assert result.output == ''
def test_config_enable_disable_ipv6_link_local_on_portchannel_interface(self):
runner = CliRunner()
db = Db()
obj = {'db':db.cfgdb}
# Enable ipv6 link local on PortChannel0001
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["enable"].commands["use-link-local-only"], ["PortChannel0001"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code == 0
assert result.output == ''
# Disable ipv6 link local on PortChannel0001
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["disable"].commands["use-link-local-only"], ["PortChannel0001"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code == 0
assert result.output == ''
def test_config_enable_disable_ipv6_link_local_on_invalid_interface(self):
runner = CliRunner()
db = Db()
obj = {'db':db.cfgdb}
# Enable ipv6 link local on PortChannel1
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["enable"].commands["use-link-local-only"], ["PortChannel1"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code != 0
assert 'Error: Interface name PortChannel1 is invalid. Please enter a valid interface name!!' in result.output
# Disable ipv6 link local on Ethernet500
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["disable"].commands["use-link-local-only"], ["Ethernet500"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code != 0
assert 'Error: Interface name Ethernet500 is invalid. Please enter a valid interface name!!' in result.output
def test_config_enable_disable_ipv6_link_local_on_interface_which_is_member_of_vlan(self):
runner = CliRunner()
db = Db()
obj = {'db':db.cfgdb}
# Enable ipv6 link local on Ethernet16
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["enable"].commands["use-link-local-only"], ["Ethernet16"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code != 0
assert 'Error: Ethernet16 is configured as a member of vlan. Cannot configure the IPv6 link local mode!' in result.output
# Disable ipv6 link local on Ethernet16
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["disable"].commands["use-link-local-only"], ["Ethernet16"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code != 0
assert 'Error: Ethernet16 is configured as a member of vlan. Cannot configure the IPv6 link local mode!' in result.output
def test_config_enable_disable_ipv6_link_local_on_interface_which_is_member_of_portchannel(self):
runner = CliRunner()
db = Db()
obj = {'db':db.cfgdb}
# Enable ipv6 link local on Ethernet32
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["enable"].commands["use-link-local-only"], ["Ethernet32"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code != 0
assert 'Error: Ethernet32 is configured as a member of portchannel. Cannot configure the IPv6 link local mode!' in result.output
# Disable ipv6 link local on Ethernet32
result = runner.invoke(config.config.commands["interface"].commands["ipv6"].commands["disable"].commands["use-link-local-only"], ["Ethernet32"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code != 0
assert 'Error: Ethernet32 is configured as a member of portchannel. Cannot configure the IPv6 link local mode!' in result.output
def test_config_enable_disable_ipv6_link_local_on_all_valid_interfaces(self):
runner = CliRunner()
db = Db()
obj = {'db':db.cfgdb}
# Enable ipv6 link local on all interfaces
result = runner.invoke(config.config.commands["ipv6"].commands["enable"].commands["link-local"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code == 0
assert result.output == ''
# Disable ipv6 link local on all interfaces
result = runner.invoke(config.config.commands["ipv6"].commands["disable"].commands["link-local"], obj=obj)
print(result.exit_code)
print(result.output)
assert result.exit_code == 0
assert result.output == ''
@classmethod
def teardown_class(cls):
os.environ['UTILITIES_UNIT_TESTING'] = "0"
print("TEARDOWN")
| 43.188312 | 166 | 0.654037 | 807 | 6,651 | 5.255266 | 0.095415 | 0.082764 | 0.079698 | 0.063664 | 0.890356 | 0.875501 | 0.857109 | 0.83966 | 0.834473 | 0.82622 | 0 | 0.021876 | 0.209593 | 6,651 | 153 | 167 | 43.470588 | 0.784858 | 0.075327 | 0 | 0.640351 | 0 | 0 | 0.230807 | 0.027384 | 0 | 0 | 0 | 0 | 0.219298 | 1 | 0.078947 | false | 0 | 0.04386 | 0 | 0.131579 | 0.236842 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c7f3f608261b139cd2922e3fcb6e19465abeab4a | 74 | py | Python | rnaindel/__init__.py | rawagiha/RNAIndel | ad0aa6f7dcb36b11e6159b9533aadd3240bb3916 | [
"Apache-2.0"
] | 21 | 2019-01-03T22:23:11.000Z | 2021-09-05T14:45:14.000Z | rnaindel/__init__.py | rawagiha/RNAIndel | ad0aa6f7dcb36b11e6159b9533aadd3240bb3916 | [
"Apache-2.0"
] | 7 | 2019-01-04T23:19:39.000Z | 2021-11-03T00:26:54.000Z | rnaindel/__init__.py | rawagiha/RNAIndel | ad0aa6f7dcb36b11e6159b9533aadd3240bb3916 | [
"Apache-2.0"
] | 9 | 2019-01-22T19:31:08.000Z | 2021-06-28T05:52:57.000Z | from .analysis import *
from .occurrence import *
from .training import *
| 18.5 | 25 | 0.756757 | 9 | 74 | 6.222222 | 0.555556 | 0.357143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 74 | 3 | 26 | 24.666667 | 0.903226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4039ed954a2e2631a3213466e57563133c1ba340 | 21,813 | py | Python | pipe/network/sample_pyramid_add_kpn_FiveLevel.py | dong1015323606/tof_mpi_remove | 11ecac5db4b30affbb1785ac01397e7aa53f22cf | [
"MIT"
] | 6 | 2020-08-24T02:03:56.000Z | 2021-12-10T02:39:41.000Z | pipe/network/sample_pyramid_add_kpn_FiveLevel.py | dong1015323606/tof_mpi_remove | 11ecac5db4b30affbb1785ac01397e7aa53f22cf | [
"MIT"
] | 1 | 2020-10-16T02:15:36.000Z | 2021-06-05T02:25:36.000Z | pipe/network/sample_pyramid_add_kpn_FiveLevel.py | dong1015323606/tof_mpi_remove | 11ecac5db4b30affbb1785ac01397e7aa53f22cf | [
"MIT"
] | 6 | 2020-09-25T12:20:44.000Z | 2021-11-25T03:13:36.000Z | import sys
sys.path.insert(0, './module/')
import tensorflow as tf
from dataset import *
from activation import *
from conv import conv
from dfus_block import dfus_block
tf.logging.set_verbosity(tf.logging.INFO)
PI = 3.14159265358979323846
flg = False
dtype = tf.float32
def feature_extractor_subnet(x, flg, regular):
"""Build a U-Net architecture"""
""" Args: x is the input, 4-D tensor (BxHxWxC)
flg represent weather add the BN
regular represent the regularizer number
Return: output is 4-D Tensor (BxHxWxC)
"""
pref = 'feature_extractor_subnet_'
# whether to train flag
train_ae = flg
# define initializer for the network
keys = ['conv', 'upsample']
keys_avoid = ['OptimizeLoss']
inits = []
init_net = None
if init_net != None:
for name in init_net.get_variable_names():
# select certain variables
flag_init = False
for key in keys:
if key in name:
flag_init = True
for key in keys_avoid:
if key in name:
flag_init = False
if flag_init:
name_f = name.replace('/', '_')
num = str(init_net.get_variable_value(name).tolist())
# self define the initializer function
from tensorflow.python.framework import dtypes
from tensorflow.python.ops.init_ops import Initializer
exec(
"class " + name_f + "(Initializer):\n def __init__(self,dtype=tf.float32): self.dtype=dtype \n def __call__(self,shape,dtype=None,partition_info=None): return tf.cast(np.array(" + num + "),dtype=self.dtype)\n def get_config(self):return {\"dtype\": self.dtype.name}")
inits.append(name_f)
# autoencoder
n_filters = [
16, 16,
32, 32,
96, 96,
128, 128,
256, 256,
]
filter_sizes = [
3, 3,
3, 3,
3, 3,
3, 3,
3, 3,
]
pool_sizes = [ \
1, 1,
2, 1,
2, 1,
2, 1,
2, 1,
]
pool_strides = [
1, 1,
2, 1,
2, 1,
2, 1,
2, 1,
]
skips = [ \
False, False,
True, False,
True, False,
True, False,
True, False,
]
# change space
ae_inputs = tf.identity(x, name='ae_inputs')
# prepare input
current_input = tf.identity(ae_inputs, name="input")
####################################################################################################################
# convolutional layers: feature extractor
features = []
for i in range(0, len(n_filters)):
name = pref + "conv_" + str(i)
# define the initializer
if name + '_bias' in inits:
bias_init = eval(name + '_bias()')
else:
bias_init = tf.zeros_initializer()
if name + '_kernel' in inits:
kernel_init = eval(name + '_kernel()')
else:
kernel_init = None
# convolution
current_input = tf.layers.conv2d(
inputs=current_input,
filters=n_filters[i],
kernel_size=[filter_sizes[i], filter_sizes[i]],
padding="same",
activation=relu,
trainable=train_ae,
kernel_initializer=kernel_init,
bias_initializer=bias_init,
name=name,
)
if pool_sizes[i] == 1 and pool_strides[i] == 1:
current_input = current_input
if (i == len(n_filters) - 1) or (pool_sizes[i + 1] == 2 and pool_strides[i + 1] == 2):
features.append(current_input)
else:
current_input = tf.layers.max_pooling2d( \
inputs=current_input,
pool_size=[pool_sizes[i], pool_sizes[i]],
strides=pool_strides[i],
name=pref + "pool_" + str(i)
)
return features
def depth_residual_regresssion_subnet(x, flg, regular, subnet_num):
"""Build a U-Net architecture"""
""" Args: x is the input, 4-D tensor (BxHxWxC)
flg represent weather add the BN
regular represent the regularizer number
Return: output is 4-D Tensor (BxHxWxC)
"""
pref = 'depth_regression_subnet_' + str(subnet_num) + '_'
# whether to train flag
train_ae = flg
# define initializer for the network
keys = ['conv', 'upsample']
keys_avoid = ['OptimizeLoss']
inits = []
init_net = None
if init_net != None:
for name in init_net.get_variable_names():
# select certain variables
flag_init = False
for key in keys:
if key in name:
flag_init = True
for key in keys_avoid:
if key in name:
flag_init = False
if flag_init:
name_f = name.replace('/', '_')
num = str(init_net.get_variable_value(name).tolist())
# self define the initializer function
from tensorflow.python.framework import dtypes
from tensorflow.python.ops.init_ops import Initializer
exec(
"class " + name_f + "(Initializer):\n def __init__(self,dtype=tf.float32): self.dtype=dtype \n def __call__(self,shape,dtype=None,partition_info=None): return tf.cast(np.array(" + num + "),dtype=self.dtype)\n def get_config(self):return {\"dtype\": self.dtype.name}")
inits.append(name_f)
# autoencoder
n_filters = [
128, 96,
64, 32,
16, 1,
]
filter_sizes = [
3, 3,
3, 3,
3, 3,
]
pool_sizes = [ \
1, 1,
1, 1,
1, 1,
]
pool_strides = [
1, 1,
1, 1,
1, 1,
]
skips = [ \
False, False,
False, False,
False, False,
]
# change space
ae_inputs = tf.identity(x, name='ae_inputs')
# prepare input
current_input = tf.identity(ae_inputs, name="input")
####################################################################################################################
# convolutional layers: depth regression
feature = []
for i in range(0, len(n_filters)):
name = pref + "conv_" + str(i)
# define the initializer
if name + '_bias' in inits:
bias_init = eval(name + '_bias()')
else:
bias_init = tf.zeros_initializer()
if name + '_kernel' in inits:
kernel_init = eval(name + '_kernel()')
else:
kernel_init = None
if i == (len(n_filters) - 1):
activation = None
else:
activation = relu
# convolution
current_input = tf.layers.conv2d(
inputs=current_input,
filters=n_filters[i],
kernel_size=[filter_sizes[i], filter_sizes[i]],
padding="same",
activation=activation,
trainable=train_ae,
kernel_initializer=kernel_init,
bias_initializer=bias_init,
name=name,
)
if pool_sizes[i] == 1 and pool_strides[i] == 1:
feature.append(current_input)
else:
feature.append(
tf.layers.max_pooling2d( \
inputs=current_input,
pool_size=[pool_sizes[i], pool_sizes[i]],
strides=pool_strides[i],
name=pref + "pool_" + str(i)
)
)
current_input = feature[-1]
depth_coarse = tf.identity(feature[-1], name='depth_coarse_output')
return depth_coarse
def residual_output_subnet(x, flg, regular, subnet_num):
"""Build a U-Net architecture"""
""" Args: x is the input, 4-D tensor (BxHxWxC)
flg represent weather add the BN
regular represent the regularizer number
Return: output is 4-D Tensor (BxHxWxC)
"""
pref = 'residual_output_subnet_' + str(subnet_num) + '_'
# whether to train flag
train_ae = flg
# define initializer for the network
keys = ['conv', 'upsample']
keys_avoid = ['OptimizeLoss']
inits = []
init_net = None
if init_net != None:
for name in init_net.get_variable_names():
# select certain variables
flag_init = False
for key in keys:
if key in name:
flag_init = True
for key in keys_avoid:
if key in name:
flag_init = False
if flag_init:
name_f = name.replace('/', '_')
num = str(init_net.get_variable_value(name).tolist())
# self define the initializer function
from tensorflow.python.framework import dtypes
from tensorflow.python.ops.init_ops import Initializer
exec(
"class " + name_f + "(Initializer):\n def __init__(self,dtype=tf.float32): self.dtype=dtype \n def __call__(self,shape,dtype=None,partition_info=None): return tf.cast(np.array(" + num + "),dtype=self.dtype)\n def get_config(self):return {\"dtype\": self.dtype.name}")
inits.append(name_f)
# autoencoder
n_filters = [
1
]
filter_sizes = [
1
]
pool_sizes = [ \
1
]
pool_strides = [
1
]
skips = [ \
False
]
# change space
ae_inputs = tf.identity(x, name='ae_inputs')
# prepare input
current_input = tf.identity(ae_inputs, name="input")
####################################################################################################################
# convolutional layers: depth regression
feature = []
for i in range(0, len(n_filters)):
name = pref + "conv_" + str(i)
# define the initializer
if name + '_bias' in inits:
bias_init = eval(name + '_bias()')
else:
bias_init = tf.zeros_initializer()
if name + '_kernel' in inits:
kernel_init = eval(name + '_kernel()')
else:
kernel_init = None
if i == (len(n_filters) - 1):
activation = None
else:
activation = relu
# convolution
current_input = tf.layers.conv2d(
inputs=current_input,
filters=n_filters[i],
kernel_size=[filter_sizes[i], filter_sizes[i]],
padding="same",
activation=activation,
trainable=train_ae,
kernel_initializer=kernel_init,
bias_initializer=bias_init,
name=name,
)
if pool_sizes[i] == 1 and pool_strides[i] == 1:
feature.append(current_input)
else:
feature.append(
tf.layers.max_pooling2d( \
inputs=current_input,
pool_size=[pool_sizes[i], pool_sizes[i]],
strides=pool_strides[i],
name=pref + "pool_" + str(i)
)
)
current_input = feature[-1]
depth_residual_coarse = tf.identity(feature[-1], name='depth_coarse_residual_output')
return depth_residual_coarse
def unet_subnet(x, flg, regular):
"""Build a U-Net architecture"""
""" Args: x is the input, 4-D tensor (BxHxWxC)
flg represent weather add the BN
regular represent the regularizer number
Return: output is 4-D Tensor (BxHxWxC)
"""
pref = 'unet_subnet_'
# whether to train flag
train_ae = flg
# define initializer for the network
keys = ['conv', 'upsample']
keys_avoid = ['OptimizeLoss']
inits = []
init_net = None
if init_net != None:
for name in init_net.get_variable_names():
# select certain variables
flag_init = False
for key in keys:
if key in name:
flag_init = True
for key in keys_avoid:
if key in name:
flag_init = False
if flag_init:
name_f = name.replace('/', '_')
num = str(init_net.get_variable_value(name).tolist())
# self define the initializer function
from tensorflow.python.framework import dtypes
from tensorflow.python.ops.init_ops import Initializer
exec(
"class " + name_f + "(Initializer):\n def __init__(self,dtype=tf.float32): self.dtype=dtype \n def __call__(self,shape,dtype=None,partition_info=None): return tf.cast(np.array(" + num + "),dtype=self.dtype)\n def get_config(self):return {\"dtype\": self.dtype.name}")
inits.append(name_f)
# autoencoder
n_filters = [
16, 16,
32, 32,
64, 64,
128, 128,
]
filter_sizes = [
3, 3,
3, 3,
3, 3,
3, 3,
]
pool_sizes = [ \
1, 1,
2, 1,
2, 1,
2, 1,
]
pool_strides = [
1, 1,
2, 1,
2, 1,
2, 1,
]
skips = [ \
False, False,
True, False,
True, False,
True, False,
]
# change space
ae_inputs = tf.identity(x, name='ae_inputs')
# prepare input
current_input = tf.identity(ae_inputs, name="input")
####################################################################################################################
# convolutional layers: encoder
conv = []
pool = [current_input]
for i in range(0, len(n_filters)):
name = pref + "conv_" + str(i)
# define the initializer
if name + '_bias' in inits:
bias_init = eval(name + '_bias()')
else:
bias_init = tf.zeros_initializer()
if name + '_kernel' in inits:
kernel_init = eval(name + '_kernel()')
else:
kernel_init = None
# convolution
conv.append( \
tf.layers.conv2d( \
inputs=current_input,
filters=n_filters[i],
kernel_size=[filter_sizes[i], filter_sizes[i]],
padding="same",
activation=relu,
trainable=train_ae,
kernel_initializer=kernel_init,
bias_initializer=bias_init,
name=name,
)
)
if pool_sizes[i] == 1 and pool_strides[i] == 1:
pool.append(conv[-1])
else:
pool.append( \
tf.layers.max_pooling2d( \
inputs=conv[-1],
pool_size=[pool_sizes[i], pool_sizes[i]],
strides=pool_strides[i],
name=pref + "pool_" + str(i)
)
)
current_input = pool[-1]
####################################################################################################################
# convolutional layer: decoder
# upsampling
upsamp = []
current_input = pool[-1]
for i in range((len(n_filters) - 1) - 1, 0, -1):
name = pref + "upsample_" + str(i)
# define the initializer
if name + '_bias' in inits:
bias_init = eval(name + '_bias()')
else:
bias_init = tf.zeros_initializer()
if name + '_kernel' in inits:
kernel_init = eval(name + '_kernel()')
else:
kernel_init = None
## change the kernel size in upsample process
if skips[i] == False and skips[i + 1] == True:
filter_sizes[i] = 4
# upsampling
current_input = tf.layers.conv2d_transpose( \
inputs=current_input,
filters=n_filters[i],
kernel_size=[filter_sizes[i], filter_sizes[i]],
strides=(pool_strides[i], pool_strides[i]),
padding="same",
activation=relu,
trainable=train_ae,
kernel_initializer=kernel_init,
bias_initializer=bias_init,
name=name
)
upsamp.append(current_input)
# current_input = tf.layers.batch_normalization(
# inputs=current_input,
# training=train_ae,
# name=pref + "upsamp_BN_" + str(i))
# skip connection
if skips[i] == False and skips[i - 1] == True:
current_input = tf.concat([current_input, pool[i + 1]], axis=-1)
####################################################################################################################
features = tf.identity(upsamp[-1], name='ae_output')
return features
def depth_output_subnet(inputs, flg, regular, kernel_size): ## x (B,H,W,1), features:(B,H,W,64), samples:(B,H,W,9)
pref = 'depth_output_subnet_'
# whether to train flag
train_ae = flg
current_input = inputs
# define initializer for the network
keys = ['conv', 'upsample']
keys_avoid = ['OptimizeLoss']
inits = []
init_net = None
if init_net != None:
for name in init_net.get_variable_names():
# select certain variables
flag_init = False
for key in keys:
if key in name:
flag_init = True
for key in keys_avoid:
if key in name:
flag_init = False
if flag_init:
name_f = name.replace('/', '_')
num = str(init_net.get_variable_value(name).tolist())
# self define the initializer function
from tensorflow.python.framework import dtypes
from tensorflow.python.ops.init_ops import Initializer
exec(
"class " + name_f + "(Initializer):\n def __init__(self,dtype=tf.float32): self.dtype=dtype \n def __call__(self,shape,dtype=None,partition_info=None): return tf.cast(np.array(" + num + "),dtype=self.dtype)\n def get_config(self):return {\"dtype\": self.dtype.name}")
inits.append(name_f)
n_filters_mix = [kernel_size ** 2]
filter_sizes_mix = [1]
mix = []
for i in range(len(n_filters_mix)):
name = pref + "conv_" + str(i)
# define the initializer
if name + '_bias' in inits:
bias_init = eval(name + '_bias()')
else:
bias_init = tf.zeros_initializer()
if name + '_kernel' in inits:
kernel_init = eval(name + '_kernel()')
else:
kernel_init = None
if i == (len(n_filters_mix) - 1):
activation = sigmoid
else:
activation = relu
# convolution
mix.append( \
tf.layers.conv2d( \
inputs=current_input,
filters=n_filters_mix[i],
kernel_size=[filter_sizes_mix[i], filter_sizes_mix[i]],
padding="same",
activation=activation,
trainable=train_ae,
kernel_initializer=kernel_init,
bias_initializer=bias_init,
name=name,
)
)
current_input = mix[-1]
return current_input
def dear_kpn(x, flg, regular):
kernel_size = 3
features = unet_subnet(x, flg, regular)
weights = depth_output_subnet(features, flg, regular, kernel_size=kernel_size)
weights = weights / tf.reduce_sum(tf.abs(weights) + 1e-6, axis=-1, keep_dims=True)
column = im2col(x, kernel_size=kernel_size)
current_output = tf.reduce_sum(column * weights, axis=-1, keep_dims=True)
depth_output = tf.identity(current_output, name='depth_output')
return depth_output
def sample_pyramid_add_kpn_FiveLevel(x, flg, regular, batch_size, deformable_range):
depth_residual = []
depth_residual_input = []
h_max = tf.shape(x)[1]
w_max = tf.shape(x)[2]
depth = tf.expand_dims(x[:, :, :, 0], axis=-1)
depth_and_amplitude = x[:, :, :, 0:2]
rgb = x[:, :, :, 2:5]
features = feature_extractor_subnet(depth_and_amplitude, flg, regular)
for i in range(1, len(features) + 1):
if i == 1:
inputs = features[len(features) - i]
else:
feature_input = features[len(features) - i]
h_max_low_scale = tf.shape(feature_input)[1]
w_max_low_scale = tf.shape(feature_input)[2]
depth_coarse_input = tf.image.resize_bicubic(depth_residual[-1], size=(h_max_low_scale, w_max_low_scale),
align_corners=True)
inputs = tf.concat([feature_input, depth_coarse_input], axis=-1)
current_depth_residual = depth_residual_regresssion_subnet(inputs, flg, regular, subnet_num=i)
depth_residual.append(current_depth_residual)
current_depth_residual_input = tf.image.resize_bicubic(current_depth_residual, size=(h_max, w_max),
align_corners=True)
depth_residual_input.append(current_depth_residual_input)
depth_coarse_residual_input = tf.concat(depth_residual_input, axis=-1)
final_depth_residual_output = residual_output_subnet(depth_coarse_residual_input, flg, regular, subnet_num=0)
current_final_depth_output = depth + final_depth_residual_output
final_depth_output = dear_kpn(current_final_depth_output, flg, regular)
depth_residual_input.append(final_depth_residual_output)
depth_residual_input.append(final_depth_output - current_final_depth_output)
depth_residual_input.append(final_depth_output - current_final_depth_output)
return final_depth_output, depth_residual_input | 32.950151 | 287 | 0.530922 | 2,465 | 21,813 | 4.468154 | 0.081947 | 0.039223 | 0.004903 | 0.005448 | 0.774106 | 0.752588 | 0.739513 | 0.730071 | 0.718903 | 0.714 | 0 | 0.0173 | 0.337505 | 21,813 | 662 | 288 | 32.950151 | 0.744862 | 0.068858 | 0 | 0.716 | 0 | 0.01 | 0.094587 | 0.039545 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014 | false | 0 | 0.032 | 0 | 0.06 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
405405ae573fe34cf3428fd70f217e8ee0e5589d | 224,844 | py | Python | manila/tests/share/test_manager.py | deiter/manila | ba94d20e823d2edad7e9bd01546cf1642b17d212 | [
"Apache-2.0"
] | 1 | 2019-05-06T10:33:38.000Z | 2019-05-06T10:33:38.000Z | manila/tests/share/test_manager.py | deiter/manila | ba94d20e823d2edad7e9bd01546cf1642b17d212 | [
"Apache-2.0"
] | 4 | 2019-05-06T11:45:17.000Z | 2019-05-09T14:23:28.000Z | manila/tests/share/test_manager.py | deiter/manila | ba94d20e823d2edad7e9bd01546cf1642b17d212 | [
"Apache-2.0"
] | 3 | 2019-05-03T12:32:47.000Z | 2021-01-30T20:26:19.000Z | # Copyright 2014 Mirantis Inc.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Test of Share Manager for Manila."""
import datetime
import random
import ddt
import mock
from oslo_concurrency import lockutils
from oslo_serialization import jsonutils
from oslo_utils import importutils
from oslo_utils import timeutils
import six
from manila.common import constants
from manila import context
from manila.data import rpcapi as data_rpc
from manila import db
from manila.db.sqlalchemy import models
from manila import exception
from manila import quota
from manila.share import access as share_access
from manila.share import drivers_private_data
from manila.share import manager
from manila.share import migration as migration_api
from manila.share import rpcapi
from manila.share import share_types
from manila import test
from manila.tests.api import fakes as test_fakes
from manila.tests import db_utils
from manila.tests import fake_share as fakes
from manila.tests import fake_utils
from manila.tests import utils as test_utils
from manila import utils
def fake_replica(**kwargs):
return fakes.fake_replica(for_manager=True, **kwargs)
class LockedOperationsTestCase(test.TestCase):
class FakeManager(object):
@manager.locked_share_replica_operation
def fake_replica_operation(self, context, replica, share_id=None):
pass
def setUp(self):
super(self.__class__, self).setUp()
self.manager = self.FakeManager()
self.fake_context = test_fakes.FakeRequestContext
self.lock_call = self.mock_object(
utils, 'synchronized', mock.Mock(return_value=lambda f: f))
@ddt.data({'id': 'FAKE_REPLICA_ID'}, 'FAKE_REPLICA_ID')
@ddt.unpack
def test_locked_share_replica_operation(self, **replica):
self.manager.fake_replica_operation(self.fake_context, replica,
share_id='FAKE_SHARE_ID')
self.assertTrue(self.lock_call.called)
@ddt.ddt
class ShareManagerTestCase(test.TestCase):
def setUp(self):
super(ShareManagerTestCase, self).setUp()
self.flags(share_driver='manila.tests.fake_driver.FakeShareDriver')
# Define class directly, because this test suite dedicated
# to specific manager.
self.share_manager = importutils.import_object(
"manila.share.manager.ShareManager")
self.mock_object(self.share_manager.driver, 'do_setup')
self.mock_object(self.share_manager.driver, 'check_for_setup_error')
self.context = context.get_admin_context()
self.share_manager.driver.initialized = True
mock.patch.object(
lockutils, 'lock', fake_utils.get_fake_lock_context())
self.synchronized_lock_decorator_call = self.mock_object(
utils, 'synchronized', mock.Mock(return_value=lambda f: f))
def test_share_manager_instance(self):
fake_service_name = "fake_service"
import_mock = mock.Mock()
self.mock_object(importutils, "import_object", import_mock)
private_data_mock = mock.Mock()
self.mock_object(drivers_private_data, "DriverPrivateData",
private_data_mock)
self.mock_object(manager.ShareManager, '_init_hook_drivers')
share_manager = manager.ShareManager(service_name=fake_service_name)
private_data_mock.assert_called_once_with(
context=mock.ANY,
backend_host=share_manager.host,
config_group=fake_service_name
)
self.assertTrue(import_mock.called)
self.assertTrue(manager.ShareManager._init_hook_drivers.called)
def test__init_hook_drivers(self):
fake_service_name = "fake_service"
import_mock = mock.Mock()
self.mock_object(importutils, "import_object", import_mock)
self.mock_object(drivers_private_data, "DriverPrivateData")
share_manager = manager.ShareManager(service_name=fake_service_name)
share_manager.configuration.safe_get = mock.Mock(
return_value=["Foo", "Bar"])
self.assertEqual(0, len(share_manager.hooks))
import_mock.reset()
share_manager._init_hook_drivers()
self.assertEqual(
len(share_manager.configuration.safe_get.return_value),
len(share_manager.hooks))
import_mock.assert_has_calls([
mock.call(
hook,
configuration=share_manager.configuration,
host=share_manager.host
) for hook in share_manager.configuration.safe_get.return_value
], any_order=True)
def test__execute_periodic_hook(self):
share_instances_mock = mock.Mock()
hook_data_mock = mock.Mock()
self.mock_object(
self.share_manager.db,
"share_instances_get_all_by_host",
share_instances_mock)
self.mock_object(
self.share_manager.driver,
"get_periodic_hook_data",
hook_data_mock)
self.share_manager.hooks = [mock.Mock(return_value=i) for i in (0, 1)]
self.share_manager._execute_periodic_hook(self.context)
share_instances_mock.assert_called_once_with(
context=self.context, host=self.share_manager.host)
hook_data_mock.assert_called_once_with(
context=self.context,
share_instances=share_instances_mock.return_value)
for mock_hook in self.share_manager.hooks:
mock_hook.execute_periodic_hook.assert_called_once_with(
context=self.context,
periodic_hook_data=hook_data_mock.return_value)
def test_init_host_with_no_shares(self):
self.mock_object(self.share_manager.db,
'share_instances_get_all_by_host',
mock.Mock(return_value=[]))
self.share_manager.init_host()
self.assertTrue(self.share_manager.driver.initialized)
self.share_manager.db.share_instances_get_all_by_host.\
assert_called_once_with(utils.IsAMatcher(context.RequestContext),
self.share_manager.host)
self.share_manager.driver.do_setup.assert_called_once_with(
utils.IsAMatcher(context.RequestContext))
self.share_manager.driver.check_for_setup_error.\
assert_called_once_with()
@ddt.data(
"migration_get_driver_info",
"migration_get_info",
"migration_cancel",
"migration_get_progress",
"migration_complete",
"migration_start",
"create_share_instance",
"manage_share",
"unmanage_share",
"delete_share_instance",
"delete_free_share_servers",
"create_snapshot",
"delete_snapshot",
"allow_access",
"deny_access",
"_report_driver_status",
"_execute_periodic_hook",
"publish_service_capabilities",
"delete_share_server",
"extend_share",
"shrink_share",
"create_consistency_group",
"delete_consistency_group",
"create_cgsnapshot",
"delete_cgsnapshot",
"create_share_replica",
"delete_share_replica",
"promote_share_replica",
"periodic_share_replica_update",
"update_share_replica",
"create_replicated_snapshot",
"delete_replicated_snapshot",
"periodic_share_replica_snapshot_update",
)
def test_call_driver_when_its_init_failed(self, method_name):
self.mock_object(self.share_manager.driver, 'do_setup',
mock.Mock(side_effect=Exception()))
self.share_manager.init_host()
self.assertRaises(
exception.DriverNotInitialized,
getattr(self.share_manager, method_name),
'foo', 'bar', 'quuz'
)
@ddt.data("do_setup", "check_for_setup_error")
def test_init_host_with_driver_failure(self, method_name):
self.mock_object(self.share_manager.driver, method_name,
mock.Mock(side_effect=Exception()))
self.mock_object(manager.LOG, 'exception')
self.share_manager.driver.initialized = False
self.share_manager.init_host()
manager.LOG.exception.assert_called_once_with(
mock.ANY, {'name': self.share_manager.driver.__class__.__name__,
'host': self.share_manager.host,
'exc': mock.ANY})
self.assertFalse(self.share_manager.driver.initialized)
def _setup_init_mocks(self, setup_access_rules=True):
instances = [
db_utils.create_share(id='fake_id_1',
status=constants.STATUS_AVAILABLE,
display_name='fake_name_1').instance,
db_utils.create_share(id='fake_id_2',
status=constants.STATUS_ERROR,
display_name='fake_name_2').instance,
db_utils.create_share(id='fake_id_3',
status=constants.STATUS_AVAILABLE,
display_name='fake_name_3').instance,
db_utils.create_share(
id='fake_id_4',
status=constants.STATUS_AVAILABLE,
task_state=constants.TASK_STATE_MIGRATION_IN_PROGRESS,
display_name='fake_name_4').instance,
db_utils.create_share(id='fake_id_5',
status=constants.STATUS_AVAILABLE,
display_name='fake_name_5').instance,
]
instances[4]['access_rules_status'] = constants.STATUS_OUT_OF_SYNC
if not setup_access_rules:
return instances
rules = [
db_utils.create_access(share_id='fake_id_1'),
db_utils.create_access(share_id='fake_id_3'),
]
return instances, rules
def test_init_host_with_shares_and_rules(self):
# initialization of test data
def raise_share_access_exists(*args, **kwargs):
raise exception.ShareAccessExists(
access_type='fake_access_type', access='fake_access')
instances, rules = self._setup_init_mocks()
fake_export_locations = ['fake/path/1', 'fake/path']
share_server = 'fake_share_server_type_does_not_matter'
self.mock_object(self.share_manager.db,
'share_instances_get_all_by_host',
mock.Mock(return_value=instances))
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(side_effect=[instances[0], instances[2],
instances[4]]))
self.mock_object(self.share_manager.db,
'share_export_locations_update')
self.mock_object(self.share_manager.driver, 'ensure_share',
mock.Mock(return_value=fake_export_locations))
self.mock_object(self.share_manager, '_ensure_share_instance_has_pool')
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=share_server))
self.mock_object(self.share_manager, 'publish_service_capabilities',
mock.Mock())
self.mock_object(self.share_manager.db,
'share_access_get_all_for_share',
mock.Mock(return_value=rules))
self.mock_object(
self.share_manager.access_helper,
'update_access_rules',
mock.Mock(side_effect=raise_share_access_exists)
)
# call of 'init_host' method
self.share_manager.init_host()
# verification of call
self.share_manager.db.share_instances_get_all_by_host.\
assert_called_once_with(utils.IsAMatcher(context.RequestContext),
self.share_manager.host)
exports_update = self.share_manager.db.share_export_locations_update
exports_update.assert_has_calls([
mock.call(mock.ANY, instances[0]['id'], fake_export_locations),
mock.call(mock.ANY, instances[2]['id'], fake_export_locations)
])
self.share_manager.driver.do_setup.assert_called_once_with(
utils.IsAMatcher(context.RequestContext))
self.share_manager.driver.check_for_setup_error.\
assert_called_once_with()
self.share_manager._ensure_share_instance_has_pool.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext), instances[0]),
mock.call(utils.IsAMatcher(context.RequestContext), instances[2]),
])
self.share_manager._get_share_server.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext), instances[0]),
mock.call(utils.IsAMatcher(context.RequestContext), instances[2]),
])
self.share_manager.driver.ensure_share.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext), instances[0],
share_server=share_server),
mock.call(utils.IsAMatcher(context.RequestContext), instances[2],
share_server=share_server),
])
self.share_manager.publish_service_capabilities.\
assert_called_once_with(
utils.IsAMatcher(context.RequestContext))
self.share_manager.access_helper.update_access_rules.assert_has_calls([
mock.call(mock.ANY, instances[4]['id'], share_server=share_server),
])
def test_init_host_with_exception_on_ensure_share(self):
def raise_exception(*args, **kwargs):
raise exception.ManilaException(message="Fake raise")
instances = self._setup_init_mocks(setup_access_rules=False)
share_server = 'fake_share_server_type_does_not_matter'
self.mock_object(self.share_manager.db,
'share_instances_get_all_by_host',
mock.Mock(return_value=instances))
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(side_effect=[instances[0], instances[2],
instances[3]]))
self.mock_object(self.share_manager.driver, 'ensure_share',
mock.Mock(side_effect=raise_exception))
self.mock_object(self.share_manager, '_ensure_share_instance_has_pool')
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=share_server))
self.mock_object(self.share_manager, 'publish_service_capabilities')
self.mock_object(manager.LOG, 'error')
self.mock_object(manager.LOG, 'info')
# call of 'init_host' method
self.share_manager.init_host()
# verification of call
self.share_manager.db.share_instances_get_all_by_host.\
assert_called_once_with(utils.IsAMatcher(context.RequestContext),
self.share_manager.host)
self.share_manager.driver.do_setup.assert_called_once_with(
utils.IsAMatcher(context.RequestContext))
self.share_manager.driver.check_for_setup_error.assert_called_with()
self.share_manager._ensure_share_instance_has_pool.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext), instances[0]),
mock.call(utils.IsAMatcher(context.RequestContext), instances[2]),
])
self.share_manager._get_share_server.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext), instances[0]),
mock.call(utils.IsAMatcher(context.RequestContext), instances[2]),
])
self.share_manager.driver.ensure_share.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext), instances[0],
share_server=share_server),
mock.call(utils.IsAMatcher(context.RequestContext), instances[2],
share_server=share_server),
])
self.share_manager.publish_service_capabilities.\
assert_called_once_with(
utils.IsAMatcher(context.RequestContext))
manager.LOG.info.assert_any_call(
mock.ANY,
{'task': constants.TASK_STATE_MIGRATION_IN_PROGRESS,
'id': instances[3]['id']},
)
manager.LOG.info.assert_any_call(
mock.ANY,
{'id': instances[1]['id'], 'status': instances[1]['status']},
)
def test_init_host_with_exception_on_update_access_rules(self):
def raise_exception(*args, **kwargs):
raise exception.ManilaException(message="Fake raise")
instances, rules = self._setup_init_mocks()
share_server = 'fake_share_server_type_does_not_matter'
smanager = self.share_manager
self.mock_object(smanager.db, 'share_instances_get_all_by_host',
mock.Mock(return_value=instances))
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(side_effect=[instances[0], instances[2],
instances[4]]))
self.mock_object(self.share_manager.driver, 'ensure_share',
mock.Mock(return_value=None))
self.mock_object(smanager, '_ensure_share_instance_has_pool')
self.mock_object(smanager, '_get_share_server',
mock.Mock(return_value=share_server))
self.mock_object(smanager, 'publish_service_capabilities')
self.mock_object(manager.LOG, 'error')
self.mock_object(manager.LOG, 'info')
self.mock_object(smanager.db, 'share_access_get_all_for_share',
mock.Mock(return_value=rules))
self.mock_object(smanager.access_helper, 'update_access_rules',
mock.Mock(side_effect=raise_exception))
# call of 'init_host' method
smanager.init_host()
# verification of call
smanager.db.share_instances_get_all_by_host.\
assert_called_once_with(utils.IsAMatcher(context.RequestContext),
smanager.host)
smanager.driver.do_setup.assert_called_once_with(
utils.IsAMatcher(context.RequestContext))
smanager.driver.check_for_setup_error.assert_called_with()
smanager._ensure_share_instance_has_pool.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext), instances[0]),
mock.call(utils.IsAMatcher(context.RequestContext), instances[2]),
])
smanager._get_share_server.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext), instances[0]),
mock.call(utils.IsAMatcher(context.RequestContext), instances[2]),
])
smanager.driver.ensure_share.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext), instances[0],
share_server=share_server),
mock.call(utils.IsAMatcher(context.RequestContext), instances[2],
share_server=share_server),
])
self.share_manager.publish_service_capabilities.\
assert_called_once_with(
utils.IsAMatcher(context.RequestContext))
manager.LOG.info.assert_any_call(
mock.ANY,
{'task': constants.TASK_STATE_MIGRATION_IN_PROGRESS,
'id': instances[3]['id']},
)
manager.LOG.info.assert_any_call(
mock.ANY,
{'id': instances[1]['id'], 'status': instances[1]['status']},
)
smanager.access_helper.update_access_rules.assert_has_calls([
mock.call(utils.IsAMatcher(context.RequestContext),
instances[4]['id'], share_server=share_server),
])
manager.LOG.error.assert_has_calls([
mock.call(mock.ANY, mock.ANY),
])
def test_create_share_instance_from_snapshot_with_server(self):
"""Test share can be created from snapshot if server exists."""
network = db_utils.create_share_network()
server = db_utils.create_share_server(
share_network_id=network['id'], host='fake_host',
backend_details=dict(fake='fake'))
parent_share = db_utils.create_share(share_network_id='net-id',
share_server_id=server['id'])
share = db_utils.create_share()
share_id = share['id']
snapshot = db_utils.create_snapshot(share_id=parent_share['id'])
snapshot_id = snapshot['id']
self.share_manager.create_share_instance(
self.context, share.instance['id'], snapshot_id=snapshot_id)
self.assertEqual(share_id, db.share_get(context.get_admin_context(),
share_id).id)
shr = db.share_get(self.context, share_id)
self.assertEqual(constants.STATUS_AVAILABLE, shr['status'])
self.assertEqual(server['id'], shr['instance']['share_server_id'])
def test_create_share_instance_from_snapshot_with_server_not_found(self):
"""Test creation from snapshot fails if server not found."""
parent_share = db_utils.create_share(share_network_id='net-id',
share_server_id='fake-id')
share = db_utils.create_share()
share_id = share['id']
snapshot = db_utils.create_snapshot(share_id=parent_share['id'])
snapshot_id = snapshot['id']
self.assertRaises(exception.ShareServerNotFound,
self.share_manager.create_share_instance,
self.context,
share.instance['id'],
snapshot_id=snapshot_id
)
shr = db.share_get(self.context, share_id)
self.assertEqual(constants.STATUS_ERROR, shr['status'])
def test_create_share_instance_from_snapshot(self):
"""Test share can be created from snapshot."""
share = db_utils.create_share()
share_id = share['id']
snapshot = db_utils.create_snapshot(share_id=share_id)
snapshot_id = snapshot['id']
self.share_manager.create_share_instance(
self.context, share.instance['id'], snapshot_id=snapshot_id)
self.assertEqual(share_id, db.share_get(context.get_admin_context(),
share_id).id)
shr = db.share_get(self.context, share_id)
self.assertEqual(constants.STATUS_AVAILABLE, shr['status'])
self.assertTrue(len(shr['export_location']) > 0)
self.assertEqual(2, len(shr['export_locations']))
def test_create_share_instance_for_share_with_replication_support(self):
"""Test update call is made to update replica_state."""
share = db_utils.create_share(replication_type='writable')
share_id = share['id']
self.share_manager.create_share_instance(self.context,
share.instance['id'])
self.assertEqual(share_id, db.share_get(context.get_admin_context(),
share_id).id)
shr = db.share_get(self.context, share_id)
shr_instance = db.share_instance_get(self.context,
share.instance['id'])
self.assertEqual(constants.STATUS_AVAILABLE, shr['status'],)
self.assertEqual(constants.REPLICA_STATE_ACTIVE,
shr_instance['replica_state'])
@ddt.data([], None)
def test_create_share_replica_no_active_replicas(self, active_replicas):
replica = fake_replica()
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=active_replicas))
self.mock_object(
db, 'share_replica_get', mock.Mock(return_value=replica))
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_driver_replica_call = self.mock_object(
self.share_manager.driver, 'create_replica')
self.assertRaises(exception.ReplicationException,
self.share_manager.create_share_replica,
self.context, replica)
mock_replica_update_call.assert_called_once_with(
mock.ANY, replica['id'], {'status': constants.STATUS_ERROR,
'replica_state': constants.STATUS_ERROR})
self.assertFalse(mock_driver_replica_call.called)
def test_create_share_replica_with_share_network_id_and_not_dhss(self):
replica = fake_replica()
manager.CONF.set_default('driver_handles_share_servers', False)
self.mock_object(db, 'share_access_get_all_for_share',
mock.Mock(return_value=[]))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=fake_replica(id='fake2')))
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_driver_replica_call = self.mock_object(
self.share_manager.driver, 'create_replica')
self.assertRaises(exception.InvalidDriverMode,
self.share_manager.create_share_replica,
self.context, replica)
mock_replica_update_call.assert_called_once_with(
mock.ANY, replica['id'], {'status': constants.STATUS_ERROR,
'replica_state': constants.STATUS_ERROR})
self.assertFalse(mock_driver_replica_call.called)
def test_create_share_replica_with_share_server_exception(self):
replica = fake_replica()
manager.CONF.set_default('driver_handles_share_servers', True)
self.mock_object(db, 'share_instance_access_copy',
mock.Mock(return_value=[]))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=fake_replica(id='fake2')))
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_driver_replica_call = self.mock_object(
self.share_manager.driver, 'create_replica')
self.assertRaises(exception.NotFound,
self.share_manager.create_share_replica,
self.context, replica)
mock_replica_update_call.assert_called_once_with(
mock.ANY, replica['id'], {'status': constants.STATUS_ERROR,
'replica_state': constants.STATUS_ERROR})
self.assertFalse(mock_driver_replica_call.called)
def test_create_share_replica_driver_error_on_creation(self):
fake_access_rules = [{'id': '1'}, {'id': '2'}, {'id': '3'}]
replica = fake_replica(share_network_id='')
replica_2 = fake_replica(id='fake2')
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_instance_access_copy',
mock.Mock(return_value=fake_access_rules))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=replica_2))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, replica_2]))
self.mock_object(self.share_manager,
'_provide_share_server_for_share',
mock.Mock(return_value=('FAKE_SERVER', replica)))
self.mock_object(self.share_manager,
'_get_replica_snapshots_for_snapshot',
mock.Mock(return_value=[]))
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_export_locs_update_call = self.mock_object(
db, 'share_export_locations_update')
mock_log_error = self.mock_object(manager.LOG, 'error')
mock_log_info = self.mock_object(manager.LOG, 'info')
self.mock_object(db, 'share_instance_access_get',
mock.Mock(return_value=fake_access_rules[0]))
mock_share_replica_access_update = self.mock_object(
db, 'share_instance_update_access_status')
self.mock_object(self.share_manager, '_get_share_server')
driver_call = self.mock_object(
self.share_manager.driver, 'create_replica',
mock.Mock(side_effect=exception.ManilaException))
self.assertRaises(exception.ManilaException,
self.share_manager.create_share_replica,
self.context, replica)
mock_replica_update_call.assert_called_once_with(
mock.ANY, replica['id'], {'status': constants.STATUS_ERROR,
'replica_state': constants.STATUS_ERROR})
self.assertEqual(1, mock_share_replica_access_update.call_count)
self.assertFalse(mock_export_locs_update_call.called)
self.assertTrue(mock_log_error.called)
self.assertFalse(mock_log_info.called)
self.assertTrue(driver_call.called)
def test_create_share_replica_invalid_locations_state(self):
driver_retval = {
'export_locations': 'FAKE_EXPORT_LOC',
}
replica = fake_replica(share_network='')
replica_2 = fake_replica(id='fake2')
fake_access_rules = [{'id': '1'}, {'id': '2'}]
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=replica_2))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, replica_2]))
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_instance_access_copy',
mock.Mock(return_value=fake_access_rules))
self.mock_object(self.share_manager,
'_provide_share_server_for_share',
mock.Mock(return_value=('FAKE_SERVER', replica)))
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=None))
self.mock_object(self.share_manager,
'_get_replica_snapshots_for_snapshot',
mock.Mock(return_value=[]))
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_export_locs_update_call = self.mock_object(
db, 'share_export_locations_update')
mock_log_info = self.mock_object(manager.LOG, 'info')
mock_log_warning = self.mock_object(manager.LOG, 'warning')
mock_log_error = self.mock_object(manager.LOG, 'error')
driver_call = self.mock_object(
self.share_manager.driver, 'create_replica',
mock.Mock(return_value=driver_retval))
self.mock_object(db, 'share_instance_access_get',
mock.Mock(return_value=fake_access_rules[0]))
mock_share_replica_access_update = self.mock_object(
db, 'share_instance_update_access_status')
self.share_manager.create_share_replica(self.context, replica)
self.assertFalse(mock_replica_update_call.called)
self.assertEqual(1, mock_share_replica_access_update.call_count)
self.assertFalse(mock_export_locs_update_call.called)
self.assertTrue(mock_log_info.called)
self.assertTrue(mock_log_warning.called)
self.assertFalse(mock_log_error.called)
self.assertTrue(driver_call.called)
call_args = driver_call.call_args_list[0][0]
replica_list_arg = call_args[1]
r_ids = [r['id'] for r in replica_list_arg]
for r in (replica, replica_2):
self.assertIn(r['id'], r_ids)
self.assertEqual(2, len(r_ids))
def test_create_share_replica_no_availability_zone(self):
replica = fake_replica(
availability_zone=None, share_network='',
replica_state=constants.REPLICA_STATE_OUT_OF_SYNC)
replica_2 = fake_replica(id='fake2')
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, replica_2]))
manager.CONF.set_default('storage_availability_zone', 'fake_az')
fake_access_rules = [{'id': '1'}, {'id': '2'}, {'id': '3'}]
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_instance_access_copy',
mock.Mock(return_value=fake_access_rules))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=replica_2))
self.mock_object(self.share_manager,
'_provide_share_server_for_share',
mock.Mock(return_value=('FAKE_SERVER', replica)))
self.mock_object(self.share_manager,
'_get_replica_snapshots_for_snapshot',
mock.Mock(return_value=[]))
mock_replica_update_call = self.mock_object(
db, 'share_replica_update', mock.Mock(return_value=replica))
mock_calls = [
mock.call(mock.ANY, replica['id'],
{'availability_zone': 'fake_az'}, with_share_data=True),
mock.call(mock.ANY, replica['id'],
{'status': constants.STATUS_AVAILABLE,
'replica_state': constants.REPLICA_STATE_OUT_OF_SYNC}),
]
mock_export_locs_update_call = self.mock_object(
db, 'share_export_locations_update')
mock_log_info = self.mock_object(manager.LOG, 'info')
mock_log_warning = self.mock_object(manager.LOG, 'warning')
mock_log_error = self.mock_object(manager.LOG, 'warning')
self.mock_object(db, 'share_instance_access_get',
mock.Mock(return_value=fake_access_rules[0]))
mock_share_replica_access_update = self.mock_object(
self.share_manager, '_update_share_replica_access_rules_state')
driver_call = self.mock_object(
self.share_manager.driver, 'create_replica',
mock.Mock(return_value=replica))
self.mock_object(self.share_manager, '_get_share_server', mock.Mock())
self.share_manager.create_share_replica(self.context, replica)
mock_replica_update_call.assert_has_calls(mock_calls, any_order=False)
mock_share_replica_access_update.assert_called_once_with(
mock.ANY, replica['id'], replica['access_rules_status'])
self.assertTrue(mock_export_locs_update_call.called)
self.assertTrue(mock_log_info.called)
self.assertFalse(mock_log_warning.called)
self.assertFalse(mock_log_error.called)
self.assertTrue(driver_call.called)
@ddt.data(True, False)
def test_create_share_replica(self, has_snapshots):
replica = fake_replica(
share_network='', replica_state=constants.REPLICA_STATE_IN_SYNC)
replica_2 = fake_replica(id='fake2')
snapshots = ([fakes.fake_snapshot(create_instance=True)]
if has_snapshots else [])
snapshot_instances = [
fakes.fake_snapshot_instance(share_instance_id=replica['id']),
fakes.fake_snapshot_instance(share_instance_id='fake2'),
]
fake_access_rules = [{'id': '1'}, {'id': '2'}, {'id': '3'}]
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_instance_access_copy',
mock.Mock(return_value=fake_access_rules))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=replica_2))
self.mock_object(self.share_manager,
'_provide_share_server_for_share',
mock.Mock(return_value=('FAKE_SERVER', replica)))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, replica_2]))
self.mock_object(db, 'share_snapshot_get_all_for_share', mock.Mock(
return_value=snapshots))
mock_instance_get_call = self.mock_object(
db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_export_locs_update_call = self.mock_object(
db, 'share_export_locations_update')
mock_log_info = self.mock_object(manager.LOG, 'info')
mock_log_warning = self.mock_object(manager.LOG, 'warning')
mock_log_error = self.mock_object(manager.LOG, 'warning')
self.mock_object(db, 'share_instance_access_get',
mock.Mock(return_value=fake_access_rules[0]))
mock_share_replica_access_update = self.mock_object(
db, 'share_instance_update_access_status')
driver_call = self.mock_object(
self.share_manager.driver, 'create_replica',
mock.Mock(return_value=replica))
self.mock_object(self.share_manager, '_get_share_server')
self.share_manager.create_share_replica(self.context, replica)
mock_replica_update_call.assert_called_once_with(
mock.ANY, replica['id'],
{'status': constants.STATUS_AVAILABLE,
'replica_state': constants.REPLICA_STATE_IN_SYNC})
self.assertEqual(1, mock_share_replica_access_update.call_count)
self.assertTrue(mock_export_locs_update_call.called)
self.assertTrue(mock_log_info.called)
self.assertFalse(mock_log_warning.called)
self.assertFalse(mock_log_error.called)
self.assertTrue(driver_call.called)
call_args = driver_call.call_args_list[0][0]
replica_list_arg = call_args[1]
snapshot_list_arg = call_args[4]
r_ids = [r['id'] for r in replica_list_arg]
for r in (replica, replica_2):
self.assertIn(r['id'], r_ids)
self.assertEqual(2, len(r_ids))
if has_snapshots:
for snapshot_dict in snapshot_list_arg:
self.assertTrue('active_replica_snapshot' in snapshot_dict)
self.assertTrue('share_replica_snapshot' in snapshot_dict)
else:
self.assertFalse(mock_instance_get_call.called)
def test_delete_share_replica_access_rules_exception(self):
replica = fake_replica()
replica_2 = fake_replica(id='fake_2')
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, replica_2]))
active_replica = fake_replica(
id='Current_active_replica',
replica_state=constants.REPLICA_STATE_ACTIVE)
mock_exception_log = self.mock_object(manager.LOG, 'exception')
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=active_replica))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(self.share_manager.access_helper,
'update_access_rules')
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_replica_delete_call = self.mock_object(db, 'share_replica_delete')
mock_drv_delete_replica_call = self.mock_object(
self.share_manager.driver, 'delete_replica')
self.mock_object(
self.share_manager.access_helper, 'update_access_rules',
mock.Mock(side_effect=exception.ManilaException))
self.assertRaises(exception.ManilaException,
self.share_manager.delete_share_replica,
self.context, replica['id'],
share_id=replica['share_id'])
mock_replica_update_call.assert_called_once_with(
mock.ANY, replica['id'], {'status': constants.STATUS_ERROR})
self.assertFalse(mock_drv_delete_replica_call.called)
self.assertFalse(mock_replica_delete_call.called)
self.assertFalse(mock_exception_log.called)
def test_delete_share_replica_drv_misbehavior_ignored_with_the_force(self):
replica = fake_replica()
active_replica = fake_replica(id='Current_active_replica')
mock_exception_log = self.mock_object(manager.LOG, 'exception')
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, active_replica]))
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=active_replica))
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=None))
self.mock_object(self.share_manager.access_helper,
'update_access_rules')
self.mock_object(
db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=[]))
mock_snap_instance_delete = self.mock_object(
db, 'share_snapshot_instance_delete')
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_replica_delete_call = self.mock_object(db, 'share_replica_delete')
mock_drv_delete_replica_call = self.mock_object(
self.share_manager.driver, 'delete_replica',
mock.Mock(side_effect=exception.ManilaException))
self.mock_object(
self.share_manager.access_helper, 'update_access_rules')
self.share_manager.delete_share_replica(
self.context, replica['id'], share_id=replica['share_id'],
force=True)
self.assertFalse(mock_replica_update_call.called)
self.assertTrue(mock_replica_delete_call.called)
self.assertEqual(1, mock_exception_log.call_count)
self.assertTrue(mock_drv_delete_replica_call.called)
self.assertFalse(mock_snap_instance_delete.called)
def test_delete_share_replica_driver_exception(self):
replica = fake_replica()
active_replica = fake_replica(id='Current_active_replica')
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, active_replica]))
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=active_replica))
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=None))
mock_snapshot_get_call = self.mock_object(
db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=[]))
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_replica_delete_call = self.mock_object(db, 'share_replica_delete')
self.mock_object(
self.share_manager.access_helper, 'update_access_rules')
mock_drv_delete_replica_call = self.mock_object(
self.share_manager.driver, 'delete_replica',
mock.Mock(side_effect=exception.ManilaException))
self.assertRaises(exception.ManilaException,
self.share_manager.delete_share_replica,
self.context, replica['id'],
share_id=replica['share_id'])
self.assertTrue(mock_replica_update_call.called)
self.assertFalse(mock_replica_delete_call.called)
self.assertTrue(mock_drv_delete_replica_call.called)
self.assertTrue(mock_snapshot_get_call.called)
def test_delete_share_replica_both_exceptions_ignored_with_the_force(self):
replica = fake_replica()
active_replica = fake_replica(id='Current_active_replica')
snapshots = [
fakes.fake_snapshot(share_id=replica['id'],
status=constants.STATUS_AVAILABLE),
fakes.fake_snapshot(share_id=replica['id'],
id='test_creating_to_err',
status=constants.STATUS_CREATING)
]
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, active_replica]))
mock_exception_log = self.mock_object(manager.LOG, 'exception')
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=active_replica))
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=None))
self.mock_object(
db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshots))
mock_snapshot_instance_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_replica_delete_call = self.mock_object(db, 'share_replica_delete')
self.mock_object(
self.share_manager.access_helper, 'update_access_rules',
mock.Mock(side_effect=exception.ManilaException))
mock_drv_delete_replica_call = self.mock_object(
self.share_manager.driver, 'delete_replica',
mock.Mock(side_effect=exception.ManilaException))
self.share_manager.delete_share_replica(
self.context, replica['id'], share_id=replica['share_id'],
force=True)
mock_replica_update_call.assert_called_once_with(
mock.ANY, replica['id'], {'status': constants.STATUS_ERROR})
self.assertTrue(mock_replica_delete_call.called)
self.assertEqual(2, mock_exception_log.call_count)
self.assertTrue(mock_drv_delete_replica_call.called)
self.assertEqual(2, mock_snapshot_instance_delete_call.call_count)
def test_delete_share_replica(self):
replica = fake_replica()
active_replica = fake_replica(id='current_active_replica')
snapshots = [
fakes.fake_snapshot(share_id=replica['share_id'],
status=constants.STATUS_AVAILABLE),
fakes.fake_snapshot(share_id=replica['share_id'],
id='test_creating_to_err',
status=constants.STATUS_CREATING)
]
self.mock_object(
db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshots))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, active_replica]))
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=active_replica))
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=None))
mock_info_log = self.mock_object(manager.LOG, 'info')
mock_snapshot_instance_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
mock_replica_update_call = self.mock_object(db, 'share_replica_update')
mock_replica_delete_call = self.mock_object(db, 'share_replica_delete')
self.mock_object(
self.share_manager.access_helper, 'update_access_rules')
mock_drv_delete_replica_call = self.mock_object(
self.share_manager.driver, 'delete_replica')
self.share_manager.delete_share_replica(self.context, replica)
self.assertFalse(mock_replica_update_call.called)
self.assertTrue(mock_replica_delete_call.called)
self.assertTrue(mock_info_log.called)
self.assertTrue(mock_drv_delete_replica_call.called)
self.assertEqual(2, mock_snapshot_instance_delete_call.call_count)
def test_promote_share_replica_no_active_replica(self):
replica = fake_replica()
replica_list = [replica]
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_replicas_get_available_active_replica',
mock.Mock(return_value=replica_list))
mock_info_log = self.mock_object(manager.LOG, 'info')
mock_driver_call = self.mock_object(self.share_manager.driver,
'promote_replica')
mock_replica_update = self.mock_object(db, 'share_replica_update')
expected_update_call = mock.call(
mock.ANY, replica['id'], {'status': constants.STATUS_AVAILABLE})
self.assertRaises(exception.ReplicationException,
self.share_manager.promote_share_replica,
self.context, replica)
self.assertFalse(mock_info_log.called)
self.assertFalse(mock_driver_call.called)
mock_replica_update.assert_has_calls([expected_update_call])
def test_promote_share_replica_driver_exception(self):
replica = fake_replica()
active_replica = fake_replica(
id='current_active_replica',
replica_state=constants.REPLICA_STATE_ACTIVE)
replica_list = [replica, active_replica]
self.mock_object(db, 'share_access_get_all_for_share',
mock.Mock(return_value=[]))
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replica_list))
self.mock_object(self.share_manager.driver, 'promote_replica',
mock.Mock(side_effect=exception.ManilaException))
mock_info_log = self.mock_object(manager.LOG, 'info')
mock_replica_update = self.mock_object(db, 'share_replica_update')
expected_update_calls = [mock.call(
mock.ANY, r['id'], {'status': constants.STATUS_ERROR})
for r in(replica, active_replica)]
self.assertRaises(exception.ManilaException,
self.share_manager.promote_share_replica,
self.context, replica)
mock_replica_update.assert_has_calls(expected_update_calls)
self.assertFalse(mock_info_log.called)
@ddt.data([], None)
def test_promote_share_replica_driver_update_nothing_has_snaps(self,
retval):
replica = fake_replica()
active_replica = fake_replica(
id='current_active_replica',
replica_state=constants.REPLICA_STATE_ACTIVE)
snapshots_instances = [
fakes.fake_snapshot(create_instance=True,
share_id=replica['share_id'],
status=constants.STATUS_AVAILABLE),
fakes.fake_snapshot(create_instance=True,
share_id=replica['share_id'],
id='test_creating_to_err',
status=constants.STATUS_CREATING)
]
replica_list = [replica, active_replica]
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_access_get_all_for_share',
mock.Mock(return_value=[]))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replica_list))
self.mock_object(
db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshots_instances))
self.mock_object(
self.share_manager.driver, 'promote_replica',
mock.Mock(return_value=retval))
mock_snap_instance_update = self.mock_object(
db, 'share_snapshot_instance_update')
mock_info_log = self.mock_object(manager.LOG, 'info')
mock_export_locs_update = self.mock_object(
db, 'share_export_locations_update')
mock_replica_update = self.mock_object(db, 'share_replica_update')
call_1 = mock.call(mock.ANY, replica['id'],
{'status': constants.STATUS_AVAILABLE,
'replica_state': constants.REPLICA_STATE_ACTIVE})
call_2 = mock.call(
mock.ANY, 'current_active_replica',
{'replica_state': constants.REPLICA_STATE_OUT_OF_SYNC})
expected_update_calls = [call_1, call_2]
self.share_manager.promote_share_replica(self.context, replica)
self.assertFalse(mock_export_locs_update.called)
mock_replica_update.assert_has_calls(expected_update_calls,
any_order=True)
mock_snap_instance_update.assert_called_once_with(
mock.ANY, 'test_creating_to_err',
{'status': constants.STATUS_ERROR})
self.assertEqual(2, mock_info_log.call_count)
def test_promote_share_replica_driver_updates_replica_list(self):
replica = fake_replica()
active_replica = fake_replica(
id='current_active_replica',
replica_state=constants.REPLICA_STATE_ACTIVE)
replica_list = [replica, active_replica, fake_replica(id=3)]
updated_replica_list = [
{
'id': replica['id'],
'export_locations': ['TEST1', 'TEST2'],
'replica_state': constants.REPLICA_STATE_ACTIVE,
},
{
'id': 'current_active_replica',
'export_locations': 'junk_return_value',
'replica_state': constants.REPLICA_STATE_IN_SYNC,
},
{
'id': 'other_replica',
'export_locations': ['TEST1', 'TEST2'],
},
]
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(
db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=[]))
self.mock_object(db, 'share_access_get_all_for_share',
mock.Mock(return_value=[]))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replica_list))
mock_snap_instance_update = self.mock_object(
db, 'share_snapshot_instance_update')
self.mock_object(
self.share_manager.driver, 'promote_replica',
mock.Mock(return_value=updated_replica_list))
mock_info_log = self.mock_object(manager.LOG, 'info')
mock_export_locs_update = self.mock_object(
db, 'share_export_locations_update')
mock_replica_update = self.mock_object(db, 'share_replica_update')
reset_replication_change_call = mock.call(
mock.ANY, replica['id'], {'replica_state': constants.STATUS_ACTIVE,
'status': constants.STATUS_AVAILABLE})
self.share_manager.promote_share_replica(self.context, replica)
self.assertEqual(2, mock_export_locs_update.call_count)
self.assertEqual(2, mock_replica_update.call_count)
self.assertTrue(
reset_replication_change_call in mock_replica_update.mock_calls)
self.assertTrue(mock_info_log.called)
self.assertFalse(mock_snap_instance_update.called)
@ddt.data('openstack1@watson#_pool0', 'openstack1@newton#_pool0')
def test_periodic_share_replica_update(self, host):
mock_debug_log = self.mock_object(manager.LOG, 'debug')
replicas = [
fake_replica(host='openstack1@watson#pool4'),
fake_replica(host='openstack1@watson#pool5'),
fake_replica(host='openstack1@newton#pool5'),
fake_replica(host='openstack1@newton#pool5'),
]
self.mock_object(self.share_manager.db, 'share_replicas_get_all',
mock.Mock(return_value=replicas))
mock_update_method = self.mock_object(
self.share_manager, '_share_replica_update')
self.share_manager.host = host
self.share_manager.periodic_share_replica_update(self.context)
self.assertEqual(2, mock_update_method.call_count)
self.assertEqual(1, mock_debug_log.call_count)
@ddt.data(constants.REPLICA_STATE_IN_SYNC,
constants.REPLICA_STATE_OUT_OF_SYNC)
def test__share_replica_update_driver_exception(self, replica_state):
mock_debug_log = self.mock_object(manager.LOG, 'debug')
replica = fake_replica(replica_state=replica_state)
active_replica = fake_replica(
replica_state=constants.REPLICA_STATE_ACTIVE)
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, active_replica]))
self.mock_object(self.share_manager.db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_server_get',
mock.Mock(return_value='fake_share_server'))
self.mock_object(self.share_manager.driver, 'update_replica_state',
mock.Mock(side_effect=exception.ManilaException))
mock_db_update_call = self.mock_object(
self.share_manager.db, 'share_replica_update')
self.share_manager._share_replica_update(
self.context, replica, share_id=replica['share_id'])
mock_db_update_call.assert_called_once_with(
self.context, replica['id'],
{'replica_state': constants.STATUS_ERROR,
'status': constants.STATUS_ERROR}
)
self.assertEqual(1, mock_debug_log.call_count)
def test__share_replica_update_driver_exception_ignored(self):
mock_debug_log = self.mock_object(manager.LOG, 'debug')
replica = fake_replica(replica_state=constants.STATUS_ERROR)
active_replica = fake_replica(replica_state=constants.STATUS_ACTIVE)
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, active_replica]))
self.mock_object(self.share_manager.db, 'share_replica_get',
mock.Mock(return_value=replica))
self.mock_object(db, 'share_server_get',
mock.Mock(return_value='fake_share_server'))
self.share_manager.host = replica['host']
self.mock_object(self.share_manager.driver, 'update_replica_state',
mock.Mock(side_effect=exception.ManilaException))
mock_db_update_call = self.mock_object(
self.share_manager.db, 'share_replica_update')
self.share_manager._share_replica_update(
self.context, replica, share_id=replica['share_id'])
mock_db_update_call.assert_called_once_with(
self.context, replica['id'],
{'replica_state': constants.STATUS_ERROR,
'status': constants.STATUS_ERROR}
)
self.assertEqual(1, mock_debug_log.call_count)
@ddt.data({'status': constants.STATUS_AVAILABLE,
'replica_state': constants.REPLICA_STATE_ACTIVE, },
{'status': constants.STATUS_DELETING,
'replica_state': constants.REPLICA_STATE_IN_SYNC, },
{'status': constants.STATUS_CREATING,
'replica_state': constants.REPLICA_STATE_OUT_OF_SYNC, },
{'status': constants.STATUS_MANAGING,
'replica_state': constants.REPLICA_STATE_OUT_OF_SYNC, },
{'status': constants.STATUS_UNMANAGING,
'replica_state': constants.REPLICA_STATE_ACTIVE, },
{'status': constants.STATUS_EXTENDING,
'replica_state': constants.REPLICA_STATE_IN_SYNC, },
{'status': constants.STATUS_SHRINKING,
'replica_state': constants.REPLICA_STATE_IN_SYNC, })
def test__share_replica_update_unqualified_replica(self, state):
mock_debug_log = self.mock_object(manager.LOG, 'debug')
mock_warning_log = self.mock_object(manager.LOG, 'warning')
mock_driver_call = self.mock_object(
self.share_manager.driver, 'update_replica_state')
mock_db_update_call = self.mock_object(
self.share_manager.db, 'share_replica_update')
replica = fake_replica(**state)
self.mock_object(db, 'share_server_get',
mock.Mock(return_value='fake_share_server'))
self.mock_object(db, 'share_replica_get',
mock.Mock(return_value=replica))
self.share_manager._share_replica_update(self.context, replica,
share_id=replica['share_id'])
self.assertFalse(mock_debug_log.called)
self.assertFalse(mock_warning_log.called)
self.assertFalse(mock_driver_call.called)
self.assertFalse(mock_db_update_call.called)
@ddt.data(None, constants.REPLICA_STATE_IN_SYNC,
constants.REPLICA_STATE_OUT_OF_SYNC,
constants.REPLICA_STATE_ACTIVE,
constants.STATUS_ERROR)
def test__share_replica_update(self, retval):
mock_debug_log = self.mock_object(manager.LOG, 'debug')
mock_warning_log = self.mock_object(manager.LOG, 'warning')
replica_states = [constants.REPLICA_STATE_IN_SYNC,
constants.REPLICA_STATE_OUT_OF_SYNC]
replica = fake_replica(replica_state=random.choice(replica_states),
share_server='fake_share_server')
active_replica = fake_replica(
id='fake2', replica_state=constants.STATUS_ACTIVE)
snapshots = [fakes.fake_snapshot(
create_instance=True, aggregate_status=constants.STATUS_AVAILABLE)]
snapshot_instances = [
fakes.fake_snapshot_instance(share_instance_id=replica['id']),
fakes.fake_snapshot_instance(share_instance_id='fake2'),
]
del replica['availability_zone']
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica, active_replica]))
self.mock_object(db, 'share_server_get',
mock.Mock(return_value='fake_share_server'))
mock_db_update_calls = []
self.mock_object(self.share_manager.db, 'share_replica_get',
mock.Mock(return_value=replica))
mock_driver_call = self.mock_object(
self.share_manager.driver, 'update_replica_state',
mock.Mock(return_value=retval))
mock_db_update_call = self.mock_object(
self.share_manager.db, 'share_replica_update')
self.mock_object(db, 'share_snapshot_get_all_for_share',
mock.Mock(return_value=snapshots))
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
self.share_manager._share_replica_update(
self.context, replica, share_id=replica['share_id'])
if retval == constants.REPLICA_STATE_ACTIVE:
self.assertEqual(1, mock_warning_log.call_count)
elif retval:
self.assertEqual(0, mock_warning_log.call_count)
self.assertTrue(mock_driver_call.called)
snapshot_list_arg = mock_driver_call.call_args[0][4]
self.assertTrue('active_replica_snapshot' in snapshot_list_arg[0])
self.assertTrue('share_replica_snapshot' in snapshot_list_arg[0])
mock_db_update_call.assert_has_calls(mock_db_update_calls)
self.assertEqual(1, mock_debug_log.call_count)
def test_update_share_replica_replica_not_found(self):
replica = fake_replica()
self.mock_object(
self.share_manager.db, 'share_replica_get', mock.Mock(
side_effect=exception.ShareReplicaNotFound(replica_id='fake')))
self.mock_object(self.share_manager, '_get_share_server')
driver_call = self.mock_object(
self.share_manager, '_share_replica_update')
self.assertRaises(
exception.ShareReplicaNotFound,
self.share_manager.update_share_replica,
self.context, replica, share_id=replica['share_id'])
self.assertFalse(driver_call.called)
def test_update_share_replica_replica(self):
replica_update_call = self.mock_object(
self.share_manager, '_share_replica_update')
self.mock_object(self.share_manager.db, 'share_replica_get')
retval = self.share_manager.update_share_replica(
self.context, 'fake_replica_id', share_id='fake_share_id')
self.assertIsNone(retval)
self.assertTrue(replica_update_call.called)
def test_create_delete_share_snapshot(self):
"""Test share's snapshot can be created and deleted."""
def _fake_create_snapshot(self, snapshot, **kwargs):
snapshot['progress'] = '99%'
return snapshot.to_dict()
self.mock_object(self.share_manager.driver, "create_snapshot",
_fake_create_snapshot)
share = db_utils.create_share()
share_id = share['id']
snapshot = db_utils.create_snapshot(share_id=share_id)
snapshot_id = snapshot['id']
self.share_manager.create_snapshot(self.context, share_id,
snapshot_id)
self.assertEqual(share_id,
db.share_snapshot_get(context.get_admin_context(),
snapshot_id).share_id)
snap = db.share_snapshot_get(self.context, snapshot_id)
self.assertEqual(constants.STATUS_AVAILABLE, snap['status'])
self.share_manager.delete_snapshot(self.context, snapshot_id)
self.assertRaises(exception.NotFound,
db.share_snapshot_get,
self.context,
snapshot_id)
def test_create_delete_share_snapshot_error(self):
"""Test snapshot can be created and deleted with error."""
def _raise_not_found(self, *args, **kwargs):
raise exception.NotFound()
self.mock_object(self.share_manager.driver, "create_snapshot",
mock.Mock(side_effect=_raise_not_found))
self.mock_object(self.share_manager.driver, "delete_snapshot",
mock.Mock(side_effect=_raise_not_found))
share = db_utils.create_share()
share_id = share['id']
snapshot = db_utils.create_snapshot(share_id=share_id)
snapshot_id = snapshot['id']
self.assertRaises(exception.NotFound,
self.share_manager.create_snapshot,
self.context, share_id, snapshot_id)
snap = db.share_snapshot_get(self.context, snapshot_id)
self.assertEqual(constants.STATUS_ERROR, snap['status'])
self.assertRaises(exception.NotFound,
self.share_manager.delete_snapshot,
self.context, snapshot_id)
self.assertEqual(
constants.STATUS_ERROR_DELETING,
db.share_snapshot_get(self.context, snapshot_id).status)
self.share_manager.driver.create_snapshot.assert_called_once_with(
self.context, utils.IsAMatcher(models.ShareSnapshotInstance),
share_server=None)
self.share_manager.driver.delete_snapshot.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
utils.IsAMatcher(models.ShareSnapshotInstance),
share_server=None)
def test_delete_snapshot_quota_error(self):
share = db_utils.create_share()
share_id = share['id']
snapshot = db_utils.create_snapshot(share_id=share_id)
snapshot_id = snapshot['id']
snapshot = db_utils.create_snapshot(
with_share=True, status=constants.STATUS_AVAILABLE)
self.mock_object(quota.QUOTAS, 'reserve',
mock.Mock(side_effect=exception.QuotaError('fake')))
self.mock_object(quota.QUOTAS, 'commit')
self.share_manager.delete_snapshot(self.context, snapshot_id)
quota.QUOTAS.reserve.assert_called_once_with(
mock.ANY,
project_id=six.text_type(snapshot['project_id']),
snapshots=-1,
snapshot_gigabytes=-snapshot['size'],
user_id=six.text_type(snapshot['user_id'])
)
self.assertFalse(quota.QUOTAS.commit.called)
def test_delete_share_instance_if_busy(self):
"""Test snapshot could not be deleted if busy."""
def _raise_share_snapshot_is_busy(self, *args, **kwargs):
raise exception.ShareSnapshotIsBusy(snapshot_name='fakename')
self.mock_object(self.share_manager.driver, "delete_snapshot",
mock.Mock(side_effect=_raise_share_snapshot_is_busy))
share = db_utils.create_share(status=constants.STATUS_ACTIVE)
snapshot = db_utils.create_snapshot(share_id=share['id'])
snapshot_id = snapshot['id']
self.share_manager.delete_snapshot(self.context, snapshot_id)
snap = db.share_snapshot_get(self.context, snapshot_id)
self.assertEqual(constants.STATUS_AVAILABLE, snap['status'])
self.share_manager.driver.delete_snapshot.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
utils.IsAMatcher(models.ShareSnapshotInstance),
share_server=None)
def test_create_share_instance_with_share_network_dhss_false(self):
manager.CONF.set_default('driver_handles_share_servers', False)
self.mock_object(
self.share_manager.driver.configuration, 'safe_get',
mock.Mock(return_value=False))
share_network_id = 'fake_sn'
share_instance = db_utils.create_share(
share_network_id=share_network_id).instance
self.mock_object(
self.share_manager.db, 'share_instance_get',
mock.Mock(return_value=share_instance))
self.mock_object(self.share_manager.db, 'share_instance_update')
self.assertRaisesRegex(
exception.ManilaException,
'.*%s.*' % share_instance['id'],
self.share_manager.create_share_instance, self.context,
share_instance['id'])
self.share_manager.db.share_instance_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
share_instance['id'],
with_share_data=True
)
self.share_manager.db.share_instance_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), share_instance['id'],
{'status': constants.STATUS_ERROR})
def test_create_share_instance_with_share_network_server_not_exists(self):
"""Test share can be created without share server."""
share_net = db_utils.create_share_network()
share = db_utils.create_share(share_network_id=share_net['id'])
share_id = share['id']
def fake_setup_server(context, share_network, *args, **kwargs):
return db_utils.create_share_server(
share_network_id=share_network['id'],
host='fake_host')
self.mock_object(manager.LOG, 'info')
self.share_manager.driver.create_share = mock.Mock(
return_value='fake_location')
self.share_manager._setup_server = fake_setup_server
self.share_manager.create_share_instance(self.context,
share.instance['id'])
self.assertEqual(share_id, db.share_get(context.get_admin_context(),
share_id).id)
manager.LOG.info.assert_called_with(mock.ANY, share.instance['id'])
def test_create_share_instance_with_share_network_server_fail(self):
fake_share = db_utils.create_share(share_network_id='fake_sn_id',
size=1)
fake_server = {
'id': 'fake_srv_id',
'status': constants.STATUS_CREATING,
}
self.mock_object(db, 'share_server_create',
mock.Mock(return_value=fake_server))
self.mock_object(db, 'share_instance_update',
mock.Mock(return_value=fake_share.instance))
self.mock_object(db, 'share_instance_get',
mock.Mock(return_value=fake_share.instance))
self.mock_object(manager.LOG, 'error')
def raise_share_server_not_found(*args, **kwargs):
raise exception.ShareServerNotFound(
share_server_id=fake_server['id'])
def raise_manila_exception(*args, **kwargs):
raise exception.ManilaException()
self.mock_object(db,
'share_server_get_all_by_host_and_share_net_valid',
mock.Mock(side_effect=raise_share_server_not_found))
self.mock_object(self.share_manager, '_setup_server',
mock.Mock(side_effect=raise_manila_exception))
self.assertRaises(
exception.ManilaException,
self.share_manager.create_share_instance,
self.context,
fake_share.instance['id'],
)
db.share_server_get_all_by_host_and_share_net_valid.\
assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
self.share_manager.host,
fake_share['share_network_id'],
)
db.share_server_create.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), mock.ANY)
db.share_instance_update.assert_has_calls([
mock.call(
utils.IsAMatcher(context.RequestContext),
fake_share.instance['id'],
{'status': constants.STATUS_ERROR},
)
])
self.share_manager._setup_server.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), fake_server)
manager.LOG.error.assert_called_with(mock.ANY,
fake_share.instance['id'])
def test_create_share_instance_with_share_network_not_found(self):
"""Test creation fails if share network not found."""
self.mock_object(manager.LOG, 'error')
share = db_utils.create_share(share_network_id='fake-net-id')
share_id = share['id']
self.assertRaises(
exception.ShareNetworkNotFound,
self.share_manager.create_share_instance,
self.context,
share.instance['id']
)
manager.LOG.error.assert_called_with(mock.ANY, share.instance['id'])
shr = db.share_get(self.context, share_id)
self.assertEqual(constants.STATUS_ERROR, shr['status'])
def test_create_share_instance_with_share_network_server_exists(self):
"""Test share can be created with existing share server."""
share_net = db_utils.create_share_network()
share = db_utils.create_share(share_network_id=share_net['id'])
share_srv = db_utils.create_share_server(
share_network_id=share_net['id'], host=self.share_manager.host)
share_id = share['id']
self.mock_object(manager.LOG, 'info')
driver_mock = mock.Mock()
driver_mock.create_share.return_value = "fake_location"
driver_mock.choose_share_server_compatible_with_share.return_value = (
share_srv
)
self.share_manager.driver = driver_mock
self.share_manager.create_share_instance(self.context,
share.instance['id'])
self.assertFalse(self.share_manager.driver.setup_network.called)
self.assertEqual(share_id, db.share_get(context.get_admin_context(),
share_id).id)
shr = db.share_get(self.context, share_id)
self.assertEqual(shr['status'], constants.STATUS_AVAILABLE)
self.assertEqual(shr['share_server_id'], share_srv['id'])
self.assertTrue(len(shr['export_location']) > 0)
self.assertEqual(1, len(shr['export_locations']))
manager.LOG.info.assert_called_with(mock.ANY, share.instance['id'])
@ddt.data('export_location', 'export_locations')
def test_create_share_instance_with_error_in_driver(self, details_key):
"""Test db updates if share creation fails in driver."""
share = db_utils.create_share()
share_id = share['id']
some_data = 'fake_location'
self.share_manager.driver = mock.Mock()
e = exception.ManilaException(detail_data={details_key: some_data})
self.share_manager.driver.create_share.side_effect = e
self.assertRaises(
exception.ManilaException,
self.share_manager.create_share_instance,
self.context,
share.instance['id']
)
self.assertTrue(self.share_manager.driver.create_share.called)
shr = db.share_get(self.context, share_id)
self.assertEqual(some_data, shr['export_location'])
def test_create_share_instance_with_server_created(self):
"""Test share can be created and share server is created."""
share_net = db_utils.create_share_network()
share = db_utils.create_share(share_network_id=share_net['id'])
db_utils.create_share_server(
share_network_id=share_net['id'], host=self.share_manager.host,
status=constants.STATUS_ERROR)
share_id = share['id']
fake_server = {
'id': 'fake_srv_id',
'status': constants.STATUS_CREATING,
}
self.mock_object(db, 'share_server_create',
mock.Mock(return_value=fake_server))
self.mock_object(self.share_manager, '_setup_server',
mock.Mock(return_value=fake_server))
self.share_manager.create_share_instance(self.context,
share.instance['id'])
self.assertEqual(share_id, db.share_get(context.get_admin_context(),
share_id).id)
shr = db.share_get(self.context, share_id)
self.assertEqual(constants.STATUS_AVAILABLE, shr['status'])
self.assertEqual('fake_srv_id', shr['share_server_id'])
db.share_server_create.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), mock.ANY)
self.share_manager._setup_server.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), fake_server)
def test_create_share_instance_update_replica_state(self):
share_net = db_utils.create_share_network()
share = db_utils.create_share(share_network_id=share_net['id'],
replication_type='dr')
db_utils.create_share_server(
share_network_id=share_net['id'], host=self.share_manager.host,
status=constants.STATUS_ERROR)
share_id = share['id']
fake_server = {
'id': 'fake_srv_id',
'status': constants.STATUS_CREATING,
}
self.mock_object(db, 'share_server_create',
mock.Mock(return_value=fake_server))
self.mock_object(self.share_manager, '_setup_server',
mock.Mock(return_value=fake_server))
self.share_manager.create_share_instance(self.context,
share.instance['id'])
self.assertEqual(share_id, db.share_get(context.get_admin_context(),
share_id).id)
shr = db.share_get(self.context, share_id)
shr_instances = db.share_instances_get_all_by_share(
self.context, shr['id'])
self.assertEqual(1, len(shr_instances))
self.assertEqual(constants.STATUS_AVAILABLE, shr['status'])
self.assertEqual(
constants.REPLICA_STATE_ACTIVE, shr_instances[0]['replica_state'])
self.assertEqual('fake_srv_id', shr['share_server_id'])
db.share_server_create.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), mock.ANY)
self.share_manager._setup_server.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), fake_server)
@ddt.data(True, False)
def test_create_delete_share_instance_error(self, exception_update_access):
"""Test share can be created and deleted with error."""
def _raise_exception(self, *args, **kwargs):
raise exception.ManilaException('fake')
self.mock_object(self.share_manager.driver, "create_share",
mock.Mock(side_effect=_raise_exception))
self.mock_object(self.share_manager.driver, "delete_share",
mock.Mock(side_effect=_raise_exception))
if exception_update_access:
self.mock_object(
self.share_manager.access_helper, "update_access_rules",
mock.Mock(side_effect=_raise_exception))
share = db_utils.create_share()
share_id = share['id']
self.assertRaises(exception.ManilaException,
self.share_manager.create_share_instance,
self.context,
share.instance['id'])
shr = db.share_get(self.context, share_id)
self.assertEqual(constants.STATUS_ERROR, shr['status'])
self.assertRaises(exception.ManilaException,
self.share_manager.delete_share_instance,
self.context,
share.instance['id'])
shr = db.share_get(self.context, share_id)
self.assertEqual(constants.STATUS_ERROR_DELETING, shr['status'])
self.share_manager.driver.create_share.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
utils.IsAMatcher(models.ShareInstance),
share_server=None)
if not exception_update_access:
self.share_manager.driver.delete_share.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
utils.IsAMatcher(models.ShareInstance),
share_server=None)
def test_create_share_instance_update_availability_zone(self):
share = db_utils.create_share(availability_zone=None)
share_id = share['id']
self.share_manager.create_share_instance(
self.context, share.instance['id'])
actual_share = db.share_get(context.get_admin_context(), share_id)
self.assertIsNotNone(actual_share.availability_zone)
self.assertEqual(manager.CONF.storage_availability_zone,
actual_share.availability_zone)
def test_provide_share_server_for_share_incompatible_servers(self):
fake_exception = exception.ManilaException("fake")
fake_share_server = {'id': 'fake'}
share = db_utils.create_share()
self.mock_object(db,
'share_server_get_all_by_host_and_share_net_valid',
mock.Mock(return_value=[fake_share_server]))
self.mock_object(
self.share_manager.driver,
"choose_share_server_compatible_with_share",
mock.Mock(side_effect=fake_exception)
)
self.assertRaises(exception.ManilaException,
self.share_manager._provide_share_server_for_share,
self.context, "fake_id", share.instance)
driver_mock = self.share_manager.driver
driver_method_mock = (
driver_mock.choose_share_server_compatible_with_share
)
driver_method_mock.assert_called_once_with(
self.context, [fake_share_server], share.instance, snapshot=None,
consistency_group=None)
def test_provide_share_server_for_share_invalid_arguments(self):
self.assertRaises(ValueError,
self.share_manager._provide_share_server_for_share,
self.context, None, None)
def test_provide_share_server_for_share_parent_ss_not_found(self):
fake_parent_id = "fake_server_id"
fake_exception = exception.ShareServerNotFound("fake")
share = db_utils.create_share()
fake_snapshot = {
'share': {
'instance': {
'share_server_id': fake_parent_id
}
}
}
self.mock_object(db, 'share_server_get',
mock.Mock(side_effect=fake_exception))
self.assertRaises(exception.ShareServerNotFound,
self.share_manager._provide_share_server_for_share,
self.context, "fake_id", share.instance,
snapshot=fake_snapshot)
db.share_server_get.assert_called_once_with(
self.context, fake_parent_id)
def test_provide_share_server_for_share_parent_ss_invalid(self):
fake_parent_id = "fake_server_id"
share = db_utils.create_share()
fake_snapshot = {
'share': {
'instance': {
'share_server_id': fake_parent_id
}
}
}
fake_parent_share_server = {'status': 'fake'}
self.mock_object(db, 'share_server_get',
mock.Mock(return_value=fake_parent_share_server))
self.assertRaises(exception.InvalidShareServer,
self.share_manager._provide_share_server_for_share,
self.context, "fake_id", share.instance,
snapshot=fake_snapshot)
db.share_server_get.assert_called_once_with(
self.context, fake_parent_id)
def test_provide_share_server_for_cg_incompatible_servers(self):
fake_exception = exception.ManilaException("fake")
fake_share_server = {'id': 'fake'}
cg = db_utils.create_consistency_group()
self.mock_object(db,
'share_server_get_all_by_host_and_share_net_valid',
mock.Mock(return_value=[fake_share_server]))
self.mock_object(
self.share_manager.driver,
"choose_share_server_compatible_with_cg",
mock.Mock(side_effect=fake_exception)
)
self.assertRaises(exception.ManilaException,
self.share_manager._provide_share_server_for_cg,
self.context, "fake_id", cg)
driver_mock = self.share_manager.driver
driver_method_mock = (
driver_mock.choose_share_server_compatible_with_cg
)
driver_method_mock.assert_called_once_with(
self.context, [fake_share_server], cg, cgsnapshot=None)
def test_provide_share_server_for_cg_invalid_arguments(self):
self.assertRaises(exception.InvalidInput,
self.share_manager._provide_share_server_for_cg,
self.context, None, None)
def test_manage_share_invalid_driver(self):
self.mock_object(self.share_manager, 'driver', mock.Mock())
self.share_manager.driver.driver_handles_share_servers = True
self.mock_object(share_types,
'get_share_type_extra_specs',
mock.Mock(return_value='False'))
self.mock_object(self.share_manager.db, 'share_update', mock.Mock())
share = db_utils.create_share()
share_id = share['id']
self.assertRaises(
exception.InvalidDriverMode,
self.share_manager.manage_share, self.context, share_id, {})
self.share_manager.db.share_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), share_id,
{'status': constants.STATUS_MANAGE_ERROR, 'size': 1})
def test_manage_share_invalid_share_type(self):
self.mock_object(self.share_manager, 'driver', mock.Mock())
self.share_manager.driver.driver_handles_share_servers = False
self.mock_object(share_types,
'get_share_type_extra_specs',
mock.Mock(return_value='True'))
self.mock_object(self.share_manager.db, 'share_update', mock.Mock())
share = db_utils.create_share()
share_id = share['id']
self.assertRaises(
exception.ManageExistingShareTypeMismatch,
self.share_manager.manage_share, self.context, share_id, {})
self.share_manager.db.share_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), share_id,
{'status': constants.STATUS_MANAGE_ERROR, 'size': 1})
def test_manage_share_driver_exception(self):
CustomException = type('CustomException', (Exception,), dict())
self.mock_object(self.share_manager, 'driver', mock.Mock())
self.share_manager.driver.driver_handles_share_servers = False
self.mock_object(self.share_manager.driver,
'manage_existing',
mock.Mock(side_effect=CustomException))
self.mock_object(share_types,
'get_share_type_extra_specs',
mock.Mock(return_value='False'))
self.mock_object(self.share_manager.db, 'share_update', mock.Mock())
share = db_utils.create_share()
share_id = share['id']
driver_options = {'fake': 'fake'}
self.assertRaises(
CustomException,
self.share_manager.manage_share,
self.context, share_id, driver_options)
self.share_manager.driver.manage_existing.\
assert_called_once_with(mock.ANY, driver_options)
self.share_manager.db.share_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), share_id,
{'status': constants.STATUS_MANAGE_ERROR, 'size': 1})
def test_manage_share_invalid_size(self):
self.mock_object(self.share_manager, 'driver')
self.share_manager.driver.driver_handles_share_servers = False
self.mock_object(share_types,
'get_share_type_extra_specs',
mock.Mock(return_value='False'))
self.mock_object(self.share_manager.driver,
"manage_existing",
mock.Mock(return_value=None))
self.mock_object(self.share_manager.db, 'share_update', mock.Mock())
share = db_utils.create_share()
share_id = share['id']
driver_options = {'fake': 'fake'}
self.assertRaises(
exception.InvalidShare,
self.share_manager.manage_share,
self.context, share_id, driver_options)
self.share_manager.driver.manage_existing.\
assert_called_once_with(mock.ANY, driver_options)
self.share_manager.db.share_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), share_id,
{'status': constants.STATUS_MANAGE_ERROR, 'size': 1})
def test_manage_share_quota_error(self):
self.mock_object(self.share_manager, 'driver')
self.share_manager.driver.driver_handles_share_servers = False
self.mock_object(share_types,
'get_share_type_extra_specs',
mock.Mock(return_value='False'))
self.mock_object(self.share_manager.driver,
"manage_existing",
mock.Mock(return_value={'size': 3}))
self.mock_object(self.share_manager, '_update_quota_usages',
mock.Mock(side_effect=exception.QuotaError))
self.mock_object(self.share_manager.db, 'share_update', mock.Mock())
share = db_utils.create_share()
share_id = share['id']
driver_options = {'fake': 'fake'}
self.assertRaises(
exception.QuotaError,
self.share_manager.manage_share,
self.context, share_id, driver_options)
self.share_manager.driver.manage_existing.\
assert_called_once_with(mock.ANY, driver_options)
self.share_manager.db.share_update.assert_called_once_with(
mock.ANY, share_id,
{'status': constants.STATUS_MANAGE_ERROR, 'size': 1})
self.share_manager._update_quota_usages.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
share['project_id'], {'shares': 1, 'gigabytes': 3})
@ddt.data(
{'size': 1},
{'size': 2, 'name': 'fake'},
{'size': 3, 'export_locations': ['foo', 'bar', 'quuz']})
def test_manage_share_valid_share(self, driver_data):
export_locations = driver_data.get('export_locations')
self.mock_object(self.share_manager.db, 'share_update', mock.Mock())
self.mock_object(self.share_manager, 'driver', mock.Mock())
self.mock_object(self.share_manager, '_update_quota_usages',
mock.Mock())
self.mock_object(
self.share_manager.db,
'share_export_locations_update',
mock.Mock(side_effect=(
self.share_manager.db.share_export_locations_update)))
self.share_manager.driver.driver_handles_share_servers = False
self.mock_object(share_types,
'get_share_type_extra_specs',
mock.Mock(return_value='False'))
self.mock_object(self.share_manager.driver,
"manage_existing",
mock.Mock(return_value=driver_data))
share = db_utils.create_share()
share_id = share['id']
driver_options = {'fake': 'fake'}
self.share_manager.manage_share(self.context, share_id, driver_options)
self.share_manager.driver.manage_existing.\
assert_called_once_with(mock.ANY, driver_options)
if export_locations:
self.share_manager.db.share_export_locations_update.\
assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
share.instance['id'], export_locations, delete=True)
else:
self.assertFalse(
self.share_manager.db.share_export_locations_update.called)
valid_share_data = {
'status': constants.STATUS_AVAILABLE, 'launched_at': mock.ANY}
valid_share_data.update(driver_data)
self.share_manager.db.share_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
share_id, valid_share_data)
def test_update_quota_usages_new(self):
self.mock_object(self.share_manager.db, 'quota_usage_get',
mock.Mock(return_value={'in_use': 1}))
self.mock_object(self.share_manager.db, 'quota_usage_update')
project_id = 'fake_project_id'
resource_name = 'fake'
usage = 1
self.share_manager._update_quota_usages(
self.context, project_id, {resource_name: usage})
self.share_manager.db.quota_usage_get.assert_called_once_with(
mock.ANY, project_id, resource_name, mock.ANY)
self.share_manager.db.quota_usage_update.assert_called_once_with(
mock.ANY, project_id, mock.ANY, resource_name, in_use=2)
def test_update_quota_usages_update(self):
project_id = 'fake_project_id'
resource_name = 'fake'
usage = 1
side_effect = exception.QuotaUsageNotFound(project_id=project_id)
self.mock_object(
self.share_manager.db,
'quota_usage_get',
mock.Mock(side_effect=side_effect))
self.mock_object(self.share_manager.db, 'quota_usage_create')
self.share_manager._update_quota_usages(
self.context, project_id, {resource_name: usage})
self.share_manager.db.quota_usage_get.assert_called_once_with(
mock.ANY, project_id, resource_name, mock.ANY)
self.share_manager.db.quota_usage_create.assert_called_once_with(
mock.ANY, project_id, mock.ANY, resource_name, usage)
def _setup_unmanage_mocks(self, mock_driver=True, mock_unmanage=None):
if mock_driver:
self.mock_object(self.share_manager, 'driver')
if mock_unmanage:
self.mock_object(self.share_manager.driver, "unmanage",
mock_unmanage)
self.mock_object(self.share_manager.db, 'share_update')
self.mock_object(self.share_manager.db, 'share_instance_delete')
@ddt.data(True, False)
def test_unmanage_share_invalid_driver(self, driver_handles_share_servers):
self._setup_unmanage_mocks()
self.share_manager.driver.driver_handles_share_servers = (
driver_handles_share_servers
)
share_net = db_utils.create_share_network()
share_srv = db_utils.create_share_server(
share_network_id=share_net['id'], host=self.share_manager.host)
share = db_utils.create_share(share_network_id=share_net['id'],
share_server_id=share_srv['id'])
self.share_manager.unmanage_share(self.context, share['id'])
self.share_manager.db.share_update.assert_called_once_with(
mock.ANY, share['id'], {'status': constants.STATUS_UNMANAGE_ERROR})
def test_unmanage_share_invalid_share(self):
unmanage = mock.Mock(side_effect=exception.InvalidShare(reason="fake"))
self._setup_unmanage_mocks(mock_driver=False, mock_unmanage=unmanage)
share = db_utils.create_share()
self.share_manager.unmanage_share(self.context, share['id'])
self.share_manager.db.share_update.assert_called_once_with(
mock.ANY, share['id'], {'status': constants.STATUS_UNMANAGE_ERROR})
def test_unmanage_share_valid_share(self):
manager.CONF.set_default('driver_handles_share_servers', False)
self._setup_unmanage_mocks(mock_driver=False,
mock_unmanage=mock.Mock())
share = db_utils.create_share()
share_id = share['id']
share_instance_id = share.instance['id']
self.share_manager.unmanage_share(self.context, share_id)
self.share_manager.driver.unmanage.\
assert_called_once_with(mock.ANY)
self.share_manager.db.share_instance_delete.assert_called_once_with(
mock.ANY, share_instance_id)
def test_unmanage_share_valid_share_with_quota_error(self):
manager.CONF.set_default('driver_handles_share_servers', False)
self._setup_unmanage_mocks(mock_driver=False,
mock_unmanage=mock.Mock())
self.mock_object(quota.QUOTAS, 'reserve',
mock.Mock(side_effect=Exception()))
share = db_utils.create_share()
share_instance_id = share.instance['id']
self.share_manager.unmanage_share(self.context, share['id'])
self.share_manager.driver.unmanage.\
assert_called_once_with(mock.ANY)
self.share_manager.db.share_instance_delete.assert_called_once_with(
mock.ANY, share_instance_id)
def test_unmanage_share_remove_access_rules_error(self):
manager.CONF.set_default('driver_handles_share_servers', False)
manager.CONF.unmanage_remove_access_rules = True
self._setup_unmanage_mocks(mock_driver=False,
mock_unmanage=mock.Mock())
self.mock_object(
self.share_manager.access_helper,
'update_access_rules',
mock.Mock(side_effect=Exception())
)
self.mock_object(quota.QUOTAS, 'reserve', mock.Mock(return_value=[]))
share = db_utils.create_share()
self.share_manager.unmanage_share(self.context, share['id'])
self.share_manager.db.share_update.assert_called_once_with(
mock.ANY, share['id'], {'status': constants.STATUS_UNMANAGE_ERROR})
def test_unmanage_share_valid_share_remove_access_rules(self):
manager.CONF.set_default('driver_handles_share_servers', False)
manager.CONF.unmanage_remove_access_rules = True
self._setup_unmanage_mocks(mock_driver=False,
mock_unmanage=mock.Mock())
smanager = self.share_manager
self.mock_object(smanager.access_helper, 'update_access_rules')
self.mock_object(quota.QUOTAS, 'reserve', mock.Mock(return_value=[]))
share = db_utils.create_share()
share_id = share['id']
share_instance_id = share.instance['id']
smanager.unmanage_share(self.context, share_id)
smanager.driver.unmanage.assert_called_once_with(mock.ANY)
smanager.access_helper.update_access_rules.assert_called_once_with(
mock.ANY, mock.ANY, delete_rules='all', share_server=None
)
smanager.db.share_instance_delete.assert_called_once_with(
mock.ANY, share_instance_id)
def test_delete_share_instance_share_server_not_found(self):
share_net = db_utils.create_share_network()
share = db_utils.create_share(share_network_id=share_net['id'],
share_server_id='fake-id')
self.assertRaises(
exception.ShareServerNotFound,
self.share_manager.delete_share_instance,
self.context,
share.instance['id']
)
@ddt.data(True, False)
def test_delete_share_instance_last_on_srv_with_sec_service(
self, with_details):
share_net = db_utils.create_share_network()
sec_service = db_utils.create_security_service(
share_network_id=share_net['id'])
backend_details = dict(
security_service_ldap=jsonutils.dumps(sec_service))
if with_details:
share_srv = db_utils.create_share_server(
share_network_id=share_net['id'],
host=self.share_manager.host,
backend_details=backend_details)
else:
share_srv = db_utils.create_share_server(
share_network_id=share_net['id'],
host=self.share_manager.host)
db.share_server_backend_details_set(
context.get_admin_context(), share_srv['id'], backend_details)
share = db_utils.create_share(share_network_id=share_net['id'],
share_server_id=share_srv['id'])
self.share_manager.driver = mock.Mock()
manager.CONF.delete_share_server_with_last_share = True
self.share_manager.delete_share_instance(self.context,
share.instance['id'])
self.share_manager.driver.teardown_server.assert_called_once_with(
server_details=backend_details,
security_services=[jsonutils.loads(
backend_details['security_service_ldap'])])
@ddt.data({'force': True, 'side_effect': 'update_access'},
{'force': True, 'side_effect': 'delete_share'},
{'force': False, 'side_effect': None})
@ddt.unpack
def test_delete_share_instance_last_on_server(self, force, side_effect):
share_net = db_utils.create_share_network()
share_srv = db_utils.create_share_server(
share_network_id=share_net['id'],
host=self.share_manager.host
)
share = db_utils.create_share(share_network_id=share_net['id'],
share_server_id=share_srv['id'])
self.share_manager.driver = mock.Mock()
if side_effect == 'update_access':
self.mock_object(
self.share_manager.access_helper, 'update_access_rules',
mock.Mock(side_effect=Exception('fake')))
if side_effect == 'delete_share':
self.mock_object(self.share_manager.driver, 'delete_share',
mock.Mock(side_effect=Exception('fake')))
self.mock_object(manager.LOG, 'error')
manager.CONF.delete_share_server_with_last_share = True
self.share_manager.delete_share_instance(
self.context, share.instance['id'], force=force)
self.share_manager.driver.teardown_server.assert_called_once_with(
server_details=share_srv.get('backend_details'),
security_services=[])
self.assertEqual(force, manager.LOG.error.called)
def test_delete_share_instance_last_on_server_deletion_disabled(self):
share_net = db_utils.create_share_network()
share_srv = db_utils.create_share_server(
share_network_id=share_net['id'],
host=self.share_manager.host
)
share = db_utils.create_share(share_network_id=share_net['id'],
share_server_id=share_srv['id'])
manager.CONF.delete_share_server_with_last_share = False
self.share_manager.driver = mock.Mock()
self.share_manager.delete_share_instance(self.context,
share.instance['id'])
self.assertFalse(self.share_manager.driver.teardown_network.called)
def test_delete_share_instance_not_last_on_server(self):
share_net = db_utils.create_share_network()
share_srv = db_utils.create_share_server(
share_network_id=share_net['id'],
host=self.share_manager.host
)
share = db_utils.create_share(share_network_id=share_net['id'],
share_server_id=share_srv['id'])
db_utils.create_share(share_network_id=share_net['id'],
share_server_id=share_srv['id'])
manager.CONF.delete_share_server_with_last_share = True
self.share_manager.driver = mock.Mock()
self.share_manager.delete_share_instance(self.context,
share.instance['id'])
self.assertFalse(self.share_manager.driver.teardown_network.called)
@ddt.data('update_access', 'delete_share')
def test_delete_share_instance_not_found(self, side_effect):
share_net = db_utils.create_share_network()
share_srv = db_utils.create_share_server(
share_network_id=share_net['id'],
host=self.share_manager.host)
share = db_utils.create_share(share_network_id=share_net['id'],
share_server_id=share_srv['id'])
access = db_utils.create_access(share_id=share['id'])
db_utils.create_share(share_network_id=share_net['id'],
share_server_id=share_srv['id'])
manager.CONF.delete_share_server_with_last_share = False
self.mock_object(db, 'share_server_get',
mock.Mock(return_value=share_srv))
self.mock_object(db, 'share_instance_get',
mock.Mock(return_value=share.instance))
self.mock_object(db, 'share_access_get_all_for_instance',
mock.Mock(return_value=[access]))
self.share_manager.driver = mock.Mock()
self.share_manager.access_helper.driver = mock.Mock()
if side_effect == 'update_access':
self.mock_object(
self.share_manager.access_helper.driver, 'update_access',
mock.Mock(side_effect=exception.ShareResourceNotFound(
share_id=share['id'])))
if side_effect == 'delete_share':
self.mock_object(
self.share_manager.driver, 'delete_share',
mock.Mock(side_effect=exception.ShareResourceNotFound(
share_id=share['id'])))
self.mock_object(
self.share_manager.access_helper, '_check_needs_refresh',
mock.Mock(return_value=False)
)
self.mock_object(manager.LOG, 'warning')
self.share_manager.delete_share_instance(self.context,
share.instance['id'])
self.assertFalse(self.share_manager.driver.teardown_network.called)
(self.share_manager.access_helper.driver.update_access.
assert_called_once_with(utils.IsAMatcher(
context.RequestContext), share.instance, [], add_rules=[],
delete_rules=[access], share_server=share_srv))
self.assertTrue(manager.LOG.warning.called)
def test_allow_deny_access(self):
"""Test access rules to share can be created and deleted."""
self.mock_object(share_access.LOG, 'info')
share = db_utils.create_share()
share_id = share['id']
share_instance = db_utils.create_share_instance(
share_id=share_id,
access_rules_status=constants.STATUS_OUT_OF_SYNC)
share_instance_id = share_instance['id']
access = db_utils.create_access(share_id=share_id,
share_instance_id=share_instance_id)
access_id = access['id']
self.share_manager.allow_access(self.context, share_instance_id,
[access_id])
self.assertEqual('active', db.share_instance_get(
self.context, share_instance_id).access_rules_status)
share_access.LOG.info.assert_called_with(mock.ANY,
share_instance_id)
share_access.LOG.info.reset_mock()
self.share_manager.deny_access(self.context, share_instance_id,
[access_id])
share_access.LOG.info.assert_called_with(mock.ANY,
share_instance_id)
share_access.LOG.info.reset_mock()
def test_allow_deny_access_error(self):
"""Test access rules to share can be created and deleted with error."""
def _fake_allow_access(self, *args, **kwargs):
raise exception.NotFound()
def _fake_deny_access(self, *args, **kwargs):
raise exception.NotFound()
self.mock_object(self.share_manager.access_helper.driver,
"allow_access", _fake_allow_access)
self.mock_object(self.share_manager.access_helper.driver,
"deny_access", _fake_deny_access)
share = db_utils.create_share()
share_id = share['id']
share_instance = db_utils.create_share_instance(
share_id=share_id,
access_rules_status=constants.STATUS_OUT_OF_SYNC)
share_instance_id = share_instance['id']
access = db_utils.create_access(share_id=share_id,
share_instance_id=share_instance_id)
access_id = access['id']
def validate(method):
self.assertRaises(exception.ManilaException, method, self.context,
share_instance_id, [access_id])
inst = db.share_instance_get(self.context, share_instance_id)
self.assertEqual(constants.STATUS_ERROR,
inst['access_rules_status'])
validate(self.share_manager.allow_access)
validate(self.share_manager.deny_access)
def test_setup_server(self):
# Setup required test data
share_server = {
'id': 'fake_id',
'share_network_id': 'fake_sn_id',
}
metadata = {'fake_metadata_key': 'fake_metadata_value'}
share_network = {'id': 'fake_sn_id'}
network_info = {'security_services': []}
for ss_type in constants.SECURITY_SERVICES_ALLOWED_TYPES:
network_info['security_services'].append({
'name': 'fake_name' + ss_type,
'domain': 'fake_domain' + ss_type,
'server': 'fake_server' + ss_type,
'dns_ip': 'fake_dns_ip' + ss_type,
'user': 'fake_user' + ss_type,
'type': ss_type,
'password': 'fake_password' + ss_type,
})
sec_services = network_info['security_services']
server_info = {'fake_server_info_key': 'fake_server_info_value'}
network_info['network_type'] = 'fake_network_type'
# mock required stuff
self.mock_object(self.share_manager.db, 'share_network_get',
mock.Mock(return_value=share_network))
self.mock_object(self.share_manager.driver, 'allocate_network')
self.mock_object(self.share_manager, '_form_server_setup_info',
mock.Mock(return_value=network_info))
self.mock_object(self.share_manager, '_validate_segmentation_id')
self.mock_object(self.share_manager.driver, 'setup_server',
mock.Mock(return_value=server_info))
self.mock_object(self.share_manager.db,
'share_server_backend_details_set')
self.mock_object(self.share_manager.db, 'share_server_update',
mock.Mock(return_value=share_server))
# execute method _setup_server
result = self.share_manager._setup_server(
self.context, share_server, metadata=metadata)
# verify results
self.assertEqual(share_server, result)
self.share_manager.db.share_network_get.assert_has_calls([
mock.call(self.context, share_server['share_network_id']),
mock.call(self.context, share_server['share_network_id']),
])
self.share_manager.driver.allocate_network.assert_called_once_with(
self.context, share_server, share_network)
self.share_manager._form_server_setup_info.assert_called_once_with(
self.context, share_server, share_network)
self.share_manager._validate_segmentation_id.assert_called_once_with(
network_info)
self.share_manager.driver.setup_server.assert_called_once_with(
network_info, metadata=metadata)
self.share_manager.db.share_server_backend_details_set.\
assert_has_calls([
mock.call(self.context, share_server['id'],
{'security_service_' + sec_services[0]['type']:
jsonutils.dumps(sec_services[0])}),
mock.call(self.context, share_server['id'],
{'security_service_' + sec_services[1]['type']:
jsonutils.dumps(sec_services[1])}),
mock.call(self.context, share_server['id'],
{'security_service_' + sec_services[2]['type']:
jsonutils.dumps(sec_services[2])}),
mock.call(self.context, share_server['id'], server_info),
])
self.share_manager.db.share_server_update.assert_called_once_with(
self.context, share_server['id'],
{'status': constants.STATUS_ACTIVE})
def test_setup_server_server_info_not_present(self):
# Setup required test data
share_server = {
'id': 'fake_id',
'share_network_id': 'fake_sn_id',
}
metadata = {'fake_metadata_key': 'fake_metadata_value'}
share_network = {'id': 'fake_sn_id'}
network_info = {
'fake_network_info_key': 'fake_network_info_value',
'security_services': [],
'network_type': 'fake_network_type',
}
server_info = {}
# mock required stuff
self.mock_object(self.share_manager.db, 'share_network_get',
mock.Mock(return_value=share_network))
self.mock_object(self.share_manager, '_form_server_setup_info',
mock.Mock(return_value=network_info))
self.mock_object(self.share_manager.driver, 'setup_server',
mock.Mock(return_value=server_info))
self.mock_object(self.share_manager.db, 'share_server_update',
mock.Mock(return_value=share_server))
self.mock_object(self.share_manager.driver, 'allocate_network')
# execute method _setup_server
result = self.share_manager._setup_server(
self.context, share_server, metadata=metadata)
# verify results
self.assertEqual(share_server, result)
self.share_manager.db.share_network_get.assert_has_calls([
mock.call(self.context, share_server['share_network_id']),
mock.call(self.context, share_server['share_network_id'])])
self.share_manager._form_server_setup_info.assert_called_once_with(
self.context, share_server, share_network)
self.share_manager.driver.setup_server.assert_called_once_with(
network_info, metadata=metadata)
self.share_manager.db.share_server_update.assert_called_once_with(
self.context, share_server['id'],
{'status': constants.STATUS_ACTIVE})
self.share_manager.driver.allocate_network.assert_called_once_with(
self.context, share_server, share_network)
def setup_server_raise_exception(self, detail_data_proper):
# Setup required test data
share_server = {
'id': 'fake_id',
'share_network_id': 'fake_sn_id',
}
server_info = {'details_key': 'value'}
share_network = {'id': 'fake_sn_id'}
network_info = {
'fake_network_info_key': 'fake_network_info_value',
'security_services': [],
'network_type': 'fake_network_type',
}
if detail_data_proper:
detail_data = {'server_details': server_info}
self.mock_object(self.share_manager.db,
'share_server_backend_details_set')
else:
detail_data = 'not dictionary detail data'
# Mock required parameters
self.mock_object(self.share_manager.db, 'share_network_get',
mock.Mock(return_value=share_network))
self.mock_object(self.share_manager.db, 'share_server_update')
for m in ['deallocate_network', 'allocate_network']:
self.mock_object(self.share_manager.driver, m)
self.mock_object(self.share_manager, '_form_server_setup_info',
mock.Mock(return_value=network_info))
self.mock_object(self.share_manager.db,
'share_server_backend_details_set')
self.mock_object(self.share_manager.driver, 'setup_server',
mock.Mock(side_effect=exception.ManilaException(
detail_data=detail_data)))
# execute method _setup_server
self.assertRaises(
exception.ManilaException,
self.share_manager._setup_server,
self.context,
share_server,
)
# verify results
if detail_data_proper:
self.share_manager.db.share_server_backend_details_set.\
assert_called_once_with(
self.context, share_server['id'], server_info)
self.share_manager._form_server_setup_info.assert_called_once_with(
self.context, share_server, share_network)
self.share_manager.db.share_server_update.assert_called_once_with(
self.context, share_server['id'],
{'status': constants.STATUS_ERROR})
self.share_manager.db.share_network_get.assert_has_calls([
mock.call(self.context, share_server['share_network_id']),
mock.call(self.context, share_server['share_network_id'])])
self.share_manager.driver.allocate_network.assert_has_calls([
mock.call(self.context, share_server, share_network)])
self.share_manager.driver.deallocate_network.assert_has_calls([
mock.call(self.context, share_server['id'])])
def test_setup_server_incorrect_detail_data(self):
self.setup_server_raise_exception(detail_data_proper=False)
def test_setup_server_exception_in_driver(self):
self.setup_server_raise_exception(detail_data_proper=True)
@ddt.data({},
{'detail_data': 'fake'},
{'detail_data': {'server_details': 'fake'}},
{'detail_data': {'server_details': {'fake': 'fake'}}},
{'detail_data': {
'server_details': {'fake': 'fake', 'fake2': 'fake2'}}},)
def test_setup_server_exception_in_cleanup_after_error(self, data):
def get_server_details_from_data(data):
d = data.get('detail_data')
if not isinstance(d, dict):
return {}
d = d.get('server_details')
if not isinstance(d, dict):
return {}
return d
share_server = {'id': 'fake', 'share_network_id': 'fake'}
details = get_server_details_from_data(data)
exc_mock = mock.Mock(side_effect=exception.ManilaException(**data))
details_mock = mock.Mock(side_effect=exception.ManilaException())
self.mock_object(self.share_manager.db, 'share_network_get', exc_mock)
self.mock_object(self.share_manager.db,
'share_server_backend_details_set', details_mock)
self.mock_object(self.share_manager.db, 'share_server_update')
self.mock_object(self.share_manager.driver, 'deallocate_network')
self.mock_object(manager.LOG, 'debug')
self.mock_object(manager.LOG, 'warning')
self.assertRaises(
exception.ManilaException,
self.share_manager._setup_server,
self.context,
share_server,
)
self.assertTrue(self.share_manager.db.share_network_get.called)
if details:
self.assertEqual(len(details), details_mock.call_count)
expected = [mock.call(mock.ANY, share_server['id'], {k: v})
for k, v in details.items()]
self.assertEqual(expected, details_mock.call_args_list)
self.share_manager.db.share_server_update.assert_called_once_with(
self.context,
share_server['id'],
{'status': constants.STATUS_ERROR})
self.share_manager.driver.deallocate_network.assert_called_once_with(
self.context, share_server['id']
)
self.assertFalse(manager.LOG.warning.called)
if get_server_details_from_data(data):
self.assertTrue(manager.LOG.debug.called)
def test_ensure_share_instance_has_pool_with_only_host(self):
fake_share = {
'status': constants.STATUS_AVAILABLE, 'host': 'host1', 'id': 1}
host = self.share_manager._ensure_share_instance_has_pool(
context.get_admin_context(), fake_share)
self.assertIsNone(host)
def test_ensure_share_instance_has_pool_with_full_pool_name(self):
fake_share = {'host': 'host1#pool0', 'id': 1,
'status': constants.STATUS_AVAILABLE}
fake_share_expected_value = 'pool0'
host = self.share_manager._ensure_share_instance_has_pool(
context.get_admin_context(), fake_share)
self.assertEqual(fake_share_expected_value, host)
def test_ensure_share_instance_has_pool_unable_to_fetch_share(self):
fake_share = {'host': 'host@backend', 'id': 1,
'status': constants.STATUS_AVAILABLE}
with mock.patch.object(self.share_manager.driver, 'get_pool',
side_effect=Exception):
with mock.patch.object(manager, 'LOG') as mock_LOG:
self.share_manager._ensure_share_instance_has_pool(
context.get_admin_context(), fake_share)
self.assertEqual(1, mock_LOG.error.call_count)
def test__form_server_setup_info(self):
def fake_network_allocations_get_for_share_server(*args, **kwargs):
if kwargs.get('label') != 'admin':
return ['foo', 'bar']
return ['admin-foo', 'admin-bar']
self.mock_object(
self.share_manager.db, 'network_allocations_get_for_share_server',
mock.Mock(
side_effect=fake_network_allocations_get_for_share_server))
fake_share_server = dict(
id='fake_share_server_id', backend_details=dict(foo='bar'))
fake_share_network = dict(
segmentation_id='fake_segmentation_id',
cidr='fake_cidr',
neutron_net_id='fake_neutron_net_id',
neutron_subnet_id='fake_neutron_subnet_id',
nova_net_id='fake_nova_net_id',
security_services='fake_security_services',
network_type='fake_network_type')
expected = dict(
server_id=fake_share_server['id'],
segmentation_id=fake_share_network['segmentation_id'],
cidr=fake_share_network['cidr'],
neutron_net_id=fake_share_network['neutron_net_id'],
neutron_subnet_id=fake_share_network['neutron_subnet_id'],
nova_net_id=fake_share_network['nova_net_id'],
security_services=fake_share_network['security_services'],
network_allocations=(
fake_network_allocations_get_for_share_server()),
admin_network_allocations=(
fake_network_allocations_get_for_share_server(label='admin')),
backend_details=fake_share_server['backend_details'],
network_type=fake_share_network['network_type'])
network_info = self.share_manager._form_server_setup_info(
self.context, fake_share_server, fake_share_network)
self.assertEqual(expected, network_info)
self.share_manager.db.network_allocations_get_for_share_server.\
assert_has_calls([
mock.call(self.context, fake_share_server['id'], label='user'),
mock.call(self.context, fake_share_server['id'], label='admin')
])
@ddt.data(
{'network_info': {'network_type': 'vlan', 'segmentation_id': '100'}},
{'network_info': {'network_type': 'vlan', 'segmentation_id': '1'}},
{'network_info': {'network_type': 'vlan', 'segmentation_id': '4094'}},
{'network_info': {'network_type': 'vxlan', 'segmentation_id': '100'}},
{'network_info': {'network_type': 'vxlan', 'segmentation_id': '1'}},
{'network_info': {'network_type': 'vxlan',
'segmentation_id': '16777215'}},
{'network_info': {'network_type': 'gre', 'segmentation_id': '100'}},
{'network_info': {'network_type': 'gre', 'segmentation_id': '1'}},
{'network_info': {'network_type': 'gre',
'segmentation_id': '4294967295'}},
{'network_info': {'network_type': 'flat', 'segmentation_id': None}},
{'network_info': {'network_type': 'flat', 'segmentation_id': 0}},
{'network_info': {'network_type': None, 'segmentation_id': None}},
{'network_info': {'network_type': None, 'segmentation_id': 0}})
@ddt.unpack
def test_validate_segmentation_id_with_valid_values(self, network_info):
self.share_manager._validate_segmentation_id(network_info)
@ddt.data(
{'network_info': {'network_type': 'vlan', 'segmentation_id': None}},
{'network_info': {'network_type': 'vlan', 'segmentation_id': -1}},
{'network_info': {'network_type': 'vlan', 'segmentation_id': 0}},
{'network_info': {'network_type': 'vlan', 'segmentation_id': '4095'}},
{'network_info': {'network_type': 'vxlan', 'segmentation_id': None}},
{'network_info': {'network_type': 'vxlan', 'segmentation_id': 0}},
{'network_info': {'network_type': 'vxlan',
'segmentation_id': '16777216'}},
{'network_info': {'network_type': 'gre', 'segmentation_id': None}},
{'network_info': {'network_type': 'gre', 'segmentation_id': 0}},
{'network_info': {'network_type': 'gre',
'segmentation_id': '4294967296'}},
{'network_info': {'network_type': 'flat', 'segmentation_id': '1000'}},
{'network_info': {'network_type': None, 'segmentation_id': '1000'}})
@ddt.unpack
def test_validate_segmentation_id_with_invalid_values(self, network_info):
self.assertRaises(exception.NetworkBadConfigurationException,
self.share_manager._validate_segmentation_id,
network_info)
@ddt.data(5, 70)
def test_verify_server_cleanup_interval_invalid_cases(self, val):
data = dict(DEFAULT=dict(unused_share_server_cleanup_interval=val))
with test_utils.create_temp_config_with_opts(data):
self.assertRaises(exception.InvalidParameterValue,
manager.ShareManager)
@ddt.data(10, 36, 60)
def test_verify_server_cleanup_interval_valid_cases(self, val):
data = dict(DEFAULT=dict(unused_share_server_cleanup_interval=val))
with test_utils.create_temp_config_with_opts(data):
manager.ShareManager()
@mock.patch.object(db, 'share_server_get_all_unused_deletable',
mock.Mock())
@mock.patch.object(manager.ShareManager, 'delete_share_server',
mock.Mock())
def test_delete_free_share_servers_cleanup_disabled(self):
data = dict(DEFAULT=dict(automatic_share_server_cleanup=False))
with test_utils.create_temp_config_with_opts(data):
share_manager = manager.ShareManager()
share_manager.driver.initialized = True
share_manager.delete_free_share_servers(self.context)
self.assertFalse(db.share_server_get_all_unused_deletable.called)
@mock.patch.object(db, 'share_server_get_all_unused_deletable',
mock.Mock())
@mock.patch.object(manager.ShareManager, 'delete_share_server',
mock.Mock())
def test_delete_free_share_servers_driver_handles_ss_disabled(self):
data = dict(DEFAULT=dict(driver_handles_share_servers=False))
with test_utils.create_temp_config_with_opts(data):
share_manager = manager.ShareManager()
share_manager.driver.initialized = True
share_manager.delete_free_share_servers(self.context)
self.assertFalse(db.share_server_get_all_unused_deletable.called)
self.assertFalse(share_manager.delete_share_server.called)
@mock.patch.object(db, 'share_server_get_all_unused_deletable',
mock.Mock(return_value=['server1', ]))
@mock.patch.object(manager.ShareManager, 'delete_share_server',
mock.Mock())
@mock.patch.object(timeutils, 'utcnow', mock.Mock(
return_value=datetime.timedelta(minutes=20)))
def test_delete_free_share_servers(self):
self.share_manager.delete_free_share_servers(self.context)
db.share_server_get_all_unused_deletable.assert_called_once_with(
self.context,
self.share_manager.host,
datetime.timedelta(minutes=10))
self.share_manager.delete_share_server.assert_called_once_with(
self.context,
'server1')
timeutils.utcnow.assert_called_once_with()
def test_extend_share_invalid(self):
share = db_utils.create_share()
share_id = share['id']
reservations = {}
self.mock_object(self.share_manager, 'driver')
self.mock_object(self.share_manager.db, 'share_update')
self.mock_object(quota.QUOTAS, 'rollback')
self.mock_object(self.share_manager.driver, 'extend_share',
mock.Mock(side_effect=Exception('fake')))
self.assertRaises(
exception.ShareExtendingError,
self.share_manager.extend_share, self.context, share_id, 123, {})
quota.QUOTAS.rollback.assert_called_once_with(
mock.ANY,
reservations,
project_id=six.text_type(share['project_id']),
user_id=six.text_type(share['user_id'])
)
def test_extend_share(self):
share = db_utils.create_share()
share_id = share['id']
new_size = 123
shr_update = {
'size': int(new_size),
'status': constants.STATUS_AVAILABLE.lower()
}
reservations = {}
fake_share_server = 'fake'
manager = self.share_manager
self.mock_object(manager, 'driver')
self.mock_object(manager.db, 'share_get',
mock.Mock(return_value=share))
self.mock_object(manager.db, 'share_update',
mock.Mock(return_value=share))
self.mock_object(quota.QUOTAS, 'commit')
self.mock_object(manager.driver, 'extend_share')
self.mock_object(manager, '_get_share_server',
mock.Mock(return_value=fake_share_server))
self.share_manager.extend_share(self.context, share_id,
new_size, reservations)
self.assertTrue(manager._get_share_server.called)
manager.driver.extend_share.assert_called_once_with(
utils.IsAMatcher(models.ShareInstance),
new_size, share_server=fake_share_server
)
quota.QUOTAS.commit.assert_called_once_with(
mock.ANY, reservations, project_id=share['project_id'],
user_id=share['user_id'])
manager.db.share_update.assert_called_once_with(
mock.ANY, share_id, shr_update
)
def test_shrink_share_quota_error(self):
size = 5
new_size = 1
share = db_utils.create_share(size=size)
share_id = share['id']
self.mock_object(self.share_manager.db, 'share_update')
self.mock_object(quota.QUOTAS, 'reserve',
mock.Mock(side_effect=Exception('fake')))
self.assertRaises(
exception.ShareShrinkingError,
self.share_manager.shrink_share, self.context, share_id, new_size)
quota.QUOTAS.reserve.assert_called_with(
mock.ANY,
project_id=six.text_type(share['project_id']),
user_id=six.text_type(share['user_id']),
gigabytes=new_size - size
)
self.assertTrue(self.share_manager.db.share_update.called)
@ddt.data({'exc': exception.InvalidShare('fake'),
'status': constants.STATUS_SHRINKING_ERROR},
{'exc': exception.ShareShrinkingPossibleDataLoss("fake"),
'status': constants.STATUS_SHRINKING_POSSIBLE_DATA_LOSS_ERROR})
@ddt.unpack
def test_shrink_share_invalid(self, exc, status):
share = db_utils.create_share()
new_size = 1
share_id = share['id']
size_decrease = int(share['size']) - new_size
self.mock_object(self.share_manager, 'driver')
self.mock_object(self.share_manager.db, 'share_update')
self.mock_object(self.share_manager.db, 'share_get',
mock.Mock(return_value=share))
self.mock_object(quota.QUOTAS, 'reserve')
self.mock_object(quota.QUOTAS, 'rollback')
self.mock_object(self.share_manager.driver, 'shrink_share',
mock.Mock(side_effect=exc))
self.assertRaises(
exception.ShareShrinkingError,
self.share_manager.shrink_share, self.context, share_id, new_size)
self.share_manager.driver.shrink_share.assert_called_once_with(
utils.IsAMatcher(models.ShareInstance),
new_size, share_server=None
)
self.share_manager.db.share_update.assert_called_once_with(
mock.ANY, share_id, {'status': status}
)
quota.QUOTAS.reserve.assert_called_once_with(
mock.ANY, gigabytes=-size_decrease, project_id=share['project_id'],
user_id=share['user_id']
)
quota.QUOTAS.rollback.assert_called_once_with(
mock.ANY, mock.ANY, project_id=share['project_id'],
user_id=share['user_id']
)
self.assertTrue(self.share_manager.db.share_get.called)
def test_shrink_share(self):
share = db_utils.create_share()
share_id = share['id']
new_size = 123
shr_update = {
'size': int(new_size),
'status': constants.STATUS_AVAILABLE
}
fake_share_server = 'fake'
size_decrease = int(share['size']) - new_size
manager = self.share_manager
self.mock_object(manager, 'driver')
self.mock_object(manager.db, 'share_get',
mock.Mock(return_value=share))
self.mock_object(manager.db, 'share_update',
mock.Mock(return_value=share))
self.mock_object(quota.QUOTAS, 'commit')
self.mock_object(quota.QUOTAS, 'reserve')
self.mock_object(manager.driver, 'shrink_share')
self.mock_object(manager, '_get_share_server',
mock.Mock(return_value=fake_share_server))
self.share_manager.shrink_share(self.context, share_id, new_size)
self.assertTrue(manager._get_share_server.called)
manager.driver.shrink_share.assert_called_once_with(
utils.IsAMatcher(models.ShareInstance),
new_size, share_server=fake_share_server
)
quota.QUOTAS.reserve.assert_called_once_with(
mock.ANY, gigabytes=-size_decrease, project_id=share['project_id'],
user_id=share['user_id']
)
quota.QUOTAS.commit.assert_called_once_with(
mock.ANY, mock.ANY, project_id=share['project_id'],
user_id=share['user_id']
)
manager.db.share_update.assert_called_once_with(
mock.ANY, share_id, shr_update
)
def test_report_driver_status_driver_handles_ss_false(self):
fake_stats = {'field': 'val'}
fake_pool = {'name': 'pool1'}
self.share_manager.last_capabilities = {'field': 'old_val'}
self.mock_object(self.share_manager, 'driver', mock.Mock())
driver = self.share_manager.driver
driver.get_share_stats = mock.Mock(return_value=fake_stats)
self.mock_object(db, 'share_server_get_all_by_host', mock.Mock())
driver.driver_handles_share_servers = False
driver.get_share_server_pools = mock.Mock(return_value=fake_pool)
self.share_manager._report_driver_status(self.context)
driver.get_share_stats.assert_called_once_with(
refresh=True)
self.assertFalse(db.share_server_get_all_by_host.called)
self.assertFalse(driver.get_share_server_pools.called)
self.assertEqual(fake_stats, self.share_manager.last_capabilities)
def test_report_driver_status_driver_handles_ss(self):
fake_stats = {'field': 'val'}
fake_ss = {'id': '1234'}
fake_pool = {'name': 'pool1'}
self.mock_object(self.share_manager, 'driver', mock.Mock())
driver = self.share_manager.driver
driver.get_share_stats = mock.Mock(return_value=fake_stats)
self.mock_object(db, 'share_server_get_all_by_host', mock.Mock(
return_value=[fake_ss]))
driver.driver_handles_share_servers = True
driver.get_share_server_pools = mock.Mock(return_value=fake_pool)
self.share_manager._report_driver_status(self.context)
driver.get_share_stats.assert_called_once_with(refresh=True)
db.share_server_get_all_by_host.assert_called_once_with(
self.context,
self.share_manager.host)
driver.get_share_server_pools.assert_called_once_with(fake_ss)
expected_stats = {
'field': 'val',
'server_pools_mapping': {
'1234': fake_pool},
}
self.assertEqual(expected_stats, self.share_manager.last_capabilities)
def test_report_driver_status_empty_share_stats(self):
old_capabilities = {'field': 'old_val'}
fake_pool = {'name': 'pool1'}
self.share_manager.last_capabilities = old_capabilities
self.mock_object(self.share_manager, 'driver', mock.Mock())
driver = self.share_manager.driver
driver.get_share_stats = mock.Mock(return_value={})
self.mock_object(db, 'share_server_get_all_by_host', mock.Mock())
driver.driver_handles_share_servers = True
driver.get_share_server_pools = mock.Mock(return_value=fake_pool)
self.share_manager._report_driver_status(self.context)
driver.get_share_stats.assert_called_once_with(refresh=True)
self.assertFalse(db.share_server_get_all_by_host.called)
self.assertFalse(driver.get_share_server_pools.called)
self.assertEqual(old_capabilities,
self.share_manager.last_capabilities)
def test_create_consistency_group(self):
fake_cg = {'id': 'fake_id'}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.driver,
'create_consistency_group',
mock.Mock(return_value=None))
self.share_manager.create_consistency_group(self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_called_once_with(mock.ANY, 'fake_id',
{'status': constants.STATUS_AVAILABLE,
'created_at': mock.ANY})
def test_create_cg_with_share_network_driver_not_handles_servers(self):
manager.CONF.set_default('driver_handles_share_servers', False)
self.mock_object(
self.share_manager.driver.configuration, 'safe_get',
mock.Mock(return_value=False))
cg_id = 'fake_cg_id'
share_network_id = 'fake_sn'
fake_cg = {'id': 'fake_id', 'share_network_id': share_network_id}
self.mock_object(
self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_update')
self.assertRaises(
exception.ManilaException,
self.share_manager.create_consistency_group, self.context, cg_id)
self.share_manager.db.consistency_group_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), cg_id)
self.share_manager.db.consistency_group_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), cg_id,
{'status': constants.STATUS_ERROR})
def test_create_cg_with_share_network_driver_handles_servers(self):
manager.CONF.set_default('driver_handles_share_servers', True)
self.mock_object(
self.share_manager.driver.configuration, 'safe_get',
mock.Mock(return_value=True))
share_network_id = 'fake_sn'
fake_cg = {'id': 'fake_id', 'share_network_id': share_network_id,
'host': "fake_host"}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager, '_provide_share_server_for_cg',
mock.Mock(return_value=({}, fake_cg)))
self.mock_object(self.share_manager.driver,
'create_consistency_group',
mock.Mock(return_value=None))
self.share_manager.create_consistency_group(self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_called_once_with(mock.ANY, 'fake_id',
{'status': constants.STATUS_AVAILABLE,
'created_at': mock.ANY})
def test_create_consistency_group_with_update(self):
fake_cg = {'id': 'fake_id'}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.driver,
'create_consistency_group',
mock.Mock(return_value={'foo': 'bar'}))
self.share_manager.create_consistency_group(self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_any_call(mock.ANY, 'fake_id', {'foo': 'bar'})
self.share_manager.db.consistency_group_update.\
assert_any_call(mock.ANY, 'fake_id',
{'status': constants.STATUS_AVAILABLE,
'created_at': mock.ANY})
def test_create_consistency_group_with_error(self):
fake_cg = {'id': 'fake_id'}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.driver,
'create_consistency_group',
mock.Mock(side_effect=exception.Error))
self.assertRaises(exception.Error,
self.share_manager.create_consistency_group,
self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_called_once_with(mock.ANY, 'fake_id',
{'status': constants.STATUS_ERROR})
def test_create_consistency_group_from_cgsnapshot(self):
fake_cg = {'id': 'fake_id', 'source_cgsnapshot_id': 'fake_snap_id',
'shares': [], 'share_server_id': 'fake_ss_id'}
fake_ss = {'id': 'fake_ss_id', 'share_network_id': 'fake_sn'}
fake_snap = {'id': 'fake_snap_id', 'cgsnapshot_members': [],
'consistency_group': {'share_server_id': fake_ss['id']}}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'share_server_get',
mock.Mock(
return_value=fake_ss))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.driver,
'create_consistency_group_from_cgsnapshot',
mock.Mock(return_value=(None, None)))
self.share_manager.create_consistency_group(self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_called_once_with(mock.ANY, 'fake_id',
{'status': constants.STATUS_AVAILABLE,
'created_at': mock.ANY})
self.share_manager.db.share_server_get(mock.ANY, 'fake_ss_id')
self.share_manager.driver.create_consistency_group_from_cgsnapshot.\
assert_called_once_with(
mock.ANY, fake_cg, fake_snap, share_server=fake_ss)
def test_create_cg_cgsnapshot_share_network_driver_not_handles_servers(
self):
manager.CONF.set_default('driver_handles_share_servers', False)
self.mock_object(
self.share_manager.driver.configuration, 'safe_get',
mock.Mock(return_value=False))
cg_id = 'fake_cg_id'
share_network_id = 'fake_sn'
fake_cg = {'id': 'fake_id', 'source_cgsnapshot_id': 'fake_snap_id',
'shares': [], 'share_network_id': share_network_id,
'host': "fake_host"}
self.mock_object(
self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
fake_snap = {'id': 'fake_snap_id', 'cgsnapshot_members': []}
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'consistency_group_update')
self.assertRaises(exception.ManilaException,
self.share_manager.create_consistency_group,
self.context, cg_id)
self.share_manager.db.consistency_group_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), cg_id)
self.share_manager.db.consistency_group_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), cg_id,
{'status': constants.STATUS_ERROR})
def test_create_cg_from_cgsnapshot_share_network_driver_handles_servers(
self):
manager.CONF.set_default('driver_handles_share_servers', True)
self.mock_object(self.share_manager.driver.configuration, 'safe_get',
mock.Mock(return_value=True))
share_network_id = 'fake_sn'
fake_cg = {'id': 'fake_id', 'source_cgsnapshot_id': 'fake_snap_id',
'shares': [], 'share_network_id': share_network_id}
fake_snap = {'id': 'fake_snap_id', 'cgsnapshot_members': []}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager, '_provide_share_server_for_cg',
mock.Mock(return_value=({}, fake_cg)))
self.mock_object(self.share_manager.driver,
'create_consistency_group_from_cgsnapshot',
mock.Mock(return_value=(None, None)))
self.share_manager.create_consistency_group(self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_called_once_with(mock.ANY, 'fake_id',
{'status': constants.STATUS_AVAILABLE,
'created_at': mock.ANY})
def test_create_consistency_group_from_cgsnapshot_with_update(self):
fake_cg = {'id': 'fake_id', 'source_cgsnapshot_id': 'fake_snap_id',
'shares': []}
fake_snap = {'id': 'fake_snap_id', 'cgsnapshot_members': []}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.driver,
'create_consistency_group_from_cgsnapshot',
mock.Mock(return_value=({'foo': 'bar'}, None)))
self.share_manager.create_consistency_group(self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_any_call(mock.ANY, 'fake_id', {'foo': 'bar'})
self.share_manager.db.consistency_group_update.\
assert_any_call(mock.ANY, 'fake_id',
{'status': constants.STATUS_AVAILABLE,
'created_at': mock.ANY})
def test_create_consistency_group_from_cgsnapshot_with_share_update(self):
fake_share = {'id': 'fake_share_id'}
fake_export_locations = ['my_export_location']
fake_cg = {'id': 'fake_id', 'source_cgsnapshot_id': 'fake_snap_id',
'shares': [fake_share]}
fake_snap = {'id': 'fake_snap_id', 'cgsnapshot_members': []}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'consistency_group_update')
self.mock_object(self.share_manager.db, 'share_instance_update')
self.mock_object(self.share_manager.db,
'share_export_locations_update')
fake_share_update_list = [{'id': fake_share['id'],
'foo': 'bar',
'export_locations': fake_export_locations}]
self.mock_object(self.share_manager.driver,
'create_consistency_group_from_cgsnapshot',
mock.Mock(
return_value=(None, fake_share_update_list)))
self.share_manager.create_consistency_group(self.context, "fake_id")
self.share_manager.db.share_instance_update.\
assert_any_call(mock.ANY, 'fake_share_id', {'foo': 'bar'})
self.share_manager.db.share_export_locations_update.\
assert_any_call(mock.ANY, 'fake_share_id', fake_export_locations)
self.share_manager.db.consistency_group_update.\
assert_any_call(mock.ANY, 'fake_id',
{'status': constants.STATUS_AVAILABLE,
'created_at': mock.ANY})
def test_create_consistency_group_from_cgsnapshot_with_error(self):
fake_cg = {'id': 'fake_id', 'source_cgsnapshot_id': 'fake_snap_id',
'shares': []}
fake_snap = {'id': 'fake_snap_id', 'cgsnapshot_members': []}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db,
'share_instances_get_all_by_consistency_group_id',
mock.Mock(return_value=[]))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.driver,
'create_consistency_group_from_cgsnapshot',
mock.Mock(side_effect=exception.Error))
self.assertRaises(exception.Error,
self.share_manager.create_consistency_group,
self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_called_once_with(mock.ANY, 'fake_id',
{'status': constants.STATUS_ERROR})
def test_create_consistency_group_from_cgsnapshot_with_share_error(self):
fake_share = {'id': 'fake_share_id'}
fake_cg = {'id': 'fake_id', 'source_cgsnapshot_id': 'fake_snap_id',
'shares': [fake_share]}
fake_snap = {'id': 'fake_snap_id', 'cgsnapshot_members': []}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db,
'share_instances_get_all_by_consistency_group_id',
mock.Mock(return_value=[fake_share]))
self.mock_object(self.share_manager.db, 'consistency_group_update')
self.mock_object(self.share_manager.db, 'share_instance_update')
self.mock_object(self.share_manager.driver,
'create_consistency_group_from_cgsnapshot',
mock.Mock(side_effect=exception.Error))
self.assertRaises(exception.Error,
self.share_manager.create_consistency_group,
self.context, "fake_id")
self.share_manager.db.share_instance_update.\
assert_any_call(mock.ANY, 'fake_share_id',
{'status': constants.STATUS_ERROR})
self.share_manager.db.consistency_group_update.\
assert_called_once_with(mock.ANY, 'fake_id',
{'status': constants.STATUS_ERROR})
def test_delete_consistency_group(self):
fake_cg = {'id': 'fake_id'}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_destroy',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.driver,
'delete_consistency_group',
mock.Mock(return_value=None))
self.share_manager.delete_consistency_group(self.context, "fake_id")
self.share_manager.db.consistency_group_destroy.\
assert_called_once_with(mock.ANY, 'fake_id')
def test_delete_consistency_group_with_update(self):
fake_cg = {'id': 'fake_id'}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_destroy',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.driver,
'delete_consistency_group',
mock.Mock(return_value={'foo': 'bar'}))
self.share_manager.delete_consistency_group(self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_called_once_with(mock.ANY, 'fake_id', {'foo': 'bar'})
self.share_manager.db.consistency_group_destroy.\
assert_called_once_with(mock.ANY, 'fake_id')
def test_delete_consistency_group_with_error(self):
fake_cg = {'id': 'fake_id'}
self.mock_object(self.share_manager.db, 'consistency_group_get',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.db, 'consistency_group_update',
mock.Mock(return_value=fake_cg))
self.mock_object(self.share_manager.driver,
'delete_consistency_group',
mock.Mock(side_effect=exception.Error))
self.assertRaises(exception.Error,
self.share_manager.delete_consistency_group,
self.context, "fake_id")
self.share_manager.db.consistency_group_update.\
assert_called_once_with(mock.ANY, 'fake_id',
{'status': constants.STATUS_ERROR})
def test_create_cgsnapshot(self):
fake_snap = {'id': 'fake_snap_id', 'consistency_group': {},
'cgsnapshot_members': []}
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'cgsnapshot_update',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.driver,
'create_cgsnapshot',
mock.Mock(return_value=(None, None)))
self.share_manager.create_cgsnapshot(self.context, fake_snap['id'])
self.share_manager.db.cgsnapshot_update.\
assert_called_once_with(mock.ANY, fake_snap['id'],
{'status': constants.STATUS_AVAILABLE,
'created_at': mock.ANY})
def test_create_cgsnapshot_with_update(self):
fake_snap = {'id': 'fake_snap_id', 'consistency_group': {},
'cgsnapshot_members': []}
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'cgsnapshot_update',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.driver,
'create_cgsnapshot',
mock.Mock(return_value=({'foo': 'bar'}, None)))
self.share_manager.create_cgsnapshot(self.context, fake_snap['id'])
self.share_manager.db.cgsnapshot_update.\
assert_any_call(mock.ANY, 'fake_snap_id', {'foo': 'bar'})
self.share_manager.db.cgsnapshot_update.assert_any_call(
mock.ANY, fake_snap['id'],
{'status': constants.STATUS_AVAILABLE, 'created_at': mock.ANY})
def test_create_cgsnapshot_with_member_update(self):
fake_member = {
'id': 'fake_member_id',
'share_instance_id': 'blah',
}
fake_member_update = {
'id': 'fake_member_id',
'foo': 'bar'
}
fake_snap = {'id': 'fake_snap_id', 'consistency_group': {},
'cgsnapshot_members': [fake_member]}
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'cgsnapshot_update',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'cgsnapshot_member_update')
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(return_value={'id': 'blah'}))
self.mock_object(self.share_manager.driver, 'create_cgsnapshot',
mock.Mock(return_value=(None, [fake_member_update])))
self.share_manager.create_cgsnapshot(self.context, fake_snap['id'])
self.share_manager.db.cgsnapshot_update.assert_any_call(
mock.ANY, fake_snap['id'],
{'cgsnapshot_members': [fake_member_update]})
self.share_manager.db.cgsnapshot_update.\
assert_any_call(mock.ANY, fake_snap['id'],
{'status': constants.STATUS_AVAILABLE,
'created_at': mock.ANY})
self.assertTrue(self.share_manager.db.cgsnapshot_member_update.called)
def test_create_cgsnapshot_with_error(self):
fake_snap = {'id': 'fake_snap_id', 'consistency_group': {},
'cgsnapshot_members': []}
self.mock_object(self.share_manager.db, 'cgsnapshot_get',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.db, 'cgsnapshot_update',
mock.Mock(return_value=fake_snap))
self.mock_object(self.share_manager.driver,
'create_cgsnapshot',
mock.Mock(side_effect=exception.Error))
self.assertRaises(exception.Error,
self.share_manager.create_cgsnapshot,
self.context, fake_snap['id'])
self.share_manager.db.cgsnapshot_update.\
assert_called_once_with(mock.ANY, fake_snap['id'],
{'status': constants.STATUS_ERROR})
def test_migration_get_info(self):
share_instance = {'share_server_id': 'fake_server_id'}
share_instance_id = 'fake_id'
share_server = 'fake_share_server'
migration_info = 'fake_info'
# mocks
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(return_value=share_instance))
self.mock_object(self.share_manager.db, 'share_server_get',
mock.Mock(return_value=share_server))
self.mock_object(self.share_manager.driver, 'migration_get_info',
mock.Mock(return_value=migration_info))
# run
result = self.share_manager.migration_get_info(
self.context, share_instance_id)
# asserts
self.assertEqual(migration_info, result)
self.share_manager.db.share_instance_get.assert_called_once_with(
self.context, share_instance_id, with_share_data=True)
self.share_manager.driver.migration_get_info.assert_called_once_with(
self.context, share_instance, share_server)
def test_migration_get_driver_info(self):
share_instance = {'share_server_id': 'fake_server_id'}
share_instance_id = 'fake-id'
share_server = 'fake-share-server'
migration_info = 'fake_info'
# mocks
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(return_value=share_instance))
self.mock_object(self.share_manager.db, 'share_server_get',
mock.Mock(return_value=share_server))
self.mock_object(self.share_manager.driver,
'migration_get_driver_info',
mock.Mock(return_value=migration_info))
result = self.share_manager.migration_get_driver_info(
self.context, share_instance_id)
# asserts
self.assertEqual(migration_info, result)
self.share_manager.db.share_instance_get.assert_called_once_with(
self.context, share_instance_id, with_share_data=True)
self.share_manager.driver.migration_get_driver_info.\
assert_called_once_with(self.context, share_instance, share_server)
@ddt.data((True, 'fake_model_update'), exception.ManilaException())
def test_migration_start(self, exc):
server = 'fake_share_server'
instance = db_utils.create_share_instance(
share_id='fake_id',
status=constants.STATUS_AVAILABLE,
share_server_id='fake_server_id')
share = db_utils.create_share(id='fake_id', instances=[instance])
host = 'fake_host'
driver_migration_info = 'driver_fake_info'
# mocks
self.mock_object(self.share_manager.db, 'share_get',
mock.Mock(return_value=share))
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(return_value=instance))
self.mock_object(self.share_manager.db, 'share_server_get',
mock.Mock(return_value=server))
self.mock_object(self.share_manager.db, 'share_update')
self.mock_object(self.share_manager.db, 'share_instance_update')
self.mock_object(rpcapi.ShareAPI, 'migration_get_driver_info',
mock.Mock(return_value=driver_migration_info))
if isinstance(exc, exception.ManilaException):
self.mock_object(self.share_manager.driver, 'migration_start',
mock.Mock(side_effect=exc))
self.mock_object(self.share_manager, '_migration_start_generic',
mock.Mock(side_effect=Exception('fake')))
self.mock_object(manager.LOG, 'exception')
else:
self.mock_object(self.share_manager.driver, 'migration_start',
mock.Mock(return_value=exc))
# run
if isinstance(exc, exception.ManilaException):
self.assertRaises(exception.ShareMigrationFailed,
self.share_manager.migration_start,
self.context, 'fake_id', host, False, False)
else:
self.share_manager.migration_start(
self.context, 'fake_id', host, False, False)
# asserts
self.share_manager.db.share_get.assert_called_once_with(
self.context, share['id'])
self.share_manager.db.share_instance_get.assert_called_once_with(
self.context, instance['id'], with_share_data=True)
self.share_manager.db.share_server_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
instance['share_server_id'])
share_update_calls = [
mock.call(
self.context, share['id'],
{'task_state': constants.TASK_STATE_MIGRATION_IN_PROGRESS}),
mock.call(
self.context, share['id'],
{'task_state': (
constants.TASK_STATE_MIGRATION_DRIVER_IN_PROGRESS)})
]
share_instance_update_calls = [
mock.call(self.context, instance['id'],
{'status': constants.STATUS_MIGRATING})
]
if isinstance(exc, exception.ManilaException):
share_update_calls.append(mock.call(
self.context, share['id'],
{'task_state': constants.TASK_STATE_MIGRATION_ERROR}))
share_instance_update_calls.append(
mock.call(self.context, instance['id'],
{'status': constants.STATUS_AVAILABLE}))
self.share_manager._migration_start_generic.\
assert_called_once_with(self.context, share, instance, host,
False)
self.assertTrue(manager.LOG.exception.called)
else:
share_update_calls.append(mock.call(
self.context, share['id'],
{'task_state':
constants.TASK_STATE_MIGRATION_DRIVER_PHASE1_DONE}))
share_instance_update_calls.append(
mock.call(self.context, instance['id'], 'fake_model_update'))
self.share_manager.db.share_update.assert_has_calls(share_update_calls)
self.share_manager.db.share_instance_update.assert_has_calls(
share_instance_update_calls)
rpcapi.ShareAPI.migration_get_driver_info.assert_called_once_with(
self.context, instance)
self.share_manager.driver.migration_start.assert_called_once_with(
self.context, instance, server, host, driver_migration_info, False)
@ddt.data(None, Exception('fake'))
def test__migration_start_generic(self, exc):
instance = db_utils.create_share_instance(
share_id='fake_id',
status=constants.STATUS_AVAILABLE,
share_server_id='fake_server_id')
new_instance = db_utils.create_share_instance(
share_id='new_fake_id',
status=constants.STATUS_AVAILABLE)
share = db_utils.create_share(id='fake_id', instances=[instance])
server = 'share_server'
src_migration_info = 'src_fake_info'
dest_migration_info = 'dest_fake_info'
# mocks
self.mock_object(self.share_manager.db, 'share_server_get',
mock.Mock(return_value=server))
self.mock_object(self.share_manager.db, 'share_instance_update',
mock.Mock(return_value=server))
self.mock_object(migration_api.ShareMigrationHelper,
'change_to_read_only')
if exc is None:
self.mock_object(migration_api.ShareMigrationHelper,
'create_instance_and_wait',
mock.Mock(return_value=new_instance))
self.mock_object(self.share_manager.driver, 'migration_get_info',
mock.Mock(return_value=src_migration_info))
self.mock_object(rpcapi.ShareAPI, 'migration_get_info',
mock.Mock(return_value=dest_migration_info))
self.mock_object(data_rpc.DataAPI, 'migration_start',
mock.Mock(side_effect=Exception('fake')))
self.mock_object(migration_api.ShareMigrationHelper,
'cleanup_new_instance')
else:
self.mock_object(migration_api.ShareMigrationHelper,
'create_instance_and_wait',
mock.Mock(side_effect=exc))
self.mock_object(migration_api.ShareMigrationHelper,
'cleanup_access_rules')
# run
self.assertRaises(
exception.ShareMigrationFailed,
self.share_manager._migration_start_generic,
self.context, share, instance, 'fake_host', False)
# asserts
self.share_manager.db.share_server_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
instance['share_server_id'])
migration_api.ShareMigrationHelper.change_to_read_only.\
assert_called_once_with(instance, server, True,
self.share_manager.driver)
migration_api.ShareMigrationHelper.create_instance_and_wait.\
assert_called_once_with(share, instance, 'fake_host')
migration_api.ShareMigrationHelper.\
cleanup_access_rules.assert_called_once_with(
instance, server, self.share_manager.driver)
if exc is None:
self.share_manager.db.share_instance_update.\
assert_called_once_with(
self.context, new_instance['id'],
{'status': constants.STATUS_MIGRATING_TO})
self.share_manager.driver.migration_get_info.\
assert_called_once_with(self.context, instance, server)
rpcapi.ShareAPI.migration_get_info.assert_called_once_with(
self.context, new_instance)
data_rpc.DataAPI.migration_start.assert_called_once_with(
self.context, share['id'], ['lost+found'], instance['id'],
new_instance['id'], src_migration_info, dest_migration_info,
False)
migration_api.ShareMigrationHelper.\
cleanup_new_instance.assert_called_once_with(new_instance)
@ddt.data('fake_model_update', Exception('fake'))
def test_migration_complete_driver(self, exc):
server = 'fake_server'
model_update = 'fake_model_update'
instance = db_utils.create_share_instance(
share_id='fake_id',
status=constants.STATUS_AVAILABLE,
share_server_id='fake_server_id')
share = db_utils.create_share(
id='fake_id',
instances=[instance],
task_state=constants.TASK_STATE_MIGRATION_DRIVER_PHASE1_DONE)
# mocks
self.mock_object(self.share_manager.db, 'share_get',
mock.Mock(return_value=share))
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(return_value=instance))
self.mock_object(self.share_manager.db, 'share_server_get',
mock.Mock(return_value=server))
self.mock_object(self.share_manager.db, 'share_update')
if isinstance(exc, Exception):
self.mock_object(self.share_manager.driver, 'migration_complete',
mock.Mock(side_effect=exc))
else:
self.mock_object(self.share_manager.driver, 'migration_complete',
mock.Mock(return_value=exc))
self.mock_object(self.share_manager.db, 'share_instance_update')
self.mock_object(rpcapi.ShareAPI, 'migration_get_driver_info',
mock.Mock(return_value='fake_info'))
self.mock_object(manager.LOG, 'exception')
# run
if isinstance(exc, Exception):
self.assertRaises(
exception.ShareMigrationFailed,
self.share_manager.migration_complete,
self.context, 'fake_id', 'fake_ins_id', 'new_fake_ins_id')
else:
self.share_manager.migration_complete(
self.context, 'fake_id', 'fake_ins_id', 'new_fake_ins_id')
# asserts
self.share_manager.db.share_get.assert_called_once_with(
self.context, share['id'])
self.share_manager.db.share_instance_get.assert_called_once_with(
self.context, instance['id'], with_share_data=True)
self.share_manager.db.share_server_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), 'fake_server_id')
self.share_manager.driver.migration_complete.assert_called_once_with(
self.context, instance, server, 'fake_info')
rpcapi.ShareAPI.migration_get_driver_info.assert_called_once_with(
self.context, instance)
if isinstance(exc, Exception):
self.share_manager.db.share_update.assert_called_once_with(
self.context, share['id'],
{'task_state': constants.TASK_STATE_MIGRATION_ERROR})
self.assertTrue(manager.LOG.exception.called)
else:
self.share_manager.db.share_update.assert_called_once_with(
self.context, share['id'],
{'task_state': constants.TASK_STATE_MIGRATION_SUCCESS})
self.share_manager.db.share_instance_update.\
assert_called_once_with(self.context, instance['id'],
model_update)
@ddt.data(None, Exception('fake'))
def test_migration_complete_generic(self, exc):
share = db_utils.create_share(
id='fake_id',
task_state=constants.TASK_STATE_DATA_COPYING_COMPLETED)
# mocks
self.mock_object(self.share_manager.db, 'share_get',
mock.Mock(return_value=share))
self.mock_object(self.share_manager, '_migration_complete',
mock.Mock(side_effect=exc))
self.mock_object(self.share_manager.db, 'share_update')
self.mock_object(self.share_manager.db, 'share_instance_update')
self.mock_object(manager.LOG, 'exception')
# run
if exc:
self.assertRaises(
exception.ShareMigrationFailed,
self.share_manager.migration_complete,
self.context, 'fake_id', 'fake_ins_id', 'new_fake_ins_id')
else:
self.share_manager.migration_complete(
self.context, 'fake_id', 'fake_ins_id', 'new_fake_ins_id')
# asserts
self.share_manager.db.share_get.assert_called_once_with(
self.context, share['id'])
self.share_manager._migration_complete.assert_called_once_with(
self.context, share, 'fake_ins_id', 'new_fake_ins_id')
if exc:
self.share_manager.db.share_update.assert_called_once_with(
self.context, share['id'],
{'task_state': constants.TASK_STATE_MIGRATION_ERROR})
self.share_manager.db.share_instance_update.\
assert_called_once_with(
self.context, 'fake_ins_id',
{'status': constants.STATUS_AVAILABLE})
self.assertTrue(manager.LOG.exception.called)
@ddt.data(constants.TASK_STATE_DATA_COPYING_ERROR,
constants.TASK_STATE_DATA_COPYING_CANCELLED,
constants.TASK_STATE_DATA_COPYING_COMPLETED,
'other')
def test__migration_complete_status(self, status):
instance = db_utils.create_share_instance(
share_id='fake_id',
share_server_id='fake_server_id')
new_instance = db_utils.create_share_instance(share_id='fake_id')
share = db_utils.create_share(id='fake_id', task_state=status)
server = 'fake_server'
# mocks
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(side_effect=[instance, new_instance]))
self.mock_object(self.share_manager.db, 'share_server_get',
mock.Mock(return_value=server))
self.mock_object(migration_api.ShareMigrationHelper,
'cleanup_new_instance')
self.mock_object(migration_api.ShareMigrationHelper,
'cleanup_access_rules')
self.mock_object(self.share_manager.db, 'share_instance_update')
self.mock_object(self.share_manager.db, 'share_update')
if status == constants.TASK_STATE_DATA_COPYING_COMPLETED:
self.mock_object(migration_api.ShareMigrationHelper,
'apply_new_access_rules',
mock.Mock(side_effect=Exception('fake')))
self.mock_object(manager.LOG, 'exception')
# run
if status == constants.TASK_STATE_DATA_COPYING_CANCELLED:
self.share_manager._migration_complete(
self.context, share, instance['id'], new_instance['id'])
else:
self.assertRaises(
exception.ShareMigrationFailed,
self.share_manager._migration_complete, self.context, share,
instance['id'], new_instance['id'])
# asserts
self.share_manager.db.share_instance_get.assert_has_calls([
mock.call(self.context, instance['id'], with_share_data=True),
mock.call(self.context, new_instance['id'], with_share_data=True)
])
self.share_manager.db.share_server_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), 'fake_server_id')
if status != 'other':
migration_api.ShareMigrationHelper.cleanup_new_instance.\
assert_called_once_with(new_instance)
migration_api.ShareMigrationHelper.cleanup_access_rules.\
assert_called_once_with(instance, server,
self.share_manager.driver)
if status == constants.TASK_STATE_MIGRATION_CANCELLED:
self.share_manager.db.share_instance_update.\
assert_called_once_with(self.context, instance['id'],
{'status': constants.STATUS_AVAILABLE})
self.share_manager.db.share_update.assert_called_once_with(
self.context, share['id'],
{'task_state': constants.TASK_STATE_MIGRATION_CANCELLED})
if status == constants.TASK_STATE_DATA_COPYING_COMPLETED:
migration_api.ShareMigrationHelper.apply_new_access_rules.\
assert_called_once_with(new_instance)
self.assertTrue(manager.LOG.exception.called)
def test__migration_complete(self):
instance = db_utils.create_share_instance(
share_id='fake_id',
share_server_id='fake_server_id')
new_instance = db_utils.create_share_instance(share_id='fake_id')
share = db_utils.create_share(
id='fake_id',
task_state=constants.TASK_STATE_DATA_COPYING_COMPLETED)
server = 'fake_server'
# mocks
self.mock_object(self.share_manager.db, 'share_instance_get',
mock.Mock(side_effect=[instance, new_instance]))
self.mock_object(self.share_manager.db, 'share_server_get',
mock.Mock(return_value=server))
self.mock_object(self.share_manager.db, 'share_instance_update')
self.mock_object(self.share_manager.db, 'share_update')
self.mock_object(migration_api.ShareMigrationHelper,
'delete_instance_and_wait')
self.mock_object(migration_api.ShareMigrationHelper,
'apply_new_access_rules')
# run
self.share_manager._migration_complete(
self.context, share, instance['id'], new_instance['id'])
# asserts
self.share_manager.db.share_instance_get.assert_has_calls([
mock.call(self.context, instance['id'], with_share_data=True),
mock.call(self.context, new_instance['id'], with_share_data=True)
])
self.share_manager.db.share_server_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), 'fake_server_id')
self.share_manager.db.share_instance_update.assert_has_calls([
mock.call(self.context, new_instance['id'],
{'status': constants.STATUS_AVAILABLE}),
mock.call(self.context, instance['id'],
{'status': constants.STATUS_INACTIVE})
])
self.share_manager.db.share_update.assert_has_calls([
mock.call(
self.context, share['id'],
{'task_state': constants.TASK_STATE_MIGRATION_COMPLETING}),
mock.call(
self.context, share['id'],
{'task_state': constants.TASK_STATE_MIGRATION_SUCCESS}),
])
migration_api.ShareMigrationHelper.apply_new_access_rules.\
assert_called_once_with(new_instance)
migration_api.ShareMigrationHelper.delete_instance_and_wait.\
assert_called_once_with(instance)
def test_migration_cancel(self):
server = db_utils.create_share_server()
share = db_utils.create_share(
task_state=constants.TASK_STATE_MIGRATION_DRIVER_IN_PROGRESS,
share_server_id=server['id'])
self.mock_object(db, 'share_get', mock.Mock(return_value=share))
self.mock_object(db, 'share_server_get',
mock.Mock(return_value=server))
self.mock_object(rpcapi.ShareAPI, 'migration_get_driver_info',
mock.Mock(return_value='migration_info'))
self.mock_object(self.share_manager.driver, 'migration_cancel')
self.share_manager.migration_cancel(self.context, share)
rpcapi.ShareAPI.migration_get_driver_info.assert_called_once_with(
self.context, share.instance)
self.share_manager.driver.migration_cancel.assert_called_once_with(
self.context, share.instance, server, 'migration_info')
def test_migration_cancel_invalid(self):
share = db_utils.create_share()
self.mock_object(db, 'share_get', mock.Mock(return_value=share))
self.assertRaises(
exception.InvalidShare, self.share_manager.migration_cancel,
self.context, share)
def test_migration_get_progress(self):
server = db_utils.create_share_server()
share = db_utils.create_share(
task_state=constants.TASK_STATE_MIGRATION_DRIVER_IN_PROGRESS,
share_server_id=server['id'])
expected = 'fake_progress'
self.mock_object(db, 'share_get', mock.Mock(return_value=share))
self.mock_object(db, 'share_server_get',
mock.Mock(return_value=server))
self.mock_object(rpcapi.ShareAPI, 'migration_get_driver_info',
mock.Mock(return_value='migration_info'))
self.mock_object(self.share_manager.driver, 'migration_get_progress',
mock.Mock(return_value=expected))
result = self.share_manager.migration_get_progress(self.context, share)
self.assertEqual(expected, result)
rpcapi.ShareAPI.migration_get_driver_info.assert_called_once_with(
self.context, share.instance)
self.share_manager.driver.migration_get_progress.\
assert_called_once_with(
self.context, share.instance, server, 'migration_info')
def test_migration_get_progress_invalid(self):
share = db_utils.create_share()
self.mock_object(db, 'share_get', mock.Mock(return_value=share))
self.assertRaises(
exception.InvalidShare, self.share_manager.migration_get_progress,
self.context, share)
def test_manage_snapshot_invalid_driver_mode(self):
self.mock_object(self.share_manager, 'driver')
self.share_manager.driver.driver_handles_share_servers = True
share = db_utils.create_share()
snapshot = db_utils.create_snapshot(share_id=share['id'])
driver_options = {'fake': 'fake'}
self.assertRaises(
exception.InvalidDriverMode,
self.share_manager.manage_snapshot, self.context,
snapshot['id'], driver_options)
def test_manage_snapshot_invalid_snapshot(self):
fake_share_server = 'fake_share_server'
self.mock_object(self.share_manager, 'driver')
self.share_manager.driver.driver_handles_share_servers = False
mock_get_share_server = self.mock_object(
self.share_manager,
'_get_share_server',
mock.Mock(return_value=fake_share_server))
share = db_utils.create_share()
snapshot = db_utils.create_snapshot(share_id=share['id'])
driver_options = {'fake': 'fake'}
mock_get = self.mock_object(self.share_manager.db,
'share_snapshot_get',
mock.Mock(return_value=snapshot))
self.assertRaises(
exception.InvalidShareSnapshot,
self.share_manager.manage_snapshot, self.context,
snapshot['id'], driver_options)
mock_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['id'])
mock_get_share_server.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['share'])
def test_manage_snapshot_driver_exception(self):
CustomException = type('CustomException', (Exception,), {})
self.mock_object(self.share_manager, 'driver')
self.share_manager.driver.driver_handles_share_servers = False
mock_manage = self.mock_object(self.share_manager.driver,
'manage_existing_snapshot',
mock.Mock(side_effect=CustomException))
mock_get_share_server = self.mock_object(self.share_manager,
'_get_share_server',
mock.Mock(return_value=None))
share = db_utils.create_share()
snapshot = db_utils.create_snapshot(share_id=share['id'])
driver_options = {}
mock_get = self.mock_object(self.share_manager.db,
'share_snapshot_get',
mock.Mock(return_value=snapshot))
self.assertRaises(
CustomException,
self.share_manager.manage_snapshot,
self.context, snapshot['id'], driver_options)
mock_manage.assert_called_once_with(mock.ANY, driver_options)
mock_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['id'])
mock_get_share_server.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['share'])
@ddt.data(
{'size': 1},
{'size': 2, 'name': 'fake'},
{'size': 3})
def test_manage_snapshot_valid_snapshot(self, driver_data):
mock_get_share_server = self.mock_object(self.share_manager,
'_get_share_server',
mock.Mock(return_value=None))
self.mock_object(self.share_manager.db, 'share_snapshot_update')
self.mock_object(self.share_manager, 'driver')
self.mock_object(self.share_manager, '_update_quota_usages')
self.share_manager.driver.driver_handles_share_servers = False
mock_manage = self.mock_object(
self.share_manager.driver,
"manage_existing_snapshot",
mock.Mock(return_value=driver_data))
size = driver_data['size']
share = db_utils.create_share(size=size)
snapshot = db_utils.create_snapshot(share_id=share['id'], size=size)
snapshot_id = snapshot['id']
driver_options = {}
mock_get = self.mock_object(self.share_manager.db,
'share_snapshot_get',
mock.Mock(return_value=snapshot))
self.share_manager.manage_snapshot(self.context, snapshot_id,
driver_options)
mock_manage.assert_called_once_with(mock.ANY, driver_options)
valid_snapshot_data = {
'status': constants.STATUS_AVAILABLE}
valid_snapshot_data.update(driver_data)
self.share_manager.db.share_snapshot_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
snapshot_id, valid_snapshot_data)
self.share_manager._update_quota_usages.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
snapshot['project_id'],
{'snapshots': 1, 'snapshot_gigabytes': size})
mock_get_share_server.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['share'])
mock_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot_id)
def test_unmanage_snapshot_invalid_driver_mode(self):
self.mock_object(self.share_manager, 'driver')
self.share_manager.driver.driver_handles_share_servers = True
share = db_utils.create_share()
snapshot = db_utils.create_snapshot(share_id=share['id'])
self.mock_object(self.share_manager.db, 'share_snapshot_update')
ret = self.share_manager.unmanage_snapshot(self.context,
snapshot['id'])
self.assertIsNone(ret)
self.share_manager.db.share_snapshot_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
snapshot['id'],
{'status': constants.STATUS_UNMANAGE_ERROR})
def test_unmanage_snapshot_invalid_snapshot(self):
self.mock_object(self.share_manager, 'driver')
self.share_manager.driver.driver_handles_share_servers = False
mock_get_share_server = self.mock_object(
self.share_manager,
'_get_share_server',
mock.Mock(return_value='fake_share_server'))
self.mock_object(self.share_manager.db, 'share_snapshot_update')
share = db_utils.create_share()
snapshot = db_utils.create_snapshot(share_id=share['id'])
mock_get = self.mock_object(self.share_manager.db,
'share_snapshot_get',
mock.Mock(return_value=snapshot))
ret = self.share_manager.unmanage_snapshot(self.context,
snapshot['id'])
self.assertIsNone(ret)
self.share_manager.db.share_snapshot_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext),
snapshot['id'],
{'status': constants.STATUS_UNMANAGE_ERROR})
mock_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['id'])
mock_get_share_server.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['share'])
def test_unmanage_snapshot_invalid_share(self):
self.mock_object(self.share_manager, 'driver')
self.share_manager.driver.driver_handles_share_servers = False
mock_unmanage = mock.Mock(
side_effect=exception.UnmanageInvalidShareSnapshot(reason="fake"))
self.mock_object(self.share_manager.driver, "unmanage_snapshot",
mock_unmanage)
mock_get_share_server = self.mock_object(
self.share_manager,
'_get_share_server',
mock.Mock(return_value=None))
self.mock_object(self.share_manager.db, 'share_snapshot_update')
share = db_utils.create_share()
snapshot = db_utils.create_snapshot(share_id=share['id'])
mock_get = self.mock_object(self.share_manager.db,
'share_snapshot_get',
mock.Mock(return_value=snapshot))
self.share_manager.unmanage_snapshot(self.context, snapshot['id'])
self.share_manager.db.share_snapshot_update.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['id'],
{'status': constants.STATUS_UNMANAGE_ERROR})
self.share_manager.driver.unmanage_snapshot.assert_called_once_with(
mock.ANY)
mock_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['id'])
mock_get_share_server.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['share'])
@ddt.data(False, True)
def test_unmanage_snapshot_valid_snapshot(self, quota_error):
if quota_error:
self.mock_object(quota.QUOTAS, 'reserve', mock.Mock(
side_effect=exception.ManilaException(message='error')))
mock_log_warning = self.mock_object(manager.LOG, 'warning')
self.mock_object(self.share_manager, 'driver')
self.share_manager.driver.driver_handles_share_servers = False
self.mock_object(self.share_manager.driver, "unmanage_snapshot")
mock_get_share_server = self.mock_object(
self.share_manager,
'_get_share_server',
mock.Mock(return_value=None))
mock_snapshot_instance_destroy_call = self.mock_object(
self.share_manager.db, 'share_snapshot_instance_delete')
share = db_utils.create_share()
snapshot = db_utils.create_snapshot(share_id=share['id'])
mock_get = self.mock_object(self.share_manager.db,
'share_snapshot_get',
mock.Mock(return_value=snapshot))
self.share_manager.unmanage_snapshot(self.context, snapshot['id'])
self.share_manager.driver.unmanage_snapshot.assert_called_once_with(
mock.ANY)
mock_snapshot_instance_destroy_call.assert_called_once_with(
mock.ANY, snapshot['instance']['id'])
mock_get.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['id'])
mock_get_share_server.assert_called_once_with(
utils.IsAMatcher(context.RequestContext), snapshot['share'])
if quota_error:
self.assertTrue(mock_log_warning.called)
def _setup_crud_replicated_snapshot_data(self):
snapshot = fakes.fake_snapshot(create_instance=True)
snapshot_instance = fakes.fake_snapshot_instance(
base_snapshot=snapshot)
snapshot_instances = [snapshot['instance'], snapshot_instance]
replicas = [fake_replica(), fake_replica()]
return snapshot, snapshot_instances, replicas
def test_create_replicated_snapshot_driver_exception(self):
snapshot, snapshot_instances, replicas = (
self._setup_crud_replicated_snapshot_data()
)
self.mock_object(
db, 'share_snapshot_get', mock.Mock(return_value=snapshot))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replicas))
self.mock_object(
self.share_manager.driver, 'create_replicated_snapshot',
mock.Mock(side_effect=exception.ManilaException))
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
self.assertRaises(exception.ManilaException,
self.share_manager.create_replicated_snapshot,
self.context, snapshot['id'], share_id='fake_share')
mock_db_update_call.assert_has_calls([
mock.call(
self.context, snapshot['instance']['id'],
{'status': constants.STATUS_ERROR}),
mock.call(
self.context, snapshot_instances[1]['id'],
{'status': constants.STATUS_ERROR}),
])
@ddt.data(None, [])
def test_create_replicated_snapshot_driver_updates_nothing(self, retval):
snapshot, snapshot_instances, replicas = (
self._setup_crud_replicated_snapshot_data()
)
self.mock_object(
db, 'share_snapshot_get', mock.Mock(return_value=snapshot))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replicas))
self.mock_object(
self.share_manager.driver, 'create_replicated_snapshot',
mock.Mock(return_value=retval))
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
return_value = self.share_manager.create_replicated_snapshot(
self.context, snapshot['id'], share_id='fake_share')
self.assertIsNone(return_value)
self.assertFalse(mock_db_update_call.called)
def test_create_replicated_snapshot_driver_updates_snapshot(self):
snapshot, snapshot_instances, replicas = (
self._setup_crud_replicated_snapshot_data()
)
snapshot_dict = {
'status': constants.STATUS_AVAILABLE,
'provider_location': 'spinners_end',
'progress': '100%',
'id': snapshot['instance']['id'],
}
self.mock_object(
db, 'share_snapshot_get', mock.Mock(return_value=snapshot))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replicas))
self.mock_object(
self.share_manager.driver, 'create_replicated_snapshot',
mock.Mock(return_value=[snapshot_dict]))
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
return_value = self.share_manager.create_replicated_snapshot(
self.context, snapshot['id'], share_id='fake_share')
self.assertIsNone(return_value)
mock_db_update_call.assert_called_once_with(
self.context, snapshot['instance']['id'], snapshot_dict)
def delete_replicated_snapshot_driver_exception(self):
snapshot, snapshot_instances, replicas = (
self._setup_crud_replicated_snapshot_data()
)
self.mock_object(
db, 'share_snapshot_get', mock.Mock(return_value=snapshot))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replicas))
self.mock_object(
self.share_manager.driver, 'delete_replicated_snapshot',
mock.Mock(side_effect=exception.ManilaException))
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
self.assertRaises(exception.ManilaException,
self.share_manager.delete_replicated_snapshot,
self.context, snapshot['id'], share_id='fake_share')
mock_db_update_call.assert_has_calls([
mock.call(
self.context, snapshot['instance']['id'],
{'status': constants.STATUS_ERROR_DELETING}),
mock.call(
self.context, snapshot_instances[1]['id'],
{'status': constants.STATUS_ERROR_DELETING}),
])
self.assertFalse(mock_db_delete_call.called)
def delete_replicated_snapshot_driver_exception_ignored_with_force(self):
snapshot, snapshot_instances, replicas = (
self._setup_crud_replicated_snapshot_data()
)
self.mock_object(
db, 'share_snapshot_get', mock.Mock(return_value=snapshot))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replicas))
self.mock_object(
self.share_manager.driver, 'delete_replicated_snapshot',
mock.Mock(side_effect=exception.ManilaException))
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
retval = self.share_manager.delete_replicated_snapshot(
self.context, snapshot['id'], share_id='fake_share')
self.assertIsNone(retval)
mock_db_delete_call.assert_has_calls([
mock.call(
self.context, snapshot['instance']['id']),
mock.call(
self.context, snapshot_instances[1]['id']),
])
self.assertFalse(mock_db_update_call.called)
@ddt.data(None, [])
def delete_replicated_snapshot_driver_updates_nothing(self, retval):
snapshot, snapshot_instances, replicas = (
self._setup_crud_replicated_snapshot_data()
)
self.mock_object(
db, 'share_snapshot_get', mock.Mock(return_value=snapshot))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replicas))
self.mock_object(
self.share_manager.driver, 'delete_replicated_snapshot',
mock.Mock(return_value=retval))
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
return_value = self.share_manager.delete_replicated_snapshot(
self.context, snapshot['id'], share_id='fake_share')
self.assertIsNone(return_value)
self.assertFalse(mock_db_delete_call.called)
self.assertFalse(mock_db_update_call.called)
def delete_replicated_snapshot_driver_deletes_snapshots(self):
snapshot, snapshot_instances, replicas = (
self._setup_crud_replicated_snapshot_data()
)
retval = [{
'status': constants.STATUS_DELETED,
'id': snapshot['instance']['id'],
}]
self.mock_object(
db, 'share_snapshot_get', mock.Mock(return_value=snapshot))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replicas))
self.mock_object(
self.share_manager.driver, 'delete_replicated_snapshot',
mock.Mock(return_value=retval))
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
return_value = self.share_manager.delete_replicated_snapshot(
self.context, snapshot['id'], share_id='fake_share')
self.assertIsNone(return_value)
mock_db_delete_call.assert_called_once_with(
self.context, snapshot['instance']['id'])
self.assertFalse(mock_db_update_call.called)
@ddt.data(True, False)
def delete_replicated_snapshot_drv_del_and_updates_snapshots(self, force):
snapshot, snapshot_instances, replicas = (
self._setup_crud_replicated_snapshot_data()
)
updated_instance_details = {
'status': constants.STATUS_ERROR,
'id': snapshot_instances[1]['id'],
'provider_location': 'azkaban',
}
retval = [
{
'status': constants.STATUS_DELETED,
'id': snapshot['instance']['id'],
},
]
retval.append(updated_instance_details)
self.mock_object(
db, 'share_snapshot_get', mock.Mock(return_value=snapshot))
self.mock_object(self.share_manager, '_get_share_server')
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=replicas))
self.mock_object(
self.share_manager.driver, 'delete_replicated_snapshot',
mock.Mock(return_value=retval))
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
return_value = self.share_manager.delete_replicated_snapshot(
self.context, snapshot['id'], share_id='fake_share', force=force)
self.assertIsNone(return_value)
if force:
self.assertTrue(2, mock_db_delete_call.call_count)
self.assertFalse(mock_db_update_call.called)
else:
mock_db_delete_call.assert_called_once_with(
self.context, snapshot['instance']['id'])
mock_db_update_call.assert_called_once_with(
self.context, snapshot_instances[1]['id'],
updated_instance_details)
def test_periodic_share_replica_snapshot_update(self):
mock_debug_log = self.mock_object(manager.LOG, 'debug')
replicas = 3 * [
fake_replica(host='malfoy@manor#_pool0',
replica_state=constants.REPLICA_STATE_IN_SYNC)
]
replicas.append(fake_replica(replica_state=constants.STATUS_ACTIVE))
snapshot = fakes.fake_snapshot(create_instance=True,
status=constants.STATUS_DELETING)
snapshot_instances = 3 * [
fakes.fake_snapshot_instance(base_snapshot=snapshot)
]
self.mock_object(
db, 'share_replicas_get_all', mock.Mock(return_value=replicas))
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(return_value=snapshot_instances))
mock_snapshot_update_call = self.mock_object(
self.share_manager, '_update_replica_snapshot')
retval = self.share_manager.periodic_share_replica_snapshot_update(
self.context)
self.assertIsNone(retval)
self.assertEqual(1, mock_debug_log.call_count)
self.assertEqual(0, mock_snapshot_update_call.call_count)
@ddt.data(True, False)
def test_periodic_share_replica_snapshot_update_nothing_to_update(
self, has_instances):
mock_debug_log = self.mock_object(manager.LOG, 'debug')
replicas = 3 * [
fake_replica(host='malfoy@manor#_pool0',
replica_state=constants.REPLICA_STATE_IN_SYNC)
]
replicas.append(fake_replica(replica_state=constants.STATUS_ACTIVE))
snapshot = fakes.fake_snapshot(create_instance=True,
status=constants.STATUS_DELETING)
snapshot_instances = 3 * [
fakes.fake_snapshot_instance(base_snapshot=snapshot)
]
self.mock_object(db, 'share_replicas_get_all',
mock.Mock(side_effect=[[], replicas]))
self.mock_object(db, 'share_snapshot_instance_get_all_with_filters',
mock.Mock(side_effect=[snapshot_instances, []]))
mock_snapshot_update_call = self.mock_object(
self.share_manager, '_update_replica_snapshot')
retval = self.share_manager.periodic_share_replica_snapshot_update(
self.context)
self.assertIsNone(retval)
self.assertEqual(1, mock_debug_log.call_count)
self.assertEqual(0, mock_snapshot_update_call.call_count)
def test__update_replica_snapshot_replica_deleted_from_database(self):
replica_not_found = exception.ShareReplicaNotFound(replica_id='xyzzy')
self.mock_object(db, 'share_replica_get', mock.Mock(
side_effect=replica_not_found))
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
mock_driver_update_call = self.mock_object(
self.share_manager.driver, 'update_replicated_snapshot')
snaphot_instance = fakes.fake_snapshot_instance()
retval = self.share_manager._update_replica_snapshot(
self.context, snaphot_instance)
self.assertIsNone(retval)
mock_db_delete_call.assert_called_once_with(
self.context, snaphot_instance['id'])
self.assertFalse(mock_driver_update_call.called)
self.assertFalse(mock_db_update_call.called)
def test__update_replica_snapshot_both_deleted_from_database(self):
replica_not_found = exception.ShareReplicaNotFound(replica_id='xyzzy')
instance_not_found = exception.ShareSnapshotInstanceNotFound(
instance_id='spoon!')
self.mock_object(db, 'share_replica_get', mock.Mock(
side_effect=replica_not_found))
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete', mock.Mock(
side_effect=instance_not_found))
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
mock_driver_update_call = self.mock_object(
self.share_manager.driver, 'update_replicated_snapshot')
snapshot_instance = fakes.fake_snapshot_instance()
retval = self.share_manager._update_replica_snapshot(
self.context, snapshot_instance)
self.assertIsNone(retval)
mock_db_delete_call.assert_called_once_with(
self.context, snapshot_instance['id'])
self.assertFalse(mock_driver_update_call.called)
self.assertFalse(mock_db_update_call.called)
def test__update_replica_snapshot_driver_raises_Not_Found_exception(self):
mock_debug_log = self.mock_object(manager.LOG, 'debug')
replica = fake_replica()
snapshot_instance = fakes.fake_snapshot_instance(
status=constants.STATUS_DELETING)
self.mock_object(
db, 'share_replica_get', mock.Mock(return_value=replica))
self.mock_object(db, 'share_snapshot_instance_get',
mock.Mock(return_value=snapshot_instance))
self.mock_object(db, 'share_snapshot_instance_get',
mock.Mock(return_value=snapshot_instance))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica]))
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=None))
self.mock_object(
self.share_manager.driver, 'update_replicated_snapshot',
mock.Mock(
side_effect=exception.SnapshotResourceNotFound(name='abc')))
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
retval = self.share_manager._update_replica_snapshot(
self.context, snapshot_instance, replica_snapshots=None)
self.assertIsNone(retval)
self.assertEqual(1, mock_debug_log.call_count)
mock_db_delete_call.assert_called_once_with(
self.context, snapshot_instance['id'])
self.assertFalse(mock_db_update_call.called)
@ddt.data(exception.NotFound, exception.ManilaException)
def test__update_replica_snapshot_driver_raises_other_exception(self, exc):
mock_debug_log = self.mock_object(manager.LOG, 'debug')
mock_info_log = self.mock_object(manager.LOG, 'info')
mock_exception_log = self.mock_object(manager.LOG, 'exception')
replica = fake_replica()
snapshot_instance = fakes.fake_snapshot_instance(
status=constants.STATUS_CREATING)
self.mock_object(
db, 'share_replica_get', mock.Mock(return_value=replica))
self.mock_object(db, 'share_snapshot_instance_get',
mock.Mock(return_value=snapshot_instance))
self.mock_object(db, 'share_snapshot_instance_get',
mock.Mock(return_value=snapshot_instance))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica]))
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=None))
self.mock_object(self.share_manager.driver,
'update_replicated_snapshot',
mock.Mock(side_effect=exc))
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
retval = self.share_manager._update_replica_snapshot(
self.context, snapshot_instance)
self.assertIsNone(retval)
self.assertEqual(1, mock_exception_log.call_count)
self.assertEqual(1, mock_debug_log.call_count)
self.assertFalse(mock_info_log.called)
mock_db_update_call.assert_called_once_with(
self.context, snapshot_instance['id'], {'status': 'error'})
self.assertFalse(mock_db_delete_call.called)
@ddt.data(True, False)
def test__update_replica_snapshot_driver_updates_replica(self, update):
replica = fake_replica()
snapshot_instance = fakes.fake_snapshot_instance()
driver_update = {}
if update:
driver_update = {
'id': snapshot_instance['id'],
'provider_location': 'knockturn_alley',
'status': constants.STATUS_AVAILABLE,
}
mock_debug_log = self.mock_object(manager.LOG, 'debug')
mock_info_log = self.mock_object(manager.LOG, 'info')
self.mock_object(
db, 'share_replica_get', mock.Mock(return_value=replica))
self.mock_object(db, 'share_snapshot_instance_get',
mock.Mock(return_value=snapshot_instance))
self.mock_object(db, 'share_snapshot_instance_get',
mock.Mock(return_value=snapshot_instance))
self.mock_object(db, 'share_replicas_get_all_by_share',
mock.Mock(return_value=[replica]))
self.mock_object(self.share_manager, '_get_share_server',
mock.Mock(return_value=None))
self.mock_object(self.share_manager.driver,
'update_replicated_snapshot',
mock.Mock(return_value=driver_update))
mock_db_delete_call = self.mock_object(
db, 'share_snapshot_instance_delete')
mock_db_update_call = self.mock_object(
db, 'share_snapshot_instance_update')
retval = self.share_manager._update_replica_snapshot(
self.context, snapshot_instance, replica_snapshots=None)
driver_update['progress'] = '100%'
self.assertIsNone(retval)
self.assertEqual(1, mock_debug_log.call_count)
self.assertFalse(mock_info_log.called)
if update:
mock_db_update_call.assert_called_once_with(
self.context, snapshot_instance['id'], driver_update)
else:
self.assertFalse(mock_db_update_call.called)
self.assertFalse(mock_db_delete_call.called)
@ddt.ddt
class HookWrapperTestCase(test.TestCase):
def setUp(self):
super(HookWrapperTestCase, self).setUp()
self.configuration = mock.Mock()
self.configuration.safe_get.return_value = True
@manager.add_hooks
def _fake_wrapped_method(self, some_arg, some_kwarg):
return "foo"
def test_hooks_enabled(self):
self.hooks = [mock.Mock(return_value=i) for i in range(2)]
result = self._fake_wrapped_method(
"some_arg", some_kwarg="some_kwarg_value")
self.assertEqual("foo", result)
for i, mock_hook in enumerate(self.hooks):
mock_hook.execute_pre_hook.assert_called_once_with(
"some_arg",
func_name="_fake_wrapped_method",
some_kwarg="some_kwarg_value")
mock_hook.execute_post_hook.assert_called_once_with(
"some_arg",
func_name="_fake_wrapped_method",
driver_action_results="foo",
pre_hook_data=self.hooks[i].execute_pre_hook.return_value,
some_kwarg="some_kwarg_value")
def test_hooks_disabled(self):
self.hooks = []
result = self._fake_wrapped_method(
"some_arg", some_kwarg="some_kwarg_value")
self.assertEqual("foo", result)
for mock_hook in self.hooks:
self.assertFalse(mock_hook.execute_pre_hook.called)
self.assertFalse(mock_hook.execute_post_hook.called)
| 47.575963 | 79 | 0.640871 | 25,557 | 224,844 | 5.236139 | 0.022186 | 0.067882 | 0.087999 | 0.056554 | 0.89113 | 0.858661 | 0.826297 | 0.784053 | 0.751128 | 0.715603 | 0 | 0.002028 | 0.263276 | 224,844 | 4,725 | 80 | 47.586032 | 0.805829 | 0.009224 | 0 | 0.674481 | 0 | 0 | 0.110083 | 0.042909 | 0 | 0 | 0 | 0 | 0.131868 | 1 | 0.044933 | false | 0.000488 | 0.009035 | 0.000733 | 0.057875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
407fbde3aec5a5072ee1f3a83d887c1fa5e36d45 | 85 | py | Python | sympy/codegen/pynodes.py | utkarshdeorah/sympy | dcdf59bbc6b13ddbc329431adf72fcee294b6389 | [
"BSD-3-Clause"
] | 1 | 2020-01-12T17:16:05.000Z | 2020-01-12T17:16:05.000Z | sympy/codegen/pynodes.py | utkarshdeorah/sympy | dcdf59bbc6b13ddbc329431adf72fcee294b6389 | [
"BSD-3-Clause"
] | 14 | 2018-02-08T10:11:03.000Z | 2019-04-16T10:32:46.000Z | sympy/codegen/pynodes.py | utkarshdeorah/sympy | dcdf59bbc6b13ddbc329431adf72fcee294b6389 | [
"BSD-3-Clause"
] | 1 | 2022-02-04T13:50:29.000Z | 2022-02-04T13:50:29.000Z | from .abstract_nodes import List as AbstractList
class List(AbstractList):
pass
| 17 | 48 | 0.788235 | 11 | 85 | 6 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164706 | 85 | 4 | 49 | 21.25 | 0.929577 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
40eac44c20cf2d38970f8eaaaab7f040cae8c601 | 27 | py | Python | adet/modeling/solov2/__init__.py | manusheoran/AdelaiDet_DA | 04f0843c6be8e436716783300abcba715d560853 | [
"BSD-2-Clause"
] | 2,597 | 2020-03-15T06:01:23.000Z | 2022-03-31T18:21:31.000Z | adet/modeling/solov2/__init__.py | manusheoran/AdelaiDet_DA | 04f0843c6be8e436716783300abcba715d560853 | [
"BSD-2-Clause"
] | 467 | 2020-03-16T11:31:52.000Z | 2022-03-31T08:50:15.000Z | adet/modeling/solov2/__init__.py | manusheoran/AdelaiDet_DA | 04f0843c6be8e436716783300abcba715d560853 | [
"BSD-2-Clause"
] | 584 | 2020-03-15T05:53:40.000Z | 2022-03-26T02:56:30.000Z | from .solov2 import SOLOv2
| 13.5 | 26 | 0.814815 | 4 | 27 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 0.148148 | 27 | 1 | 27 | 27 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
906eb50ae4dbd119ca154893bb9c1ee0b7b463db | 4,059 | py | Python | code/nll_grad_fb.py | mjvakili/supermean | 4388a164cc1da41776c0f8dd0060ada6db7e1c9e | [
"MIT"
] | 1 | 2016-12-12T20:58:43.000Z | 2016-12-12T20:58:43.000Z | code/nll_grad_fb.py | mjvakili/supermean | 4388a164cc1da41776c0f8dd0060ada6db7e1c9e | [
"MIT"
] | null | null | null | code/nll_grad_fb.py | mjvakili/supermean | 4388a164cc1da41776c0f8dd0060ada6db7e1c9e | [
"MIT"
] | null | null | null | import numpy as np
import scipy.optimize as op
def fit_single_patch(data, mask , psf, theta, floor, gain):
"""
Inputs:
data = patch,
mask = True for healthy pixels, False for flagged pixels
psf = psf model rendered at the data grid
theta = [old_flux, old_bkg],
where:
old_flux = current flux estimate for a patch
old_back = current bkg estimate for the patch
floor = floor variance of the noise model
gain = gain of the noise model
Outputs:
(non-regularized) NLL of the patch, and derivative
of (non-regularized) NLL w.r.t flux and bkg of the patch.
Note that the regularization is independent of F, B.
"""
var = floor + gain * np.abs(theta[1] * psf[mask] + theta[0])
A = np.ones((var.size, 2))
A[:, 1] = psf[mask]
model = np.dot(A, theta) #masked model
res = data[mask] - model
func = 0.5*np.sum(((res)**2.)/var) + 0.5*np.sum(np.log(var))
Grad = 0.5*gain*(1./var - res*res/(var*var)) - (res*res)/var
grad = np.sum(Grad[:, None]*A , axis = 1)
return func , grad
# different version of the function defined above. We'll see which one is faster:
def v2_fit_single_patch(theta, masked_data, masked_psf, floor, gain):
"""
Inputs:
theta = [old_bkg, old_flux],
where:
old_flux = current flux estimate for a patch
old_back = current bkg estimate for the patch
masked_data = patch with flagged pixels masked out
masked_psf = psf model rendered at the data grid
masked out
where pixels are flagged
floor = floor variance of the noise model
gain = gain of the noise model
Outputs:
(non-regularized) NLL of the patch, and derivative
of (non-regularized) NLL w.r.t flux and bkg of the patch.
Note that the regularization is independent of F, B.
"""
var = floor + gain * np.abs(theta[1]* masked_psf + theta[0])
A = np.ones((var.size, 2))
A[:, 1] = masked_psf
model = np.dot(A , theta) #masked model
res = masked_data - model
func = 0.5*np.sum(((res)**2.)/var) + 0.5*np.sum(np.log(var))
Grad = 0.5*gain*(1./var - res*res/(var*var)) - (res*res)/var
grad = np.sum(Grad[:, None]*A , axis = 0) #this could unnecessarily slow down the code!!
print func.shape , grad.shape
return func , grad
##### this is probably the fastest version!
def v3_fit_single_patch(theta, masked_data, masked_psf, floor, gain):
"""
Inputs:
theta = [old_flux, old_bkg],
where:
old_flux = current flux estimate for a patch
old_back = current bkg estimate for the patch
masked_data = patch with flagged pixels masked out
masked_psf = psf model rendered at the data grid
masked out
where pixels are flagged
floor = floor variance of the noise model
gain = gain of the noise model
Outputs:
(non-regularized) NLL of the patch, and derivative
of (non-regularized) NLL w.r.t flux and bkg of the patch.
Note that the regularization is independent of F, B.
"""
grad = np.zeros((2))
var = floor + gain * np.abs(theta[1] * masked_psf + theta[0])
model = theta[0] + theta[1]*masked_psf #masked model
res = masked_data - model
func = 0.5*np.sum(((res)**2.)/var) + 0.5*np.sum(np.log(var))
gain_term_b = - (gain/2.)*(res**2./var**2.) + (gain/2.)*(var**-1.)
#gainp[modelp<0] *= -1. #var=f+g|model| to account for numerical artifacts when sr model is sampled at the data grid
grad[0] = -1.*np.sum(res/var) + np.sum(gain_term_b)
gain_term_f = (gain/2.)*masked_psf*(var**-1. - res**2./var**2.)
#gainp[modelp<0] *= -1. #var=f+g|model| to account for numerical artifacts when sr model is sampled at the data grid
grad[1] = np.sum(-1.*res*masked_psf/var) + np.sum(gain_term_f)
return func, grad
| 35.295652 | 125 | 0.598916 | 625 | 4,059 | 3.8192 | 0.1792 | 0.027231 | 0.025136 | 0.037704 | 0.780059 | 0.766653 | 0.766653 | 0.766653 | 0.733976 | 0.733976 | 0 | 0.018933 | 0.284306 | 4,059 | 114 | 126 | 35.605263 | 0.802754 | 0.107169 | 0 | 0.470588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.058824 | null | null | 0.029412 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
909680a4f50d339905058a47255f30cb43acfb09 | 33 | py | Python | cct/core2/devices/vacuumgauge/generic/__init__.py | awacha/cct | be1adbed2533df15c778051f3f4f9da0749c873a | [
"BSD-3-Clause"
] | 1 | 2015-11-04T16:37:39.000Z | 2015-11-04T16:37:39.000Z | cct/core2/devices/vacuumgauge/generic/__init__.py | awacha/cct | be1adbed2533df15c778051f3f4f9da0749c873a | [
"BSD-3-Clause"
] | null | null | null | cct/core2/devices/vacuumgauge/generic/__init__.py | awacha/cct | be1adbed2533df15c778051f3f4f9da0749c873a | [
"BSD-3-Clause"
] | 1 | 2020-03-05T02:50:43.000Z | 2020-03-05T02:50:43.000Z | from .frontend import VacuumGauge | 33 | 33 | 0.878788 | 4 | 33 | 7.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 33 | 1 | 33 | 33 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
90ac11663dc8b10598fe5f239bff717ec89796ac | 78 | py | Python | src/evaluation/__init__.py | mfederici/dsit | 7f26f7ce93edb2075fba4aa965aa1ad9bf773aa5 | [
"MIT"
] | 17 | 2021-11-02T17:51:02.000Z | 2022-02-21T02:48:56.000Z | src/evaluation/__init__.py | mfederici/dsit | 7f26f7ce93edb2075fba4aa965aa1ad9bf773aa5 | [
"MIT"
] | null | null | null | src/evaluation/__init__.py | mfederici/dsit | 7f26f7ce93edb2075fba4aa965aa1ad9bf773aa5 | [
"MIT"
] | null | null | null | from src.evaluation.accuracy import AccuracyEvaluation, CrossEntropyEvaluation | 78 | 78 | 0.910256 | 7 | 78 | 10.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051282 | 78 | 1 | 78 | 78 | 0.959459 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
90c262f280d42f7c5aea2d42d91082a004cbe89b | 25,992 | py | Python | Pyrado/pyrado/environments/rcspysim/box_lifting.py | jacarvalho/SimuRLacra | a6c982862e2ab39a9f65d1c09aa59d9a8b7ac6c5 | [
"BSD-3-Clause"
] | null | null | null | Pyrado/pyrado/environments/rcspysim/box_lifting.py | jacarvalho/SimuRLacra | a6c982862e2ab39a9f65d1c09aa59d9a8b7ac6c5 | [
"BSD-3-Clause"
] | null | null | null | Pyrado/pyrado/environments/rcspysim/box_lifting.py | jacarvalho/SimuRLacra | a6c982862e2ab39a9f65d1c09aa59d9a8b7ac6c5 | [
"BSD-3-Clause"
] | null | null | null | import functools
import numpy as np
import os.path as osp
from init_args_serializer import Serializable
from typing import Sequence
import rcsenv
from pyrado.environments.rcspysim.base import RcsSim
from pyrado.spaces.box import BoxSpace
from pyrado.spaces.singular import SingularStateSpace
from pyrado.tasks.base import Task
from pyrado.tasks.desired_state import DesStateTask
from pyrado.tasks.endless_flipping import EndlessFlippingTask
from pyrado.tasks.masked import MaskedTask
from pyrado.tasks.reward_functions import ExpQuadrErrRewFcn, MinusOnePerStepRewFcn, AbsErrRewFcn, CosOfOneEleRewFcn, \
CompoundRewFcn
from pyrado.tasks.parallel import ParallelTasks
from pyrado.tasks.utils import proximity_succeeded, never_succeeded
from pyrado.tasks.predefined import create_check_all_boundaries_task, \
create_task_space_discrepancy_task, create_collision_task
from pyrado.utils.data_types import EnvSpec
rcsenv.addResourcePath(rcsenv.RCSPYSIM_CONFIG_PATH)
def create_box_lift_task(env_spec: EnvSpec, continuous_rew_fcn: bool, succ_thold: float):
# Define the indices for selection. This needs to match the observations' names in RcsPySim.
idcs = ['Box_Z']
# Get the masked environment specification
spec = EnvSpec(
env_spec.obs_space,
env_spec.act_space,
env_spec.state_space.subspace(env_spec.state_space.create_mask(idcs))
)
# Create a desired state task
# state_des = np.array([0.3]) # box position is measured relative to the table
state_des = np.array([1.1]) # box position is measured world coordinates
if continuous_rew_fcn:
Q = np.diag([3e1])
R = 1e0*np.eye(spec.act_space.flat_dim)
rew_fcn = ExpQuadrErrRewFcn(Q, R)
else:
rew_fcn = MinusOnePerStepRewFcn()
dst = DesStateTask(spec, state_des, rew_fcn, functools.partial(proximity_succeeded, thold_dist=succ_thold))
# Return the masked tasks
return MaskedTask(env_spec, dst, idcs)
def create_box_flip_task(env_spec: EnvSpec, continuous_rew_fcn):
# Define the indices for selection. This needs to match the observations' names in RcsPySim.
idcs = ['Box_A']
# Get the masked environment specification
spec = EnvSpec(
env_spec.obs_space,
env_spec.act_space,
env_spec.state_space.subspace(env_spec.state_space.create_mask(idcs))
)
# Create a desired state task
# state_des = np.array([0.3]) # box position is measured relative to the table
state_des = np.array([-np.pi/2]) # box position is measured world coordinates
if continuous_rew_fcn:
q = np.array([0./np.pi])
r = 1e-6*np.ones(spec.act_space.flat_dim)
rew_fcn_act = AbsErrRewFcn(q, r)
rew_fcn_box = CosOfOneEleRewFcn(idx=0)
rew_fcn = CompoundRewFcn([rew_fcn_act, rew_fcn_box])
else:
rew_fcn = MinusOnePerStepRewFcn()
ef_task = EndlessFlippingTask(spec, rew_fcn, init_angle=0.)
# Return the masked tasks
return MaskedTask(env_spec, ef_task, idcs)
class BoxLiftingSim(RcsSim, Serializable):
""" Base class for 2-armed humanoid robot lifting a box out of a basket """
def __init__(self,
task_args: dict,
ref_frame: str,
position_mps: bool,
mps_left: [Sequence[dict], None],
mps_right: [Sequence[dict], None],
fixed_init_state: bool = False,
**kwargs):
"""
Constructor
.. note::
This constructor should only be called via the subclasses.
:param task_args: arguments for the task construction
:param ref_frame: reference frame for the position and orientation MPs, e.g. 'world', 'basket', or 'box'
:param position_mps: `True` if the MPs are defined on position level, `False` if defined on velocity level
:param mps_left: left arm's movement primitives holding the dynamical systems and the goal states
:param mps_right: right arm's movement primitives holding the dynamical systems and the goal states
:param fixed_init_state: use an init state space with only one state (e.g. for debugging)
:param kwargs: keyword arguments which are available for all task-based `RcsSim`
taskCombinationMethod: str = 'mean', # 'sum', 'mean', 'product', or 'softmax'
checkJointLimits: bool = False,
collisionAvoidanceIK: bool = True,
observeVelocities: bool = True,
observeCollisionCost: bool = True,
observePredictedCollisionCost: bool = False,
observeManipulabilityIndex: bool = False,
observeCurrentManipulability: bool = True,
observeDynamicalSystemDiscrepancy: bool = False,
observeTaskSpaceDiscrepancy: bool = True,
observeForceTorque: bool = True
"""
Serializable._init(self, locals())
# Forward to the RcsSim's constructor
RcsSim.__init__(
self,
envType='BoxLifting',
physicsConfigFile='pBoxLifting.xml',
extraConfigDir=osp.join(rcsenv.RCSPYSIM_CONFIG_PATH, 'BoxLifting'),
hudColor='BLACK_RUBBER',
task_args=task_args,
refFrame=ref_frame,
positionTasks=position_mps,
tasksLeft=mps_left,
tasksRight=mps_right,
**kwargs
)
# Initial state space definition
if fixed_init_state:
dafault_init_state = np.array(
[0.2, 0., 0., 0.85, 65.*np.pi/180, -65.*np.pi/180]) # [m, m, rad, m, rad, rad]
self._init_space = SingularStateSpace(dafault_init_state,
labels=['$x$', '$y$', '$th$', '$z$', '$q_2_L$', '$q_2_R$'])
else:
min_init_state = np.array([0.05, -0.05, -5*np.pi/180, 0.8, 60*np.pi/180, -70*np.pi/180])
max_init_state = np.array([0.25, 0.05, 5*np.pi/180, 0.9, 70*np.pi/180, -60*np.pi/180])
self._init_space = BoxSpace(min_init_state, max_init_state, # [m, m, rad, m, rad, rad]
labels=['$x$', '$y$', '$th$', '$z$', '$q_2$', '$q_4$'])
def _create_task(self, task_args: dict) -> Task:
# Create the tasks
continuous_rew_fcn = task_args.get('continuous_rew_fcn', True)
task_box = create_box_lift_task(self.spec, continuous_rew_fcn, succ_thold=0.03)
task_check_bounds = create_check_all_boundaries_task(self.spec, penalty=1e3)
task_collision = create_collision_task(self.spec, factor=1.)
task_ts_discrepancy = create_task_space_discrepancy_task(self.spec,
AbsErrRewFcn(q=0.5*np.ones(3),
r=np.zeros(self.act_space.shape)))
return ParallelTasks([
task_box,
task_check_bounds,
task_collision,
task_ts_discrepancy
], hold_rew_when_done=False)
@classmethod
def get_nominal_domain_param(cls):
return dict(box_length=0.18,
box_width=0.14,
box_mass=0.3,
box_friction_coefficient=1.4,
basket_mass=0.5,
basket_friction_coefficient=0.6)
class BoxLiftingPosMPsSim(BoxLiftingSim, Serializable):
""" Humanoid robot lifting a box out of a basket using two arms and position-level movement primitives """
name: str = 'bl-pos'
def __init__(self,
ref_frame: str,
mps_left: [Sequence[dict], None],
mps_right: [Sequence[dict], None],
continuous_rew_fcn: bool = True,
fixed_init_state: bool = False,
**kwargs):
"""
Constructor
:param ref_frame: reference frame for the position and orientation MPs, e.g. 'world', 'basket', or 'box'
:param mps_left: left arm's movement primitives holding the dynamical systems and the goal states
:param mps_right: right arm's movement primitives holding the dynamical systems and the goal states
:param continuous_rew_fcn: specify if the continuous or an uninformative reward function should be used
:param fixed_init_state: use an init state space with only one state (e.g. for debugging)
:param kwargs: keyword arguments which are available for all task-based `RcsSim`
taskCombinationMethod: str = 'mean', # 'sum', 'mean', 'product', or 'softmax'
checkJointLimits: bool = False,
collisionAvoidanceIK: bool = True,
observeVelocities: bool = True,
observeCollisionCost: bool = True,
observePredictedCollisionCost: bool = False,
observeManipulabilityIndex: bool = False,
observeCurrentManipulability: bool = True,
observeDynamicalSystemDiscrepancy: bool = False,
observeTaskSpaceDiscrepancy: bool = True,
observeForceTorque: bool = True
"""
Serializable._init(self, locals())
# Fall back to some defaults of no MPs are defined (e.g. for testing)
# basket_extends = self.get_body_extents('Basket', 0)
if mps_left is None:
mps_left = [
# Power grasp position in basket frame (basket width = 0.7)
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.array([0., 0.5, 0.15])}, # [m]
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.array([0., -0.3, 0.15])}, # [m]
# Power grasp position in box frame (box width = 0.18)
# {'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
# 'goal': np.array([0., 0., 0.1])}, # [m]
# Power grasp orientation in basket frame
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.pi/180*np.array([180, -90, 0.])}, # [rad]
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.pi/180*np.array([120, -90, 0.])}, # [rad]
# Joints SDH
{'function': 'msd_nlin', 'attractorStiffness': 50., 'mass': 1., 'damping': 50.,
'goal': 10/180*np.pi*np.array([0, 2, -1.5, 2, 0, 2, 0])},
]
if mps_right is None:
mps_right = [
# Power grasp position in basket frame (basket width = 0.7)
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.array([0., -0.5, 0.15])}, # [m]
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.array([0., 0.3, 0.15])}, # [m]
# Power grasp orientation
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.pi/180*np.array([180, -90, 0.])}, # [rad]
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.pi/180*np.array([240, -90, 0.])}, # [rad]
# Joints SDH
{'function': 'msd_nlin', 'attractorStiffness': 50., 'mass': 1., 'damping': 50.,
'goal': 10/180*np.pi*np.array([0, 1.5, -1, 1, 0, 1.5, 0])},
# Distance
# {'function': 'msd', 'attractorStiffness': 50., 'mass': 1., 'damping': 10.,
{'function': 'lin', 'errorDynamics': 1., # [m/s]
'goal': np.array([0.0])}, # [m]
]
# Forward to the BoxLiftingSim's constructor
super().__init__(
task_args=dict(continuous_rew_fcn=continuous_rew_fcn),
ref_frame=ref_frame,
position_mps=True,
mps_left=mps_left,
mps_right=mps_right,
**kwargs
)
class BoxLiftingVelMPsSim(BoxLiftingSim, Serializable):
""" Humanoid robot lifting a box out of a basket using two arms and velocity-level movement primitives """
name: str = 'bl-vel'
def __init__(self,
ref_frame: str,
mps_left: [Sequence[dict], None],
mps_right: [Sequence[dict], None],
continuous_rew_fcn: bool = True,
fixed_init_state: bool = False,
**kwargs):
"""
Constructor
:param ref_frame: reference frame for the position and orientation MPs, e.g. 'world', 'basket', or 'box'
:param mps_left: left arm's movement primitives holding the dynamical systems and the goal states
:param mps_right: right arm's movement primitives holding the dynamical systems and the goal states
:param continuous_rew_fcn: specify if the continuous or an uninformative reward function should be used
:param fixed_init_state: use an init state space with only one state (e.g. for debugging)
:param kwargs: keyword arguments which are available for all task-based `RcsSim`
taskCombinationMethod: str = 'mean', # 'sum', 'mean', 'product', or 'softmax'
checkJointLimits: bool = False,
collisionAvoidanceIK: bool = True,
observeCollisionCost: bool = True,
observeVelocities: bool = True,
observePredictedCollisionCost: bool = False,
observeManipulabilityIndex: bool = False,
observeCurrentManipulability: bool = True,
observeDynamicalSystemDiscrepancy: bool = False,
observeTaskSpaceDiscrepancy: bool = True,
observeForceTorque: bool = True
"""
Serializable._init(self, locals())
# Fall back to some defaults of no MPs are defined (e.g. for testing)
dt = kwargs.get('dt', 0.01) # 100 Hz is the default
# basket_extends = self.get_body_extents('Basket', 0)
if mps_left is None:
mps_left = [
# Power grasp Xd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([0.15])}, # [m/s]
# Power grasp Yd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([0.15])}, # [m/s]
# Power grasp Zd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([0.15])}, # [m/s]
# Power grasp Ad
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([15/180*np.pi])}, # [rad/s]
# Power grasp Bd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([15/180*np.pi])}, # [rad/s]
# Power grasp Cd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([15/180*np.pi])}, # [rad/s]
# Joints SDH
{'function': 'msd_nlin', 'attractorStiffness': 50., 'mass': 2., 'damping': 50.,
'goal': 10/180*np.pi*np.array([0, 2, -1.5, 2, 0, 2, 0])},
]
if mps_right is None:
mps_right = [
# Power grasp Xd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([0.15])}, # [m/s]
# Power grasp Yd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([0.15])}, # [m/s]
# Power grasp Zd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([0.15])}, # [m/s]
# Power grasp Ad
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([15/180*np.pi])}, # [rad/s]
# Power grasp Bd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([15/180*np.pi])}, # [rad/s]
# Power grasp Cd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([15/180*np.pi])}, # [rad/s]
# Joints SDH
{'function': 'msd_nlin', 'attractorStiffness': 50., 'mass': 2., 'damping': 50.,
'goal': 10/180*np.pi*np.array([0, 1.5, -1, 1, 0, 1.5, 0])},
]
# Forward to the BoxLiftingSim's constructor
super().__init__(
task_args=dict(continuous_rew_fcn=continuous_rew_fcn),
ref_frame=ref_frame,
position_mps=False,
mps_left=mps_left,
mps_right=mps_right,
**kwargs
)
class BoxLiftingSimpleSim(RcsSim, Serializable):
""" Base class for simplified robotic manipulator turning a box in a basket """
def __init__(self,
task_args: dict,
ref_frame: str,
position_mps: bool,
mps_left: [Sequence[dict], None],
**kwargs):
"""
Constructor
.. note::
This constructor should only be called via the subclasses.
:param task_args: arguments for the task construction
:param ref_frame: reference frame for the position and orientation MPs, e.g. 'world', 'basket', or 'box'
:param position_mps: `True` if the MPs are defined on position level, `False` if defined on velocity level
:param mps_left: left arm's movement primitives holding the dynamical systems and the goal states
:param kwargs: keyword arguments which are available for all task-based `RcsSim`
taskCombinationMethod: str = 'mean', # 'sum', 'mean', 'product', or 'softmax'
checkJointLimits: bool = False,
collisionAvoidanceIK: bool = True,
observeVelocities: bool = False,
observeCollisionCost: bool = True,
observePredictedCollisionCost: bool = False,
observeManipulabilityIndex: bool = False,
observeCurrentManipulability: bool = True,
observeDynamicalSystemDiscrepancy: bool = False,
observeTaskSpaceDiscrepancy: bool = True,
observeForceTorque: bool = True
"""
Serializable._init(self, locals())
if kwargs.get('collisionConfig', None) is None:
kwargs.update(collisionConfig={
'pairs': [
{'body1': 'Hand', 'body2': 'Table'},
],
'threshold': 0.07
})
# Forward to the RcsSim's constructor
RcsSim.__init__(
self,
envType='BoxLiftingSimple',
physicsConfigFile='pBoxLifting.xml',
extraConfigDir=osp.join(rcsenv.RCSPYSIM_CONFIG_PATH, 'BoxLifting'),
hudColor='BLACK_RUBBER',
task_args=task_args,
refFrame=ref_frame,
positionTasks=position_mps,
tasksLeft=mps_left,
**kwargs
)
def _create_task(self, task_args: dict) -> Task:
# Create the tasks
continuous_rew_fcn = task_args.get('continuous_rew_fcn', True)
task_box = create_box_flip_task(self.spec, continuous_rew_fcn)
task_check_bounds = create_check_all_boundaries_task(self.spec, penalty=1e3)
# task_collision = create_collision_task(self.spec, factor=5e-2)
from pyrado.environments.rcspysim.box_flipping import create_task_space_discrepancy_task
task_ts_discrepancy = create_task_space_discrepancy_task(self.spec,
AbsErrRewFcn(q=5e-2*np.ones(2),
r=np.zeros(self.act_space.shape)))
return ParallelTasks([
task_box,
task_check_bounds,
# task_collision,
task_ts_discrepancy
], hold_rew_when_done=False)
@classmethod
def get_nominal_domain_param(cls):
return dict(box_length=0.14, # x_world dimension
box_width=0.18, # y_world dimension
box_mass=0.4,
box_friction_coefficient=1.3,
basket_mass=0.5,
basket_friction_coefficient=0.9)
class BoxLiftingSimplePosMPsSim(BoxLiftingSimpleSim, Serializable):
""" Simplified robotic manipulator turning a box in a basket using position-level movement primitives """
name: str = 'bls-pos'
def __init__(self,
ref_frame: str,
mps_left: [Sequence[dict], None],
continuous_rew_fcn: bool = True,
**kwargs):
"""
Constructor
:param ref_frame: reference frame for the position and orientation MPs, e.g. 'world', 'basket', or 'box'
:param mps_left: left arm's movement primitives holding the dynamical systems and the goal states
:param continuous_rew_fcn: specify if the continuous or an uninformative reward function should be used
:param kwargs: keyword arguments which are available for all task-based `RcsSim`
taskCombinationMethod: str = 'mean', # 'sum', 'mean', 'product', or 'softmax'
checkJointLimits: bool = False,
collisionAvoidanceIK: bool = True,
observeVelocities: bool = False,
observeCollisionCost: bool = True,
observePredictedCollisionCost: bool = False,
observeManipulabilityIndex: bool = False,
observeCurrentManipulability: bool = True,
observeDynamicalSystemDiscrepancy: bool = False,
observeTaskSpaceDiscrepancy: bool = True,
observeForceTorque: bool = True
"""
Serializable._init(self, locals())
# Fall back to some defaults of no MPs are defined (e.g. for testing)
if mps_left is None:
mps_left = [
# Y
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.array([-0.4])}, # [m]
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.array([+0.4])}, # [m]
# Z
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.array([-0.05])}, # [m]
{'function': 'msd_nlin', 'attractorStiffness': 30., 'mass': 1., 'damping': 60.,
'goal': np.array([+0.3])}, # [m]
]
# Forward to the BoxLiftingSimpleSim's constructor
super().__init__(
task_args=dict(continuous_rew_fcn=continuous_rew_fcn),
ref_frame=ref_frame,
position_mps=True,
mps_left=mps_left,
**kwargs
)
class BoxLiftingSimpleVelMPsSim(BoxLiftingSimpleSim, Serializable):
""" Simplified robotic manipulator turning a box in a basket using velocity-level movement primitives """
name: str = 'bls-vel'
def __init__(self,
ref_frame: str,
mps_left: [Sequence[dict], None],
continuous_rew_fcn: bool = True,
**kwargs):
"""
Constructor
:param ref_frame: reference frame for the position and orientation MPs, e.g. 'world', 'basket', or 'box'
:param mps_left: left arm's movement primitives holding the dynamical systems and the goal states
:param continuous_rew_fcn: specify if the continuous or an uninformative reward function should be used
:param kwargs: keyword arguments which are available for all task-based `RcsSim`
taskCombinationMethod: str = 'mean', # 'sum', 'mean', 'product', or 'softmax'
checkJointLimits: bool = False,
collisionAvoidanceIK: bool = True,
observeVelocities: bool = False,
observeCollisionCost: bool = True,
observePredictedCollisionCost: bool = False,
observeManipulabilityIndex: bool = False,
observeCurrentManipulability: bool = True,
observeDynamicalSystemDiscrepancy: bool = False,
observeTaskSpaceDiscrepancy: bool = True,
observeForceTorque: bool = True
"""
Serializable._init(self, locals())
# Fall back to some defaults of no MPs are defined (e.g. for testing)
dt = kwargs.get('dt', 0.01) # 100 Hz is the default
# basket_extends = self.get_body_extents('Basket', 0)
if mps_left is None:
mps_left = [
# Yd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([0.1])}, # [m/s]
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([-0.1])}, # [m/s]
# Zd
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([0.1])}, # [m/s]
{'function': 'lin', 'errorDynamics': 1., 'goal': dt*np.array([-0.1])}, # [m/s]
]
# Forward to the BoxLiftingSimpleSim's constructor
super().__init__(
task_args=dict(continuous_rew_fcn=continuous_rew_fcn),
ref_frame=ref_frame,
position_mps=False,
mps_left=mps_left,
**kwargs
)
| 48.222635 | 118 | 0.565251 | 2,849 | 25,992 | 5.002808 | 0.11934 | 0.020627 | 0.016839 | 0.03936 | 0.856171 | 0.835052 | 0.818073 | 0.803129 | 0.791763 | 0.786852 | 0 | 0.025336 | 0.324254 | 25,992 | 538 | 119 | 48.312268 | 0.786154 | 0.384157 | 0 | 0.622449 | 0 | 0 | 0.102822 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040816 | false | 0 | 0.064626 | 0.006803 | 0.159864 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
90d3a76886172f0d9418858ca88bfecbf99f9a81 | 34,855 | py | Python | sdc/tests/test_groupby.py | akharche/hpat | c7889893b49f7b251cd9f0a0889107593d8f1c4a | [
"BSD-2-Clause"
] | null | null | null | sdc/tests/test_groupby.py | akharche/hpat | c7889893b49f7b251cd9f0a0889107593d8f1c4a | [
"BSD-2-Clause"
] | null | null | null | sdc/tests/test_groupby.py | akharche/hpat | c7889893b49f7b251cd9f0a0889107593d8f1c4a | [
"BSD-2-Clause"
] | null | null | null | # *****************************************************************************
# Copyright (c) 2020, Intel Corporation All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# Redistributions of source code must retain the above copyright notice,
# this list of conditions and the following disclaimer.
#
# Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
# THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR
# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS;
# OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
# WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR
# OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# *****************************************************************************
import numba
import numpy as np
import pandas as pd
import platform
import pyarrow.parquet as pq
import unittest
from itertools import product
import sdc
from sdc.tests.test_base import TestCase
from sdc.tests.test_utils import (count_array_OneDs,
count_array_REPs,
count_parfor_OneDs,
count_parfor_REPs,
dist_IR_contains,
get_start_end,
skip_numba_jit,
sdc_limitation)
from sdc.tests.test_series import gen_frand_array
_pivot_df1 = pd.DataFrame({"A": ["foo", "foo", "foo", "foo", "foo",
"bar", "bar", "bar", "bar"],
"B": ["one", "one", "one", "two", "two",
"one", "one", "two", "two"],
"C": ["small", "large", "large", "small",
"small", "large", "small", "small",
"large"],
"D": [1, 2, 2, 6, 3, 4, 5, 6, 9]})
_default_df_numeric_data = {
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11, dtype=np.intp),
'C': np.arange(11, dtype=np.float_),
'D': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'E': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
}
class TestGroupBy(TestCase):
@sdc_limitation
def test_dataframe_groupby_index_name(self):
"""SDC indexes do not have names, so index created from a named Series looses it's name."""
def test_impl(df):
return df.groupby('A').min()
hpat_func = self.jit(test_impl)
n = 11
df = pd.DataFrame({
'A': [2, 1, 1, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(n, dtype=np.intp)
})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_frame_equal(result, result_ref)
def test_dataframe_groupby_by_all_dtypes(self):
def test_impl(df):
return df.groupby('A').count()
hpat_func = self.jit(test_impl)
dtype_to_column_data = {
'int': [2, 1, 1, 1, 2, 2, 1, 0, 3, 1, 3],
'float': [2, 1, 1, 1, 2, 2, 1, 3, np.nan, 1, np.nan],
'string': ['b', 'a', 'a', 'a', 'b', 'b', 'a', ' ', None, 'a', None]
}
df = pd.DataFrame(_default_df_numeric_data)
for dtype, col_data in dtype_to_column_data.items():
with self.subTest(by_dtype=dtype, by_data=col_data):
df['A'] = col_data
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_sort(self):
def test_impl(df, param):
return df.groupby('A', sort=param).min()
hpat_func = self.jit(test_impl)
n, m = 1000, 20
np.random.seed(0)
df = pd.DataFrame({
'A': np.random.choice(np.arange(m), n),
'B': np.arange(n, dtype=np.intp),
'C': np.arange(n, dtype=np.float_),
'D': gen_frand_array(n, nancount=n // 2),
})
for value in [True, False]:
with self.subTest(sort=value):
result = hpat_func(df, value) if value else hpat_func(df, value).sort_index()
result_ref = test_impl(df, value) if value else hpat_func(df, value).sort_index()
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_count(self):
def test_impl(df):
return df.groupby('A').count()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_count_no_unboxing(self):
def test_impl():
df = pd.DataFrame({
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11),
'C': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'D': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
})
return df.groupby('A').count()
sdc_impl = self.jit(test_impl)
result_jit = sdc_impl()
result_ref = test_impl()
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result_jit, result_ref, check_names=False)
def test_dataframe_groupby_max(self):
def test_impl(df):
return df.groupby('A').max()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_max_no_unboxing(self):
def test_impl():
df = pd.DataFrame({
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11),
'C': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'D': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
})
return df.groupby('A').max()
sdc_impl = self.jit(test_impl)
# TODO: implement index classes, as current indexes do not have names
kwargs = {'check_names': False}
if platform.system() == 'Windows':
# Attribute "dtype" are different on windows int64 vs int32
kwargs['check_dtype'] = False
pd.testing.assert_frame_equal(sdc_impl(), test_impl(), **kwargs)
def test_dataframe_groupby_min(self):
def test_impl(df):
return df.groupby('A').min()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_min_no_unboxing(self):
def test_impl():
df = pd.DataFrame({
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11),
'C': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'D': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
})
return df.groupby('A').min()
sdc_impl = self.jit(test_impl)
# TODO: implement index classes, as current indexes do not have names
kwargs = {'check_names': False}
if platform.system() == 'Windows':
# Attribute "dtype" are different on windows int64 vs int32
kwargs['check_dtype'] = False
pd.testing.assert_frame_equal(sdc_impl(), test_impl(), **kwargs)
def test_dataframe_groupby_mean(self):
def test_impl(df):
return df.groupby('A').mean()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_mean_no_unboxing(self):
def test_impl():
df = pd.DataFrame({
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11),
'C': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'D': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
})
return df.groupby('A').mean()
sdc_impl = self.jit(test_impl)
result_jit = sdc_impl()
result_ref = test_impl()
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result_jit, result_ref, check_names=False)
def test_dataframe_groupby_median(self):
def test_impl(df):
return df.groupby('A').median()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_median_no_unboxing(self):
def test_impl():
df = pd.DataFrame({
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11),
'C': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'D': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
})
return df.groupby('A').median()
sdc_impl = self.jit(test_impl)
result_jit = sdc_impl()
result_ref = test_impl()
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result_jit, result_ref, check_names=False)
@unittest.expectedFailure # pandas groupby.median returns unstable dtype (int or float) unlike series.median
def test_dataframe_groupby_median_result_dtype(self):
def test_impl(df):
return df.groupby('A').median()
hpat_func = self.jit(test_impl)
n = 11
df = pd.DataFrame({
'A': [2, 1, 1, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(n, dtype=np.intp)
})
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_prod(self):
def test_impl(df):
return df.groupby('A').prod()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_prod_no_unboxing(self):
def test_impl():
df = pd.DataFrame({
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11),
'C': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'D': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
})
return df.groupby('A').prod()
sdc_impl = self.jit(test_impl)
# TODO: implement index classes, as current indexes do not have names
kwargs = {'check_names': False}
if platform.system() == 'Windows':
# Attribute "dtype" are different on windows int64 vs int32
kwargs['check_dtype'] = False
pd.testing.assert_frame_equal(sdc_impl(), test_impl(), **kwargs)
@skip_numba_jit("BUG: SDC impl of Series.sum returns float64 on as series of ints")
def test_dataframe_groupby_sum(self):
def test_impl(df):
return df.groupby('A').sum()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_sum_no_unboxing(self):
def test_impl():
df = pd.DataFrame({
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11),
'C': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'D': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
})
return df.groupby('A').sum()
sdc_impl = self.jit(test_impl)
# TODO: implement index classes, as current indexes do not have names
# Attribute "dtype" are different int64 vs int32
kwargs = {'check_names': False, 'check_dtype': False}
pd.testing.assert_frame_equal(sdc_impl(), test_impl(), **kwargs)
def test_dataframe_groupby_std(self):
def test_impl(df):
return df.groupby('A').std()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_std_no_unboxing(self):
def test_impl():
df = pd.DataFrame({
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11),
'C': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'D': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
})
return df.groupby('A').std()
sdc_impl = self.jit(test_impl)
result_jit = sdc_impl()
result_ref = test_impl()
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result_jit, result_ref, check_names=False)
def test_dataframe_groupby_var(self):
def test_impl(df):
return df.groupby('A').var()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_var_no_unboxing(self):
def test_impl():
df = pd.DataFrame({
'A': [2, 1, 2, 1, 2, 2, 1, 0, 3, 1, 3],
'B': np.arange(11),
'C': [np.nan, 2., -1.3, np.nan, 3.5, 0, 10, 0.42, np.nan, -2.5, 23],
'D': [np.inf, 2., -1.3, -np.inf, 3.5, 0, 10, 0.42, np.nan, -2.5, 23]
})
return df.groupby('A').var()
sdc_impl = self.jit(test_impl)
result_jit = sdc_impl()
result_ref = test_impl()
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result_jit, result_ref, check_names=False)
@skip_numba_jit
def test_agg_seq(self):
def test_impl(df):
A = df.groupby('A')['B'].agg(lambda x: x.max() - x.min())
return A.values
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
# np.testing.assert_array_equal(hpat_func(df), test_impl(df))
self.assertEqual(set(hpat_func(df)), set(test_impl(df)))
@skip_numba_jit("BUG: SDC impl of Series.sum returns float64 on as series of ints")
def test_agg_seq_sum(self):
def test_impl(df):
return df.groupby('A')['B'].sum()
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_agg_seq_count(self):
def test_impl(df):
return df.groupby('A')['B'].count()
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_series_equal(result, result_ref, check_names=False)
def test_agg_seq_mean(self):
def test_impl(df):
return df.groupby('A')['B'].mean()
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_series_equal(result, result_ref, check_names=False)
def test_agg_seq_median(self):
def test_impl(df):
return df.groupby('A')['B'].median()
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_series_equal(result, result_ref, check_names=False)
def test_agg_seq_min(self):
def test_impl(df):
return df.groupby('A')['B'].min()
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_series_equal(result, result_ref, check_names=False)
@skip_numba_jit
def test_agg_seq_min_date(self):
def test_impl(df):
df2 = df.groupby('A', as_index=False).min()
return df2
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': pd.date_range('2019-1-3', '2019-1-9')})
self.assertEqual(set(hpat_func(df)), set(test_impl(df)))
def test_agg_seq_max(self):
def test_impl(df):
return df.groupby('A')['B'].max()
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_series_equal(result, result_ref, check_names=False)
@skip_numba_jit
def test_agg_seq_as_index(self):
def test_impl(df):
df2 = df.groupby('A', as_index=False).mean()
return df2.A.values
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
self.assertEqual(set(hpat_func(df)), set(test_impl(df)))
def test_agg_seq_prod(self):
def test_impl(df):
return df.groupby('A')['B'].prod()
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_series_equal(result, result_ref, check_names=False)
def test_agg_seq_var(self):
def test_impl(df):
return df.groupby('A')['B'].var()
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_series_equal(result, result_ref, check_names=False)
def test_agg_seq_std(self):
def test_impl(df):
return df.groupby('A')['B'].std()
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
result = hpat_func(df)
result_ref = test_impl(df)
pd.testing.assert_series_equal(result, result_ref, check_names=False)
@skip_numba_jit
def test_agg_multikey_seq(self):
def test_impl(df):
A = df.groupby(['A', 'C'])['B'].sum()
return A.values
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7],
'C': [3, 5, 6, 5, 4, 4, 3]})
self.assertEqual(set(hpat_func(df)), set(test_impl(df)))
@skip_numba_jit
def test_agg_multikey_parallel(self):
def test_impl(in_A, in_B, in_C):
df = pd.DataFrame({'A': in_A, 'B': in_B, 'C': in_C})
A = df.groupby(['A', 'C'])['B'].sum()
return A.sum()
hpat_func = self.jit(locals={'in_A:input': 'distributed',
'in_B:input': 'distributed',
'in_C:input': 'distributed'})(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7],
'C': [3, 5, 6, 5, 4, 4, 3]})
start, end = get_start_end(len(df))
h_A = df.A.values[start:end]
h_B = df.B.values[start:end]
h_C = df.C.values[start:end]
p_A = df.A.values
p_B = df.B.values
p_C = df.C.values
h_res = hpat_func(h_A, h_B, h_C)
p_res = test_impl(p_A, p_B, p_C)
self.assertEqual(h_res, p_res)
@skip_numba_jit
def test_agg_parallel(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
A = df.groupby('A')['B'].agg(lambda x: x.max() - x.min())
return A.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_agg_parallel_sum(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
A = df.groupby('A')['B'].sum()
return A.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_agg_parallel_count(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
A = df.groupby('A')['B'].count()
return A.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_agg_parallel_mean(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
A = df.groupby('A')['B'].mean()
return A.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_agg_parallel_min(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
A = df.groupby('A')['B'].min()
return A.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_agg_parallel_max(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
A = df.groupby('A')['B'].max()
return A.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_agg_parallel_var(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
A = df.groupby('A')['B'].var()
return A.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_agg_parallel_std(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
A = df.groupby('A')['B'].std()
return A.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@unittest.skip('AssertionError - fix needed\n'
'16 != 20\n')
def test_agg_parallel_str(self):
def test_impl():
df = pq.read_table("groupby3.pq").to_pandas()
A = df.groupby('A')['B'].agg(lambda x: x.max() - x.min())
return A.sum()
hpat_func = self.jit(test_impl)
self.assertEqual(hpat_func(), test_impl())
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_agg_parallel_all_col(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
df2 = df.groupby('A').max()
return df2.B.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_agg_parallel_as_index(self):
def test_impl(n):
df = pd.DataFrame({'A': np.ones(n, np.int64), 'B': np.arange(n)})
df2 = df.groupby('A', as_index=False).max()
return df2.A.sum()
hpat_func = self.jit(test_impl)
n = 11
self.assertEqual(hpat_func(n), test_impl(n))
self.assertEqual(count_array_REPs(), 0)
self.assertEqual(count_parfor_REPs(), 0)
@skip_numba_jit
def test_muti_hiframes_node_filter_agg(self):
def test_impl(df, cond):
df2 = df[cond]
c = df2.groupby('A')['B'].count()
return df2.C, c
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7], 'C': [2, 3, -1, 1, 2, 3, -1]})
cond = df.A > 1
res = test_impl(df, cond)
h_res = hpat_func(df, cond)
self.assertEqual(set(res[1]), set(h_res[1]))
np.testing.assert_array_equal(res[0], h_res[0])
@skip_numba_jit
def test_agg_seq_str(self):
def test_impl(df):
A = df.groupby('A')['B'].agg(lambda x: (x == 'aa').sum())
return A.values
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': ['aa', 'b', 'b', 'b', 'aa', 'aa', 'b'],
'B': ['ccc', 'a', 'bb', 'aa', 'dd', 'ggg', 'rr']})
# np.testing.assert_array_equal(hpat_func(df), test_impl(df))
self.assertEqual(set(hpat_func(df)), set(test_impl(df)))
@skip_numba_jit
def test_agg_seq_count_str(self):
def test_impl(df):
A = df.groupby('A')['B'].count()
return A.values
hpat_func = self.jit(test_impl)
df = pd.DataFrame({'A': ['aa', 'b', 'b', 'b', 'aa', 'aa', 'b'],
'B': ['ccc', 'a', 'bb', 'aa', 'dd', 'ggg', 'rr']})
# np.testing.assert_array_equal(hpat_func(df), test_impl(df))
self.assertEqual(set(hpat_func(df)), set(test_impl(df)))
@skip_numba_jit
def test_pivot(self):
def test_impl(df):
pt = df.pivot_table(index='A', columns='C', values='D', aggfunc='sum')
return (pt.small.values, pt.large.values)
hpat_func = self.jit(pivots={'pt': ['small', 'large']})(test_impl)
self.assertEqual(
set(hpat_func(_pivot_df1)[0]), set(test_impl(_pivot_df1)[0]))
self.assertEqual(
set(hpat_func(_pivot_df1)[1]), set(test_impl(_pivot_df1)[1]))
@skip_numba_jit
def test_pivot_parallel(self):
def test_impl():
df = pd.read_parquet("pivot2.pq")
pt = df.pivot_table(index='A', columns='C', values='D', aggfunc='sum')
res = pt.small.values.sum()
return res
hpat_func = self.jit(
pivots={'pt': ['small', 'large']})(test_impl)
self.assertEqual(hpat_func(), test_impl())
@skip_numba_jit
def test_crosstab1(self):
def test_impl(df):
pt = pd.crosstab(df.A, df.C)
return (pt.small.values, pt.large.values)
hpat_func = self.jit(pivots={'pt': ['small', 'large']})(test_impl)
self.assertEqual(
set(hpat_func(_pivot_df1)[0]), set(test_impl(_pivot_df1)[0]))
self.assertEqual(
set(hpat_func(_pivot_df1)[1]), set(test_impl(_pivot_df1)[1]))
@skip_numba_jit
def test_crosstab_parallel1(self):
def test_impl():
df = pd.read_parquet("pivot2.pq")
pt = pd.crosstab(df.A, df.C)
res = pt.small.values.sum()
return res
hpat_func = self.jit(
pivots={'pt': ['small', 'large']})(test_impl)
self.assertEqual(hpat_func(), test_impl())
@unittest.skip("Implement groupby(lambda) for DataFrame")
def test_groupby_lambda(self):
def test_impl(df):
group = df.groupby(lambda x: x % 2 == 0)
return group.count()
df = pd.DataFrame({'A': [2, 1, 1, 1, 2, 2, 1], 'B': [-8, 2, 3, 1, 5, 6, 7]})
hpat_func = self.jit(test_impl)
pd.testing.assert_frame_equal(hpat_func(df), test_impl(df))
def test_dataframe_groupby_getitem_literal_tuple(self):
def test_impl(df):
return df.groupby('A')['B', 'C'].count()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_frame_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_getitem_literal_str(self):
def test_impl(df):
return df.groupby('C')['B'].count()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
result = hpat_func(df)
result_ref = test_impl(df)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_series_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_getitem_unicode_str(self):
def test_impl(df, col_name):
return df.groupby('A')[col_name].count()
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
col_name = 'C'
# pandas returns groupby.generic.SeriesGroupBy object in this case, hence align result_ref
result = hpat_func(df, col_name)
result_ref = test_impl(df, col_name)
# TODO: implement index classes, as current indexes do not have names
pd.testing.assert_series_equal(result, result_ref, check_names=False)
def test_dataframe_groupby_getitem_repeated(self):
def test_impl(df):
return df.groupby('A')['B', 'C']['D']
hpat_func = self.jit(test_impl)
df = pd.DataFrame(_default_df_numeric_data)
with self.assertRaises(Exception) as context:
test_impl(df)
pandas_exception = context.exception
self.assertRaises(type(pandas_exception), hpat_func, df)
def test_series_groupby_by_array(self):
def test_impl(A, data):
return A.groupby(data).count()
hpat_func = self.jit(test_impl)
data_to_test = [
[True, False, False, True, False, False, True, False, True, True, False],
[2, 1, 1, 1, 2, 2, 1, 0, 3, 1, 3],
[2, 1, 1, 1, 2, 2, 1, 3, np.nan, 1, np.nan],
['b', 'a', 'a', 'a', 'b', 'b', 'a', ' ', None, 'a', None]
]
for series_data, arr_data in product(data_to_test, data_to_test):
S = pd.Series(series_data)
by_arr = np.asarray(arr_data)
# arrays of dtype object cannot be jitted, so skip group by string data for now
if by_arr.dtype.name == 'object':
continue
with self.subTest(series_data=series_data, by_arr=by_arr):
result = hpat_func(S, by_arr)
result_ref = test_impl(S, by_arr)
pd.testing.assert_series_equal(result, result_ref)
@unittest.skip("getiter for this type is not implemented yet")
def test_series_groupby_iterator_int(self):
def test_impl():
A = pd.Series([13, 11, 21, 13, 13, 51, 42, 21])
grouped = A.groupby(A)
return [i for i in grouped]
hpat_func = self.jit(test_impl)
ref_result = test_impl()
result = hpat_func()
np.testing.assert_array_equal(result, ref_result)
if __name__ == "__main__":
unittest.main()
| 39.428733 | 114 | 0.560953 | 5,072 | 34,855 | 3.656743 | 0.06664 | 0.081091 | 0.062005 | 0.048525 | 0.812369 | 0.781528 | 0.769666 | 0.757373 | 0.749717 | 0.747237 | 0 | 0.036877 | 0.291235 | 34,855 | 883 | 115 | 39.473386 | 0.713892 | 0.109855 | 0 | 0.685377 | 0 | 0 | 0.02994 | 0 | 0 | 0 | 0 | 0.001133 | 0.129985 | 1 | 0.180207 | false | 0 | 0.016248 | 0.039882 | 0.288035 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
90d90b51c2e2b1b6ea5095029ea4b6f9ba8c1bdc | 25 | py | Python | leech/__init__.py | philipphager/leech | 89e9d1be5487f502289529aa59318f5ad8c94ed8 | [
"MIT"
] | 2 | 2018-04-12T06:08:35.000Z | 2018-04-14T05:53:33.000Z | leech/__init__.py | philipphager/leech | 89e9d1be5487f502289529aa59318f5ad8c94ed8 | [
"MIT"
] | 2 | 2018-04-12T07:46:34.000Z | 2018-04-12T07:46:58.000Z | leech/__init__.py | philipphager/leech | 89e9d1be5487f502289529aa59318f5ad8c94ed8 | [
"MIT"
] | null | null | null | from .leech import leech
| 12.5 | 24 | 0.8 | 4 | 25 | 5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 25 | 1 | 25 | 25 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
292caed0910edfcd24f6535a91658873f479dc38 | 26,235 | py | Python | src/cicadad/protos/datastore_pb2_grpc.py | cicadatesting/cicada-distributed | cb9caa4107fd5da30e508f34e6e11d0f8f58c142 | [
"Apache-2.0"
] | 6 | 2021-07-12T20:53:13.000Z | 2022-01-14T19:34:25.000Z | src/cicadad/protos/datastore_pb2_grpc.py | cicadatesting/cicada-distributed | cb9caa4107fd5da30e508f34e6e11d0f8f58c142 | [
"Apache-2.0"
] | 9 | 2021-04-24T04:20:12.000Z | 2022-03-22T02:14:17.000Z | src/cicadad/protos/datastore_pb2_grpc.py | cicadatesting/cicada-distributed | cb9caa4107fd5da30e508f34e6e11d0f8f58c142 | [
"Apache-2.0"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from cicadad.protos import datastore_pb2 as cicadad_dot_protos_dot_datastore__pb2
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
class DatastoreStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.AddTestEvent = channel.unary_unary(
'/datastore.Datastore/AddTestEvent',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.AddEventRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GetTestEvents = channel.unary_unary(
'/datastore.Datastore/GetTestEvents',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.GetEventsRequest.SerializeToString,
response_deserializer=cicadad_dot_protos_dot_datastore__pb2.Events.FromString,
)
self.AddUserResult = channel.unary_unary(
'/datastore.Datastore/AddUserResult',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.AddUserResultRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.SetScenarioResult = channel.unary_unary(
'/datastore.Datastore/SetScenarioResult',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.SetScenarioResultRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.MoveUserResults = channel.unary_unary(
'/datastore.Datastore/MoveUserResults',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.MoveUserResultsRequest.SerializeToString,
response_deserializer=cicadad_dot_protos_dot_datastore__pb2.MoveUserResultsResponse.FromString,
)
self.MoveScenarioResult = channel.unary_unary(
'/datastore.Datastore/MoveScenarioResult',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.MoveScenarioResultRequest.SerializeToString,
response_deserializer=cicadad_dot_protos_dot_datastore__pb2.MoveScenarioResultResponse.FromString,
)
self.DistributeWork = channel.unary_unary(
'/datastore.Datastore/DistributeWork',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.DistributeWorkRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GetUserWork = channel.unary_unary(
'/datastore.Datastore/GetUserWork',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.GetUserWorkRequest.SerializeToString,
response_deserializer=cicadad_dot_protos_dot_datastore__pb2.GetUserWorkResponse.FromString,
)
self.AddUserEvent = channel.unary_unary(
'/datastore.Datastore/AddUserEvent',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.AddEventRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GetUserEvents = channel.unary_unary(
'/datastore.Datastore/GetUserEvents',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.GetEventsRequest.SerializeToString,
response_deserializer=cicadad_dot_protos_dot_datastore__pb2.Events.FromString,
)
self.AddMetric = channel.unary_unary(
'/datastore.Datastore/AddMetric',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.AddMetricRequest.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GetMetricTotal = channel.unary_unary(
'/datastore.Datastore/GetMetricTotal',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.GetMetricRequest.SerializeToString,
response_deserializer=cicadad_dot_protos_dot_datastore__pb2.MetricTotalResponse.FromString,
)
self.GetLastMetric = channel.unary_unary(
'/datastore.Datastore/GetLastMetric',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.GetMetricRequest.SerializeToString,
response_deserializer=cicadad_dot_protos_dot_datastore__pb2.LastMetricResponse.FromString,
)
self.GetMetricRate = channel.unary_unary(
'/datastore.Datastore/GetMetricRate',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.GetMetricRateRequest.SerializeToString,
response_deserializer=cicadad_dot_protos_dot_datastore__pb2.MetricRateResponse.FromString,
)
self.GetMetricStatistics = channel.unary_unary(
'/datastore.Datastore/GetMetricStatistics',
request_serializer=cicadad_dot_protos_dot_datastore__pb2.GetMetricRequest.SerializeToString,
response_deserializer=cicadad_dot_protos_dot_datastore__pb2.MetricStatisticsResponse.FromString,
)
class DatastoreServicer(object):
"""Missing associated documentation comment in .proto file."""
def AddTestEvent(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetTestEvents(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def AddUserResult(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def SetScenarioResult(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def MoveUserResults(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def MoveScenarioResult(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DistributeWork(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetUserWork(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def AddUserEvent(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetUserEvents(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def AddMetric(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetMetricTotal(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetLastMetric(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetMetricRate(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetMetricStatistics(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_DatastoreServicer_to_server(servicer, server):
rpc_method_handlers = {
'AddTestEvent': grpc.unary_unary_rpc_method_handler(
servicer.AddTestEvent,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.AddEventRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GetTestEvents': grpc.unary_unary_rpc_method_handler(
servicer.GetTestEvents,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.GetEventsRequest.FromString,
response_serializer=cicadad_dot_protos_dot_datastore__pb2.Events.SerializeToString,
),
'AddUserResult': grpc.unary_unary_rpc_method_handler(
servicer.AddUserResult,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.AddUserResultRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'SetScenarioResult': grpc.unary_unary_rpc_method_handler(
servicer.SetScenarioResult,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.SetScenarioResultRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'MoveUserResults': grpc.unary_unary_rpc_method_handler(
servicer.MoveUserResults,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.MoveUserResultsRequest.FromString,
response_serializer=cicadad_dot_protos_dot_datastore__pb2.MoveUserResultsResponse.SerializeToString,
),
'MoveScenarioResult': grpc.unary_unary_rpc_method_handler(
servicer.MoveScenarioResult,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.MoveScenarioResultRequest.FromString,
response_serializer=cicadad_dot_protos_dot_datastore__pb2.MoveScenarioResultResponse.SerializeToString,
),
'DistributeWork': grpc.unary_unary_rpc_method_handler(
servicer.DistributeWork,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.DistributeWorkRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GetUserWork': grpc.unary_unary_rpc_method_handler(
servicer.GetUserWork,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.GetUserWorkRequest.FromString,
response_serializer=cicadad_dot_protos_dot_datastore__pb2.GetUserWorkResponse.SerializeToString,
),
'AddUserEvent': grpc.unary_unary_rpc_method_handler(
servicer.AddUserEvent,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.AddEventRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GetUserEvents': grpc.unary_unary_rpc_method_handler(
servicer.GetUserEvents,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.GetEventsRequest.FromString,
response_serializer=cicadad_dot_protos_dot_datastore__pb2.Events.SerializeToString,
),
'AddMetric': grpc.unary_unary_rpc_method_handler(
servicer.AddMetric,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.AddMetricRequest.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GetMetricTotal': grpc.unary_unary_rpc_method_handler(
servicer.GetMetricTotal,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.GetMetricRequest.FromString,
response_serializer=cicadad_dot_protos_dot_datastore__pb2.MetricTotalResponse.SerializeToString,
),
'GetLastMetric': grpc.unary_unary_rpc_method_handler(
servicer.GetLastMetric,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.GetMetricRequest.FromString,
response_serializer=cicadad_dot_protos_dot_datastore__pb2.LastMetricResponse.SerializeToString,
),
'GetMetricRate': grpc.unary_unary_rpc_method_handler(
servicer.GetMetricRate,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.GetMetricRateRequest.FromString,
response_serializer=cicadad_dot_protos_dot_datastore__pb2.MetricRateResponse.SerializeToString,
),
'GetMetricStatistics': grpc.unary_unary_rpc_method_handler(
servicer.GetMetricStatistics,
request_deserializer=cicadad_dot_protos_dot_datastore__pb2.GetMetricRequest.FromString,
response_serializer=cicadad_dot_protos_dot_datastore__pb2.MetricStatisticsResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'datastore.Datastore', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class Datastore(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def AddTestEvent(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/AddTestEvent',
cicadad_dot_protos_dot_datastore__pb2.AddEventRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetTestEvents(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/GetTestEvents',
cicadad_dot_protos_dot_datastore__pb2.GetEventsRequest.SerializeToString,
cicadad_dot_protos_dot_datastore__pb2.Events.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def AddUserResult(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/AddUserResult',
cicadad_dot_protos_dot_datastore__pb2.AddUserResultRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def SetScenarioResult(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/SetScenarioResult',
cicadad_dot_protos_dot_datastore__pb2.SetScenarioResultRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def MoveUserResults(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/MoveUserResults',
cicadad_dot_protos_dot_datastore__pb2.MoveUserResultsRequest.SerializeToString,
cicadad_dot_protos_dot_datastore__pb2.MoveUserResultsResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def MoveScenarioResult(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/MoveScenarioResult',
cicadad_dot_protos_dot_datastore__pb2.MoveScenarioResultRequest.SerializeToString,
cicadad_dot_protos_dot_datastore__pb2.MoveScenarioResultResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def DistributeWork(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/DistributeWork',
cicadad_dot_protos_dot_datastore__pb2.DistributeWorkRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetUserWork(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/GetUserWork',
cicadad_dot_protos_dot_datastore__pb2.GetUserWorkRequest.SerializeToString,
cicadad_dot_protos_dot_datastore__pb2.GetUserWorkResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def AddUserEvent(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/AddUserEvent',
cicadad_dot_protos_dot_datastore__pb2.AddEventRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetUserEvents(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/GetUserEvents',
cicadad_dot_protos_dot_datastore__pb2.GetEventsRequest.SerializeToString,
cicadad_dot_protos_dot_datastore__pb2.Events.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def AddMetric(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/AddMetric',
cicadad_dot_protos_dot_datastore__pb2.AddMetricRequest.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetMetricTotal(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/GetMetricTotal',
cicadad_dot_protos_dot_datastore__pb2.GetMetricRequest.SerializeToString,
cicadad_dot_protos_dot_datastore__pb2.MetricTotalResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetLastMetric(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/GetLastMetric',
cicadad_dot_protos_dot_datastore__pb2.GetMetricRequest.SerializeToString,
cicadad_dot_protos_dot_datastore__pb2.LastMetricResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetMetricRate(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/GetMetricRate',
cicadad_dot_protos_dot_datastore__pb2.GetMetricRateRequest.SerializeToString,
cicadad_dot_protos_dot_datastore__pb2.MetricRateResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetMetricStatistics(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/datastore.Datastore/GetMetricStatistics',
cicadad_dot_protos_dot_datastore__pb2.GetMetricRequest.SerializeToString,
cicadad_dot_protos_dot_datastore__pb2.MetricStatisticsResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
| 49.5 | 123 | 0.682752 | 2,345 | 26,235 | 7.266098 | 0.054158 | 0.052116 | 0.068549 | 0.081401 | 0.875697 | 0.844885 | 0.841423 | 0.7896 | 0.673748 | 0.637244 | 0 | 0.004774 | 0.249438 | 26,235 | 529 | 124 | 49.593573 | 0.860545 | 0.047875 | 0 | 0.615551 | 1 | 0 | 0.078746 | 0.041928 | 0 | 0 | 0 | 0 | 0 | 1 | 0.069114 | false | 0 | 0.006479 | 0.032397 | 0.114471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
2961ef2e3b4b3491899de158441864bde07be3a2 | 20 | py | Python | msgtracker/endpoints/__init__.py | mpillar/msg-tracker | 16edb9d555795d0eec625dd954e14f914cbbbe2b | [
"MIT"
] | null | null | null | msgtracker/endpoints/__init__.py | mpillar/msg-tracker | 16edb9d555795d0eec625dd954e14f914cbbbe2b | [
"MIT"
] | null | null | null | msgtracker/endpoints/__init__.py | mpillar/msg-tracker | 16edb9d555795d0eec625dd954e14f914cbbbe2b | [
"MIT"
] | null | null | null | from . import slack
| 10 | 19 | 0.75 | 3 | 20 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 20 | 1 | 20 | 20 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
296798d4c7b6fd649de9ae9e5d76dd5f38eaa7e0 | 30 | py | Python | messaging_components/brokers/artemis/management/__init__.py | fgiorgetti/qpid-dispatch-tests | 164c609d28db87692eed53d5361aa1ee5c97375c | [
"Apache-2.0"
] | null | null | null | messaging_components/brokers/artemis/management/__init__.py | fgiorgetti/qpid-dispatch-tests | 164c609d28db87692eed53d5361aa1ee5c97375c | [
"Apache-2.0"
] | null | null | null | messaging_components/brokers/artemis/management/__init__.py | fgiorgetti/qpid-dispatch-tests | 164c609d28db87692eed53d5361aa1ee5c97375c | [
"Apache-2.0"
] | null | null | null | from .jolokia_client import *
| 15 | 29 | 0.8 | 4 | 30 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
466a5f643ce0af3c0d843317db00fa916b3a8405 | 40 | py | Python | doepy/param_covar/__init__.py | scwolof/doepy | acb2cad95428de2c14b28563cff1aa30679e1f39 | [
"MIT"
] | 1 | 2020-04-23T13:43:35.000Z | 2020-04-23T13:43:35.000Z | doepy/param_covar/__init__.py | scwolof/doepy | acb2cad95428de2c14b28563cff1aa30679e1f39 | [
"MIT"
] | null | null | null | doepy/param_covar/__init__.py | scwolof/doepy | acb2cad95428de2c14b28563cff1aa30679e1f39 | [
"MIT"
] | 1 | 2021-06-13T14:38:32.000Z | 2021-06-13T14:38:32.000Z | from .laplace import state_space_laplace | 40 | 40 | 0.9 | 6 | 40 | 5.666667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 1 | 40 | 40 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
466a895a06163dfc7ddf957ab921a0d41944ec98 | 126 | py | Python | apps/plugins/tanzawa_plugin/now/admin.py | rmdes/tanzawa | d53baa10bd6c217cd18628437a88a43e3bd02b70 | [
"Apache-2.0"
] | 25 | 2021-06-13T03:38:44.000Z | 2022-03-15T15:53:31.000Z | apps/plugins/tanzawa_plugin/now/admin.py | rmdes/tanzawa | d53baa10bd6c217cd18628437a88a43e3bd02b70 | [
"Apache-2.0"
] | 59 | 2021-06-12T23:35:06.000Z | 2022-03-24T21:40:24.000Z | apps/plugins/tanzawa_plugin/now/admin.py | rmdes/tanzawa | d53baa10bd6c217cd18628437a88a43e3bd02b70 | [
"Apache-2.0"
] | null | null | null | from django.contrib import admin
from . import models
admin.site.register(models.TNow)
admin.site.register(models.TFileNow)
| 18 | 36 | 0.809524 | 18 | 126 | 5.666667 | 0.555556 | 0.176471 | 0.333333 | 0.45098 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 126 | 6 | 37 | 21 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
46b072cf85bd61275fdf6d2b12fa53f3915d98ce | 22,748 | py | Python | modules/ESP32/greekc.py | ccccmagicboy/MicroPython_fw | d2049bc19e3d5010f5d6d0d17aa13a8693914fbd | [
"MIT"
] | 23 | 2020-01-22T00:40:20.000Z | 2021-08-03T20:42:07.000Z | modules/ESP32/greekc.py | ccccmagicboy/MicroPython_fw | d2049bc19e3d5010f5d6d0d17aa13a8693914fbd | [
"MIT"
] | 10 | 2020-02-18T09:57:04.000Z | 2020-03-04T11:39:17.000Z | modules/ESP32/greekc.py | ccccmagicboy/MicroPython_fw | d2049bc19e3d5010f5d6d0d17aa13a8693914fbd | [
"MIT"
] | 5 | 2020-02-20T09:35:45.000Z | 2022-01-04T16:23:13.000Z | def glyphs():
return 96
_font =\
b'\x00\x4a\x5a\x0e\x4d\x57\x52\x46\x51\x48\x52\x54\x53\x48\x52'\
b'\x46\x20\x52\x52\x48\x52\x4e\x20\x52\x52\x59\x51\x5a\x52\x5b'\
b'\x53\x5a\x52\x59\x0b\x4a\x5a\x4e\x46\x4d\x4d\x20\x52\x4f\x46'\
b'\x4d\x4d\x20\x52\x56\x46\x55\x4d\x20\x52\x57\x46\x55\x4d\x0b'\
b'\x48\x5d\x53\x42\x4c\x62\x20\x52\x59\x42\x52\x62\x20\x52\x4c'\
b'\x4f\x5a\x4f\x20\x52\x4b\x55\x59\x55\x29\x48\x5c\x50\x42\x50'\
b'\x5f\x20\x52\x54\x42\x54\x5f\x20\x52\x58\x49\x57\x4a\x58\x4b'\
b'\x59\x4a\x59\x49\x57\x47\x54\x46\x50\x46\x4d\x47\x4b\x49\x4b'\
b'\x4b\x4c\x4d\x4d\x4e\x4f\x4f\x55\x51\x57\x52\x59\x54\x20\x52'\
b'\x4b\x4b\x4d\x4d\x4f\x4e\x55\x50\x57\x51\x58\x52\x59\x54\x59'\
b'\x58\x57\x5a\x54\x5b\x50\x5b\x4d\x5a\x4b\x58\x4b\x57\x4c\x56'\
b'\x4d\x57\x4c\x58\x1f\x46\x5e\x5b\x46\x49\x5b\x20\x52\x4e\x46'\
b'\x50\x48\x50\x4a\x4f\x4c\x4d\x4d\x4b\x4d\x49\x4b\x49\x49\x4a'\
b'\x47\x4c\x46\x4e\x46\x50\x47\x53\x48\x56\x48\x59\x47\x5b\x46'\
b'\x20\x52\x57\x54\x55\x55\x54\x57\x54\x59\x56\x5b\x58\x5b\x5a'\
b'\x5a\x5b\x58\x5b\x56\x59\x54\x57\x54\x30\x46\x5f\x5b\x4e\x5a'\
b'\x4f\x5b\x50\x5c\x4f\x5c\x4e\x5b\x4d\x5a\x4d\x59\x4e\x58\x50'\
b'\x56\x55\x54\x58\x52\x5a\x50\x5b\x4d\x5b\x4a\x5a\x49\x58\x49'\
b'\x55\x4a\x53\x50\x4f\x52\x4d\x53\x4b\x53\x49\x52\x47\x50\x46'\
b'\x4e\x47\x4d\x49\x4d\x4b\x4e\x4e\x50\x51\x55\x58\x57\x5a\x5a'\
b'\x5b\x5b\x5b\x5c\x5a\x5c\x59\x20\x52\x4d\x5b\x4b\x5a\x4a\x58'\
b'\x4a\x55\x4b\x53\x4d\x51\x20\x52\x4d\x4b\x4e\x4d\x56\x58\x58'\
b'\x5a\x5a\x5b\x05\x4e\x56\x52\x46\x51\x4d\x20\x52\x53\x46\x51'\
b'\x4d\x17\x4b\x59\x55\x42\x53\x44\x51\x47\x4f\x4b\x4e\x50\x4e'\
b'\x54\x4f\x59\x51\x5d\x53\x60\x55\x62\x56\x62\x20\x52\x55\x42'\
b'\x56\x42\x54\x44\x52\x47\x50\x4b\x4f\x50\x4f\x54\x50\x59\x52'\
b'\x5d\x54\x60\x56\x62\x17\x4b\x59\x4e\x42\x50\x44\x52\x47\x54'\
b'\x4b\x55\x50\x55\x54\x54\x59\x52\x5d\x50\x60\x4e\x62\x4f\x62'\
b'\x20\x52\x4e\x42\x4f\x42\x51\x44\x53\x47\x55\x4b\x56\x50\x56'\
b'\x54\x55\x59\x53\x5d\x51\x60\x4f\x62\x08\x4a\x5a\x52\x4c\x52'\
b'\x58\x20\x52\x4d\x4f\x57\x55\x20\x52\x57\x4f\x4d\x55\x05\x45'\
b'\x5f\x52\x49\x52\x5b\x20\x52\x49\x52\x5b\x52\x07\x4e\x56\x53'\
b'\x57\x52\x58\x51\x57\x52\x56\x53\x57\x53\x59\x51\x5b\x02\x45'\
b'\x5f\x49\x52\x5b\x52\x05\x4e\x56\x52\x56\x51\x57\x52\x58\x53'\
b'\x57\x52\x56\x02\x47\x5d\x5b\x42\x49\x62\x27\x48\x5c\x51\x46'\
b'\x4e\x47\x4c\x4a\x4b\x4f\x4b\x52\x4c\x57\x4e\x5a\x51\x5b\x53'\
b'\x5b\x56\x5a\x58\x57\x59\x52\x59\x4f\x58\x4a\x56\x47\x53\x46'\
b'\x51\x46\x20\x52\x51\x46\x4f\x47\x4e\x48\x4d\x4a\x4c\x4f\x4c'\
b'\x52\x4d\x57\x4e\x59\x4f\x5a\x51\x5b\x20\x52\x53\x5b\x55\x5a'\
b'\x56\x59\x57\x57\x58\x52\x58\x4f\x57\x4a\x56\x48\x55\x47\x53'\
b'\x46\x0a\x48\x5c\x4e\x4a\x50\x49\x53\x46\x53\x5b\x20\x52\x52'\
b'\x47\x52\x5b\x20\x52\x4e\x5b\x57\x5b\x2c\x48\x5c\x4c\x4a\x4d'\
b'\x4b\x4c\x4c\x4b\x4b\x4b\x4a\x4c\x48\x4d\x47\x50\x46\x54\x46'\
b'\x57\x47\x58\x48\x59\x4a\x59\x4c\x58\x4e\x55\x50\x50\x52\x4e'\
b'\x53\x4c\x55\x4b\x58\x4b\x5b\x20\x52\x54\x46\x56\x47\x57\x48'\
b'\x58\x4a\x58\x4c\x57\x4e\x54\x50\x50\x52\x20\x52\x4b\x59\x4c'\
b'\x58\x4e\x58\x53\x5a\x56\x5a\x58\x59\x59\x58\x20\x52\x4e\x58'\
b'\x53\x5b\x57\x5b\x58\x5a\x59\x58\x59\x56\x2e\x48\x5c\x4c\x4a'\
b'\x4d\x4b\x4c\x4c\x4b\x4b\x4b\x4a\x4c\x48\x4d\x47\x50\x46\x54'\
b'\x46\x57\x47\x58\x49\x58\x4c\x57\x4e\x54\x4f\x51\x4f\x20\x52'\
b'\x54\x46\x56\x47\x57\x49\x57\x4c\x56\x4e\x54\x4f\x20\x52\x54'\
b'\x4f\x56\x50\x58\x52\x59\x54\x59\x57\x58\x59\x57\x5a\x54\x5b'\
b'\x50\x5b\x4d\x5a\x4c\x59\x4b\x57\x4b\x56\x4c\x55\x4d\x56\x4c'\
b'\x57\x20\x52\x57\x51\x58\x54\x58\x57\x57\x59\x56\x5a\x54\x5b'\
b'\x0c\x48\x5c\x54\x48\x54\x5b\x20\x52\x55\x46\x55\x5b\x20\x52'\
b'\x55\x46\x4a\x55\x5a\x55\x20\x52\x51\x5b\x58\x5b\x26\x48\x5c'\
b'\x4d\x46\x4b\x50\x20\x52\x4b\x50\x4d\x4e\x50\x4d\x53\x4d\x56'\
b'\x4e\x58\x50\x59\x53\x59\x55\x58\x58\x56\x5a\x53\x5b\x50\x5b'\
b'\x4d\x5a\x4c\x59\x4b\x57\x4b\x56\x4c\x55\x4d\x56\x4c\x57\x20'\
b'\x52\x53\x4d\x55\x4e\x57\x50\x58\x53\x58\x55\x57\x58\x55\x5a'\
b'\x53\x5b\x20\x52\x4d\x46\x57\x46\x20\x52\x4d\x47\x52\x47\x57'\
b'\x46\x2f\x48\x5c\x57\x49\x56\x4a\x57\x4b\x58\x4a\x58\x49\x57'\
b'\x47\x55\x46\x52\x46\x4f\x47\x4d\x49\x4c\x4b\x4b\x4f\x4b\x55'\
b'\x4c\x58\x4e\x5a\x51\x5b\x53\x5b\x56\x5a\x58\x58\x59\x55\x59'\
b'\x54\x58\x51\x56\x4f\x53\x4e\x52\x4e\x4f\x4f\x4d\x51\x4c\x54'\
b'\x20\x52\x52\x46\x50\x47\x4e\x49\x4d\x4b\x4c\x4f\x4c\x55\x4d'\
b'\x58\x4f\x5a\x51\x5b\x20\x52\x53\x5b\x55\x5a\x57\x58\x58\x55'\
b'\x58\x54\x57\x51\x55\x4f\x53\x4e\x1e\x48\x5c\x4b\x46\x4b\x4c'\
b'\x20\x52\x4b\x4a\x4c\x48\x4e\x46\x50\x46\x55\x49\x57\x49\x58'\
b'\x48\x59\x46\x20\x52\x4c\x48\x4e\x47\x50\x47\x55\x49\x20\x52'\
b'\x59\x46\x59\x49\x58\x4c\x54\x51\x53\x53\x52\x56\x52\x5b\x20'\
b'\x52\x58\x4c\x53\x51\x52\x53\x51\x56\x51\x5b\x3e\x48\x5c\x50'\
b'\x46\x4d\x47\x4c\x49\x4c\x4c\x4d\x4e\x50\x4f\x54\x4f\x57\x4e'\
b'\x58\x4c\x58\x49\x57\x47\x54\x46\x50\x46\x20\x52\x50\x46\x4e'\
b'\x47\x4d\x49\x4d\x4c\x4e\x4e\x50\x4f\x20\x52\x54\x4f\x56\x4e'\
b'\x57\x4c\x57\x49\x56\x47\x54\x46\x20\x52\x50\x4f\x4d\x50\x4c'\
b'\x51\x4b\x53\x4b\x57\x4c\x59\x4d\x5a\x50\x5b\x54\x5b\x57\x5a'\
b'\x58\x59\x59\x57\x59\x53\x58\x51\x57\x50\x54\x4f\x20\x52\x50'\
b'\x4f\x4e\x50\x4d\x51\x4c\x53\x4c\x57\x4d\x59\x4e\x5a\x50\x5b'\
b'\x20\x52\x54\x5b\x56\x5a\x57\x59\x58\x57\x58\x53\x57\x51\x56'\
b'\x50\x54\x4f\x2f\x48\x5c\x58\x4d\x57\x50\x55\x52\x52\x53\x51'\
b'\x53\x4e\x52\x4c\x50\x4b\x4d\x4b\x4c\x4c\x49\x4e\x47\x51\x46'\
b'\x53\x46\x56\x47\x58\x49\x59\x4c\x59\x52\x58\x56\x57\x58\x55'\
b'\x5a\x52\x5b\x4f\x5b\x4d\x5a\x4c\x58\x4c\x57\x4d\x56\x4e\x57'\
b'\x4d\x58\x20\x52\x51\x53\x4f\x52\x4d\x50\x4c\x4d\x4c\x4c\x4d'\
b'\x49\x4f\x47\x51\x46\x20\x52\x53\x46\x55\x47\x57\x49\x58\x4c'\
b'\x58\x52\x57\x56\x56\x58\x54\x5a\x52\x5b\x0b\x4e\x56\x52\x4f'\
b'\x51\x50\x52\x51\x53\x50\x52\x4f\x20\x52\x52\x56\x51\x57\x52'\
b'\x58\x53\x57\x52\x56\x0d\x4e\x56\x52\x4f\x51\x50\x52\x51\x53'\
b'\x50\x52\x4f\x20\x52\x53\x57\x52\x58\x51\x57\x52\x56\x53\x57'\
b'\x53\x59\x51\x5b\x03\x46\x5e\x5a\x49\x4a\x52\x5a\x5b\x05\x45'\
b'\x5f\x49\x4f\x5b\x4f\x20\x52\x49\x55\x5b\x55\x03\x46\x5e\x4a'\
b'\x49\x5a\x52\x4a\x5b\x1f\x49\x5b\x4d\x4a\x4e\x4b\x4d\x4c\x4c'\
b'\x4b\x4c\x4a\x4d\x48\x4e\x47\x50\x46\x53\x46\x56\x47\x57\x48'\
b'\x58\x4a\x58\x4c\x57\x4e\x56\x4f\x52\x51\x52\x54\x20\x52\x53'\
b'\x46\x55\x47\x56\x48\x57\x4a\x57\x4c\x56\x4e\x54\x50\x20\x52'\
b'\x52\x59\x51\x5a\x52\x5b\x53\x5a\x52\x59\x37\x45\x60\x57\x4e'\
b'\x56\x4c\x54\x4b\x51\x4b\x4f\x4c\x4e\x4d\x4d\x50\x4d\x53\x4e'\
b'\x55\x50\x56\x53\x56\x55\x55\x56\x53\x20\x52\x51\x4b\x4f\x4d'\
b'\x4e\x50\x4e\x53\x4f\x55\x50\x56\x20\x52\x57\x4b\x56\x53\x56'\
b'\x55\x58\x56\x5a\x56\x5c\x54\x5d\x51\x5d\x4f\x5c\x4c\x5b\x4a'\
b'\x59\x48\x57\x47\x54\x46\x51\x46\x4e\x47\x4c\x48\x4a\x4a\x49'\
b'\x4c\x48\x4f\x48\x52\x49\x55\x4a\x57\x4c\x59\x4e\x5a\x51\x5b'\
b'\x54\x5b\x57\x5a\x59\x59\x5a\x58\x20\x52\x58\x4b\x57\x53\x57'\
b'\x55\x58\x56\x11\x48\x5c\x52\x46\x4b\x5b\x20\x52\x52\x46\x59'\
b'\x5b\x20\x52\x52\x49\x58\x5b\x20\x52\x4d\x55\x56\x55\x20\x52'\
b'\x49\x5b\x4f\x5b\x20\x52\x55\x5b\x5b\x5b\x2c\x47\x5d\x4c\x46'\
b'\x4c\x5b\x20\x52\x4d\x46\x4d\x5b\x20\x52\x49\x46\x55\x46\x58'\
b'\x47\x59\x48\x5a\x4a\x5a\x4c\x59\x4e\x58\x4f\x55\x50\x20\x52'\
b'\x55\x46\x57\x47\x58\x48\x59\x4a\x59\x4c\x58\x4e\x57\x4f\x55'\
b'\x50\x20\x52\x4d\x50\x55\x50\x58\x51\x59\x52\x5a\x54\x5a\x57'\
b'\x59\x59\x58\x5a\x55\x5b\x49\x5b\x20\x52\x55\x50\x57\x51\x58'\
b'\x52\x59\x54\x59\x57\x58\x59\x57\x5a\x55\x5b\x14\x48\x5c\x4b'\
b'\x46\x58\x5b\x20\x52\x4c\x46\x59\x5b\x20\x52\x59\x46\x4b\x5b'\
b'\x20\x52\x49\x46\x4f\x46\x20\x52\x55\x46\x5b\x46\x20\x52\x49'\
b'\x5b\x4f\x5b\x20\x52\x55\x5b\x5b\x5b\x0e\x48\x5c\x52\x46\x4a'\
b'\x5b\x20\x52\x52\x46\x5a\x5b\x20\x52\x52\x49\x59\x5b\x20\x52'\
b'\x4b\x5a\x59\x5a\x20\x52\x4a\x5b\x5a\x5b\x15\x47\x5c\x4c\x46'\
b'\x4c\x5b\x20\x52\x4d\x46\x4d\x5b\x20\x52\x53\x4c\x53\x54\x20'\
b'\x52\x49\x46\x59\x46\x59\x4c\x58\x46\x20\x52\x4d\x50\x53\x50'\
b'\x20\x52\x49\x5b\x59\x5b\x59\x55\x58\x5b\x2f\x48\x5d\x52\x46'\
b'\x52\x5b\x20\x52\x53\x46\x53\x5b\x20\x52\x50\x4b\x4d\x4c\x4c'\
b'\x4d\x4b\x4f\x4b\x52\x4c\x54\x4d\x55\x50\x56\x55\x56\x58\x55'\
b'\x59\x54\x5a\x52\x5a\x4f\x59\x4d\x58\x4c\x55\x4b\x50\x4b\x20'\
b'\x52\x50\x4b\x4e\x4c\x4d\x4d\x4c\x4f\x4c\x52\x4d\x54\x4e\x55'\
b'\x50\x56\x20\x52\x55\x56\x57\x55\x58\x54\x59\x52\x59\x4f\x58'\
b'\x4d\x57\x4c\x55\x4b\x20\x52\x4f\x46\x56\x46\x20\x52\x4f\x5b'\
b'\x56\x5b\x0d\x49\x5b\x4e\x46\x4e\x5b\x20\x52\x4f\x46\x4f\x5b'\
b'\x20\x52\x4b\x46\x5a\x46\x5a\x4c\x59\x46\x20\x52\x4b\x5b\x52'\
b'\x5b\x1a\x46\x5e\x4b\x46\x4b\x5b\x20\x52\x4c\x46\x4c\x5b\x20'\
b'\x52\x58\x46\x58\x5b\x20\x52\x59\x46\x59\x5b\x20\x52\x48\x46'\
b'\x4f\x46\x20\x52\x55\x46\x5c\x46\x20\x52\x4c\x50\x58\x50\x20'\
b'\x52\x48\x5b\x4f\x5b\x20\x52\x55\x5b\x5c\x5b\x0b\x4d\x58\x52'\
b'\x46\x52\x5b\x20\x52\x53\x46\x53\x5b\x20\x52\x4f\x46\x56\x46'\
b'\x20\x52\x4f\x5b\x56\x5b\x05\x50\x55\x52\x51\x52\x52\x53\x52'\
b'\x53\x51\x52\x51\x1a\x46\x5c\x4b\x46\x4b\x5b\x20\x52\x4c\x46'\
b'\x4c\x5b\x20\x52\x59\x46\x4c\x53\x20\x52\x51\x4f\x59\x5b\x20'\
b'\x52\x50\x4f\x58\x5b\x20\x52\x48\x46\x4f\x46\x20\x52\x55\x46'\
b'\x5b\x46\x20\x52\x48\x5b\x4f\x5b\x20\x52\x55\x5b\x5b\x5b\x0e'\
b'\x48\x5c\x52\x46\x4b\x5b\x20\x52\x52\x46\x59\x5b\x20\x52\x52'\
b'\x49\x58\x5b\x20\x52\x49\x5b\x4f\x5b\x20\x52\x55\x5b\x5b\x5b'\
b'\x1d\x46\x5f\x4b\x46\x4b\x5b\x20\x52\x4c\x46\x52\x58\x20\x52'\
b'\x4b\x46\x52\x5b\x20\x52\x59\x46\x52\x5b\x20\x52\x59\x46\x59'\
b'\x5b\x20\x52\x5a\x46\x5a\x5b\x20\x52\x48\x46\x4c\x46\x20\x52'\
b'\x59\x46\x5d\x46\x20\x52\x48\x5b\x4e\x5b\x20\x52\x56\x5b\x5d'\
b'\x5b\x14\x47\x5e\x4c\x46\x4c\x5b\x20\x52\x4d\x46\x59\x59\x20'\
b'\x52\x4d\x48\x59\x5b\x20\x52\x59\x46\x59\x5b\x20\x52\x49\x46'\
b'\x4d\x46\x20\x52\x56\x46\x5c\x46\x20\x52\x49\x5b\x4f\x5b\x2b'\
b'\x47\x5d\x51\x46\x4e\x47\x4c\x49\x4b\x4b\x4a\x4f\x4a\x52\x4b'\
b'\x56\x4c\x58\x4e\x5a\x51\x5b\x53\x5b\x56\x5a\x58\x58\x59\x56'\
b'\x5a\x52\x5a\x4f\x59\x4b\x58\x49\x56\x47\x53\x46\x51\x46\x20'\
b'\x52\x51\x46\x4f\x47\x4d\x49\x4c\x4b\x4b\x4f\x4b\x52\x4c\x56'\
b'\x4d\x58\x4f\x5a\x51\x5b\x20\x52\x53\x5b\x55\x5a\x57\x58\x58'\
b'\x56\x59\x52\x59\x4f\x58\x4b\x57\x49\x55\x47\x53\x46\x14\x46'\
b'\x5e\x4b\x46\x4b\x5b\x20\x52\x4c\x46\x4c\x5b\x20\x52\x58\x46'\
b'\x58\x5b\x20\x52\x59\x46\x59\x5b\x20\x52\x48\x46\x5c\x46\x20'\
b'\x52\x48\x5b\x4f\x5b\x20\x52\x55\x5b\x5c\x5b\x37\x47\x5d\x51'\
b'\x46\x4e\x47\x4c\x49\x4b\x4b\x4a\x4f\x4a\x52\x4b\x56\x4c\x58'\
b'\x4e\x5a\x51\x5b\x53\x5b\x56\x5a\x58\x58\x59\x56\x5a\x52\x5a'\
b'\x4f\x59\x4b\x58\x49\x56\x47\x53\x46\x51\x46\x20\x52\x51\x46'\
b'\x4f\x47\x4d\x49\x4c\x4b\x4b\x4f\x4b\x52\x4c\x56\x4d\x58\x4f'\
b'\x5a\x51\x5b\x20\x52\x53\x5b\x55\x5a\x57\x58\x58\x56\x59\x52'\
b'\x59\x4f\x58\x4b\x57\x49\x55\x47\x53\x46\x20\x52\x4f\x4d\x4f'\
b'\x54\x20\x52\x55\x4d\x55\x54\x20\x52\x4f\x50\x55\x50\x20\x52'\
b'\x4f\x51\x55\x51\x1c\x47\x5d\x4c\x46\x4c\x5b\x20\x52\x4d\x46'\
b'\x4d\x5b\x20\x52\x49\x46\x55\x46\x58\x47\x59\x48\x5a\x4a\x5a'\
b'\x4d\x59\x4f\x58\x50\x55\x51\x4d\x51\x20\x52\x55\x46\x57\x47'\
b'\x58\x48\x59\x4a\x59\x4d\x58\x4f\x57\x50\x55\x51\x20\x52\x49'\
b'\x5b\x50\x5b\x13\x48\x5d\x4b\x46\x52\x50\x4a\x5b\x20\x52\x4a'\
b'\x46\x51\x50\x20\x52\x4a\x46\x59\x46\x5a\x4c\x58\x46\x20\x52'\
b'\x4b\x5a\x58\x5a\x20\x52\x4a\x5b\x59\x5b\x5a\x55\x58\x5b\x0f'\
b'\x49\x5c\x52\x46\x52\x5b\x20\x52\x53\x46\x53\x5b\x20\x52\x4c'\
b'\x46\x4b\x4c\x4b\x46\x5a\x46\x5a\x4c\x59\x46\x20\x52\x4f\x5b'\
b'\x56\x5b\x20\x49\x5c\x4b\x4b\x4b\x49\x4c\x47\x4d\x46\x4f\x46'\
b'\x50\x47\x51\x49\x52\x4d\x52\x5b\x20\x52\x4b\x49\x4d\x47\x4f'\
b'\x47\x51\x49\x20\x52\x5a\x4b\x5a\x49\x59\x47\x58\x46\x56\x46'\
b'\x55\x47\x54\x49\x53\x4d\x53\x5b\x20\x52\x5a\x49\x58\x47\x56'\
b'\x47\x54\x49\x20\x52\x4f\x5b\x56\x5b\x0d\x4b\x59\x51\x46\x4f'\
b'\x47\x4e\x49\x4e\x4b\x4f\x4d\x51\x4e\x53\x4e\x55\x4d\x56\x4b'\
b'\x56\x49\x55\x47\x53\x46\x51\x46\x2a\x47\x5d\x4a\x58\x4b\x5b'\
b'\x4f\x5b\x4d\x57\x4b\x53\x4a\x50\x4a\x4c\x4b\x49\x4d\x47\x50'\
b'\x46\x54\x46\x57\x47\x59\x49\x5a\x4c\x5a\x50\x59\x53\x57\x57'\
b'\x55\x5b\x59\x5b\x5a\x58\x20\x52\x4d\x57\x4c\x54\x4b\x50\x4b'\
b'\x4c\x4c\x49\x4e\x47\x50\x46\x20\x52\x54\x46\x56\x47\x58\x49'\
b'\x59\x4c\x59\x50\x58\x54\x57\x57\x20\x52\x4b\x5a\x4e\x5a\x20'\
b'\x52\x56\x5a\x59\x5a\x23\x47\x5d\x4b\x45\x4a\x4a\x20\x52\x5a'\
b'\x45\x59\x4a\x20\x52\x4f\x4e\x4e\x53\x20\x52\x56\x4e\x55\x53'\
b'\x20\x52\x4b\x57\x4a\x5c\x20\x52\x5a\x57\x59\x5c\x20\x52\x4b'\
b'\x47\x59\x47\x20\x52\x4b\x48\x59\x48\x20\x52\x4f\x50\x55\x50'\
b'\x20\x52\x4f\x51\x55\x51\x20\x52\x4b\x59\x59\x59\x20\x52\x4b'\
b'\x5a\x59\x5a\x28\x47\x5e\x52\x46\x52\x5b\x20\x52\x53\x46\x53'\
b'\x5b\x20\x52\x49\x4d\x4a\x4c\x4c\x4d\x4d\x51\x4e\x53\x4f\x54'\
b'\x51\x55\x20\x52\x4a\x4c\x4b\x4d\x4c\x51\x4d\x53\x4e\x54\x51'\
b'\x55\x54\x55\x57\x54\x58\x53\x59\x51\x5a\x4d\x5b\x4c\x20\x52'\
b'\x54\x55\x56\x54\x57\x53\x58\x51\x59\x4d\x5b\x4c\x5c\x4d\x20'\
b'\x52\x4f\x46\x56\x46\x20\x52\x4f\x5b\x56\x5b\x0f\x48\x5c\x58'\
b'\x46\x4b\x5b\x20\x52\x59\x46\x4c\x5b\x20\x52\x4c\x46\x4b\x4c'\
b'\x4b\x46\x59\x46\x20\x52\x4b\x5b\x59\x5b\x59\x55\x58\x5b\x0b'\
b'\x4b\x59\x4f\x42\x4f\x62\x20\x52\x50\x42\x50\x62\x20\x52\x4f'\
b'\x42\x56\x42\x20\x52\x4f\x62\x56\x62\x02\x4b\x59\x4b\x46\x59'\
b'\x5e\x0b\x4b\x59\x54\x42\x54\x62\x20\x52\x55\x42\x55\x62\x20'\
b'\x52\x4e\x42\x55\x42\x20\x52\x4e\x62\x55\x62\x07\x47\x5d\x4a'\
b'\x54\x52\x4f\x5a\x54\x20\x52\x4a\x54\x52\x50\x5a\x54\x02\x48'\
b'\x5c\x48\x62\x5c\x62\x06\x4c\x58\x50\x46\x55\x4c\x20\x52\x50'\
b'\x46\x4f\x47\x55\x4c\x27\x47\x5e\x51\x4d\x4e\x4e\x4c\x50\x4b'\
b'\x52\x4a\x55\x4a\x58\x4b\x5a\x4e\x5b\x50\x5b\x52\x5a\x55\x57'\
b'\x57\x54\x59\x50\x5a\x4d\x20\x52\x51\x4d\x4f\x4e\x4d\x50\x4c'\
b'\x52\x4b\x55\x4b\x58\x4c\x5a\x4e\x5b\x20\x52\x51\x4d\x53\x4d'\
b'\x55\x4e\x56\x50\x58\x58\x59\x5a\x5a\x5b\x20\x52\x53\x4d\x54'\
b'\x4e\x55\x50\x57\x58\x58\x5a\x5a\x5b\x5b\x5b\x38\x47\x5c\x54'\
b'\x46\x51\x47\x4f\x49\x4d\x4d\x4c\x50\x4b\x54\x4a\x5a\x49\x62'\
b'\x20\x52\x54\x46\x52\x47\x50\x49\x4e\x4d\x4d\x50\x4c\x54\x4b'\
b'\x5a\x4a\x62\x20\x52\x54\x46\x56\x46\x58\x47\x59\x48\x59\x4b'\
b'\x58\x4d\x57\x4e\x54\x4f\x50\x4f\x20\x52\x56\x46\x58\x48\x58'\
b'\x4b\x57\x4d\x56\x4e\x54\x4f\x20\x52\x50\x4f\x54\x50\x56\x52'\
b'\x57\x54\x57\x57\x56\x59\x55\x5a\x52\x5b\x50\x5b\x4e\x5a\x4d'\
b'\x59\x4c\x56\x20\x52\x50\x4f\x53\x50\x55\x52\x56\x54\x56\x57'\
b'\x55\x59\x54\x5a\x52\x5b\x16\x49\x5b\x4b\x4d\x4d\x4d\x4f\x4e'\
b'\x50\x50\x55\x5f\x56\x61\x57\x62\x20\x52\x4d\x4d\x4e\x4e\x4f'\
b'\x50\x54\x5f\x55\x61\x57\x62\x59\x62\x20\x52\x5a\x4d\x59\x4f'\
b'\x57\x52\x4d\x5d\x4b\x60\x4a\x62\x2b\x49\x5c\x56\x4e\x54\x4d'\
b'\x52\x4d\x4f\x4e\x4d\x51\x4c\x54\x4c\x57\x4d\x59\x4e\x5a\x50'\
b'\x5b\x52\x5b\x55\x5a\x57\x57\x58\x54\x58\x51\x57\x4f\x53\x4a'\
b'\x52\x48\x52\x46\x53\x45\x55\x45\x57\x46\x59\x48\x20\x52\x52'\
b'\x4d\x50\x4e\x4e\x51\x4d\x54\x4d\x58\x4e\x5a\x20\x52\x52\x5b'\
b'\x54\x5a\x56\x57\x57\x54\x57\x50\x56\x4e\x54\x4b\x53\x49\x53'\
b'\x47\x54\x46\x56\x46\x59\x48\x1f\x49\x5b\x58\x50\x56\x4e\x54'\
b'\x4d\x50\x4d\x4e\x4e\x4e\x50\x50\x52\x53\x53\x20\x52\x50\x4d'\
b'\x4f\x4e\x4f\x50\x51\x52\x53\x53\x20\x52\x53\x53\x4e\x54\x4c'\
b'\x56\x4c\x58\x4d\x5a\x50\x5b\x53\x5b\x55\x5a\x57\x58\x20\x52'\
b'\x53\x53\x4f\x54\x4d\x56\x4d\x58\x4e\x5a\x50\x5b\x24\x47\x5d'\
b'\x4f\x4e\x4d\x4f\x4b\x51\x4a\x54\x4a\x57\x4b\x59\x4c\x5a\x4e'\
b'\x5b\x51\x5b\x54\x5a\x57\x58\x59\x55\x5a\x52\x5a\x4f\x58\x4d'\
b'\x56\x4d\x54\x4f\x52\x53\x50\x58\x4d\x62\x20\x52\x4a\x57\x4c'\
b'\x59\x4e\x5a\x51\x5a\x54\x59\x57\x57\x59\x55\x20\x52\x5a\x4f'\
b'\x58\x4e\x56\x4e\x54\x50\x52\x53\x50\x59\x4e\x62\x1b\x48\x5c'\
b'\x49\x50\x4b\x4e\x4d\x4d\x4f\x4d\x51\x4e\x52\x4f\x53\x52\x53'\
b'\x56\x52\x5a\x4f\x62\x20\x52\x4a\x4f\x4c\x4e\x50\x4e\x52\x4f'\
b'\x20\x52\x5a\x4d\x59\x50\x58\x52\x53\x59\x50\x5e\x4e\x62\x20'\
b'\x52\x59\x4d\x58\x50\x57\x52\x53\x59\x1f\x47\x5d\x48\x51\x49'\
b'\x4f\x4b\x4d\x4e\x4d\x4f\x4e\x4f\x50\x4e\x54\x4c\x5b\x20\x52'\
b'\x4d\x4d\x4e\x4e\x4e\x50\x4d\x54\x4b\x5b\x20\x52\x4e\x54\x50'\
b'\x50\x52\x4e\x54\x4d\x56\x4d\x58\x4e\x59\x4f\x59\x52\x58\x57'\
b'\x55\x62\x20\x52\x56\x4d\x58\x4f\x58\x52\x57\x57\x54\x62\x0e'\
b'\x4c\x58\x52\x4d\x50\x54\x4f\x58\x4f\x5a\x50\x5b\x53\x5b\x55'\
b'\x59\x56\x57\x20\x52\x53\x4d\x51\x54\x50\x58\x50\x5a\x51\x5b'\
b'\x05\x47\x5d\x4b\x4b\x59\x59\x20\x52\x59\x4b\x4b\x59\x1c\x48'\
b'\x5c\x4e\x4d\x4a\x5b\x20\x52\x4f\x4d\x4b\x5b\x20\x52\x58\x4d'\
b'\x59\x4e\x5a\x4e\x59\x4d\x57\x4d\x55\x4e\x51\x52\x4f\x53\x4d'\
b'\x53\x20\x52\x4f\x53\x51\x54\x53\x5a\x54\x5b\x20\x52\x4f\x53'\
b'\x50\x54\x52\x5a\x53\x5b\x55\x5b\x57\x5a\x59\x57\x16\x48\x5c'\
b'\x4b\x46\x4d\x46\x4f\x47\x50\x48\x51\x4a\x57\x58\x58\x5a\x59'\
b'\x5b\x20\x52\x4d\x46\x4f\x48\x50\x4a\x56\x58\x57\x5a\x59\x5b'\
b'\x5a\x5b\x20\x52\x52\x4d\x4a\x5b\x20\x52\x52\x4d\x4b\x5b\x1b'\
b'\x46\x5d\x4d\x4d\x47\x62\x20\x52\x4e\x4d\x48\x62\x20\x52\x4d'\
b'\x50\x4c\x56\x4c\x59\x4e\x5b\x50\x5b\x52\x5a\x54\x58\x56\x55'\
b'\x20\x52\x58\x4d\x55\x58\x55\x5a\x56\x5b\x59\x5b\x5b\x59\x5c'\
b'\x57\x20\x52\x59\x4d\x56\x58\x56\x5a\x57\x5b\x17\x48\x5c\x4e'\
b'\x4d\x4c\x5b\x20\x52\x4f\x4d\x4e\x53\x4d\x58\x4c\x5b\x20\x52'\
b'\x59\x4d\x58\x51\x56\x55\x20\x52\x5a\x4d\x59\x50\x58\x52\x56'\
b'\x55\x54\x57\x51\x59\x4f\x5a\x4c\x5b\x20\x52\x4b\x4d\x4f\x4d'\
b'\x1f\x49\x5b\x52\x4d\x4f\x4e\x4d\x51\x4c\x54\x4c\x57\x4d\x59'\
b'\x4e\x5a\x50\x5b\x52\x5b\x55\x5a\x57\x57\x58\x54\x58\x51\x57'\
b'\x4f\x56\x4e\x54\x4d\x52\x4d\x20\x52\x52\x4d\x50\x4e\x4e\x51'\
b'\x4d\x54\x4d\x58\x4e\x5a\x20\x52\x52\x5b\x54\x5a\x56\x57\x57'\
b'\x54\x57\x50\x56\x4e\x15\x47\x5d\x50\x4e\x4c\x5b\x20\x52\x50'\
b'\x4e\x4d\x5b\x20\x52\x56\x4e\x56\x5b\x20\x52\x56\x4e\x57\x5b'\
b'\x20\x52\x49\x50\x4b\x4e\x4e\x4d\x5b\x4d\x20\x52\x49\x50\x4b'\
b'\x4f\x4e\x4e\x5b\x4e\x2b\x46\x5d\x47\x51\x48\x4f\x4a\x4d\x4d'\
b'\x4d\x4e\x4e\x4e\x50\x4d\x55\x4d\x58\x4e\x5a\x4f\x5b\x20\x52'\
b'\x4c\x4d\x4d\x4e\x4d\x50\x4c\x55\x4c\x58\x4d\x5a\x4f\x5b\x51'\
b'\x5b\x53\x5a\x55\x58\x57\x55\x58\x52\x59\x4d\x59\x49\x58\x47'\
b'\x56\x46\x54\x46\x52\x48\x52\x4a\x53\x4d\x55\x50\x57\x52\x5a'\
b'\x54\x20\x52\x53\x5a\x55\x57\x56\x55\x57\x52\x58\x4d\x58\x49'\
b'\x57\x47\x56\x46\x1e\x48\x5b\x4c\x56\x4d\x59\x4e\x5a\x50\x5b'\
b'\x52\x5b\x55\x5a\x57\x57\x58\x54\x58\x51\x57\x4f\x56\x4e\x54'\
b'\x4d\x52\x4d\x4f\x4e\x4d\x51\x4c\x54\x48\x62\x20\x52\x52\x5b'\
b'\x54\x5a\x56\x57\x57\x54\x57\x50\x56\x4e\x20\x52\x52\x4d\x50'\
b'\x4e\x4e\x51\x4d\x54\x49\x62\x22\x48\x5d\x5b\x4d\x51\x4d\x4e'\
b'\x4e\x4c\x51\x4b\x54\x4b\x57\x4c\x59\x4d\x5a\x4f\x5b\x51\x5b'\
b'\x54\x5a\x56\x57\x57\x54\x57\x51\x56\x4f\x55\x4e\x53\x4d\x20'\
b'\x52\x51\x4d\x4f\x4e\x4d\x51\x4c\x54\x4c\x58\x4d\x5a\x20\x52'\
b'\x51\x5b\x53\x5a\x55\x57\x56\x54\x56\x50\x55\x4e\x20\x52\x55'\
b'\x4e\x5b\x4e\x0f\x48\x5c\x53\x4e\x50\x5b\x20\x52\x53\x4e\x51'\
b'\x5b\x20\x52\x4a\x50\x4c\x4e\x4f\x4d\x5a\x4d\x20\x52\x4a\x50'\
b'\x4c\x4f\x4f\x4e\x5a\x4e\x1e\x48\x5c\x49\x51\x4a\x4f\x4c\x4d'\
b'\x4f\x4d\x50\x4e\x50\x50\x4e\x56\x4e\x59\x50\x5b\x20\x52\x4e'\
b'\x4d\x4f\x4e\x4f\x50\x4d\x56\x4d\x59\x4e\x5a\x50\x5b\x51\x5b'\
b'\x54\x5a\x56\x58\x58\x55\x59\x52\x59\x4f\x58\x4d\x57\x4e\x58'\
b'\x4f\x59\x52\x20\x52\x58\x55\x59\x4f\x0e\x45\x5f\x52\x49\x51'\
b'\x4a\x52\x4b\x53\x4a\x52\x49\x20\x52\x49\x52\x5b\x52\x20\x52'\
b'\x52\x59\x51\x5a\x52\x5b\x53\x5a\x52\x59\x2b\x46\x5d\x4a\x51'\
b'\x4c\x4f\x4f\x4e\x4e\x4d\x4c\x4e\x4a\x51\x49\x54\x49\x57\x4a'\
b'\x5a\x4b\x5b\x4d\x5b\x4f\x5a\x51\x57\x52\x54\x20\x52\x49\x57'\
b'\x4a\x59\x4b\x5a\x4d\x5a\x4f\x59\x51\x57\x20\x52\x51\x54\x51'\
b'\x57\x52\x5a\x53\x5b\x55\x5b\x57\x5a\x59\x57\x5a\x54\x5a\x51'\
b'\x59\x4e\x58\x4d\x57\x4e\x59\x4f\x5a\x51\x20\x52\x51\x57\x52'\
b'\x59\x53\x5a\x55\x5a\x57\x59\x59\x57\x2c\x49\x5a\x54\x46\x52'\
b'\x47\x51\x48\x51\x49\x52\x4a\x55\x4b\x58\x4b\x20\x52\x55\x4b'\
b'\x51\x4c\x4f\x4d\x4e\x4f\x4e\x51\x50\x53\x53\x54\x56\x54\x20'\
b'\x52\x55\x4b\x52\x4c\x50\x4d\x4f\x4f\x4f\x51\x51\x53\x53\x54'\
b'\x20\x52\x53\x54\x4f\x55\x4d\x56\x4c\x58\x4c\x5a\x4e\x5c\x53'\
b'\x5e\x54\x5f\x54\x61\x52\x62\x50\x62\x20\x52\x53\x54\x50\x55'\
b'\x4e\x56\x4d\x58\x4d\x5a\x4f\x5c\x53\x5e\x21\x46\x5d\x55\x46'\
b'\x4f\x62\x20\x52\x56\x46\x4e\x62\x20\x52\x47\x51\x48\x4f\x4a'\
b'\x4d\x4d\x4d\x4e\x4e\x4e\x50\x4d\x55\x4d\x58\x4f\x5a\x52\x5a'\
b'\x54\x59\x57\x56\x59\x53\x20\x52\x4c\x4d\x4d\x4e\x4d\x50\x4c'\
b'\x55\x4c\x58\x4d\x5a\x4f\x5b\x52\x5b\x54\x5a\x56\x58\x58\x55'\
b'\x59\x53\x5b\x4d\x1e\x49\x5b\x54\x46\x52\x47\x51\x48\x51\x49'\
b'\x52\x4a\x55\x4b\x5a\x4b\x5a\x4a\x57\x4b\x53\x4d\x50\x4f\x4d'\
b'\x52\x4c\x55\x4c\x57\x4d\x59\x50\x5b\x53\x5d\x54\x5f\x54\x61'\
b'\x53\x62\x51\x62\x50\x61\x20\x52\x55\x4c\x51\x4f\x4e\x52\x4d'\
b'\x55\x4d\x57\x4e\x59\x50\x5b\x27\x4b\x59\x54\x42\x52\x43\x51'\
b'\x44\x50\x46\x50\x48\x51\x4a\x52\x4b\x53\x4d\x53\x4f\x51\x51'\
b'\x20\x52\x52\x43\x51\x45\x51\x47\x52\x49\x53\x4a\x54\x4c\x54'\
b'\x4e\x53\x50\x4f\x52\x53\x54\x54\x56\x54\x58\x53\x5a\x52\x5b'\
b'\x51\x5d\x51\x5f\x52\x61\x20\x52\x51\x53\x53\x55\x53\x57\x52'\
b'\x59\x51\x5a\x50\x5c\x50\x5e\x51\x60\x52\x61\x54\x62\x02\x4e'\
b'\x56\x52\x42\x52\x62\x27\x4b\x59\x50\x42\x52\x43\x53\x44\x54'\
b'\x46\x54\x48\x53\x4a\x52\x4b\x51\x4d\x51\x4f\x53\x51\x20\x52'\
b'\x52\x43\x53\x45\x53\x47\x52\x49\x51\x4a\x50\x4c\x50\x4e\x51'\
b'\x50\x55\x52\x51\x54\x50\x56\x50\x58\x51\x5a\x52\x5b\x53\x5d'\
b'\x53\x5f\x52\x61\x20\x52\x53\x53\x51\x55\x51\x57\x52\x59\x53'\
b'\x5a\x54\x5c\x54\x5e\x53\x60\x52\x61\x50\x62\x17\x46\x5e\x49'\
b'\x55\x49\x53\x4a\x50\x4c\x4f\x4e\x4f\x50\x50\x54\x53\x56\x54'\
b'\x58\x54\x5a\x53\x5b\x51\x20\x52\x49\x53\x4a\x51\x4c\x50\x4e'\
b'\x50\x50\x51\x54\x54\x56\x55\x58\x55\x5a\x54\x5b\x51\x5b\x4f'\
b'\x22\x4a\x5a\x4a\x46\x4a\x5b\x4b\x5b\x4b\x46\x4c\x46\x4c\x5b'\
b'\x4d\x5b\x4d\x46\x4e\x46\x4e\x5b\x4f\x5b\x4f\x46\x50\x46\x50'\
b'\x5b\x51\x5b\x51\x46\x52\x46\x52\x5b\x53\x5b\x53\x46\x54\x46'\
b'\x54\x5b\x55\x5b\x55\x46\x56\x46\x56\x5b\x57\x5b\x57\x46\x58'\
b'\x46\x58\x5b\x59\x5b\x59\x46\x5a\x46\x5a\x5b'
_index =\
b'\x00\x00\x03\x00\x22\x00\x3b\x00\x54\x00\xa9\x00\xea\x00\x4d'\
b'\x01\x5a\x01\x8b\x01\xbc\x01\xcf\x01\xdc\x01\xed\x01\xf4\x01'\
b'\x01\x02\x08\x02\x59\x02\x70\x02\xcb\x02\x2a\x03\x45\x03\x94'\
b'\x03\xf5\x03\x34\x04\xb3\x04\x14\x05\x2d\x05\x4a\x05\x53\x05'\
b'\x60\x05\x69\x05\xaa\x05\x1b\x06\x40\x06\x9b\x06\xc6\x06\xe5'\
b'\x06\x12\x07\x73\x07\x90\x07\xc7\x07\xe0\x07\xed\x07\x24\x08'\
b'\x43\x08\x80\x08\xab\x08\x04\x09\x2f\x09\xa0\x09\xdb\x09\x04'\
b'\x0a\x25\x0a\x68\x0a\x85\x0a\xdc\x0a\x25\x0b\x78\x0b\x99\x0b'\
b'\xb2\x0b\xb9\x0b\xd2\x0b\xe3\x0b\xea\x0b\xf9\x0b\x4a\x0c\xbd'\
b'\x0c\xec\x0c\x45\x0d\x86\x0d\xd1\x0d\x0a\x0e\x4b\x0e\x6a\x0e'\
b'\x77\x0e\xb2\x0e\xe1\x0e\x1a\x0f\x4b\x0f\x8c\x0f\xb9\x0f\x12'\
b'\x10\x51\x10\x98\x10\xb9\x10\xf8\x10\x17\x11\x70\x11\xcb\x11'\
b'\x10\x12\x4f\x12\xa0\x12\xa7\x12\xf8\x12\x29\x13'
_mvfont = memoryview(_font)
def _chr_addr(ordch):
offset = 2 * (ordch - 32)
return int.from_bytes(_index[offset:offset + 2], 'little')
def get_ch(ordch):
offset = _chr_addr(ordch if 32 <= ordch <= 127 else ord('?'))
count = _font[offset]
return _mvfont[offset:offset+(count+2)*2-1]
| 62.323288 | 65 | 0.706699 | 5,564 | 22,748 | 2.886952 | 0.028577 | 0.094503 | 0.049866 | 0.008965 | 0.31943 | 0.221565 | 0.18222 | 0.156197 | 0.142999 | 0.121459 | 0 | 0.377127 | 0.018287 | 22,748 | 364 | 66 | 62.494505 | 0.342155 | 0 | 0 | 0 | 0 | 0.966387 | 0.909127 | 0.908819 | 0 | 1 | 0 | 0 | 0 | 1 | 0.008403 | false | 0 | 0 | 0.002801 | 0.016807 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
46b111fa8d9d3972fe415e19dd61b92183f76201 | 126 | py | Python | config/__init__.py | koravel/friends_displayer | d2505687171b142efff622e31fd3729376a2e86b | [
"Apache-2.0"
] | null | null | null | config/__init__.py | koravel/friends_displayer | d2505687171b142efff622e31fd3729376a2e86b | [
"Apache-2.0"
] | null | null | null | config/__init__.py | koravel/friends_displayer | d2505687171b142efff622e31fd3729376a2e86b | [
"Apache-2.0"
] | null | null | null | import os
__root_location = os.path.dirname(os.path.abspath(__file__))
def get_root_location():
return __root_location
| 15.75 | 60 | 0.777778 | 18 | 126 | 4.777778 | 0.611111 | 0.418605 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126984 | 126 | 7 | 61 | 18 | 0.781818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
d3c11e54e6f7b4fac713a4ac4663afb48915152f | 163 | py | Python | tasks/loader/types/automated/unused_actions.py | WaffleHacks/application-portal | 53d4d47ddb4e9cc38671c1a2859153d1143d526f | [
"MIT"
] | null | null | null | tasks/loader/types/automated/unused_actions.py | WaffleHacks/application-portal | 53d4d47ddb4e9cc38671c1a2859153d1143d526f | [
"MIT"
] | 2 | 2022-02-15T23:52:50.000Z | 2022-03-23T01:05:22.000Z | tasks/loader/types/automated/unused_actions.py | WaffleHacks/application-portal | 53d4d47ddb4e9cc38671c1a2859153d1143d526f | [
"MIT"
] | null | null | null | from .base_action import BaseAction
class Communication(BaseAction):
pass
class Integrations(BaseAction):
pass
class Workshops(BaseAction):
pass
| 11.642857 | 35 | 0.748466 | 17 | 163 | 7.117647 | 0.588235 | 0.347107 | 0.31405 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190184 | 163 | 13 | 36 | 12.538462 | 0.916667 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.428571 | 0.142857 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
d3cdc9617af3b5bc7ea44cd5549f9fcfb7b14988 | 311 | py | Python | helloworld/config.py | stuhood/example-python | 684d77ead29c39649ec9140514ca160b61dac621 | [
"Apache-2.0"
] | null | null | null | helloworld/config.py | stuhood/example-python | 684d77ead29c39649ec9140514ca160b61dac621 | [
"Apache-2.0"
] | null | null | null | helloworld/config.py | stuhood/example-python | 684d77ead29c39649ec9140514ca160b61dac621 | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 Pants project contributors.
# Licensed under the Apache License, Version 2.0 (see LICENSE).
from helloworld.protos.config_pb2 import Config
from helloworld.util.config_loader import load_config_from_json
def load_config() -> Config:
return load_config_from_json(__name__, "config.json")
| 31.1 | 63 | 0.803859 | 44 | 311 | 5.386364 | 0.613636 | 0.126582 | 0.118143 | 0.151899 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025641 | 0.122187 | 311 | 9 | 64 | 34.555556 | 0.842491 | 0.334405 | 0 | 0 | 0 | 0 | 0.053922 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0.25 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
d3d36a76b47424d9996e5e3f6a86d83c59b8f86e | 42 | py | Python | CangJie/CTC/__init__.py | bigdata-ustc/CangJie | a3264082fa0432d257b5c4722b14c55f9092a411 | [
"MIT"
] | 2 | 2020-03-04T02:27:29.000Z | 2020-05-22T04:07:24.000Z | CangJie/CTC/__init__.py | bigdata-ustc/CangJie | a3264082fa0432d257b5c4722b14c55f9092a411 | [
"MIT"
] | null | null | null | CangJie/CTC/__init__.py | bigdata-ustc/CangJie | a3264082fa0432d257b5c4722b14c55f9092a411 | [
"MIT"
] | 1 | 2022-03-12T00:31:59.000Z | 2022-03-12T00:31:59.000Z | # coding: utf-8
# 2019/12/28 @ tongshiwei
| 14 | 25 | 0.666667 | 7 | 42 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.257143 | 0.166667 | 42 | 2 | 26 | 21 | 0.542857 | 0.880952 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
315b73084d01749a2c7c9e39660951d41ee0a495 | 28 | py | Python | keras2caffe/__init__.py | jelambrar96/keras2caffe-1 | 58fa1923191019b4f5fdf5b8081441d27a53f499 | [
"MIT"
] | 74 | 2017-09-21T09:11:35.000Z | 2022-01-24T05:59:55.000Z | keras2caffe/__init__.py | jelambrar96/keras2caffe-1 | 58fa1923191019b4f5fdf5b8081441d27a53f499 | [
"MIT"
] | 24 | 2018-05-13T04:35:59.000Z | 2020-10-22T07:55:27.000Z | keras2caffe/__init__.py | jelambrar96/keras2caffe-1 | 58fa1923191019b4f5fdf5b8081441d27a53f499 | [
"MIT"
] | 25 | 2017-09-21T09:11:41.000Z | 2022-02-08T14:34:52.000Z | from .convert import convert | 28 | 28 | 0.857143 | 4 | 28 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 28 | 1 | 28 | 28 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
31b2e0a43206897c8bcc5f8a2181c3ad687b1461 | 13,833 | py | Python | test/test_ucx.py | robertmaynard/hpc-container-maker | fdf20b9881eb41f92b7d73c85b20f5f75ddfe262 | [
"Apache-2.0"
] | 340 | 2018-03-26T00:11:21.000Z | 2022-03-21T03:04:27.000Z | test/test_ucx.py | robertmaynard/hpc-container-maker | fdf20b9881eb41f92b7d73c85b20f5f75ddfe262 | [
"Apache-2.0"
] | 103 | 2018-03-24T04:34:24.000Z | 2022-03-31T18:49:57.000Z | test/test_ucx.py | robertmaynard/hpc-container-maker | fdf20b9881eb41f92b7d73c85b20f5f75ddfe262 | [
"Apache-2.0"
] | 75 | 2018-05-10T15:42:11.000Z | 2022-03-28T16:51:14.000Z | # Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=invalid-name, too-few-public-methods, bad-continuation
"""Test cases for the ucx module"""
from __future__ import unicode_literals
from __future__ import print_function
import logging # pylint: disable=unused-import
import unittest
from helpers import centos, docker, ppc64le, ubuntu, x86_64
from hpccm.building_blocks.ucx import ucx
class Test_ucx(unittest.TestCase):
def setUp(self):
"""Disable logging output messages"""
logging.disable(logging.ERROR)
@x86_64
@ubuntu
@docker
def test_defaults_ubuntu(self):
"""Default ucx building block"""
u = ucx()
self.assertEqual(str(u),
r'''# UCX version 1.9.0
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
binutils-dev \
file \
libnuma-dev \
make \
wget && \
rm -rf /var/lib/apt/lists/*
RUN mkdir -p /var/tmp && wget -q -nc --no-check-certificate -P /var/tmp https://github.com/openucx/ucx/releases/download/v1.9.0/ucx-1.9.0.tar.gz && \
mkdir -p /var/tmp && tar -x -f /var/tmp/ucx-1.9.0.tar.gz -C /var/tmp -z && \
cd /var/tmp/ucx-1.9.0 && ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --with-cuda=/usr/local/cuda && \
make -j$(nproc) && \
make -j$(nproc) install && \
rm -rf /var/tmp/ucx-1.9.0 /var/tmp/ucx-1.9.0.tar.gz
ENV CPATH=/usr/local/ucx/include:$CPATH \
LD_LIBRARY_PATH=/usr/local/ucx/lib:$LD_LIBRARY_PATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
@x86_64
@centos
@docker
def test_defaults_centos(self):
"""Default ucx building block"""
u = ucx()
self.assertEqual(str(u),
r'''# UCX version 1.9.0
RUN yum install -y \
binutils-devel \
file \
make \
numactl-devel \
wget && \
rm -rf /var/cache/yum/*
RUN mkdir -p /var/tmp && wget -q -nc --no-check-certificate -P /var/tmp https://github.com/openucx/ucx/releases/download/v1.9.0/ucx-1.9.0.tar.gz && \
mkdir -p /var/tmp && tar -x -f /var/tmp/ucx-1.9.0.tar.gz -C /var/tmp -z && \
cd /var/tmp/ucx-1.9.0 && ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --with-cuda=/usr/local/cuda && \
make -j$(nproc) && \
make -j$(nproc) install && \
rm -rf /var/tmp/ucx-1.9.0 /var/tmp/ucx-1.9.0.tar.gz
ENV CPATH=/usr/local/ucx/include:$CPATH \
LD_LIBRARY_PATH=/usr/local/ucx/lib:$LD_LIBRARY_PATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
@x86_64
@ubuntu
@docker
def test_with_paths_ubuntu(self):
"""ucx building block with paths"""
u = ucx(cuda='/cuda', gdrcopy='/gdrcopy', knem='/knem', ofed='/ofed',
xpmem='/xpmem', version='1.8.0')
self.assertEqual(str(u),
r'''# UCX version 1.8.0
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
binutils-dev \
file \
libnuma-dev \
make \
wget && \
rm -rf /var/lib/apt/lists/*
RUN mkdir -p /var/tmp && wget -q -nc --no-check-certificate -P /var/tmp https://github.com/openucx/ucx/releases/download/v1.8.0/ucx-1.8.0.tar.gz && \
mkdir -p /var/tmp && tar -x -f /var/tmp/ucx-1.8.0.tar.gz -C /var/tmp -z && \
cd /var/tmp/ucx-1.8.0 && ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --with-cuda=/cuda --with-gdrcopy=/gdrcopy --with-knem=/knem --with-rdmacm=/ofed --with-verbs=/ofed --with-xpmem=/xpmem && \
make -j$(nproc) && \
make -j$(nproc) install && \
rm -rf /var/tmp/ucx-1.8.0 /var/tmp/ucx-1.8.0.tar.gz
ENV CPATH=/usr/local/ucx/include:$CPATH \
LD_LIBRARY_PATH=/usr/local/ucx/lib:$LD_LIBRARY_PATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
@x86_64
@ubuntu
@docker
def test_with_true_ubuntu(self):
"""ucx building block with True values"""
u = ucx(cuda=True, gdrcopy=True, knem=True, ofed=True, xpmem=True,
version='1.8.0')
self.assertEqual(str(u),
r'''# UCX version 1.8.0
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
binutils-dev \
file \
libnuma-dev \
make \
wget && \
rm -rf /var/lib/apt/lists/*
RUN mkdir -p /var/tmp && wget -q -nc --no-check-certificate -P /var/tmp https://github.com/openucx/ucx/releases/download/v1.8.0/ucx-1.8.0.tar.gz && \
mkdir -p /var/tmp && tar -x -f /var/tmp/ucx-1.8.0.tar.gz -C /var/tmp -z && \
cd /var/tmp/ucx-1.8.0 && ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --with-cuda=/usr/local/cuda --with-gdrcopy --with-knem --with-rdmacm --with-verbs --with-xpmem && \
make -j$(nproc) && \
make -j$(nproc) install && \
rm -rf /var/tmp/ucx-1.8.0 /var/tmp/ucx-1.8.0.tar.gz
ENV CPATH=/usr/local/ucx/include:$CPATH \
LD_LIBRARY_PATH=/usr/local/ucx/lib:$LD_LIBRARY_PATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
@x86_64
@ubuntu
@docker
def test_with_false_ubuntu(self):
"""ucx building block with False values"""
u = ucx(cuda=False, gdrcopy=False, knem=False, ofed=False, xpmem=False,
version='1.8.0')
self.assertEqual(str(u),
r'''# UCX version 1.8.0
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
binutils-dev \
file \
libnuma-dev \
make \
wget && \
rm -rf /var/lib/apt/lists/*
RUN mkdir -p /var/tmp && wget -q -nc --no-check-certificate -P /var/tmp https://github.com/openucx/ucx/releases/download/v1.8.0/ucx-1.8.0.tar.gz && \
mkdir -p /var/tmp && tar -x -f /var/tmp/ucx-1.8.0.tar.gz -C /var/tmp -z && \
cd /var/tmp/ucx-1.8.0 && ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --without-cuda --without-gdrcopy --without-knem --without-rdmacm --without-verbs --without-xpmem && \
make -j$(nproc) && \
make -j$(nproc) install && \
rm -rf /var/tmp/ucx-1.8.0 /var/tmp/ucx-1.8.0.tar.gz
ENV CPATH=/usr/local/ucx/include:$CPATH \
LD_LIBRARY_PATH=/usr/local/ucx/lib:$LD_LIBRARY_PATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
@x86_64
@ubuntu
@docker
def test_ldconfig(self):
"""ldconfig option"""
u = ucx(ldconfig=True, version='1.4.0')
self.assertEqual(str(u),
r'''# UCX version 1.4.0
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
binutils-dev \
file \
libnuma-dev \
make \
wget && \
rm -rf /var/lib/apt/lists/*
RUN mkdir -p /var/tmp && wget -q -nc --no-check-certificate -P /var/tmp https://github.com/openucx/ucx/releases/download/v1.4.0/ucx-1.4.0.tar.gz && \
mkdir -p /var/tmp && tar -x -f /var/tmp/ucx-1.4.0.tar.gz -C /var/tmp -z && \
cd /var/tmp/ucx-1.4.0 && ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --with-cuda=/usr/local/cuda && \
make -j$(nproc) && \
make -j$(nproc) install && \
echo "/usr/local/ucx/lib" >> /etc/ld.so.conf.d/hpccm.conf && ldconfig && \
rm -rf /var/tmp/ucx-1.4.0 /var/tmp/ucx-1.4.0.tar.gz
ENV CPATH=/usr/local/ucx/include:$CPATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
@x86_64
@ubuntu
@docker
def test_environment(self):
"""environment option"""
u = ucx(environment=False, version='1.4.0')
self.assertEqual(str(u),
r'''# UCX version 1.4.0
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
binutils-dev \
file \
libnuma-dev \
make \
wget && \
rm -rf /var/lib/apt/lists/*
RUN mkdir -p /var/tmp && wget -q -nc --no-check-certificate -P /var/tmp https://github.com/openucx/ucx/releases/download/v1.4.0/ucx-1.4.0.tar.gz && \
mkdir -p /var/tmp && tar -x -f /var/tmp/ucx-1.4.0.tar.gz -C /var/tmp -z && \
cd /var/tmp/ucx-1.4.0 && ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --with-cuda=/usr/local/cuda && \
make -j$(nproc) && \
make -j$(nproc) install && \
rm -rf /var/tmp/ucx-1.4.0 /var/tmp/ucx-1.4.0.tar.gz''')
@ppc64le
@centos
@docker
def test_ppc64le(self):
"""ppc64le"""
u = ucx(cuda=False, knem='/usr/local/knem', version='1.5.2')
self.assertEqual(str(u),
r'''# UCX version 1.5.2
RUN yum install -y \
binutils-devel \
file \
make \
numactl-devel \
wget && \
rm -rf /var/cache/yum/*
RUN mkdir -p /var/tmp && wget -q -nc --no-check-certificate -P /var/tmp https://github.com/openucx/ucx/releases/download/v1.5.2/ucx-1.5.2.tar.gz && \
mkdir -p /var/tmp && tar -x -f /var/tmp/ucx-1.5.2.tar.gz -C /var/tmp -z && \
cd /var/tmp/ucx-1.5.2 && CFLAGS=-Wno-error=format ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --with-knem=/usr/local/knem --without-cuda && \
make -j$(nproc) && \
make -j$(nproc) install && \
rm -rf /var/tmp/ucx-1.5.2 /var/tmp/ucx-1.5.2.tar.gz
ENV CPATH=/usr/local/ucx/include:$CPATH \
LD_LIBRARY_PATH=/usr/local/ucx/lib:$LD_LIBRARY_PATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
@x86_64
@ubuntu
@docker
def test_git_repository_true(self):
u = ucx(repository=True)
self.assertEqual(str(u),
r'''# UCX https://github.com/openucx/ucx.git
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
autoconf \
automake \
binutils-dev \
ca-certificates \
file \
git \
libnuma-dev \
libtool \
make \
wget && \
rm -rf /var/lib/apt/lists/*
RUN mkdir -p /var/tmp && cd /var/tmp && git clone --depth=1 https://github.com/openucx/ucx.git ucx && cd - && \
cd /var/tmp/ucx && \
./autogen.sh && \
cd /var/tmp/ucx && ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --with-cuda=/usr/local/cuda && \
make -j$(nproc) && \
make -j$(nproc) install && \
rm -rf /var/tmp/ucx
ENV CPATH=/usr/local/ucx/include:$CPATH \
LD_LIBRARY_PATH=/usr/local/ucx/lib:$LD_LIBRARY_PATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
@x86_64
@ubuntu
@docker
def test_git_repository_value(self):
u = ucx(branch='v1.8.x',
repository='https://github.com/openucx-fork/ucx.git')
self.assertEqual(str(u),
r'''# UCX https://github.com/openucx-fork/ucx.git v1.8.x
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
autoconf \
automake \
binutils-dev \
ca-certificates \
file \
git \
libnuma-dev \
libtool \
make \
wget && \
rm -rf /var/lib/apt/lists/*
RUN mkdir -p /var/tmp && cd /var/tmp && git clone --depth=1 --branch v1.8.x https://github.com/openucx-fork/ucx.git ucx && cd - && \
cd /var/tmp/ucx && \
./autogen.sh && \
cd /var/tmp/ucx && ./configure --prefix=/usr/local/ucx --disable-assertions --disable-debug --disable-doxygen-doc --disable-logging --disable-params-check --enable-optimizations --with-cuda=/usr/local/cuda && \
make -j$(nproc) && \
make -j$(nproc) install && \
rm -rf /var/tmp/ucx
ENV CPATH=/usr/local/ucx/include:$CPATH \
LD_LIBRARY_PATH=/usr/local/ucx/lib:$LD_LIBRARY_PATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
@ubuntu
@docker
def test_runtime(self):
"""Runtime"""
u = ucx()
r = u.runtime()
self.assertEqual(r,
r'''# UCX
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
binutils && \
rm -rf /var/lib/apt/lists/*
COPY --from=0 /usr/local/ucx /usr/local/ucx
ENV CPATH=/usr/local/ucx/include:$CPATH \
LD_LIBRARY_PATH=/usr/local/ucx/lib:$LD_LIBRARY_PATH \
LIBRARY_PATH=/usr/local/ucx/lib:$LIBRARY_PATH \
PATH=/usr/local/ucx/bin:$PATH''')
| 42.045593 | 313 | 0.625967 | 2,113 | 13,833 | 4.045906 | 0.105537 | 0.051936 | 0.066908 | 0.037431 | 0.801614 | 0.801614 | 0.781846 | 0.774126 | 0.768862 | 0.768862 | 0 | 0.022578 | 0.196342 | 13,833 | 328 | 314 | 42.17378 | 0.746424 | 0.069038 | 0 | 0.633333 | 0 | 0 | 0.051293 | 0 | 0 | 0 | 0 | 0 | 0.122222 | 1 | 0.133333 | false | 0 | 0.066667 | 0 | 0.211111 | 0.011111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
73128bf2b37664ebb55abe68df45ad038a3bba3a | 44 | py | Python | mct_camera_trigger/src/mct_camera_trigger/__init__.py | iorodeo/mct | fa8b85f36533c9b1486ca4f6b0c40c3daa6f4e11 | [
"Apache-2.0"
] | null | null | null | mct_camera_trigger/src/mct_camera_trigger/__init__.py | iorodeo/mct | fa8b85f36533c9b1486ca4f6b0c40c3daa6f4e11 | [
"Apache-2.0"
] | null | null | null | mct_camera_trigger/src/mct_camera_trigger/__init__.py | iorodeo/mct | fa8b85f36533c9b1486ca4f6b0c40c3daa6f4e11 | [
"Apache-2.0"
] | null | null | null | import trigger_device
import camera_trigger
| 14.666667 | 21 | 0.909091 | 6 | 44 | 6.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 44 | 2 | 22 | 22 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b4390c5235cea8c58add6cda45c22d5dff35cc08 | 31,915 | py | Python | unrolled-lutnet/training-software/MNIST-CIFAR-SVHN/models/SVHN/scripts/lutnet_init.py | awai54st/LUTNet | 81b044f31d1131bee1a7fae41fc4d2fb102ea73a | [
"BSD-2-Clause"
] | 38 | 2019-10-28T10:06:33.000Z | 2022-02-21T21:38:39.000Z | unrolled-lutnet/training-software/MNIST-CIFAR-SVHN/models/CIFAR-10/scripts/lutnet_init.py | awai54st/LUTNet | 81b044f31d1131bee1a7fae41fc4d2fb102ea73a | [
"BSD-2-Clause"
] | null | null | null | unrolled-lutnet/training-software/MNIST-CIFAR-SVHN/models/CIFAR-10/scripts/lutnet_init.py | awai54st/LUTNet | 81b044f31d1131bee1a7fae41fc4d2fb102ea73a | [
"BSD-2-Clause"
] | 13 | 2019-10-28T10:17:48.000Z | 2021-08-10T21:37:11.000Z | import h5py
import numpy as np
from shutil import copyfile
copyfile("dummy_lutnet.h5", "pretrained_bin.h5") # create pretrained.h5 using datastructure from dummy.h5
bl = h5py.File("baseline_pruned.h5", 'r')
#dummy = h5py.File("dummy.h5", 'r')
pretrained = h5py.File("pretrained_bin.h5", 'r+')
# conv layer 1
bl_w1 = bl["model_weights"]["binary_conv_1"]["binary_conv_1"]["Variable_1:0"]
#bl_rand_map = bl["model_weights"]["binary_conv_1"]["binary_conv_1"]["rand_map:0"]
bl_pruning_mask = bl["model_weights"]["binary_conv_1"]["binary_conv_1"]["pruning_mask:0"]
bl_gamma = bl["model_weights"]["binary_conv_1"]["binary_conv_1"]["Variable:0"]
zero_fill = np.zeros(np.shape(np.array(bl_w1)))
pret_w1 = pretrained["model_weights"]["binary_conv_1"]["binary_conv_1"]["Variable_1:0"]
#pret_rand_map = pretrained["model_weights"]["binary_conv_1"]["binary_conv_1"]["rand_map:0"]
pret_pruning_mask = pretrained["model_weights"]["binary_conv_1"]["binary_conv_1"]["pruning_mask:0"]
p_gamma = pretrained["model_weights"]["binary_conv_1"]["binary_conv_1"]["Variable:0"]
pret_w1[...] = np.array(bl_w1)
#pret_rand_map[...] = np.array(bl_rand_map)
p_gamma[...] = np.array(bl_gamma)
pret_pruning_mask[...] = np.array(bl_pruning_mask)
print(np.sum(np.array(bl_pruning_mask)), np.prod(np.shape(np.array(bl_pruning_mask))))
# conv layer 2
bl_w1 = bl["model_weights"]["binary_conv_2"]["binary_conv_2"]["Variable_1:0"]
#bl_w2 = bl["model_weights"]["binary_conv_2"]["binary_conv_2"]["Variable_2:0"]
#bl_w3 = bl["model_weights"]["binary_conv_2"]["binary_conv_2"]["Variable_3:0"]
#bl_w4 = bl["model_weights"]["binary_conv_2"]["binary_conv_2"]["Variable_4:0"]
#bl_rand_map = bl["model_weights"]["binary_conv_2"]["binary_conv_2"]["rand_map:0"]
bl_pruning_mask = bl["model_weights"]["binary_conv_2"]["binary_conv_2"]["pruning_mask:0"]
bl_gamma = bl["model_weights"]["binary_conv_2"]["binary_conv_2"]["Variable:0"]
bl_means = bl["model_weights"]["residual_sign_1"]["residual_sign_1"]["means:0"]
zero_fill = np.zeros(np.shape(np.array(bl_w1)))
pret_w1 = pretrained["model_weights"]["binary_conv_2"]["binary_conv_2"]["Variable_1:0"]
#pret_rand_map = pretrained["model_weights"]["binary_conv_2"]["binary_conv_2"]["rand_map:0"]
pret_pruning_mask = pretrained["model_weights"]["binary_conv_2"]["binary_conv_2"]["pruning_mask:0"]
p_gamma = pretrained["model_weights"]["binary_conv_2"]["binary_conv_2"]["Variable:0"]
pret_means = pretrained["model_weights"]["residual_sign_1"]["residual_sign_1"]["means:0"]
#weight_shape = np.shape(bl_w1)
#
pret_w1[...] = np.array(bl_w1)
#pret_rand_map[...] = np.array(bl_rand_map)
p_gamma[...] = np.array(bl_gamma)
pret_means[...] = np.array(bl_means)
pret_pruning_mask[...] = np.array(bl_pruning_mask)
print(np.sum(np.array(bl_pruning_mask)), np.prod(np.shape(np.array(bl_pruning_mask))))
# conv layer 3
bl_w1 = bl["model_weights"]["binary_conv_3"]["binary_conv_3"]["Variable_1:0"]
#bl_rand_map = bl["model_weights"]["binary_conv_3"]["binary_conv_3"]["rand_map:0"]
bl_pruning_mask = bl["model_weights"]["binary_conv_3"]["binary_conv_3"]["pruning_mask:0"]
bl_gamma = bl["model_weights"]["binary_conv_3"]["binary_conv_3"]["Variable:0"]
bl_means = bl["model_weights"]["residual_sign_2"]["residual_sign_2"]["means:0"]
zero_fill = np.zeros(np.shape(np.array(bl_w1)))
pret_w1 = pretrained["model_weights"]["binary_conv_3"]["binary_conv_3"]["Variable_1:0"]
#pret_rand_map = pretrained["model_weights"]["binary_conv_3"]["binary_conv_3"]["rand_map:0"]
pret_pruning_mask = pretrained["model_weights"]["binary_conv_3"]["binary_conv_3"]["pruning_mask:0"]
p_gamma = pretrained["model_weights"]["binary_conv_3"]["binary_conv_3"]["Variable:0"]
pret_means = pretrained["model_weights"]["residual_sign_2"]["residual_sign_2"]["means:0"]
pret_w1[...] = np.array(bl_w1)
#pret_rand_map[...] = np.array(bl_rand_map)
p_gamma[...] = np.array(bl_gamma)
pret_means[...] = np.array(bl_means)
pret_pruning_mask[...] = np.array(bl_pruning_mask)
print(np.sum(np.array(bl_pruning_mask)), np.prod(np.shape(np.array(bl_pruning_mask))))
# conv layer 4
bl_w1 = bl["model_weights"]["binary_conv_4"]["binary_conv_4"]["Variable_1:0"]
#bl_rand_map = bl["model_weights"]["binary_conv_4"]["binary_conv_4"]["rand_map:0"]
bl_pruning_mask = bl["model_weights"]["binary_conv_4"]["binary_conv_4"]["pruning_mask:0"]
bl_gamma = bl["model_weights"]["binary_conv_4"]["binary_conv_4"]["Variable:0"]
bl_means = bl["model_weights"]["residual_sign_3"]["residual_sign_3"]["means:0"]
zero_fill = np.zeros(np.shape(np.array(bl_w1)))
pret_w1 = pretrained["model_weights"]["binary_conv_4"]["binary_conv_4"]["Variable_1:0"]
#pret_rand_map = pretrained["model_weights"]["binary_conv_4"]["binary_conv_4"]["rand_map:0"]
pret_pruning_mask = pretrained["model_weights"]["binary_conv_4"]["binary_conv_4"]["pruning_mask:0"]
p_gamma = pretrained["model_weights"]["binary_conv_4"]["binary_conv_4"]["Variable:0"]
pret_means = pretrained["model_weights"]["residual_sign_3"]["residual_sign_3"]["means:0"]
pret_w1[...] = np.array(bl_w1)
#pret_rand_map[...] = np.array(bl_rand_map)
p_gamma[...] = np.array(bl_gamma)
pret_means[...] = np.array(bl_means)
pret_pruning_mask[...] = np.array(bl_pruning_mask)
print(np.sum(np.array(bl_pruning_mask)), np.prod(np.shape(np.array(bl_pruning_mask))))
# conv layer 5
bl_w1 = bl["model_weights"]["binary_conv_5"]["binary_conv_5"]["Variable_1:0"]
#bl_rand_map = bl["model_weights"]["binary_conv_5"]["binary_conv_5"]["rand_map:0"]
bl_pruning_mask = bl["model_weights"]["binary_conv_5"]["binary_conv_5"]["pruning_mask:0"]
bl_gamma = bl["model_weights"]["binary_conv_5"]["binary_conv_5"]["Variable:0"]
bl_means = bl["model_weights"]["residual_sign_4"]["residual_sign_4"]["means:0"]
zero_fill = np.zeros(np.shape(np.array(bl_w1)))
pret_w1 = pretrained["model_weights"]["binary_conv_5"]["binary_conv_5"]["Variable_1:0"]
#pret_rand_map = pretrained["model_weights"]["binary_conv_5"]["binary_conv_5"]["rand_map:0"]
pret_pruning_mask = pretrained["model_weights"]["binary_conv_5"]["binary_conv_5"]["pruning_mask:0"]
p_gamma = pretrained["model_weights"]["binary_conv_5"]["binary_conv_5"]["Variable:0"]
pret_means = pretrained["model_weights"]["residual_sign_4"]["residual_sign_4"]["means:0"]
pret_w1[...] = np.array(bl_w1)
#pret_rand_map[...] = np.array(bl_rand_map)
p_gamma[...] = np.array(bl_gamma)
pret_means[...] = np.array(bl_means)
pret_pruning_mask[...] = np.array(bl_pruning_mask)
print(np.sum(np.array(bl_pruning_mask)), np.prod(np.shape(np.array(bl_pruning_mask))))
# conv layer 6
bl_w1 = bl["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_1:0"]
#bl_w2 = bl["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_2:0"]
#bl_w3 = bl["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_3:0"]
#bl_w4 = bl["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_4:0"]
bl_rand_map_0 = bl["model_weights"]["binary_conv_6"]["binary_conv_6"]["rand_map_0:0"]
bl_rand_map_1 = bl["model_weights"]["binary_conv_6"]["binary_conv_6"]["rand_map_1:0"]
bl_rand_map_2 = bl["model_weights"]["binary_conv_6"]["binary_conv_6"]["rand_map_2:0"]
bl_pruning_mask = bl["model_weights"]["binary_conv_6"]["binary_conv_6"]["pruning_mask:0"]
bl_gamma = bl["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable:0"]
bl_means = bl["model_weights"]["residual_sign_5"]["residual_sign_5"]["means:0"]
zero_fill = np.zeros(np.shape(np.array(bl_w1)))
pret_w1 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_1:0"]
pret_w2 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_2:0"]
pret_w3 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_3:0"]
pret_w4 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_4:0"]
pret_w5 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_5:0"]
pret_w6 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_6:0"]
pret_w7 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_7:0"]
pret_w8 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_8:0"]
pret_w9 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_9:0"]
pret_w10 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_10:0"]
pret_w11 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_11:0"]
pret_w12 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_12:0"]
pret_w13 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_13:0"]
pret_w14 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_14:0"]
pret_w15 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_15:0"]
pret_w16 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_16:0"]
pret_w17 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_17:0"]
pret_w18 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_18:0"]
pret_w19 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_19:0"]
pret_w20 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_20:0"]
pret_w21 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_21:0"]
pret_w22 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_22:0"]
pret_w23 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_23:0"]
pret_w24 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_24:0"]
pret_w25 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_25:0"]
pret_w26 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_26:0"]
pret_w27 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_27:0"]
pret_w28 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_28:0"]
pret_w29 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_29:0"]
pret_w30 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_30:0"]
pret_w31 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_31:0"]
pret_w32 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable_32:0"]
pret_rand_map_0 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["rand_map_0:0"]
pret_rand_map_1 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["rand_map_1:0"]
pret_rand_map_2 = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["rand_map_2:0"]
pret_pruning_mask = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["pruning_mask:0"]
p_gamma = pretrained["model_weights"]["binary_conv_6"]["binary_conv_6"]["Variable:0"]
pret_means = pretrained["model_weights"]["residual_sign_5"]["residual_sign_5"]["means:0"]
weight_shape = np.shape(bl_w1)
# randomisation and pruning recovery
bl_w1_unroll = np.reshape(np.array(bl_w1), (-1,weight_shape[3]))
bl_w1 = np.array(bl_w1)
rand_map_0 = np.arange(weight_shape[0]*weight_shape[1]*weight_shape[2])
np.random.shuffle(rand_map_0)
rand_map_1 = np.arange(weight_shape[0]*weight_shape[1]*weight_shape[2])
np.random.shuffle(rand_map_1)
rand_map_2 = np.arange(weight_shape[0]*weight_shape[1]*weight_shape[2])
np.random.shuffle(rand_map_2)
pruning_mask = np.array(bl_pruning_mask).astype(bool)
# weights for extra input 0
init_mask = np.logical_not(pruning_mask[rand_map_0])
pruning_mask_recover = np.logical_and(pruning_mask, init_mask)[np.argsort(rand_map_0)]
pruning_mask = np.logical_or(pruning_mask, pruning_mask_recover)
init_mask = np.reshape(init_mask, weight_shape)
bl_w1_rand = bl_w1_unroll[rand_map_0]
bl_w1_rand = np.reshape(bl_w1_rand, weight_shape)
w1 = bl_w1
w2 = bl_w1
w3 = bl_w1
w4 = bl_w1
w5 = bl_w1
w6 = bl_w1
w7 = bl_w1
w8 = bl_w1
w9 = -bl_w1
w10 = -bl_w1
w11 = -bl_w1
w12 = -bl_w1
w13 = -bl_w1
w14 = -bl_w1
w15 = -bl_w1
w16 = -bl_w1
w17 = bl_w1
w18 = bl_w1
w19 = bl_w1
w20 = bl_w1
w21 = bl_w1
w22 = bl_w1
w23 = bl_w1
w24 = bl_w1
w25 = -bl_w1
w26 = -bl_w1
w27 = -bl_w1
w28 = -bl_w1
w29 = -bl_w1
w30 = -bl_w1
w31 = -bl_w1
w32 = -bl_w1
w1[init_mask] = w1[init_mask] + bl_w1_rand[init_mask]
w2[init_mask] = w2[init_mask] + bl_w1_rand[init_mask]
w3[init_mask] = w3[init_mask] + bl_w1_rand[init_mask]
w4[init_mask] = w4[init_mask] + bl_w1_rand[init_mask]
w5[init_mask] = w5[init_mask] - bl_w1_rand[init_mask]
w6[init_mask] = w6[init_mask] - bl_w1_rand[init_mask]
w7[init_mask] = w7[init_mask] - bl_w1_rand[init_mask]
w8[init_mask] = w8[init_mask] - bl_w1_rand[init_mask]
w9[init_mask] = w9[init_mask] + bl_w1_rand[init_mask]
w10[init_mask] = w10[init_mask] + bl_w1_rand[init_mask]
w11[init_mask] = w11[init_mask] + bl_w1_rand[init_mask]
w12[init_mask] = w12[init_mask] + bl_w1_rand[init_mask]
w13[init_mask] = w13[init_mask] - bl_w1_rand[init_mask]
w14[init_mask] = w14[init_mask] - bl_w1_rand[init_mask]
w15[init_mask] = w15[init_mask] - bl_w1_rand[init_mask]
w16[init_mask] = w16[init_mask] - bl_w1_rand[init_mask]
w17[init_mask] = w17[init_mask] + bl_w1_rand[init_mask]
w18[init_mask] = w18[init_mask] + bl_w1_rand[init_mask]
w19[init_mask] = w19[init_mask] + bl_w1_rand[init_mask]
w20[init_mask] = w20[init_mask] + bl_w1_rand[init_mask]
w21[init_mask] = w21[init_mask] - bl_w1_rand[init_mask]
w22[init_mask] = w22[init_mask] - bl_w1_rand[init_mask]
w23[init_mask] = w23[init_mask] - bl_w1_rand[init_mask]
w24[init_mask] = w24[init_mask] - bl_w1_rand[init_mask]
w25[init_mask] = w25[init_mask] + bl_w1_rand[init_mask]
w26[init_mask] = w26[init_mask] + bl_w1_rand[init_mask]
w27[init_mask] = w27[init_mask] + bl_w1_rand[init_mask]
w28[init_mask] = w28[init_mask] + bl_w1_rand[init_mask]
w29[init_mask] = w29[init_mask] - bl_w1_rand[init_mask]
w30[init_mask] = w30[init_mask] - bl_w1_rand[init_mask]
w31[init_mask] = w31[init_mask] - bl_w1_rand[init_mask]
w32[init_mask] = w32[init_mask] - bl_w1_rand[init_mask]
# weights for extra input 2
init_mask = np.logical_not(pruning_mask[rand_map_1])
pruning_mask_recover = np.logical_and(pruning_mask, init_mask)[np.argsort(rand_map_1)]
pruning_mask = np.logical_or(pruning_mask, pruning_mask_recover)
init_mask = np.reshape(init_mask, weight_shape)
bl_w1_rand = bl_w1_unroll[rand_map_1]
bl_w1_rand = np.reshape(bl_w1_rand, weight_shape)
w1[init_mask] = w1[init_mask] + bl_w1_rand[init_mask]
w2[init_mask] = w2[init_mask] + bl_w1_rand[init_mask]
w3[init_mask] = w3[init_mask] - bl_w1_rand[init_mask]
w4[init_mask] = w4[init_mask] - bl_w1_rand[init_mask]
w5[init_mask] = w5[init_mask] + bl_w1_rand[init_mask]
w6[init_mask] = w6[init_mask] + bl_w1_rand[init_mask]
w7[init_mask] = w7[init_mask] - bl_w1_rand[init_mask]
w8[init_mask] = w8[init_mask] - bl_w1_rand[init_mask]
w9[init_mask] = w9[init_mask] + bl_w1_rand[init_mask]
w10[init_mask] = w10[init_mask] + bl_w1_rand[init_mask]
w11[init_mask] = w11[init_mask] - bl_w1_rand[init_mask]
w12[init_mask] = w12[init_mask] - bl_w1_rand[init_mask]
w13[init_mask] = w13[init_mask] + bl_w1_rand[init_mask]
w14[init_mask] = w14[init_mask] + bl_w1_rand[init_mask]
w15[init_mask] = w15[init_mask] - bl_w1_rand[init_mask]
w16[init_mask] = w16[init_mask] - bl_w1_rand[init_mask]
w17[init_mask] = w17[init_mask] + bl_w1_rand[init_mask]
w18[init_mask] = w18[init_mask] + bl_w1_rand[init_mask]
w19[init_mask] = w19[init_mask] - bl_w1_rand[init_mask]
w20[init_mask] = w20[init_mask] - bl_w1_rand[init_mask]
w21[init_mask] = w21[init_mask] + bl_w1_rand[init_mask]
w22[init_mask] = w22[init_mask] + bl_w1_rand[init_mask]
w23[init_mask] = w23[init_mask] - bl_w1_rand[init_mask]
w24[init_mask] = w24[init_mask] - bl_w1_rand[init_mask]
w25[init_mask] = w25[init_mask] + bl_w1_rand[init_mask]
w26[init_mask] = w26[init_mask] + bl_w1_rand[init_mask]
w27[init_mask] = w27[init_mask] - bl_w1_rand[init_mask]
w28[init_mask] = w28[init_mask] - bl_w1_rand[init_mask]
w29[init_mask] = w29[init_mask] + bl_w1_rand[init_mask]
w30[init_mask] = w30[init_mask] + bl_w1_rand[init_mask]
w31[init_mask] = w31[init_mask] - bl_w1_rand[init_mask]
w32[init_mask] = w32[init_mask] - bl_w1_rand[init_mask]
# weights for extra input 3
init_mask = np.logical_not(pruning_mask[rand_map_2])
pruning_mask_recover = np.logical_and(pruning_mask, init_mask)[np.argsort(rand_map_2)]
pruning_mask = np.logical_or(pruning_mask, pruning_mask_recover)
init_mask = np.reshape(init_mask, weight_shape)
bl_w1_rand = bl_w1_unroll[rand_map_2]
bl_w1_rand = np.reshape(bl_w1_rand, weight_shape)
w1[init_mask] = w1[init_mask] + bl_w1_rand[init_mask]
w2[init_mask] = w2[init_mask] - bl_w1_rand[init_mask]
w3[init_mask] = w3[init_mask] + bl_w1_rand[init_mask]
w4[init_mask] = w4[init_mask] - bl_w1_rand[init_mask]
w5[init_mask] = w5[init_mask] + bl_w1_rand[init_mask]
w6[init_mask] = w6[init_mask] - bl_w1_rand[init_mask]
w7[init_mask] = w7[init_mask] + bl_w1_rand[init_mask]
w8[init_mask] = w8[init_mask] - bl_w1_rand[init_mask]
w9[init_mask] = w9[init_mask] + bl_w1_rand[init_mask]
w10[init_mask] = w10[init_mask] - bl_w1_rand[init_mask]
w11[init_mask] = w11[init_mask] + bl_w1_rand[init_mask]
w12[init_mask] = w12[init_mask] - bl_w1_rand[init_mask]
w13[init_mask] = w13[init_mask] + bl_w1_rand[init_mask]
w14[init_mask] = w14[init_mask] - bl_w1_rand[init_mask]
w15[init_mask] = w15[init_mask] + bl_w1_rand[init_mask]
w16[init_mask] = w16[init_mask] - bl_w1_rand[init_mask]
w17[init_mask] = w17[init_mask] + bl_w1_rand[init_mask]
w18[init_mask] = w18[init_mask] - bl_w1_rand[init_mask]
w19[init_mask] = w19[init_mask] + bl_w1_rand[init_mask]
w20[init_mask] = w20[init_mask] - bl_w1_rand[init_mask]
w21[init_mask] = w21[init_mask] + bl_w1_rand[init_mask]
w22[init_mask] = w22[init_mask] - bl_w1_rand[init_mask]
w23[init_mask] = w23[init_mask] + bl_w1_rand[init_mask]
w24[init_mask] = w24[init_mask] - bl_w1_rand[init_mask]
w25[init_mask] = w25[init_mask] + bl_w1_rand[init_mask]
w26[init_mask] = w26[init_mask] - bl_w1_rand[init_mask]
w27[init_mask] = w27[init_mask] + bl_w1_rand[init_mask]
w28[init_mask] = w28[init_mask] - bl_w1_rand[init_mask]
w29[init_mask] = w29[init_mask] + bl_w1_rand[init_mask]
w30[init_mask] = w30[init_mask] - bl_w1_rand[init_mask]
w31[init_mask] = w31[init_mask] + bl_w1_rand[init_mask]
w32[init_mask] = w32[init_mask] - bl_w1_rand[init_mask]
pret_w1[...] = w1
pret_w2[...] = w2
pret_w3[...] = w3
pret_w4[...] = w4
pret_w5[...] = w5
pret_w6[...] = w6
pret_w7[...] = w7
pret_w8[...] = w8
pret_w9[...] = w9
pret_w10[...] = w10
pret_w11[...] = w11
pret_w12[...] = w12
pret_w13[...] = w13
pret_w14[...] = w14
pret_w15[...] = w15
pret_w16[...] = w16
pret_w17[...] = w17
pret_w18[...] = w18
pret_w19[...] = w19
pret_w20[...] = w20
pret_w21[...] = w21
pret_w22[...] = w22
pret_w23[...] = w23
pret_w24[...] = w24
pret_w25[...] = w25
pret_w26[...] = w26
pret_w27[...] = w27
pret_w28[...] = w28
pret_w29[...] = w29
pret_w30[...] = w30
pret_w31[...] = w31
pret_w32[...] = w32
pret_rand_map_0[...] = np.reshape(rand_map_0, (-1,1)).astype(float)
pret_rand_map_1[...] = np.reshape(rand_map_1, (-1,1)).astype(float)
pret_rand_map_2[...] = np.reshape(rand_map_2, (-1,1)).astype(float)
p_gamma[...] = np.array(bl_gamma)
pret_means[...] = np.array(bl_means)
pret_pruning_mask[...] = np.array(bl_pruning_mask)
print(np.sum(np.array(bl_pruning_mask)), np.prod(np.shape(np.array(bl_pruning_mask))))
# dense layer 1
bl_w1 = bl["model_weights"]["binary_dense_1"]["binary_dense_1"]["Variable_1:0"]
#bl_rand_map = bl["model_weights"]["binary_dense_1"]["binary_dense_1"]["rand_map:0"]
bl_pruning_mask = bl["model_weights"]["binary_dense_1"]["binary_dense_1"]["pruning_mask:0"]
bl_gamma = bl["model_weights"]["binary_dense_1"]["binary_dense_1"]["Variable:0"]
bl_means = bl["model_weights"]["residual_sign_6"]["residual_sign_6"]["means:0"]
zero_fill = np.zeros(np.shape(np.array(bl_w1)))
pret_w1 = pretrained["model_weights"]["binary_dense_1"]["binary_dense_1"]["Variable_1:0"]
#pret_rand_map = pretrained["model_weights"]["binary_dense_1"]["binary_dense_1"]["rand_map:0"]
pret_pruning_mask = pretrained["model_weights"]["binary_dense_1"]["binary_dense_1"]["pruning_mask:0"]
p_gamma = pretrained["model_weights"]["binary_dense_1"]["binary_dense_1"]["Variable:0"]
pret_means = pretrained["model_weights"]["residual_sign_6"]["residual_sign_6"]["means:0"]
pret_w1[...] = np.array(bl_w1)
#pret_rand_map[...] = np.array(bl_rand_map)
p_gamma[...] = np.array(bl_gamma)
pret_means[...] = np.array(bl_means)
pret_pruning_mask[...] = np.array(bl_pruning_mask)
print(np.sum(np.array(bl_pruning_mask)), np.prod(np.shape(np.array(bl_pruning_mask))))
# dense layer 2
bl_w1 = bl["model_weights"]["binary_dense_2"]["binary_dense_2"]["Variable_1:0"]
#bl_rand_map = bl["model_weights"]["binary_dense_2"]["binary_dense_2"]["rand_map:0"]
bl_pruning_mask = bl["model_weights"]["binary_dense_2"]["binary_dense_2"]["pruning_mask:0"]
bl_gamma = bl["model_weights"]["binary_dense_2"]["binary_dense_2"]["Variable:0"]
bl_means = bl["model_weights"]["residual_sign_7"]["residual_sign_7"]["means:0"]
zero_fill = np.zeros(np.shape(np.array(bl_w1)))
pret_w1 = pretrained["model_weights"]["binary_dense_2"]["binary_dense_2"]["Variable_1:0"]
#pret_rand_map = pretrained["model_weights"]["binary_dense_2"]["binary_dense_2"]["rand_map:0"]
pret_pruning_mask = pretrained["model_weights"]["binary_dense_2"]["binary_dense_2"]["pruning_mask:0"]
p_gamma = pretrained["model_weights"]["binary_dense_2"]["binary_dense_2"]["Variable:0"]
pret_means = pretrained["model_weights"]["residual_sign_7"]["residual_sign_7"]["means:0"]
pret_w1[...] = np.array(bl_w1)
#pret_rand_map[...] = np.array(bl_rand_map)
p_gamma[...] = np.array(bl_gamma)
pret_means[...] = np.array(bl_means)
pret_pruning_mask[...] = np.array(bl_pruning_mask)
print(np.sum(np.array(bl_pruning_mask)), np.prod(np.shape(np.array(bl_pruning_mask))))
# dense layer 3
bl_w1 = bl["model_weights"]["binary_dense_3"]["binary_dense_3"]["Variable_1:0"]
#bl_rand_map = bl["model_weights"]["binary_dense_3"]["binary_dense_3"]["rand_map:0"]
bl_pruning_mask = bl["model_weights"]["binary_dense_3"]["binary_dense_3"]["pruning_mask:0"]
bl_gamma = bl["model_weights"]["binary_dense_3"]["binary_dense_3"]["Variable:0"]
bl_means = bl["model_weights"]["residual_sign_8"]["residual_sign_8"]["means:0"]
zero_fill = np.zeros(np.shape(np.array(bl_w1)))
pret_w1 = pretrained["model_weights"]["binary_dense_3"]["binary_dense_3"]["Variable_1:0"]
#pret_rand_map = pretrained["model_weights"]["binary_dense_3"]["binary_dense_3"]["rand_map:0"]
pret_pruning_mask = pretrained["model_weights"]["binary_dense_3"]["binary_dense_3"]["pruning_mask:0"]
p_gamma = pretrained["model_weights"]["binary_dense_3"]["binary_dense_3"]["Variable:0"]
pret_means = pretrained["model_weights"]["residual_sign_8"]["residual_sign_8"]["means:0"]
pret_w1[...] = np.array(bl_w1)
#pret_rand_map[...] = np.array(bl_rand_map)
p_gamma[...] = np.array(bl_gamma)
pret_means[...] = np.array(bl_means)
pret_pruning_mask[...] = np.array(bl_pruning_mask)
print(np.sum(np.array(bl_pruning_mask)), np.prod(np.shape(np.array(bl_pruning_mask))))
# bn 1
bl_beta = bl["model_weights"]["batch_normalization_1"]["batch_normalization_1"]["beta:0"]
bl_gamma = bl["model_weights"]["batch_normalization_1"]["batch_normalization_1"]["gamma:0"]
bl_moving_mean = bl["model_weights"]["batch_normalization_1"]["batch_normalization_1"]["moving_mean:0"]
bl_moving_variance = bl["model_weights"]["batch_normalization_1"]["batch_normalization_1"]["moving_variance:0"]
p_beta = pretrained["model_weights"]["batch_normalization_1"]["batch_normalization_1"]["beta:0"]
p_gamma = pretrained["model_weights"]["batch_normalization_1"]["batch_normalization_1"]["gamma:0"]
p_moving_mean = pretrained["model_weights"]["batch_normalization_1"]["batch_normalization_1"]["moving_mean:0"]
p_moving_variance = pretrained["model_weights"]["batch_normalization_1"]["batch_normalization_1"]["moving_variance:0"]
p_beta[...] = np.array(bl_beta)
p_gamma[...] = np.array(bl_gamma)
p_moving_mean[...] = np.array(bl_moving_mean)
p_moving_variance[...] = np.array(bl_moving_variance)
# bn 2
bl_beta = bl["model_weights"]["batch_normalization_2"]["batch_normalization_2"]["beta:0"]
bl_gamma = bl["model_weights"]["batch_normalization_2"]["batch_normalization_2"]["gamma:0"]
bl_moving_mean = bl["model_weights"]["batch_normalization_2"]["batch_normalization_2"]["moving_mean:0"]
bl_moving_variance = bl["model_weights"]["batch_normalization_2"]["batch_normalization_2"]["moving_variance:0"]
p_beta = pretrained["model_weights"]["batch_normalization_2"]["batch_normalization_2"]["beta:0"]
p_gamma = pretrained["model_weights"]["batch_normalization_2"]["batch_normalization_2"]["gamma:0"]
p_moving_mean = pretrained["model_weights"]["batch_normalization_2"]["batch_normalization_2"]["moving_mean:0"]
p_moving_variance = pretrained["model_weights"]["batch_normalization_2"]["batch_normalization_2"]["moving_variance:0"]
p_beta[...] = np.array(bl_beta)
p_gamma[...] = np.array(bl_gamma)
p_moving_mean[...] = np.array(bl_moving_mean)
p_moving_variance[...] = np.array(bl_moving_variance)
# bn 3
bl_beta = bl["model_weights"]["batch_normalization_3"]["batch_normalization_3"]["beta:0"]
bl_gamma = bl["model_weights"]["batch_normalization_3"]["batch_normalization_3"]["gamma:0"]
bl_moving_mean = bl["model_weights"]["batch_normalization_3"]["batch_normalization_3"]["moving_mean:0"]
bl_moving_variance = bl["model_weights"]["batch_normalization_3"]["batch_normalization_3"]["moving_variance:0"]
p_beta = pretrained["model_weights"]["batch_normalization_3"]["batch_normalization_3"]["beta:0"]
p_gamma = pretrained["model_weights"]["batch_normalization_3"]["batch_normalization_3"]["gamma:0"]
p_moving_mean = pretrained["model_weights"]["batch_normalization_3"]["batch_normalization_3"]["moving_mean:0"]
p_moving_variance = pretrained["model_weights"]["batch_normalization_3"]["batch_normalization_3"]["moving_variance:0"]
p_beta[...] = np.array(bl_beta)
p_gamma[...] = np.array(bl_gamma)
p_moving_mean[...] = np.array(bl_moving_mean)
p_moving_variance[...] = np.array(bl_moving_variance)
# bn 4
bl_beta = bl["model_weights"]["batch_normalization_4"]["batch_normalization_4"]["beta:0"]
bl_gamma = bl["model_weights"]["batch_normalization_4"]["batch_normalization_4"]["gamma:0"]
bl_moving_mean = bl["model_weights"]["batch_normalization_4"]["batch_normalization_4"]["moving_mean:0"]
bl_moving_variance = bl["model_weights"]["batch_normalization_4"]["batch_normalization_4"]["moving_variance:0"]
p_beta = pretrained["model_weights"]["batch_normalization_4"]["batch_normalization_4"]["beta:0"]
p_gamma = pretrained["model_weights"]["batch_normalization_4"]["batch_normalization_4"]["gamma:0"]
p_moving_mean = pretrained["model_weights"]["batch_normalization_4"]["batch_normalization_4"]["moving_mean:0"]
p_moving_variance = pretrained["model_weights"]["batch_normalization_4"]["batch_normalization_4"]["moving_variance:0"]
p_beta[...] = np.array(bl_beta)
p_gamma[...] = np.array(bl_gamma)
p_moving_mean[...] = np.array(bl_moving_mean)
p_moving_variance[...] = np.array(bl_moving_variance)
# bn 5
bl_beta = bl["model_weights"]["batch_normalization_5"]["batch_normalization_5"]["beta:0"]
bl_gamma = bl["model_weights"]["batch_normalization_5"]["batch_normalization_5"]["gamma:0"]
bl_moving_mean = bl["model_weights"]["batch_normalization_5"]["batch_normalization_5"]["moving_mean:0"]
bl_moving_variance = bl["model_weights"]["batch_normalization_5"]["batch_normalization_5"]["moving_variance:0"]
p_beta = pretrained["model_weights"]["batch_normalization_5"]["batch_normalization_5"]["beta:0"]
p_gamma = pretrained["model_weights"]["batch_normalization_5"]["batch_normalization_5"]["gamma:0"]
p_moving_mean = pretrained["model_weights"]["batch_normalization_5"]["batch_normalization_5"]["moving_mean:0"]
p_moving_variance = pretrained["model_weights"]["batch_normalization_5"]["batch_normalization_5"]["moving_variance:0"]
p_beta[...] = np.array(bl_beta)
p_gamma[...] = np.array(bl_gamma)
p_moving_mean[...] = np.array(bl_moving_mean)
p_moving_variance[...] = np.array(bl_moving_variance)
# bn 6
bl_beta = bl["model_weights"]["batch_normalization_6"]["batch_normalization_6"]["beta:0"]
bl_gamma = bl["model_weights"]["batch_normalization_6"]["batch_normalization_6"]["gamma:0"]
bl_moving_mean = bl["model_weights"]["batch_normalization_6"]["batch_normalization_6"]["moving_mean:0"]
bl_moving_variance = bl["model_weights"]["batch_normalization_6"]["batch_normalization_6"]["moving_variance:0"]
p_beta = pretrained["model_weights"]["batch_normalization_6"]["batch_normalization_6"]["beta:0"]
p_gamma = pretrained["model_weights"]["batch_normalization_6"]["batch_normalization_6"]["gamma:0"]
p_moving_mean = pretrained["model_weights"]["batch_normalization_6"]["batch_normalization_6"]["moving_mean:0"]
p_moving_variance = pretrained["model_weights"]["batch_normalization_6"]["batch_normalization_6"]["moving_variance:0"]
p_beta[...] = np.array(bl_beta)
p_gamma[...] = np.array(bl_gamma)
p_moving_mean[...] = np.array(bl_moving_mean)
p_moving_variance[...] = np.array(bl_moving_variance)
# bn 7
bl_beta = bl["model_weights"]["batch_normalization_7"]["batch_normalization_7"]["beta:0"]
bl_gamma = bl["model_weights"]["batch_normalization_7"]["batch_normalization_7"]["gamma:0"]
bl_moving_mean = bl["model_weights"]["batch_normalization_7"]["batch_normalization_7"]["moving_mean:0"]
bl_moving_variance = bl["model_weights"]["batch_normalization_7"]["batch_normalization_7"]["moving_variance:0"]
p_beta = pretrained["model_weights"]["batch_normalization_7"]["batch_normalization_7"]["beta:0"]
p_gamma = pretrained["model_weights"]["batch_normalization_7"]["batch_normalization_7"]["gamma:0"]
p_moving_mean = pretrained["model_weights"]["batch_normalization_7"]["batch_normalization_7"]["moving_mean:0"]
p_moving_variance = pretrained["model_weights"]["batch_normalization_7"]["batch_normalization_7"]["moving_variance:0"]
p_beta[...] = np.array(bl_beta)
p_gamma[...] = np.array(bl_gamma)
p_moving_mean[...] = np.array(bl_moving_mean)
p_moving_variance[...] = np.array(bl_moving_variance)
# bn 8
bl_beta = bl["model_weights"]["batch_normalization_8"]["batch_normalization_8"]["beta:0"]
bl_gamma = bl["model_weights"]["batch_normalization_8"]["batch_normalization_8"]["gamma:0"]
bl_moving_mean = bl["model_weights"]["batch_normalization_8"]["batch_normalization_8"]["moving_mean:0"]
bl_moving_variance = bl["model_weights"]["batch_normalization_8"]["batch_normalization_8"]["moving_variance:0"]
p_beta = pretrained["model_weights"]["batch_normalization_8"]["batch_normalization_8"]["beta:0"]
p_gamma = pretrained["model_weights"]["batch_normalization_8"]["batch_normalization_8"]["gamma:0"]
p_moving_mean = pretrained["model_weights"]["batch_normalization_8"]["batch_normalization_8"]["moving_mean:0"]
p_moving_variance = pretrained["model_weights"]["batch_normalization_8"]["batch_normalization_8"]["moving_variance:0"]
p_beta[...] = np.array(bl_beta)
p_gamma[...] = np.array(bl_gamma)
p_moving_mean[...] = np.array(bl_moving_mean)
p_moving_variance[...] = np.array(bl_moving_variance)
# bn 7
bl_beta = bl["model_weights"]["batch_normalization_9"]["batch_normalization_9"]["beta:0"]
bl_gamma = bl["model_weights"]["batch_normalization_9"]["batch_normalization_9"]["gamma:0"]
bl_moving_mean = bl["model_weights"]["batch_normalization_9"]["batch_normalization_9"]["moving_mean:0"]
bl_moving_variance = bl["model_weights"]["batch_normalization_9"]["batch_normalization_9"]["moving_variance:0"]
p_beta = pretrained["model_weights"]["batch_normalization_9"]["batch_normalization_9"]["beta:0"]
p_gamma = pretrained["model_weights"]["batch_normalization_9"]["batch_normalization_9"]["gamma:0"]
p_moving_mean = pretrained["model_weights"]["batch_normalization_9"]["batch_normalization_9"]["moving_mean:0"]
p_moving_variance = pretrained["model_weights"]["batch_normalization_9"]["batch_normalization_9"]["moving_variance:0"]
p_beta[...] = np.array(bl_beta)
p_gamma[...] = np.array(bl_gamma)
p_moving_mean[...] = np.array(bl_moving_mean)
p_moving_variance[...] = np.array(bl_moving_variance)
pretrained.close()
| 52.405583 | 118 | 0.755162 | 5,297 | 31,915 | 4.097791 | 0.02643 | 0.110569 | 0.093707 | 0.053073 | 0.940063 | 0.938312 | 0.937759 | 0.931632 | 0.927992 | 0.904174 | 0 | 0.050207 | 0.059502 | 31,915 | 608 | 119 | 52.491776 | 0.672941 | 0.080902 | 0 | 0.363441 | 0 | 0 | 0.351961 | 0.103293 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006452 | 0 | 0.006452 | 0.019355 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b459c48935d5bdcff1db9b6d1cb2edcc44b5afb8 | 2,889 | py | Python | listcord/__init__.py | imkr-vishal/listcord.py | 3026d0c09b44f3828f2f0103134f7d6975c3ca18 | [
"MIT"
] | 1 | 2021-03-07T18:50:14.000Z | 2021-03-07T18:50:14.000Z | listcord/__init__.py | imkr-vishal/listcord.py | 3026d0c09b44f3828f2f0103134f7d6975c3ca18 | [
"MIT"
] | null | null | null | listcord/__init__.py | imkr-vishal/listcord.py | 3026d0c09b44f3828f2f0103134f7d6975c3ca18 | [
"MIT"
] | null | null | null | import requests
import aiohttp
class Client():
def __init__(self, token: str):
self.token = token
self.baseURL = 'https://listcord.xyz/api'
def get_bot(self, id: str):
data = requests.get(self.baseURL + '/bot/' + id, headers={ 'token': self.token })
return data.json()
async def get_bot_async(self, id: str):
async with aiohttp.ClientSession() as session:
async with session.get(self.baseURL + '/bot/' + id, headers={ 'token': self.token }) as result:
return await result.json()
def get_bot_reviews(self, id: str):
data = requests.get(self.baseURL + '/bot/' + id + '/reviews', headers={ 'token': self.token })
return data.json()
async def get_bot_reviews_async(self, id: str):
async with aiohttp.ClientSession() as session:
async with session.get(self.baseURL + '/bot/' + id + '/reviews', headers={ 'token': self.token }) as result:
return await result.json()
def get_review(self, user_id: str, bot_id: str):
reviews = self.get_bot_reviews(bot_id)
if not isinstance(reviews, list):
return None
for review in reviews:
if review['author_id'] == user_id:
return review
return None
async def get_review_async(self, user_id: str, bot_id: str):
async with aiohttp.ClientSession() as session:
async with session.get(self.baseURL + '/bot/' + bot_id + '/reviews', headers={ 'token': self.token }) as result:
reviews = await result.json()
if not isinstance(reviews, list):
return None
for review in reviews:
if review['author_id'] == user_id:
return review
return None
def has_voted(self, user_id: str, bot_id: str):
data = requests.get(self.baseURL + '/bot/' + bot_id + '/voted', params={ 'user_id': user_id }, headers={ 'token': self.token })
return data.json()
async def has_voted_async(self, user_id: str, bot_id: str):
async with aiohttp.ClientSession() as session:
async with session.get(self.baseURL + '/bot/' + bot_id + '/voted', params={ 'user_id': user_id }, headers={ 'token': self.token }) as result:
return await result.json()
def search(self, q: str):
data = requests.get(self.baseURL + '/bots', params={ 'q': q }, headers={ 'token': self.token })
return data.json()
async def search_async(self, q: str):
async with aiohttp.ClientSession() as session:
async with session.get(self.baseURL + '/bots', params={ 'q': q }, headers={ 'token': self.token }) as result:
return await result.json()
def __str__(self):
return 'Listcord<Client>'
__version__ = '1.5.0' | 38.52 | 153 | 0.583247 | 363 | 2,889 | 4.506887 | 0.137741 | 0.036675 | 0.077017 | 0.115526 | 0.826406 | 0.826406 | 0.817237 | 0.794621 | 0.78423 | 0.748778 | 0 | 0.001453 | 0.28522 | 2,889 | 75 | 154 | 38.52 | 0.790799 | 0 | 0 | 0.454545 | 0 | 0 | 0.070934 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.127273 | false | 0 | 0.036364 | 0.018182 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5ef7ef510f60d776a9bb129251c14820c4983975 | 4,741 | py | Python | raiden/messages/withdraw.py | ExchangeUnion/raiden | 2217bcb698fcfce3499dc1f41ad919ed82e8e45f | [
"MIT"
] | null | null | null | raiden/messages/withdraw.py | ExchangeUnion/raiden | 2217bcb698fcfce3499dc1f41ad919ed82e8e45f | [
"MIT"
] | 12 | 2019-08-09T19:12:17.000Z | 2019-12-05T15:49:29.000Z | raiden/messages/withdraw.py | ExchangeUnion/raiden | 2217bcb698fcfce3499dc1f41ad919ed82e8e45f | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from raiden.constants import EMPTY_SIGNATURE
from raiden.messages.abstract import SignedRetrieableMessage
from raiden.messages.cmdid import CmdId
from raiden.utils.signing import pack_data
from raiden.utils.typing import (
Address,
BlockExpiration,
ChainID,
ChannelID,
ClassVar,
Nonce,
TokenNetworkAddress,
WithdrawAmount,
)
from raiden_contracts.constants import MessageTypeId
@dataclass(repr=False, eq=False)
class WithdrawRequest(SignedRetrieableMessage):
""" Requests a signed on-chain withdraw confirmation from partner. """
cmdid: ClassVar[CmdId] = CmdId.WITHDRAW_REQUEST
message_type: ClassVar[int] = MessageTypeId.WITHDRAW
chain_id: ChainID
token_network_address: TokenNetworkAddress
channel_identifier: ChannelID
participant: Address
total_withdraw: WithdrawAmount
nonce: Nonce
expiration: BlockExpiration
@classmethod
def from_event(cls, event):
return cls(
message_identifier=event.message_identifier,
chain_id=event.canonical_identifier.chain_identifier,
token_network_address=event.canonical_identifier.token_network_address,
channel_identifier=event.canonical_identifier.channel_identifier,
total_withdraw=event.total_withdraw,
participant=event.participant,
nonce=event.nonce,
expiration=event.expiration,
signature=EMPTY_SIGNATURE,
)
def _data_to_sign(self) -> bytes:
return pack_data(
(self.token_network_address, "address"),
(self.chain_id, "uint256"),
(self.message_type, "uint256"),
(self.channel_identifier, "uint256"),
(self.participant, "address"),
(self.total_withdraw, "uint256"),
(self.expiration, "uint256"),
)
@dataclass(repr=False, eq=False)
class WithdrawConfirmation(SignedRetrieableMessage):
""" Confirms withdraw to partner with a signature """
cmdid: ClassVar[CmdId] = CmdId.WITHDRAW_CONFIRMATION
message_type: ClassVar[int] = MessageTypeId.WITHDRAW
chain_id: ChainID
token_network_address: TokenNetworkAddress
channel_identifier: ChannelID
participant: Address
total_withdraw: WithdrawAmount
nonce: Nonce
expiration: BlockExpiration
@classmethod
def from_event(cls, event):
return cls(
message_identifier=event.message_identifier,
chain_id=event.canonical_identifier.chain_identifier,
token_network_address=event.canonical_identifier.token_network_address,
channel_identifier=event.canonical_identifier.channel_identifier,
total_withdraw=event.total_withdraw,
participant=event.participant,
nonce=event.nonce,
expiration=event.expiration,
signature=EMPTY_SIGNATURE,
)
def _data_to_sign(self) -> bytes:
return pack_data(
(self.token_network_address, "address"),
(self.chain_id, "uint256"),
(self.message_type, "uint256"),
(self.channel_identifier, "uint256"),
(self.participant, "address"),
(self.total_withdraw, "uint256"),
(self.expiration, "uint256"),
)
@dataclass
class WithdrawExpired(SignedRetrieableMessage):
""" Notifies about withdraw expiration/cancellation from partner. """
cmdid: ClassVar[CmdId] = CmdId.WITHDRAW_EXPIRED
message_type: ClassVar[int] = MessageTypeId.WITHDRAW
chain_id: ChainID
token_network_address: TokenNetworkAddress
channel_identifier: ChannelID
participant: Address
total_withdraw: WithdrawAmount
expiration: BlockExpiration
nonce: Nonce
@classmethod
def from_event(cls, event):
return cls(
message_identifier=event.message_identifier,
chain_id=event.canonical_identifier.chain_identifier,
token_network_address=event.canonical_identifier.token_network_address,
channel_identifier=event.canonical_identifier.channel_identifier,
total_withdraw=event.total_withdraw,
participant=event.participant,
nonce=event.nonce,
expiration=event.expiration,
signature=EMPTY_SIGNATURE,
)
def _data_to_sign(self) -> bytes:
return pack_data(
(self.token_network_address, "address"),
(self.chain_id, "uint256"),
(self.message_type, "uint256"),
(self.channel_identifier, "uint256"),
(self.participant, "address"),
(self.total_withdraw, "uint256"),
(self.expiration, "uint256"),
)
| 33.624113 | 83 | 0.678971 | 451 | 4,741 | 6.904656 | 0.157428 | 0.046243 | 0.073218 | 0.055877 | 0.787733 | 0.777778 | 0.7614 | 0.734425 | 0.734425 | 0.734425 | 0 | 0.012465 | 0.238557 | 4,741 | 140 | 84 | 33.864286 | 0.850139 | 0.036279 | 0 | 0.754237 | 0 | 0 | 0.032315 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.050847 | false | 0 | 0.059322 | 0.050847 | 0.415254 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6f0f8bae66d58d55fd063fabcfff3db02ccabf50 | 28 | py | Python | sca/sca_events/__init__.py | open-power-sdk/source-code-advisor | f39d6f59bfd33e5ac1148e1e9b72f472c8429252 | [
"Apache-2.0"
] | 10 | 2017-04-11T19:18:40.000Z | 2019-10-17T18:00:30.000Z | sca/sca_events/__init__.py | open-power-sdk/source-code-advisor | f39d6f59bfd33e5ac1148e1e9b72f472c8429252 | [
"Apache-2.0"
] | 2 | 2017-04-20T17:32:57.000Z | 2021-10-18T17:15:00.000Z | sca/sca_events/__init__.py | open-power-sdk/source-code-advisor | f39d6f59bfd33e5ac1148e1e9b72f472c8429252 | [
"Apache-2.0"
] | 4 | 2017-04-12T23:59:37.000Z | 2018-04-14T14:34:59.000Z |
from sca_xml import ScaXml
| 9.333333 | 26 | 0.821429 | 5 | 28 | 4.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178571 | 28 | 2 | 27 | 14 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6f139b6a556b9a2fc3e2f7d9c56278f218bdd9ab | 5,318 | py | Python | community/models.py | loffle/loffle_back | f102d5361ac00abf8fa6e2407e9481222e8201e6 | [
"MIT"
] | null | null | null | community/models.py | loffle/loffle_back | f102d5361ac00abf8fa6e2407e9481222e8201e6 | [
"MIT"
] | null | null | null | community/models.py | loffle/loffle_back | f102d5361ac00abf8fa6e2407e9481222e8201e6 | [
"MIT"
] | null | null | null | from django.db import models
from account.models import User
class CommonManager(models.Manager):
def __init__(self, is_deleted=False):
super().__init__()
self.is_deleted = is_deleted
def get_queryset(self):
# queryset = Post.objects.filter(is_deleted=False)
# queryset = Post.objects.filter(is_deleted=False).prefetch_related('like').select_related('user')
# 세 queryset 성능 및 속도 비교해보기
return super().get_queryset().select_related('user').filter(is_deleted=self.is_deleted)
# --------------------------------------------------------
class Post(models.Model):
title = models.CharField(max_length=200)
content = models.TextField()
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
is_deleted = models.BooleanField(default=False, editable=False)
user = models.ForeignKey(User, related_name="posts", on_delete=models.CASCADE)
# file = models.ManyToManyField(File, on_delete=models.SET_NULL, null=True, blank=True) # File
like = models.ManyToManyField(User, related_name="liked_posts", blank=True)
objects = CommonManager()
deleted_objects = CommonManager(is_deleted=True)
class PostComment(models.Model):
content = models.CharField(max_length=200)
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
is_deleted = models.BooleanField(default=False, editable=False)
post = models.ForeignKey(Post, related_name="comments", on_delete=models.CASCADE)
user = models.ForeignKey(User, related_name="postcomments", on_delete=models.CASCADE)
like = models.ManyToManyField(User, related_name="liked_postcomments", blank=True)
# class Meta:
# db_table = '_'.join((__package__, 'post_comment'))
objects = CommonManager()
deleted_objects = CommonManager(is_deleted=True)
class Review(models.Model):
# title = models.CharField(max_length=200)
content = models.TextField()
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
is_deleted = models.BooleanField(default=False, editable=False)
user = models.ForeignKey(User, related_name="reviews", on_delete=models.CASCADE)
# file = models.ManyToManyField(File, on_delete=models.SET_NULL, null=True, blank=True)
# raffle = models.ForeignKey(Raffle, on_delete=models.CASCADE)
like = models.ManyToManyField(User, related_name="liked_reviews", blank=True)
objects = CommonManager()
deleted_objects = CommonManager(is_deleted=True)
class ReviewComment(models.Model):
content = models.CharField(max_length=200)
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
is_deleted = models.BooleanField(default=False, editable=False)
review = models.ForeignKey(Review, related_name="comments", on_delete=models.CASCADE)
user = models.ForeignKey(User, related_name="reviewcomments", on_delete=models.CASCADE)
like = models.ManyToManyField(User, related_name="liked_reviewcomments", blank=True)
objects = CommonManager()
deleted_objects = CommonManager(is_deleted=True)
class Notice(models.Model):
title = models.CharField(max_length=200)
content = models.TextField()
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
is_deleted = models.BooleanField(default=False, editable=False)
user = models.ForeignKey(User, related_name="notices", on_delete=models.CASCADE)
# file = models.ManyToManyField(File, on_delete=models.SET_NULL, null=True, blank=True) # File
objects = CommonManager()
deleted_objects = CommonManager(is_deleted=True)
# ================= #
# Question & Answer #
# ================= #
class QuestionType(models.Model):
name = models.CharField(max_length=100, unique=True)
def __str__(self):
return self.name
class Question(models.Model):
title = models.CharField(max_length=200)
content = models.TextField()
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
is_deleted = models.BooleanField(default=False, editable=False)
user = models.ForeignKey(User, related_name="questions", on_delete=models.CASCADE)
# file = models.ManyToManyField(File, on_delete=models.SET_NULL, null=True, blank=True) # File
question_type = models.ForeignKey(QuestionType, related_name="questions", on_delete=models.PROTECT)
objects = CommonManager()
deleted_objects = CommonManager(is_deleted=True)
class Answer(models.Model):
title = models.CharField(max_length=200)
content = models.TextField()
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
is_deleted = models.BooleanField(default=False, editable=False)
user = models.ForeignKey(User, related_name="answers", on_delete=models.CASCADE)
# file = models.ForeignKey(File, on_delete=models.SET_NULL, null=True, blank=True) # File
question = models.ForeignKey(Question, related_name="answers", on_delete=models.CASCADE)
objects = CommonManager()
deleted_objects = CommonManager(is_deleted=True)
| 39.392593 | 106 | 0.730726 | 639 | 5,318 | 5.86698 | 0.137715 | 0.050413 | 0.063484 | 0.093358 | 0.804748 | 0.804748 | 0.790611 | 0.739931 | 0.707922 | 0.673246 | 0 | 0.005264 | 0.142723 | 5,318 | 134 | 107 | 39.686567 | 0.817065 | 0.170365 | 0 | 0.575 | 0 | 0 | 0.036235 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0375 | false | 0 | 0.025 | 0.025 | 0.975 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
489279d382b3707de270cd825475979160375be8 | 132 | py | Python | dCC_Python_SodaMachine/backpack.py | JaredMartin0351/SodaMachineDebugging | f2959322b6e5bdfc74388e5bdd754766870ff5e4 | [
"MIT"
] | null | null | null | dCC_Python_SodaMachine/backpack.py | JaredMartin0351/SodaMachineDebugging | f2959322b6e5bdfc74388e5bdd754766870ff5e4 | [
"MIT"
] | null | null | null | dCC_Python_SodaMachine/backpack.py | JaredMartin0351/SodaMachineDebugging | f2959322b6e5bdfc74388e5bdd754766870ff5e4 | [
"MIT"
] | null | null | null |
class Backpack:
purchased_cans = []
def __init__(self, purchased_cans):
self.purchased_cans = purchased_cans
| 16.5 | 44 | 0.666667 | 14 | 132 | 5.714286 | 0.5 | 0.65 | 0.425 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.257576 | 132 | 7 | 45 | 18.857143 | 0.816327 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
489bc275e9cec96dccb77b11044248b29a1c8489 | 5,556 | py | Python | tests/s3/test_s3_actions.py | farhanangullia/chaostoolkit-aws | 2789ab91a7ac2373352fbf50cb60176cead7eccb | [
"Apache-2.0"
] | 85 | 2018-01-31T16:55:37.000Z | 2022-02-01T03:23:42.000Z | tests/s3/test_s3_actions.py | farhanangullia/chaostoolkit-aws | 2789ab91a7ac2373352fbf50cb60176cead7eccb | [
"Apache-2.0"
] | 88 | 2018-01-31T17:00:53.000Z | 2021-12-13T08:18:42.000Z | tests/s3/test_s3_actions.py | farhanangullia/chaostoolkit-aws | 2789ab91a7ac2373352fbf50cb60176cead7eccb | [
"Apache-2.0"
] | 58 | 2018-01-30T19:33:19.000Z | 2021-12-13T08:18:57.000Z | import json
import os
from unittest.mock import MagicMock, patch
import pytest
from botocore.exceptions import ClientError
from chaoslib.exceptions import FailedActivity
from chaosaws import aws_client
from chaosaws.s3.actions import delete_object, toggle_versioning
data_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "data")
def read_configs(filename: str) -> dict:
config = os.path.join(data_path, filename)
with open(config, "r") as fh:
return json.loads(fh.read())
def mock_client_error(*args, **kwargs) -> ClientError:
return ClientError(
operation_name=kwargs["op"],
error_response={
"Error": {"Code": kwargs["Code"], "Message": kwargs["Message"]}
},
)
@patch("chaosaws.s3.actions.aws_client", autospec=True)
def test_delete_object_true(test_client: aws_client):
client = MagicMock()
test_client.return_value = client
client.list_buckets.return_value = read_configs("list_buckets_1.json")
client.get_object.return_value = read_configs("get_object_1.json")
client.delete_object.return_value = {}
delete_object(bucket_name="Test-Bucket-1", object_key="path/to/some/file.json")
client.delete_object.assert_called_with(
Bucket="Test-Bucket-1", Key="path/to/some/file.json"
)
@patch("chaosaws.s3.actions.aws_client", autospec=True)
def test_delete_object_false_invalid_bucket(test_client: aws_client):
client = MagicMock()
test_client.return_value = client
client.list_buckets.return_value = read_configs("list_buckets_1.json")
client.get_object.return_value = read_configs("get_object_1.json")
client.delete_object.return_value = {}
with pytest.raises(FailedActivity) as x:
delete_object(bucket_name="Test-Bucket-99", object_key="path/to/some/file.json")
assert 'Bucket "Test-Bucket-99" does not exist!' in str(x)
@patch("chaosaws.s3.actions.aws_client", autospec=True)
def test_delete_object_version_true(test_client: aws_client):
client = MagicMock()
test_client.return_value = client
client.list_buckets.return_value = read_configs("list_buckets_1.json")
client.get_object.return_value = read_configs("get_object_1.json")
client.delete_object.return_value = {}
delete_object(
bucket_name="Test-Bucket-1",
object_key="path/to/some/file.json",
version_id="ab_cDefGhiJklMnoPqRsTu.aBcdEfGhi",
)
client.delete_object.assert_called_with(
Bucket="Test-Bucket-1",
Key="path/to/some/file.json",
VersionId="ab_cDefGhiJklMnoPqRsTu.aBcdEfGhi",
)
@patch("chaosaws.s3.actions.aws_client", autospec=True)
def test_toggle_versioning_no_bucket(test_client: aws_client):
client = MagicMock()
test_client.return_value = client
client.list_buckets.return_value = read_configs("list_buckets_1.json")
client.get_bucket_versioning.return_value = read_configs(
"get_bucket_versioning_1.json"
)
params = {"bucket_name": "Test-Bucket-15", "status": "Enabled"}
with pytest.raises(FailedActivity) as x:
toggle_versioning(**params)
assert 'Bucket "Test-Bucket-15" does not exist!' in str(x)
@patch("chaosaws.s3.actions.aws_client", autospec=True)
def test_toggle_versioning_enable(test_client: aws_client):
client = MagicMock()
test_client.return_value = client
client.list_buckets.return_value = read_configs("list_buckets_1.json")
client.get_bucket_versioning.return_value = read_configs(
"get_bucket_versioning_1.json"
)
params = {"bucket_name": "Test-Bucket-8", "status": "Enabled"}
toggle_versioning(**params)
client.put_bucket_versioning.assert_called_with(
Bucket="Test-Bucket-8", VersioningConfiguration={"Status": "Enabled"}
)
@patch("chaosaws.s3.actions.aws_client", autospec=True)
def test_toggle_versioning_enable_auto(test_client: aws_client):
client = MagicMock()
test_client.return_value = client
client.list_buckets.return_value = read_configs("list_buckets_1.json")
client.get_bucket_versioning.return_value = read_configs(
"get_bucket_versioning_1.json"
)
params = {"bucket_name": "Test-Bucket-8"}
toggle_versioning(**params)
client.put_bucket_versioning.assert_called_with(
Bucket="Test-Bucket-8", VersioningConfiguration={"Status": "Enabled"}
)
@patch("chaosaws.s3.actions.aws_client", autospec=True)
def test_toggle_versioning_suspend(test_client: aws_client):
client = MagicMock()
test_client.return_value = client
client.list_buckets.return_value = read_configs("list_buckets_1.json")
client.get_bucket_versioning.return_value = read_configs(
"get_bucket_versioning_2.json"
)
params = {"bucket_name": "Test-Bucket-8", "status": "Suspended"}
toggle_versioning(**params)
client.put_bucket_versioning.assert_called_with(
Bucket="Test-Bucket-8", VersioningConfiguration={"Status": "Suspended"}
)
@patch("chaosaws.s3.actions.aws_client", autospec=True)
def test_toggle_versioning_suspend_auto(test_client: aws_client):
client = MagicMock()
test_client.return_value = client
client.list_buckets.return_value = read_configs("list_buckets_1.json")
client.get_bucket_versioning.return_value = read_configs(
"get_bucket_versioning_2.json"
)
params = {"bucket_name": "Test-Bucket-8"}
toggle_versioning(**params)
client.put_bucket_versioning.assert_called_with(
Bucket="Test-Bucket-8", VersioningConfiguration={"Status": "Suspended"}
)
| 34.296296 | 88 | 0.732361 | 718 | 5,556 | 5.355153 | 0.130919 | 0.077243 | 0.062419 | 0.091547 | 0.814564 | 0.814564 | 0.789077 | 0.782055 | 0.778934 | 0.778934 | 0 | 0.009516 | 0.148848 | 5,556 | 161 | 89 | 34.509317 | 0.803553 | 0 | 0 | 0.54918 | 0 | 0 | 0.216883 | 0.099712 | 0 | 0 | 0 | 0 | 0.065574 | 1 | 0.081967 | false | 0 | 0.065574 | 0.008197 | 0.163934 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
48af6370ea1017398505127c6cfc19db7e51020f | 120 | py | Python | examples/x01separated/team_and_user/_lazy.py | Danil-Grigorev/swagger-marshmallow-codegen | 4c077f6e1ef535bcbdbf1f643f97bc4cbc62c0e8 | [
"MIT"
] | 49 | 2017-02-05T17:32:18.000Z | 2022-01-30T13:20:22.000Z | examples/x01separated/team_and_user/_lazy.py | Danil-Grigorev/swagger-marshmallow-codegen | 4c077f6e1ef535bcbdbf1f643f97bc4cbc62c0e8 | [
"MIT"
] | 62 | 2016-12-27T15:38:28.000Z | 2021-09-30T02:47:00.000Z | examples/x01separated/team_and_user/_lazy.py | Danil-Grigorev/swagger-marshmallow-codegen | 4c077f6e1ef535bcbdbf1f643f97bc4cbc62c0e8 | [
"MIT"
] | 10 | 2017-07-19T12:38:25.000Z | 2020-04-07T09:11:22.000Z | def _useTeam():
from .team import Team
return Team
def _useUser():
from .user import User
return User
| 13.333333 | 26 | 0.65 | 16 | 120 | 4.75 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.283333 | 120 | 8 | 27 | 15 | 0.883721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
82a31f6c80f26083638302c6505310e263c7a0ce | 1,972 | py | Python | fuzzy_grassmann_numbers/fuzzy_number.py | ly3xqhl8g9/fuzzy-grassmann-numbers | d12798df8d633fd1a7b250804334b388b0aa35d7 | [
"MIT"
] | null | null | null | fuzzy_grassmann_numbers/fuzzy_number.py | ly3xqhl8g9/fuzzy-grassmann-numbers | d12798df8d633fd1a7b250804334b388b0aa35d7 | [
"MIT"
] | null | null | null | fuzzy_grassmann_numbers/fuzzy_number.py | ly3xqhl8g9/fuzzy-grassmann-numbers | d12798df8d633fd1a7b250804334b388b0aa35d7 | [
"MIT"
] | null | null | null | class FN():
"""Fuzzy Number
"""
def __init__(self, limits, granularity=0.1):
self.lower_limit = limits[0]
self.upper_limit = limits[1]
self.granularity = granularity
def add(self, other):
if isinstance(other, FN):
newFN = []
newFN.append(self.lower_limit + other.lower_limit)
newFN.append(self.upper_limit + other.upper_limit)
return FN(newFN)
else:
print('The number needs to be a Fuzzy Number (FN instance).')
def subtract(self, other):
if isinstance(other, FN):
newFN = []
newFN.append(self.lower_limit - other.lower_limit)
newFN.append(self.upper_limit - other.upper_limit)
return FN(newFN)
else:
print('The number needs to be a Fuzzy Number (FN instance).')
def multiply(self, other):
if isinstance(other, FN):
newFN = []
x1 = self.lower_limit
x2 = self.upper_limit
y1 = other.lower_limit
y2 = other.upper_limit
calculation = [ x1*y1, x1*y2, x2*y1, x2*y2 ]
lower_limit_min = min(calculation)
upper_limit_max = max(calculation)
newFN.append(lower_limit_min)
newFN.append(upper_limit_max)
return FN(newFN)
def divide(self, other):
if isinstance(other, FN):
newFN = []
x1 = self.lower_limit
x2 = self.upper_limit
y1 = other.lower_limit
y2 = other.upper_limit
calculation = [ x1/y1, x1/y2, x2/y1, x2/y2 ]
lower_limit_min = min(calculation)
upper_limit_max = max(calculation)
newFN.append(lower_limit_min)
newFN.append(upper_limit_max)
return FN(newFN)
def display(self):
fn = '[' + str(self.lower_limit) + ', ' + str(self.upper_limit) + ']'
return fn
| 28.57971 | 77 | 0.549189 | 232 | 1,972 | 4.49569 | 0.176724 | 0.134228 | 0.080537 | 0.080537 | 0.809204 | 0.809204 | 0.809204 | 0.809204 | 0.809204 | 0.809204 | 0 | 0.021807 | 0.348884 | 1,972 | 68 | 78 | 29 | 0.790498 | 0.006085 | 0 | 0.64 | 0 | 0 | 0.055441 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12 | false | 0 | 0 | 0 | 0.24 | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
82a553ab24dbdbd863f0f5b4ea18ab9f0031c0df | 27 | py | Python | projects/CFPN/cfpn/util/__init__.py | Shamazo/detectron2 | ab8c4aa2e0dfa1347bb45b35ba452537f692debe | [
"Apache-2.0"
] | null | null | null | projects/CFPN/cfpn/util/__init__.py | Shamazo/detectron2 | ab8c4aa2e0dfa1347bb45b35ba452537f692debe | [
"Apache-2.0"
] | null | null | null | projects/CFPN/cfpn/util/__init__.py | Shamazo/detectron2 | ab8c4aa2e0dfa1347bb45b35ba452537f692debe | [
"Apache-2.0"
] | null | null | null | from .util import PatchUtil | 27 | 27 | 0.851852 | 4 | 27 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 27 | 1 | 27 | 27 | 0.958333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
7d9d9ec5dab0e00f89d35b05f6bfa59ccecb1056 | 1,456 | py | Python | src/dataset.py | tungkw/AEVB | 89a2cad8b000c681904d18e0021fbc9e5d491b06 | [
"MIT"
] | null | null | null | src/dataset.py | tungkw/AEVB | 89a2cad8b000c681904d18e0021fbc9e5d491b06 | [
"MIT"
] | null | null | null | src/dataset.py | tungkw/AEVB | 89a2cad8b000c681904d18e0021fbc9e5d491b06 | [
"MIT"
] | null | null | null | import numpy as np
import torch
from torch.utils.data import Dataset
from FreyFaceHelper import FreyFaceHelper
from MINSThelper import MINSTHelper
class FreyFaceDataset(Dataset):
def __init__(self, root_dir, transform=None):
self.data = FreyFaceHelper(root_dir).data / 255
self.data_size, h, w = self.data.shape
self.sample_dim = h*w
self.data = self.data.reshape(self.data_size, self.sample_dim)
self.data = torch.from_numpy(self.data).float()
self.transform = transform
def __len__(self):
return self.data_size
def __getitem__(self, idx):
if torch.is_tensor(idx):
idx = idx.tolist()
sample = self.data[idx]
if self.transform:
sample = self.transform(sample)
return sample
class MINSTDataset(Dataset):
def __init__(self, root_dir, transform=None):
self.data = MINSTHelper(root_dir).train_images / 255
self.data_size, h, w = self.data.shape
self.sample_dim = h*w
self.data = self.data.reshape(self.data_size, self.sample_dim)
self.data = torch.from_numpy(self.data).float()
self.transform = transform
def __len__(self):
return self.data_size
def __getitem__(self, idx):
if torch.is_tensor(idx):
idx = idx.tolist()
sample = self.data[idx]
if self.transform:
sample = self.transform(sample)
return sample | 32.355556 | 70 | 0.646291 | 189 | 1,456 | 4.751323 | 0.21164 | 0.178174 | 0.080178 | 0.044543 | 0.757238 | 0.757238 | 0.757238 | 0.757238 | 0.757238 | 0.757238 | 0 | 0.00554 | 0.256181 | 1,456 | 45 | 71 | 32.355556 | 0.823638 | 0 | 0 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.128205 | 0.051282 | 0.435897 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
7db6ead6b9d411021c0fd775e3283bf9cced610d | 9,377 | py | Python | Utils/DataLoader.py | Felix660/DNNDeepeningPruning | 4b61ca19ebf6570fb6210d556fde89910465b691 | [
"MIT"
] | 1 | 2021-09-16T21:52:09.000Z | 2021-09-16T21:52:09.000Z | Utils/DataLoader.py | Felix660/DNNDeepeningPruning | 4b61ca19ebf6570fb6210d556fde89910465b691 | [
"MIT"
] | 1 | 2020-12-29T13:57:40.000Z | 2020-12-29T14:11:50.000Z | Utils/DataLoader.py | Felix660/DNNDeepeningPruning | 4b61ca19ebf6570fb6210d556fde89910465b691 | [
"MIT"
] | 1 | 2021-05-23T14:44:40.000Z | 2021-05-23T14:44:40.000Z | import os
import matplotlib.pyplot as plt
import pandas
from PIL import Image
import torch
from torchvision import transforms, datasets
import numpy as np
from sklearn.utils import shuffle
from torch.utils.data.sampler import SubsetRandomSampler
class ISIC2016(torch.utils.data.Dataset):
def __init__(self, df_data, data_dir, transform=None):
super().__init__()
self.df = df_data
self.data_dir = data_dir
self.transform = transform
def __len__(self):
return len(self.df)
def __getitem__(self, id):
img_name = self.df['image'][id]
img_label = self.df['class'][id].astype(float)
img_path = os.path.join(self.data_dir, img_name + '.jpg')
image = Image.open(img_path)
if self.transform is not None:
image = self.transform(image)
return image, img_label
def data_loader(dataset_root_path, dataset_name, batch_size):
if dataset_name == "ISIC2016":
train_loader, valid_loader, test_loader = load_ISIC2016(dataset_root_path, batch_size)
elif dataset_name == "ChestXRay":
train_loader, valid_loader, test_loader = load_ChestXRay(dataset_root_path, batch_size)
elif dataset_name == "CIFAR10":
train_loader, valid_loader, test_loader = load_CIFAR10(dataset_root_path, batch_size)
elif dataset_name == "CIFAR100":
train_loader, valid_loader, test_loader = load_CIFAR100(dataset_root_path, batch_size)
return train_loader, valid_loader, test_loader
def load_ISIC2016(dataset_root_path, batch_size):
data_path = dataset_root_path + "/ISIC2016/"
# Create train dataframe
train_df = pandas.read_csv(data_path + "Training_GroundTruth.csv")
# Create test dataframe
test_df = pandas.read_csv(data_path + "Test_GroundTruth.csv")
normalize = transforms.Normalize(mean=[0.72839737, 0.6002146, 0.5401608],
std=[0.15253444, 0.17805147, 0.19754663])
transform = transforms.Compose([
transforms.Resize(256),
transforms.CenterCrop(size=(224, 224)),
transforms.ToTensor(),
normalize])
transform_valid = transforms.Compose([
transforms.Resize(256),
transforms.CenterCrop(size=(224, 224)),
transforms.ToTensor(),
normalize])
transform_test = transforms.Compose([
transforms.Resize(256),
transforms.CenterCrop(size=(224, 224)),
transforms.ToTensor(),
normalize])
train_path = data_path + "train_images/" # ISIC 2016
test_path = data_path + "test_images/"
train_set = ISIC2016(df_data=train_df, data_dir=train_path, transform=transform)
valid_set = ISIC2016(df_data=train_df, data_dir=train_path, transform=transform_valid)
test_set = ISIC2016(df_data=test_df, data_dir=test_path, transform=transform_test)
dataset_len = len(train_set)
indices = list(range(dataset_len))
# Randomly splitting indices:
val_len = int(np.floor(0.2 * dataset_len))
validation_idx = np.random.choice(indices, size=val_len, replace=False)
train_idx = list(set(indices) - set(validation_idx))
train_sampler = SubsetRandomSampler(train_idx)
validation_sampler = SubsetRandomSampler(validation_idx)
train_loader = torch.utils.data.DataLoader(
dataset=train_set,
batch_size=batch_size,
sampler=train_sampler,
num_workers=4,
pin_memory=True)
valid_loader = torch.utils.data.DataLoader(
dataset=valid_set,
batch_size=batch_size,
sampler=validation_sampler,
num_workers=4,
pin_memory=True)
test_loader = torch.utils.data.DataLoader(
dataset=test_set,
batch_size=batch_size,
num_workers=4,
pin_memory=True)
return train_loader, valid_loader, test_loader
def load_ChestXRay(dataset_root_path, batch_size):
data_path = dataset_root_path + "/chest_xray/"
normalize = transforms.Normalize(mean=[0.58450365, 0.58450365, 0.58450365],
std=[0.16148868, 0.16148868, 0.16148868])
transform = transforms.Compose([
transforms.Resize(256),
transforms.CenterCrop(size=(224, 224)),
transforms.ToTensor(),
normalize])
transform_valid = transforms.Compose([
#transforms.Grayscale(num_output_channels=1),
transforms.Resize(256),
transforms.CenterCrop(size=(224, 224)),
transforms.ToTensor(),
normalize])
transform_test = transforms.Compose([
#transforms.Grayscale(num_output_channels=1),
transforms.Resize(256),
transforms.CenterCrop(size=(224, 224)),
transforms.ToTensor(),
normalize])
train_path = data_path + "train/"
valid_path = data_path + "val/"
test_path = data_path + "test/"
train_set = datasets.ImageFolder(root=train_path, transform=transform)
test_set = datasets.ImageFolder(root=test_path, transform=transform_test)
dataset_len = len(train_set)
indices = list(range(dataset_len))
# Randomly splitting indices:
val_len = int(np.floor(0.2 * dataset_len))
validation_idx = np.random.choice(indices, size=val_len, replace=False)
train_idx = list(set(indices) - set(validation_idx))
train_sampler = SubsetRandomSampler(train_idx)
validation_sampler = SubsetRandomSampler(validation_idx)
train_loader = torch.utils.data.DataLoader(
dataset=train_set,
batch_size=batch_size,
sampler=train_sampler,
num_workers=4,
pin_memory=True)
valid_loader = torch.utils.data.DataLoader(
dataset=train_set,
batch_size=batch_size,
sampler=validation_sampler,
num_workers=4,
pin_memory=True)
test_loader = torch.utils.data.DataLoader(
dataset=test_set,
batch_size=batch_size,
shuffle=True,
num_workers=4,
pin_memory=True)
return train_loader, valid_loader, test_loader
def load_CIFAR10(dataset_root_path, batch_size):
data_path = dataset_root_path + "/CIFAR10/"
validation_split = 0.2
normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406],
std=[0.229, 0.224, 0.225])
transform_train = transforms.Compose([
transforms.RandomCrop(32, 4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
normalize])
transform_test = transforms.Compose([
transforms.ToTensor(),
normalize])
train_set = datasets.CIFAR10(root=data_path, train=True, transform=transform_train, download=True)
test_set = datasets.CIFAR10(root=data_path, train=False, transform=transform_test, download=False)
dataset_len = len(train_set)
indices = list(range(dataset_len))
# Randomly splitting indices:
val_len = int(np.floor(validation_split * dataset_len))
validation_idx = np.random.choice(indices, size=val_len, replace=False)
train_idx = list(set(indices) - set(validation_idx))
train_sampler = SubsetRandomSampler(train_idx)
validation_sampler = SubsetRandomSampler(validation_idx)
train_loader = torch.utils.data.DataLoader(
dataset=train_set,
batch_size=batch_size,
sampler=train_sampler,
num_workers=8)
validation_loader = torch.utils.data.DataLoader(
dataset=train_set,
batch_size=batch_size,
sampler=validation_sampler,
num_workers=8)
test_loader = torch.utils.data.DataLoader(
dataset=test_set,
batch_size=batch_size,
num_workers=8)
return train_loader, validation_loader, test_loader
def load_CIFAR100(dataset_root_path, batch_size):
data_path = dataset_root_path + "/CIFAR100/"
validation_split = 0.2
normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406],
std=[0.229, 0.224, 0.225])
transform_train = transforms.Compose([
transforms.RandomCrop(32, 4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
normalize])
transform_test = transforms.Compose([
transforms.ToTensor(),
normalize])
train_set = datasets.CIFAR100(root=data_path, train=True, transform=transform_train, download=True)
test_set = datasets.CIFAR100(root=data_path, train=False, transform=transform_test, download=False)
dataset_len = len(train_set)
indices = list(range(dataset_len))
# Randomly splitting indices:
val_len = int(np.floor(validation_split * dataset_len))
validation_idx = np.random.choice(indices, size=val_len, replace=False)
train_idx = list(set(indices) - set(validation_idx))
train_sampler = SubsetRandomSampler(train_idx)
validation_sampler = SubsetRandomSampler(validation_idx)
train_loader = torch.utils.data.DataLoader(
dataset=train_set,
batch_size=batch_size,
sampler=train_sampler,
num_workers=8)
validation_loader = torch.utils.data.DataLoader(
dataset=train_set,
batch_size=batch_size,
sampler=validation_sampler,
num_workers=8)
test_loader = torch.utils.data.DataLoader(
dataset=test_set,
batch_size=batch_size,
num_workers=8)
return train_loader, validation_loader, test_loader
| 32.113014 | 103 | 0.679535 | 1,117 | 9,377 | 5.431513 | 0.115488 | 0.048953 | 0.032306 | 0.039558 | 0.82446 | 0.806494 | 0.798912 | 0.755068 | 0.735784 | 0.728367 | 0 | 0.041415 | 0.222353 | 9,377 | 291 | 104 | 32.223368 | 0.790592 | 0.027088 | 0 | 0.695238 | 0 | 0 | 0.018766 | 0.002634 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038095 | false | 0 | 0.042857 | 0.004762 | 0.119048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
81469ecdf098555a759abfa7e1946f053c232186 | 24 | py | Python | lps/lopops/__init__.py | arup-group/london-pop-synth | 38e56230d440d49ddb2e2841d46a5cbaab260c35 | [
"MIT"
] | 1 | 2020-11-25T06:56:43.000Z | 2020-11-25T06:56:43.000Z | lps/lopops/__init__.py | arup-group/london-pop-synth | 38e56230d440d49ddb2e2841d46a5cbaab260c35 | [
"MIT"
] | null | null | null | lps/lopops/__init__.py | arup-group/london-pop-synth | 38e56230d440d49ddb2e2841d46a5cbaab260c35 | [
"MIT"
] | null | null | null | from .lopops import Data | 24 | 24 | 0.833333 | 4 | 24 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 24 | 1 | 24 | 24 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
81c382c5ef6a696198197dce63fef0ffbc9f9d1c | 33 | py | Python | new_csaf/csaf/utils/__init__.py | yokian/csaf | a94f0943541a21a270b753577979989b98e84497 | [
"BSD-3-Clause"
] | 6 | 2021-08-17T23:31:13.000Z | 2022-02-19T22:23:15.000Z | new_csaf/csaf/utils/__init__.py | yokian/csaf | a94f0943541a21a270b753577979989b98e84497 | [
"BSD-3-Clause"
] | 29 | 2021-08-24T17:32:39.000Z | 2022-02-28T16:28:35.000Z | new_csaf/csaf/utils/__init__.py | yokian/csaf | a94f0943541a21a270b753577979989b98e84497 | [
"BSD-3-Clause"
] | 3 | 2021-09-15T14:20:30.000Z | 2021-12-06T22:03:26.000Z | from csaf.utils.notebook import * | 33 | 33 | 0.818182 | 5 | 33 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 33 | 1 | 33 | 33 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c4eb7c4a49498abba2aed4de5fc9ac620f9d48cc | 69 | py | Python | multilingual_t5/r_baseline_mr/__init__.py | sumanthd17/mt5 | c99b4e3ad1c69908c852c730a1323ccb52d48f58 | [
"Apache-2.0"
] | null | null | null | multilingual_t5/r_baseline_mr/__init__.py | sumanthd17/mt5 | c99b4e3ad1c69908c852c730a1323ccb52d48f58 | [
"Apache-2.0"
] | null | null | null | multilingual_t5/r_baseline_mr/__init__.py | sumanthd17/mt5 | c99b4e3ad1c69908c852c730a1323ccb52d48f58 | [
"Apache-2.0"
] | null | null | null | """r_baseline_mr dataset."""
from .r_baseline_mr import RBaselineMr
| 17.25 | 38 | 0.782609 | 10 | 69 | 5 | 0.7 | 0.36 | 0.44 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101449 | 69 | 3 | 39 | 23 | 0.806452 | 0.318841 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
f20ea3b60a2d1d29b3a9dc8eeaae21beeb5daa22 | 74 | py | Python | train.py | zfar-/icm | 0d0c31885a2df264a67eb83442865d3bdcbc0bd1 | [
"MIT"
] | null | null | null | train.py | zfar-/icm | 0d0c31885a2df264a67eb83442865d3bdcbc0bd1 | [
"MIT"
] | null | null | null | train.py | zfar-/icm | 0d0c31885a2df264a67eb83442865d3bdcbc0bd1 | [
"MIT"
] | null | null | null | import numpy as np
import argparse
import keras
import keras.backend as K
| 14.8 | 25 | 0.824324 | 13 | 74 | 4.692308 | 0.615385 | 0.360656 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 74 | 4 | 26 | 18.5 | 0.983871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
48151af5fce33c9f618af846da873ab27d2b2b07 | 74 | py | Python | python/ql/src/Imports/from_import.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 4,036 | 2020-04-29T00:09:57.000Z | 2022-03-31T14:16:38.000Z | python/ql/src/Imports/from_import.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 2,970 | 2020-04-28T17:24:18.000Z | 2022-03-31T22:40:46.000Z | python/ql/src/Imports/from_import.py | ScriptBox99/github-codeql | 2ecf0d3264db8fb4904b2056964da469372a235c | [
"MIT"
] | 794 | 2020-04-29T00:28:25.000Z | 2022-03-30T08:21:46.000Z | from sys import stdout
def main():
stdout.write("Hello World!")
| 12.333333 | 32 | 0.635135 | 10 | 74 | 4.7 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.243243 | 74 | 5 | 33 | 14.8 | 0.839286 | 0 | 0 | 0 | 0 | 0 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
48507852ace8048d3534f8e18fd81bbd9b3b6eb3 | 7,717 | py | Python | utils.py | giova86/Python-LIS | 34107bece9f8471a5576b61e3eec4ad4dfce25bf | [
"MIT"
] | 1 | 2021-11-29T08:52:32.000Z | 2021-11-29T08:52:32.000Z | utils.py | giova86/Python-LIS | 34107bece9f8471a5576b61e3eec4ad4dfce25bf | [
"MIT"
] | null | null | null | utils.py | giova86/Python-LIS | 34107bece9f8471a5576b61e3eec4ad4dfce25bf | [
"MIT"
] | null | null | null | # methods
import cv2
import time
import mediapipe as mp
import numpy as np
import os
import numpy as np
import pandas as pd
mp_holistic = mp.solutions.holistic
mp_drawing = mp.solutions.drawing_utils
def mediapipe_detection(image, model):
image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
image.flags.writeable = False
results = model.process(image)
image.flags.writeable = True
image = cv2.cvtColor(image, cv2.COLOR_RGB2BGR)
return image, results
def draw_landmarks(image, results):
mp_drawing.draw_landmarks(image, results.face_landmarks, mp_holistic.FACEMESH_TESSELATION)
mp_drawing.draw_landmarks(image, results.pose_landmarks, mp_holistic.POSE_CONNECTIONS)
mp_drawing.draw_landmarks(image, results.left_hand_landmarks, mp_holistic.HAND_CONNECTIONS)
mp_drawing.draw_landmarks(image, results.right_hand_landmarks, mp_holistic.HAND_CONNECTIONS)
def draw_landmarks_custom(image, results):
mp_drawing.draw_landmarks(image, results.face_landmarks, mp_holistic.FACEMESH_TESSELATION,
mp_drawing.DrawingSpec(color=(255,255,255),thickness=1, circle_radius=1),
mp_drawing.DrawingSpec(color=(255,255,255),thickness=1, circle_radius=1),
)
mp_drawing.draw_landmarks(image, results.pose_landmarks, mp_holistic.POSE_CONNECTIONS,
mp_drawing.DrawingSpec(color=(80,110,10),thickness=2, circle_radius=1),
mp_drawing.DrawingSpec(color=(80,256,121),thickness=2, circle_radius=1),
)
mp_drawing.draw_landmarks(image, results.left_hand_landmarks, mp_holistic.HAND_CONNECTIONS,
mp_drawing.DrawingSpec(color=(0,0,255),thickness=3, circle_radius=5),
mp_drawing.DrawingSpec(color=(0,0,255),thickness=3, circle_radius=5),
)
mp_drawing.draw_landmarks(image, results.right_hand_landmarks, mp_holistic.HAND_CONNECTIONS,
mp_drawing.DrawingSpec(color=(255,0,0),thickness=3, circle_radius=5),
mp_drawing.DrawingSpec(color=(255,0,0),thickness=3, circle_radius=5),
)
#cv2.rectangle(image, start_point, end_point, color, thickness)
def draw_limit_rh(image, results):
if results.right_hand_landmarks:
xMax = max([i.x for i in results.right_hand_landmarks.landmark])
xMin = min([i.x for i in results.right_hand_landmarks.landmark])
yMax = max([i.y for i in results.right_hand_landmarks.landmark])
yMin = min([i.y for i in results.right_hand_landmarks.landmark])
xMax=xMax+0.1*(xMax-xMin)
yMax=yMax+0.1*(yMax-yMin)
xMin=xMin-0.1*(xMax-xMin)
yMin=yMin-0.1*(yMax-yMin)
h,w,_ = image.shape
cv2.rectangle(image, (int(xMin*w), int(yMin*h)), (int(xMax*w), int(yMax*h)), (255,0,0), 1)
cv2.line(image, (int(xMin*w), int(yMin*h)), (int(xMin*w), int(yMin*h)+int((yMax*h-yMin*h)/5)), (255,0,0),8)
cv2.line(image, (int(xMin*w), int(yMin*h)), (int(xMin*w)+int((xMax*w-xMin*w)/5), int(yMin*h)), (255,0,0),8)
cv2.line(image, (int(xMax*w), int(yMax*h)), (int(xMax*w), int(yMax*h)-int((yMax*h-yMin*h)/5)), (255,0,0),8)
cv2.line(image, (int(xMax*w), int(yMax*h)), (int(xMax*w)-int((xMax*w-xMin*w)/5), int(yMax*h)), (255,0,0),8)
cv2.line(image, (int(xMin*w), int(yMax*h)), (int(xMin*w), int(yMax*h)-int((yMax*h-yMin*h)/5)), (255,0,0),8)
cv2.line(image, (int(xMin*w), int(yMax*h)), (int(xMin*w)+int((xMax*w-xMin*w)/5), int(yMax*h)), (255,0,0),8)
cv2.line(image, (int(xMax*w), int(yMin*h)), (int(xMax*w), int(yMin*h)+int((yMax*h-yMin*h)/5)), (255,0,0),8)
cv2.line(image, (int(xMax*w), int(yMin*h)), (int(xMax*w)-int((xMax*w-xMin*w)/5), int(yMin*h)), (255,0,0),8)
cv2.putText(image, 'Right Hand',(int(xMin*w), int(yMin*h-(yMax*h-yMin*h)/20)), cv2.FONT_HERSHEY_SIMPLEX, 1, (255,0,0), 2)
def draw_limit_lh(image, results):
if results.left_hand_landmarks:
xMax = max([i.x for i in results.left_hand_landmarks.landmark])
xMin = min([i.x for i in results.left_hand_landmarks.landmark])
yMax = max([i.y for i in results.left_hand_landmarks.landmark])
yMin = min([i.y for i in results.left_hand_landmarks.landmark])
xMax=xMax+0.1*(xMax-xMin)
yMax=yMax+0.1*(yMax-yMin)
xMin=xMin-0.1*(xMax-xMin)
yMin=yMin-0.1*(yMax-yMin)
h,w,_ = image.shape
cv2.rectangle(image, (int(xMin*w), int(yMin*h)), (int(xMax*w), int(yMax*h)), (0,0,255), 1)
cv2.line(image, (int(xMin*w), int(yMin*h)), (int(xMin*w), int(yMin*h)+int((yMax*h-yMin*h)/5)), (0,0,255),8)
cv2.line(image, (int(xMin*w), int(yMin*h)), (int(xMin*w)+int((xMax*w-xMin*w)/5), int(yMin*h)), (0,0,255),8)
cv2.line(image, (int(xMax*w), int(yMax*h)), (int(xMax*w), int(yMax*h)-int((yMax*h-yMin*h)/5)), (0,0,255),8)
cv2.line(image, (int(xMax*w), int(yMax*h)), (int(xMax*w)-int((xMax*w-xMin*w)/5), int(yMax*h)), (0,0,255),8)
cv2.line(image, (int(xMin*w), int(yMax*h)), (int(xMin*w), int(yMax*h)-int((yMax*h-yMin*h)/5)), (0,0,255),8)
cv2.line(image, (int(xMin*w), int(yMax*h)), (int(xMin*w)+int((xMax*w-xMin*w)/5), int(yMax*h)), (0,0,255),8)
cv2.line(image, (int(xMax*w), int(yMin*h)), (int(xMax*w), int(yMin*h)+int((yMax*h-yMin*h)/5)), (0,0,255),8)
cv2.line(image, (int(xMax*w), int(yMin*h)), (int(xMax*w)-int((xMax*w-xMin*w)/5), int(yMin*h)), (0,0,255),8)
cv2.putText(image, 'Left Hand',(int(xMin*w), int(yMin*h-(yMax*h-yMin*h)/20)), cv2.FONT_HERSHEY_SIMPLEX, 1, (0,0,255), 2)
def check_detection(image, results):
if results.left_hand_landmarks:
cv2.putText(image, 'Left Hand: DETECTED',(10,30), cv2.FONT_HERSHEY_SIMPLEX, 1, (0,0,255), 2)
else:
cv2.putText(image, 'Left Hand: NOT DETECTED',(10,30), cv2.FONT_HERSHEY_SIMPLEX, 1, (0,0,255), 2)
if results.right_hand_landmarks:
cv2.putText(image, 'Right Hand: DETECTED',(10,70), cv2.FONT_HERSHEY_SIMPLEX, 1, (255,0,0), 2)
else:
cv2.putText(image, 'Right Hand: NOT DETECTED',(10,70), cv2.FONT_HERSHEY_SIMPLEX, 1, (255,0,0), 2)
if results.face_landmarks:
cv2.putText(image, 'Face: DETECTED',(10,110), cv2.FONT_HERSHEY_SIMPLEX, 1, (255,255,255), 2)
else:
cv2.putText(image, 'Face: NOT DETECTED',(10,110), cv2.FONT_HERSHEY_SIMPLEX, 1, (255,255,255), 2)
if results.face_landmarks:
cv2.putText(image, 'Pose: DETECTED',(10,150), cv2.FONT_HERSHEY_SIMPLEX, 1, (80,256,121), 2)
else:
cv2.putText(image, 'Pose: NOT DETECTED',(10,150), cv2.FONT_HERSHEY_SIMPLEX, 1, (80,256,121), 2)
def points_detection(results):
xMax = max([i.x for i in results.right_hand_landmarks.landmark])
xMin = min([i.x for i in results.right_hand_landmarks.landmark])
yMax = max([i.y for i in results.right_hand_landmarks.landmark])
yMin = min([i.y for i in results.right_hand_landmarks.landmark])
rh = np.array([[points.x, points.y, points.z] for points in results.right_hand_landmarks.landmark]).flatten() if results.right_hand_landmarks else np.zeros(21*3)
for i in np.arange(0, 63, 3):
rh[i]=(rh[i]-xMin)/(xMax-xMin)
for i in np.arange(1, 63, 3):
rh[i]=(rh[i]-yMin)/(yMax-yMin)
# lh = np.array([[points.x, points.y, points.z] for points in results.left_hand_landmarks.landmark]).flatten() if results.left_hand_landmarks else np.zeros(21*3)
# po = np.array([[points.x, points.y, points.z] for points in results.pose_landmarks.landmark]).flatten() if results.pose_landmarks else np.zeros(99)
# return np.concatenate([lh, rh, po])
return rh
| 58.908397 | 165 | 0.63859 | 1,280 | 7,717 | 3.738281 | 0.082813 | 0.031766 | 0.043469 | 0.045977 | 0.884013 | 0.819645 | 0.776385 | 0.739812 | 0.724347 | 0.719122 | 0 | 0.062332 | 0.178826 | 7,717 | 130 | 166 | 59.361538 | 0.692757 | 0.053518 | 0 | 0.333333 | 0 | 0 | 0.023157 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064815 | false | 0 | 0.064815 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
487e118af55a3a303eee82b7b0325503088c2550 | 1,261 | py | Python | leapy/sklearn/transformers/export/test/test_OneHotEncoderExporter.py | nonabelian/leapy | 152152eed87572983dd61b27a4a1726b5cb2e615 | [
"BSD-3-Clause"
] | 1 | 2019-05-01T01:59:03.000Z | 2019-05-01T01:59:03.000Z | leapy/sklearn/transformers/export/test/test_OneHotEncoderExporter.py | nonabelian/leapy | 152152eed87572983dd61b27a4a1726b5cb2e615 | [
"BSD-3-Clause"
] | null | null | null | leapy/sklearn/transformers/export/test/test_OneHotEncoderExporter.py | nonabelian/leapy | 152152eed87572983dd61b27a4a1726b5cb2e615 | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
import dask.array as da
import leapy.sklearn
from sklearn.preprocessing import OneHotEncoder
from .. import OneHotEncoderExporter
def test_ohe_export_function():
ohe = OneHotEncoder()
X_np = np.array([['a'], ['b']])
X_act = ohe.fit_transform(X_np)
ohe_runtime = OneHotEncoderExporter.to_runtime(ohe)
X_exp = ohe_runtime.transform(X_np)
# Runtime always outputs np.array
assert np.all(X_exp == X_act.toarray())
ohe = OneHotEncoder(sparse=False)
X_np = np.array([['a'], ['b']])
X_act = ohe.fit_transform(X_np)
ohe_runtime = OneHotEncoderExporter.to_runtime(ohe)
X_exp = ohe_runtime.transform(X_np)
assert np.all(X_exp == X_act)
def test_add_to_class_export():
ohe = OneHotEncoder()
X_np = np.array([['a'], ['b']])
X_act = ohe.fit_transform(X_np)
ohe_runtime = ohe.to_runtime()
X_exp = ohe_runtime.transform(X_np)
# Runtime always outputs np.array
assert np.all(X_exp == X_act.toarray())
ohe = OneHotEncoder(sparse=False)
X_np = np.array([['a'], ['b']])
X = da.from_array(X_np, chunks=X_np.shape)
X_act = ohe.fit_transform(X_np)
ohe_runtime = ohe.to_runtime()
X_exp = ohe_runtime.transform(X_np)
assert np.all(X_exp == X_act)
| 27.413043 | 55 | 0.678826 | 195 | 1,261 | 4.112821 | 0.194872 | 0.052369 | 0.119701 | 0.049875 | 0.749377 | 0.749377 | 0.749377 | 0.749377 | 0.749377 | 0.749377 | 0 | 0 | 0.18636 | 1,261 | 45 | 56 | 28.022222 | 0.781676 | 0.04996 | 0 | 0.75 | 0 | 0 | 0.006695 | 0 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.0625 | false | 0 | 0.15625 | 0 | 0.21875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6f927ffbdbc204d48c75ded8cf3e18cbd2763925 | 129 | py | Python | environments/simple-road/simple_road/envs/maps.py | KarlRong/Safe-RL-for-Driving | 67484911ca8ad9f1476e96043c379c01cd5ced8c | [
"Apache-2.0"
] | null | null | null | environments/simple-road/simple_road/envs/maps.py | KarlRong/Safe-RL-for-Driving | 67484911ca8ad9f1476e96043c379c01cd5ced8c | [
"Apache-2.0"
] | null | null | null | environments/simple-road/simple_road/envs/maps.py | KarlRong/Safe-RL-for-Driving | 67484911ca8ad9f1476e96043c379c01cd5ced8c | [
"Apache-2.0"
] | null | null | null | class Map:
def __init__(self):
pass
def process(self):
pass
def render(self, screen):
pass
| 12.9 | 29 | 0.527132 | 15 | 129 | 4.266667 | 0.6 | 0.25 | 0.34375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.387597 | 129 | 9 | 30 | 14.333333 | 0.810127 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.428571 | 0 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
6f938554d5b661bd603ddbdfb2312be6ee6b1406 | 20,394 | py | Python | mit/6-006-fall-2011/contents/readings/python-cost-model/timing.py | andreramosilva/algorithms | 17686a8ab4b2f5935da851464d9493114607211c | [
"MIT"
] | null | null | null | mit/6-006-fall-2011/contents/readings/python-cost-model/timing.py | andreramosilva/algorithms | 17686a8ab4b2f5935da851464d9493114607211c | [
"MIT"
] | null | null | null | mit/6-006-fall-2011/contents/readings/python-cost-model/timing.py | andreramosilva/algorithms | 17686a8ab4b2f5935da851464d9493114607211c | [
"MIT"
] | null | null | null | # timing.py
# Author: Ronald L. Rivest
# Date last modified: March 6, 2007
# Routines to help in timing the execution of
# various code fragments or routines, and to
# infer a good formula for the resulting runtimes.
import math
import scipy.linalg
import string
import sys
import timeit
# Parameter generation routines
def lg(x):
return math.log(x)/math.log(2.0)
def sqrt(x):
return math.sqrt(x)
def make_param_list(spec_string,growth_factor):
"""
Generate a list of dictionaries
given maximum and minimum values for each range.
Each min and max value is a *string* that can be evaluted;
each string may depend on earlier variable values
Values increment by factor of growth_factor from min to max
Example:
make_param_list("1<=n<=1000")
make_param_list("1<=n<=1000;1<=m<=1000;min(n,m)<=k<=max(n,m)")
"""
var_list = []
spec_list = string.split(spec_string,";")
D = {}
D['lg']=lg
D['sqrt'] = sqrt
D_list = [D]
for spec in spec_list:
spec_parts = string.split(spec,"<=")
assert len(spec_parts)==3
lower_spec = spec_parts[0]
var_name = spec_parts[1]
assert len(var_name)==1
var_list.append(var_name)
upper_spec = spec_parts[2]
new_D_list = []
for D in D_list:
new_D = D.copy()
val = eval(lower_spec,D)
while val<=eval(upper_spec,D):
new_D[var_name] = val
new_D_list.append(new_D.copy())
val *= growth_factor
D_list = new_D_list
# for D in D_list: print D
return (var_list,D_list)
# sample("1<=n<=1000;1<=m<=1000;min(n,m)<=k<=max(n,m)",2)
def fit(var_list,param_list,run_times,f_list):
"""
Return matrix A needed for least-squares fit.
Given:
list of variable names
list of sample dicts for various parameter sets
list of corresponding run times
list of functions to be considered for fit
these are *strings*, e.g. "n","n**2","min(n,m)",etc.
prints:
coefficients for each function in f_list
"""
print "var_list",var_list
print "Function list:",f_list
print "run times:",
for i in range(len(param_list)):
print
for v in var_list:
print v,"= %6s"%param_list[i][v],
print ": %8f"%run_times[i],"microseconds",
# print " n = %(n)6s"%param_list[i],run_times[i],"microseconds"
print
rows = len(run_times)
cols = len(f_list)
A = [ [0 for j in range(cols)] for i in range(rows) ]
for i in range(rows):
D = param_list[i]
for j in range(cols):
A[i][j] = float(eval(f_list[j],D))
b = run_times
# print "A:"
# print A
# print "b:"
# print b
# (x,resids,rank,s) = scipy.linalg.lstsq(A,b)
(x,resids,rank,s) = fit2(A,b)
print "Coefficients as interpolated from data:"
for j in range(cols):
sign = ''
if x[j]>0 and j>0:
sign="+"
elif x[j]>0:
sign = " "
print "%s%g*%s"%(sign,x[j],f_list[j])
print "(measuring time in microseconds)"
print "Sum of squares of residuals:",resids
print "RMS error = %0.2g percent"%(math.sqrt(resids/len(A))*100.0)
# print "Rank:",rank
# print "SVD:",s
sys.stdout.flush()
import scipy.optimize
def fit2(A,b):
""" Relative error minimizer """
def f(x):
assert len(x) == len(A[0])
resids = []
for i in range(len(A)):
sum = 0.0
for j in range(len(A[0])):
sum += A[i][j]*x[j]
relative_error = (sum-b[i])/b[i]
resids.append(relative_error)
return resids
ans = scipy.optimize.leastsq(f,[0.0]*len(A[0]))
# print "ans:",ans
if len(A[0])==1:
x = [ans[0]]
else:
x = ans[0]
resids = sum([r*r for r in f(x)])
return (x,resids,0,0)
def test_misc():
print
print "Test Misc-1 -- running time should be n+2*m+7+3*n*lg(n)+17*n*m"
spec_string = "1<=n<=100000;1<=m<=100000"
growth_factor = 10
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list,param_list = make_param_list(spec_string,growth_factor)
run_times = [ eval("n+2*m+7+3*n*lg(n)+17*n*m",D) for D in param_list ]
f_list = ("(n*m)","n**2","n*lg(n)","n","m","1")
fit(var_list,param_list,run_times,f_list)
print
print "Test Misc-2: pass"
spec_string = "10000<=n<=1000000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("1",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("pass")
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
def test_number():
print
print "Test Number-1 -- time to compute int('1'*n)"
spec_string = "1000<=n<=10000"
growth_factor = 2
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n**2","n","1")
f_list = ("n**2",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("string.atoi(x)","import string;x='1'*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Number-2 -- time to compute repr(2**n)"
spec_string = "1000<=n<=10000"
growth_factor = 2
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n**2","n","1")
f_list = ("n**2",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("repr(x)","x=2**%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Number-3 -- time to convert (2**n) to hex"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n**2","n","1")
f_list = ("n",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("'%x'%x","x=2**%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Number-4 -- time to add 2**n to itself"
spec_string = "1000<=n<=1000000"
growth_factor = 2
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list,param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n**2","n*lg(n)","n","1")
f_list = ("n",)
run_times = []
trials = 10000
for D in param_list:
t = timeit.Timer("x+x","x=2**%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Number-5 -- time to multiply (2**n/3) by itself"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list,param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n**2","n*lg(n)","n","1")
f_list = ("n**1.585",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("x*x","x=(2**%(n)s)/3"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Number-6 -- time to divide (2**(2n) by (2**n))"
spec_string = "1000<=n<=50000"
growth_factor = 2
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list,param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n**2","n*lg(n)","n","1")
f_list = ("n**2",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("w/x","w=(2**(2*%(n)s));x=(2**(%(n)s))"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Number-7 -- time to compute remainder of (2**(2n) by (2**n))"
spec_string = "1000<=n<=50000"
growth_factor = 2
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list,param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n**2","n*lg(n)","n","1")
f_list = ("n**2",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("w%x","w=(2**(2*%(n)s));x=(2**(%(n)s))"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Number-8 -- time to compute pow(x,y,z)"
spec_string = "1000<=n<=5000"
growth_factor = 2
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list,param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n**2","n*lg(n)","n","1")
f_list = ("n**3",)
run_times = []
trials = 10
for D in param_list:
t = timeit.Timer("pow(x,y,z)","z=(2**%(n)s)+3;x=y=(2**%(n)s)+1"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Number-9 -- time to compute 2**n"
spec_string = "1000<=n<=1000000"
growth_factor = 2
print "Spec_string: ",spec_string,"by factors of",growth_factor
var_list,param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n**2","n*lg(n)","n","1")
f_list = ("1",)
run_times = []
trials = 10000
for D in param_list:
t = timeit.Timer("2**%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
def test_string():
print
print "Test String-1: extract a byte from a string"
spec_string = "1000<=n<=1000000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("1",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("s[500]","s='0'*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test String-2: concatenate two string of length n"
spec_string = "1000<=n<=500000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("s+t","s=t='0'*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test String-3: extract a string of length n/2"
spec_string = "1000<=n<=500000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("s[0:%(n)s/2]"%D,"s='0'*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test String-4: translate a string of length n"
spec_string = "1000<=n<=500000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("string.translate(s,T)"%D,
"s='0'*%(n)s;import string;T=string.maketrans('1','2')"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
def test_list():
print
print "Test List-1: create an empty list"
spec_string = "1<=n<=10"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("1",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("x = list()")
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-2: list (array) lookup"
spec_string = "10000<=n<=1000000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("1",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("x=L[5]","L=[0]*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-3: appending to a list of length n"
spec_string = "10000<=n<=1000000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("1")
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("L.append(0)","L=[0]*%(n)s;L.append(0)"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-4: Pop"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("1",)
run_times = []
trials = 200
for D in param_list:
t = timeit.Timer("L.pop()","L=[0]*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-5: concatenating two lists of length n"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 2000
for D in param_list:
t = timeit.Timer("L+L","L=[0]*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-6: extracting a slice of length n/2"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 2000
for D in param_list:
t = timeit.Timer("L[0:%(n)s/2]"%D,"L=[0]*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-7: copy"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 2000
for D in param_list:
t = timeit.Timer("L[:]","L=[0]*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-8: assigning a slice of length n/2"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 2000
for D in param_list:
t = timeit.Timer("L[0:%(n)s/2]=L[1:1+%(n)s/2]"%D,"L=[0]*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-9: Delete first"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 200
for D in param_list:
t = timeit.Timer("del L[0]","L=[0]*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-10: Reverse"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 200
for D in param_list:
t = timeit.Timer("L.reverse()","L=[0]*%(n)s"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test List-11: Sort"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n*lg(n)",)
run_times = []
trials = 200
for D in param_list:
t = timeit.Timer("L.sort()","import random;L=[random.random() for i in range(%(n)s)]"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
def test_dict():
print
print "Test Dict-1: create an empty dictionary"
spec_string = "1<=n<=1"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("1",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("x = dict()")
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Dict-2: dictionary lookup"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("1",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("x = d[1]",
"d = dict([(i,i) for i in range(%(n)s)])"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Dict-3: dictionary copy"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("d.copy()",
"d = dict([(i,i) for i in range(%(n)s)])"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
print
print "Test Dict-4: dictionary list items"
spec_string = "1000<=n<=100000"
growth_factor = 2
print "Spec_string: ",spec_string, "by factors of", growth_factor
var_list, param_list = make_param_list(spec_string,growth_factor)
# f_list = ("n","1")
f_list = ("n*lg(n)",)
run_times = []
trials = 1000
for D in param_list:
t = timeit.Timer("d.items()",
"d = dict([(i,i) for i in range(%(n)s)])"%D)
run_times.append(t.timeit(trials)*1e6/float(trials))
fit(var_list,param_list,run_times,f_list)
def main():
test_misc()
test_number()
test_string()
test_list()
test_dict()
if False:
import profile
profile.run("main()")
else:
main()
| 33.432787 | 96 | 0.613955 | 3,261 | 20,394 | 3.633241 | 0.073904 | 0.097232 | 0.061783 | 0.082377 | 0.76764 | 0.744345 | 0.7408 | 0.737086 | 0.730672 | 0.727887 | 0 | 0.046102 | 0.228891 | 20,394 | 609 | 97 | 33.487685 | 0.7073 | 0.058105 | 0 | 0.663265 | 0 | 0.012245 | 0.18923 | 0.016133 | 0 | 0 | 0 | 0 | 0.006122 | 0 | null | null | 0.004082 | 0.020408 | null | null | 0.208163 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6fbb62a06e66044b8b91be8274dd6027c2ea4579 | 96 | py | Python | spikeforest/spikeforest_analysis/sfmdaextractors/__init__.py | mhhennig/spikeforest | 5b4507ead724af3de0be5d48a3b23aaedb0be170 | [
"Apache-2.0"
] | 1 | 2021-09-23T01:07:19.000Z | 2021-09-23T01:07:19.000Z | spikeforest/spikeforest_analysis/sfmdaextractors/__init__.py | mhhennig/spikeforest | 5b4507ead724af3de0be5d48a3b23aaedb0be170 | [
"Apache-2.0"
] | null | null | null | spikeforest/spikeforest_analysis/sfmdaextractors/__init__.py | mhhennig/spikeforest | 5b4507ead724af3de0be5d48a3b23aaedb0be170 | [
"Apache-2.0"
] | 1 | 2021-09-23T01:07:21.000Z | 2021-09-23T01:07:21.000Z | from .sfmdaextractors import SFMdaRecordingExtractor, SFMdaSortingExtractor
from . import mdaio
| 32 | 75 | 0.875 | 8 | 96 | 10.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 96 | 2 | 76 | 48 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6fd41877ed9c63f2e05febb990db5aa1b74f6159 | 28,873 | py | Python | airflow/providers/google/cloud/operators/vertex_ai/endpoint_service.py | JGoldman110/airflow | 93e2c945b1be5b7c9700e780d2aa67846503763b | [
"Apache-2.0"
] | 1 | 2020-07-12T19:17:00.000Z | 2020-07-12T19:17:00.000Z | airflow/providers/google/cloud/operators/vertex_ai/endpoint_service.py | JGoldman110/airflow | 93e2c945b1be5b7c9700e780d2aa67846503763b | [
"Apache-2.0"
] | null | null | null | airflow/providers/google/cloud/operators/vertex_ai/endpoint_service.py | JGoldman110/airflow | 93e2c945b1be5b7c9700e780d2aa67846503763b | [
"Apache-2.0"
] | null | null | null | #
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
"""This module contains Google Vertex AI operators.
.. spelling::
undeployed
undeploy
Undeploys
aiplatform
FieldMask
unassigns
"""
from typing import TYPE_CHECKING, Dict, Optional, Sequence, Tuple, Union
from google.api_core.exceptions import NotFound
from google.api_core.retry import Retry
from google.cloud.aiplatform_v1.types import DeployedModel, Endpoint, endpoint_service
from google.protobuf.field_mask_pb2 import FieldMask
from airflow.models import BaseOperator
from airflow.providers.google.cloud.hooks.vertex_ai.endpoint_service import EndpointServiceHook
from airflow.providers.google.cloud.links.vertex_ai import (
VertexAIEndpointLink,
VertexAIEndpointListLink,
VertexAIModelLink,
)
if TYPE_CHECKING:
from airflow.utils.context import Context
class CreateEndpointOperator(BaseOperator):
"""
Creates an Endpoint.
:param project_id: Required. The ID of the Google Cloud project that the service belongs to.
:param region: Required. The ID of the Google Cloud region that the service belongs to.
:param endpoint: Required. The Endpoint to create.
:param retry: Designation of what errors, if any, should be retried.
:param timeout: The timeout for this request.
:param metadata: Strings which should be sent along with the request as metadata.
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
"""
template_fields = ("region", "project_id", "impersonation_chain")
operator_extra_links = (VertexAIEndpointLink(),)
def __init__(
self,
*,
region: str,
project_id: str,
endpoint: Union[Endpoint, Dict],
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Sequence[Tuple[str, str]] = (),
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.region = region
self.project_id = project_id
self.endpoint = endpoint
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def execute(self, context: 'Context'):
hook = EndpointServiceHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
self.log.info("Creating endpoint")
operation = hook.create_endpoint(
project_id=self.project_id,
region=self.region,
endpoint=self.endpoint,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
result = hook.wait_for_operation(timeout=self.timeout, operation=operation)
endpoint = Endpoint.to_dict(result)
endpoint_id = hook.extract_endpoint_id(endpoint)
self.log.info("Endpoint was created. Endpoint ID: %s", endpoint_id)
self.xcom_push(context, key="endpoint_id", value=endpoint_id)
VertexAIEndpointLink.persist(context=context, task_instance=self, endpoint_id=endpoint_id)
return endpoint
class DeleteEndpointOperator(BaseOperator):
"""
Deletes an Endpoint.
:param project_id: Required. The ID of the Google Cloud project that the service belongs to.
:param region: Required. The ID of the Google Cloud region that the service belongs to.
:param endpoint_id: Required. The Endpoint ID to delete.
:param retry: Designation of what errors, if any, should be retried.
:param timeout: The timeout for this request.
:param metadata: Strings which should be sent along with the request as metadata.
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
"""
template_fields = ("region", "endpoint_id", "project_id", "impersonation_chain")
def __init__(
self,
*,
region: str,
project_id: str,
endpoint_id: str,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Sequence[Tuple[str, str]] = (),
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.region = region
self.project_id = project_id
self.endpoint_id = endpoint_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def execute(self, context: 'Context'):
hook = EndpointServiceHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
try:
self.log.info("Deleting endpoint: %s", self.endpoint_id)
operation = hook.delete_endpoint(
project_id=self.project_id,
region=self.region,
endpoint=self.endpoint_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
hook.wait_for_operation(timeout=self.timeout, operation=operation)
self.log.info("Endpoint was deleted.")
except NotFound:
self.log.info("The Endpoint ID %s does not exist.", self.endpoint_id)
class DeployModelOperator(BaseOperator):
"""
Deploys a Model into this Endpoint, creating a DeployedModel within it.
:param project_id: Required. The ID of the Google Cloud project that the service belongs to.
:param region: Required. The ID of the Google Cloud region that the service belongs to.
:param endpoint_id: Required. The name of the Endpoint resource into which to deploy a Model. Format:
``projects/{project}/locations/{location}/endpoints/{endpoint}``
:param deployed_model: Required. The DeployedModel to be created within the Endpoint. Note that
[Endpoint.traffic_split][google.cloud.aiplatform.v1.Endpoint.traffic_split] must be updated for
the DeployedModel to start receiving traffic, either as part of this call, or via
[EndpointService.UpdateEndpoint][google.cloud.aiplatform.v1.EndpointService.UpdateEndpoint].
:param traffic_split: A map from a DeployedModel's ID to the percentage of this Endpoint's traffic
that should be forwarded to that DeployedModel.
If this field is non-empty, then the Endpoint's
[traffic_split][google.cloud.aiplatform.v1.Endpoint.traffic_split] will be overwritten with it. To
refer to the ID of the just being deployed Model, a "0" should be used, and the actual ID of the
new DeployedModel will be filled in its place by this method. The traffic percentage values must
add up to 100.
If this field is empty, then the Endpoint's
[traffic_split][google.cloud.aiplatform.v1.Endpoint.traffic_split] is not updated.
:param retry: Designation of what errors, if any, should be retried.
:param timeout: The timeout for this request.
:param metadata: Strings which should be sent along with the request as metadata.
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
"""
template_fields = ("region", "endpoint_id", "project_id", "impersonation_chain")
operator_extra_links = (VertexAIModelLink(),)
def __init__(
self,
*,
region: str,
project_id: str,
endpoint_id: str,
deployed_model: Union[DeployedModel, Dict],
traffic_split: Optional[Union[Sequence, Dict]] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Sequence[Tuple[str, str]] = (),
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.region = region
self.project_id = project_id
self.endpoint_id = endpoint_id
self.deployed_model = deployed_model
self.traffic_split = traffic_split
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def execute(self, context: 'Context'):
hook = EndpointServiceHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
self.log.info("Deploying model")
operation = hook.deploy_model(
project_id=self.project_id,
region=self.region,
endpoint=self.endpoint_id,
deployed_model=self.deployed_model,
traffic_split=self.traffic_split,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
result = hook.wait_for_operation(timeout=self.timeout, operation=operation)
deploy_model = endpoint_service.DeployModelResponse.to_dict(result)
deployed_model_id = hook.extract_deployed_model_id(deploy_model)
self.log.info("Model was deployed. Deployed Model ID: %s", deployed_model_id)
self.xcom_push(context, key="deployed_model_id", value=deployed_model_id)
VertexAIModelLink.persist(context=context, task_instance=self, model_id=deployed_model_id)
return deploy_model
class GetEndpointOperator(BaseOperator):
"""
Gets an Endpoint.
:param project_id: Required. The ID of the Google Cloud project that the service belongs to.
:param region: Required. The ID of the Google Cloud region that the service belongs to.
:param endpoint_id: Required. The Endpoint ID to get.
:param retry: Designation of what errors, if any, should be retried.
:param timeout: The timeout for this request.
:param metadata: Strings which should be sent along with the request as metadata.
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
"""
template_fields = ("region", "endpoint_id", "project_id", "impersonation_chain")
operator_extra_links = (VertexAIEndpointLink(),)
def __init__(
self,
*,
region: str,
project_id: str,
endpoint_id: str,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Sequence[Tuple[str, str]] = (),
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.region = region
self.project_id = project_id
self.endpoint_id = endpoint_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def execute(self, context: 'Context'):
hook = EndpointServiceHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
try:
self.log.info("Get endpoint: %s", self.endpoint_id)
endpoint_obj = hook.get_endpoint(
project_id=self.project_id,
region=self.region,
endpoint=self.endpoint_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
VertexAIEndpointLink.persist(context=context, task_instance=self, endpoint_id=self.endpoint_id)
self.log.info("Endpoint was gotten.")
return Endpoint.to_dict(endpoint_obj)
except NotFound:
self.log.info("The Endpoint ID %s does not exist.", self.endpoint_id)
class ListEndpointsOperator(BaseOperator):
"""
Lists Endpoints in a Location.
:param project_id: Required. The ID of the Google Cloud project that the service belongs to.
:param region: Required. The ID of the Google Cloud region that the service belongs to.
:param filter: The standard list filter.
Supported fields:
- ``display_name`` supports = and !=.
- ``state`` supports = and !=.
- ``model_display_name`` supports = and !=
Some examples of using the filter are:
- ``state="JOB_STATE_SUCCEEDED" AND display_name="my_job"``
- ``state="JOB_STATE_RUNNING" OR display_name="my_job"``
- ``NOT display_name="my_job"``
- ``state="JOB_STATE_FAILED"``
:param page_size: The standard list page size.
:param page_token: The standard list page token.
:param read_mask: Mask specifying which fields to read.
:param order_by: A comma-separated list of fields to order by, sorted in
ascending order. Use "desc" after a field name for
descending. Supported fields:
- ``display_name``
- ``create_time``
- ``update_time``
Example: ``display_name, create_time desc``.
:param retry: Designation of what errors, if any, should be retried.
:param timeout: The timeout for this request.
:param metadata: Strings which should be sent along with the request as metadata.
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
"""
template_fields = ("region", "project_id", "impersonation_chain")
operator_extra_links = (VertexAIEndpointListLink(),)
def __init__(
self,
*,
region: str,
project_id: str,
filter: Optional[str] = None,
page_size: Optional[int] = None,
page_token: Optional[str] = None,
read_mask: Optional[str] = None,
order_by: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Sequence[Tuple[str, str]] = (),
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.region = region
self.project_id = project_id
self.filter = filter
self.page_size = page_size
self.page_token = page_token
self.read_mask = read_mask
self.order_by = order_by
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def execute(self, context: 'Context'):
hook = EndpointServiceHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
results = hook.list_endpoints(
project_id=self.project_id,
region=self.region,
filter=self.filter,
page_size=self.page_size,
page_token=self.page_token,
read_mask=self.read_mask,
order_by=self.order_by,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
VertexAIEndpointListLink.persist(context=context, task_instance=self)
return [Endpoint.to_dict(result) for result in results]
class UndeployModelOperator(BaseOperator):
"""
Undeploys a Model from an Endpoint, removing a DeployedModel from it, and freeing all resources it's
using.
:param project_id: Required. The ID of the Google Cloud project that the service belongs to.
:param region: Required. The ID of the Google Cloud region that the service belongs to.
:param endpoint_id: Required. The name of the Endpoint resource from which to undeploy a Model. Format:
``projects/{project}/locations/{location}/endpoints/{endpoint}``
:param deployed_model_id: Required. The ID of the DeployedModel to be undeployed from the Endpoint.
:param traffic_split: If this field is provided, then the Endpoint's
[traffic_split][google.cloud.aiplatform.v1.Endpoint.traffic_split] will be overwritten with it. If
last DeployedModel is being undeployed from the Endpoint, the [Endpoint.traffic_split] will always
end up empty when this call returns. A DeployedModel will be successfully undeployed only if it
doesn't have any traffic assigned to it when this method executes, or if this field unassigns any
traffic to it.
:param retry: Designation of what errors, if any, should be retried.
:param timeout: The timeout for this request.
:param metadata: Strings which should be sent along with the request as metadata.
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
"""
template_fields = ("region", "endpoint_id", "deployed_model_id", "project_id", "impersonation_chain")
def __init__(
self,
*,
region: str,
project_id: str,
endpoint_id: str,
deployed_model_id: str,
traffic_split: Optional[Union[Sequence, Dict]] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Sequence[Tuple[str, str]] = (),
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.region = region
self.project_id = project_id
self.endpoint_id = endpoint_id
self.deployed_model_id = deployed_model_id
self.traffic_split = traffic_split
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def execute(self, context: 'Context'):
hook = EndpointServiceHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
self.log.info(f"Removing a DeployedModel {self.deployed_model_id}")
operation = hook.undeploy_model(
project_id=self.project_id,
region=self.region,
endpoint=self.endpoint_id,
deployed_model_id=self.deployed_model_id,
traffic_split=self.traffic_split,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
hook.wait_for_operation(timeout=self.timeout, operation=operation)
self.log.info("DeployedModel was removed successfully")
class UpdateEndpointOperator(BaseOperator):
"""
Updates an Endpoint.
:param project_id: Required. The ID of the Google Cloud project that the service belongs to.
:param region: Required. The ID of the Google Cloud region that the service belongs to.
:param endpoint_id: Required. The ID of the Endpoint to update.
:param endpoint: Required. The Endpoint which replaces the resource on the server.
:param update_mask: Required. The update mask applies to the resource. See
[google.protobuf.FieldMask][google.protobuf.FieldMask].
:param retry: Designation of what errors, if any, should be retried.
:param timeout: The timeout for this request.
:param metadata: Strings which should be sent along with the request as metadata.
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:param delegate_to: The account to impersonate using domain-wide delegation of authority,
if any. For this to work, the service account making the request must have
domain-wide delegation enabled.
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
"""
template_fields = ("region", "endpoint_id", "project_id", "impersonation_chain")
operator_extra_links = (VertexAIEndpointLink(),)
def __init__(
self,
*,
project_id: str,
region: str,
endpoint_id: str,
endpoint: Union[Endpoint, Dict],
update_mask: Union[FieldMask, Dict],
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Sequence[Tuple[str, str]] = (),
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.project_id = project_id
self.region = region
self.endpoint_id = endpoint_id
self.endpoint = endpoint
self.update_mask = update_mask
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def execute(self, context: 'Context'):
hook = EndpointServiceHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
self.log.info("Updating endpoint: %s", self.endpoint_id)
result = hook.update_endpoint(
project_id=self.project_id,
region=self.region,
endpoint_id=self.endpoint_id,
endpoint=self.endpoint,
update_mask=self.update_mask,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
self.log.info("Endpoint was updated")
VertexAIEndpointLink.persist(context=context, task_instance=self, endpoint_id=self.endpoint_id)
return Endpoint.to_dict(result)
| 44.557099 | 108 | 0.67932 | 3,649 | 28,873 | 5.233489 | 0.09345 | 0.027229 | 0.019794 | 0.008902 | 0.766298 | 0.742944 | 0.730272 | 0.727235 | 0.723569 | 0.711735 | 0 | 0.000692 | 0.248952 | 28,873 | 647 | 109 | 44.625966 | 0.879963 | 0.461227 | 0 | 0.73842 | 0 | 0 | 0.062267 | 0.001628 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038147 | false | 0 | 0.024523 | 0 | 0.128065 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6fe15b7cf8e415e0865f7afbe112b4eb21eebb90 | 38 | py | Python | vnpy/app/chart_wizard/ui/__init__.py | funrunskypalace/vnpy | 2d87aede685fa46278d8d3392432cc127b797926 | [
"MIT"
] | 248 | 2020-12-12T02:18:27.000Z | 2022-03-28T05:27:06.000Z | vnpy/app/chart_wizard/ui/__init__.py | funrunskypalace/vnpy | 2d87aede685fa46278d8d3392432cc127b797926 | [
"MIT"
] | 6 | 2020-12-22T10:49:09.000Z | 2021-06-15T11:31:12.000Z | vnpy/app/chart_wizard/ui/__init__.py | funrunskypalace/vnpy | 2d87aede685fa46278d8d3392432cc127b797926 | [
"MIT"
] | 140 | 2020-12-17T15:02:57.000Z | 2022-03-28T05:27:07.000Z | from .widget import ChartWizardWidget
| 19 | 37 | 0.868421 | 4 | 38 | 8.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 38 | 1 | 38 | 38 | 0.970588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
82ed3d6ade8741f05add09ea887fda894aeac9d3 | 69 | py | Python | src/resources/baserouter.py | solnsumei/claims-management | 0a9db243e954fbe390f6f81f64eabd6efa4dcc81 | [
"MIT"
] | null | null | null | src/resources/baserouter.py | solnsumei/claims-management | 0a9db243e954fbe390f6f81f64eabd6efa4dcc81 | [
"MIT"
] | null | null | null | src/resources/baserouter.py | solnsumei/claims-management | 0a9db243e954fbe390f6f81f64eabd6efa4dcc81 | [
"MIT"
] | null | null | null | from fastapi import APIRouter
class BaseRouter(APIRouter):
pass | 13.8 | 29 | 0.782609 | 8 | 69 | 6.75 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 69 | 5 | 30 | 13.8 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
82f0e4bfac09fb4807970339774d3f576a742113 | 292 | py | Python | engine/subsystems/physics_engine.py | Dogeek/game_engine | bf020e436352a02d34e17981b2b4950374d938c0 | [
"MIT"
] | 1 | 2019-08-21T12:40:19.000Z | 2019-08-21T12:40:19.000Z | engine/subsystems/physics_engine.py | Dogeek/game_engine | bf020e436352a02d34e17981b2b4950374d938c0 | [
"MIT"
] | null | null | null | engine/subsystems/physics_engine.py | Dogeek/game_engine | bf020e436352a02d34e17981b2b4950374d938c0 | [
"MIT"
] | null | null | null | class PhysicsEngine:
def __init__(self, game_manager):
self.game_manager = game_manager
def iterate(self):
for go in self.game_manager.game_objects:
if go.has_component("Rigidbody2D"):
go.get_component("Rigidbody2D").update()
pass
| 29.2 | 56 | 0.636986 | 34 | 292 | 5.147059 | 0.558824 | 0.251429 | 0.257143 | 0.217143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00939 | 0.270548 | 292 | 9 | 57 | 32.444444 | 0.812207 | 0 | 0 | 0 | 0 | 0 | 0.075342 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.125 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
82f6c1fd32328a04cb8cda594a02e17100d05dec | 41 | py | Python | atcoder/abc/b069.py | tomato-300yen/coding | db6f440a96d8c83f486005c650461a69f27e3926 | [
"MIT"
] | null | null | null | atcoder/abc/b069.py | tomato-300yen/coding | db6f440a96d8c83f486005c650461a69f27e3926 | [
"MIT"
] | null | null | null | atcoder/abc/b069.py | tomato-300yen/coding | db6f440a96d8c83f486005c650461a69f27e3926 | [
"MIT"
] | null | null | null | a, *b, c = input()
print(a + len(b) + c)
| 13.666667 | 21 | 0.463415 | 9 | 41 | 2.111111 | 0.666667 | 0.210526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.243902 | 41 | 2 | 22 | 20.5 | 0.612903 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.5 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
d21093528b1cab2d951cda2026142bc1ca7cc150 | 47 | py | Python | speaker_model/__init__.py | happylittlecat2333/FastSpeech2 | 55efb879db0d7458f97d79fa605c889b2df8321f | [
"MIT"
] | null | null | null | speaker_model/__init__.py | happylittlecat2333/FastSpeech2 | 55efb879db0d7458f97d79fa605c889b2df8321f | [
"MIT"
] | null | null | null | speaker_model/__init__.py | happylittlecat2333/FastSpeech2 | 55efb879db0d7458f97d79fa605c889b2df8321f | [
"MIT"
] | null | null | null | from .speaker_embedding import SpeakerEmbedding | 47 | 47 | 0.914894 | 5 | 47 | 8.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06383 | 47 | 1 | 47 | 47 | 0.954545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9629930a848cf9da1831deaff00c3cf1cb8ee5e1 | 102 | py | Python | app/main/__init__.py | Jamesmwangi245/Blogg | e76d8049867a9f3bc1da5a824eca76d53da8f17b | [
"MIT"
] | null | null | null | app/main/__init__.py | Jamesmwangi245/Blogg | e76d8049867a9f3bc1da5a824eca76d53da8f17b | [
"MIT"
] | null | null | null | app/main/__init__.py | Jamesmwangi245/Blogg | e76d8049867a9f3bc1da5a824eca76d53da8f17b | [
"MIT"
] | null | null | null | from flask import Blueprint
from .. import views
main = Blueprint('main',__name__)
from .import erro | 17 | 33 | 0.764706 | 14 | 102 | 5.285714 | 0.571429 | 0.27027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147059 | 102 | 6 | 34 | 17 | 0.850575 | 0 | 0 | 0 | 0 | 0 | 0.038835 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0.5 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 6 |
963ea0a36bb5dc4a9d8195199b093cc7b1855ee3 | 105 | py | Python | SurPyval/node/__init__.py | JakeColtman/SurPyval | 71ab77231ba39eccba165088689282b247c015f2 | [
"MIT"
] | 2 | 2018-02-17T23:40:46.000Z | 2021-03-08T20:08:50.000Z | SurPyval/node/__init__.py | JakeColtman/SurPyval | 71ab77231ba39eccba165088689282b247c015f2 | [
"MIT"
] | null | null | null | SurPyval/node/__init__.py | JakeColtman/SurPyval | 71ab77231ba39eccba165088689282b247c015f2 | [
"MIT"
] | null | null | null | from SurPyval.node.node import Node, gamma, exponential, gaussian
from SurPyval.node.tree import NodeTree | 52.5 | 65 | 0.838095 | 15 | 105 | 5.866667 | 0.6 | 0.272727 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 105 | 2 | 66 | 52.5 | 0.926316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
9667429642b836e495719f4d97dbd5c90ef1ccd4 | 26 | py | Python | dask_cudf/io/__init__.py | quasiben/dask-cudf | 79671a9b0d1ea20e9c37fdba257f95a128ea98e5 | [
"Apache-2.0"
] | null | null | null | dask_cudf/io/__init__.py | quasiben/dask-cudf | 79671a9b0d1ea20e9c37fdba257f95a128ea98e5 | [
"Apache-2.0"
] | null | null | null | dask_cudf/io/__init__.py | quasiben/dask-cudf | 79671a9b0d1ea20e9c37fdba257f95a128ea98e5 | [
"Apache-2.0"
] | null | null | null | from .csv import read_csv
| 13 | 25 | 0.807692 | 5 | 26 | 4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
969019b72493c2417b51ae3664eb18a321edea17 | 178 | py | Python | alternatives/server-flask/src/common/utils/__init__.py | TaitoUnited/full-stack-template | 58529515c2f3dd765074b4c5f326f6336646f4e7 | [
"MIT"
] | 21 | 2019-10-12T06:04:43.000Z | 2022-03-31T06:03:34.000Z | alternatives/server-flask/src/common/utils/__init__.py | TaitoUnited/server-template | 67f370f212adefd96da2404077e575764f6a1b11 | [
"MIT"
] | 64 | 2018-04-22T09:39:19.000Z | 2019-06-14T12:32:08.000Z | alternatives/server-flask/src/common/utils/__init__.py | TaitoUnited/full-stack-template | 58529515c2f3dd765074b4c5f326f6336646f4e7 | [
"MIT"
] | 4 | 2019-11-03T22:47:56.000Z | 2022-01-09T11:52:15.000Z | from . import database # noqa
from . import format # noqa
from . import misc # noqa
from . import powerbi # noqa
from . import storage # noqa
from . import validate # noqa
| 25.428571 | 30 | 0.696629 | 24 | 178 | 5.166667 | 0.375 | 0.483871 | 0.564516 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.235955 | 178 | 6 | 31 | 29.666667 | 0.911765 | 0.162921 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
96b8de7cdf71843e7fe6a35670ef3afb9b5dd2ab | 148 | py | Python | Planet_tools/__init__.py | tundeakins/Planet_parameter_conversions | 7d7c4eb7f6e6fcf6e31979cfa72aa414a5f9e760 | [
"MIT"
] | null | null | null | Planet_tools/__init__.py | tundeakins/Planet_parameter_conversions | 7d7c4eb7f6e6fcf6e31979cfa72aa414a5f9e760 | [
"MIT"
] | null | null | null | Planet_tools/__init__.py | tundeakins/Planet_parameter_conversions | 7d7c4eb7f6e6fcf6e31979cfa72aa414a5f9e760 | [
"MIT"
] | null | null | null | from Planet_tools import convert_param, calculate_param, some_stats, estimate_effect, utils, ring
from Planet_tools.__version__ import __version__
| 49.333333 | 97 | 0.864865 | 20 | 148 | 5.7 | 0.7 | 0.175439 | 0.263158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094595 | 148 | 2 | 98 | 74 | 0.850746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
96bcaf471d55f0f506530f804b1e8575c6f46825 | 96 | py | Python | venv/lib/python3.8/site-packages/platformdirs/macos.py | Retraces/UkraineBot | 3d5d7f8aaa58fa0cb8b98733b8808e5dfbdb8b71 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/platformdirs/macos.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/platformdirs/macos.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/bc/8a/30/3d82a41e4b09716563a874286bea08d62d83d12ee0b127721736432440 | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.489583 | 0 | 96 | 1 | 96 | 96 | 0.40625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
96c2b9f478794a95ae0c4c748f0db978967cf61b | 6,149 | py | Python | tests/test_utils.py | tecknicaltom/smoketest | 07cf553508f924ba620cb8b397d9226283e32499 | [
"MIT"
] | null | null | null | tests/test_utils.py | tecknicaltom/smoketest | 07cf553508f924ba620cb8b397d9226283e32499 | [
"MIT"
] | null | null | null | tests/test_utils.py | tecknicaltom/smoketest | 07cf553508f924ba620cb8b397d9226283e32499 | [
"MIT"
] | null | null | null | import unittest
class TestUtilities(unittest.TestCase):
def test_chunkify(self):
from smoketest.utils import chunkify
# ordinary case where 2 < n < len(seq)
self.assertEqual(
chunkify(list(range(10)), 3),
[[0, 1, 2, ], [3, 4, 5, ], [6, 7, 8, 9, ]]
)
# n = 1 case
self.assertEqual(
chunkify(list(range(10)), 1),
[list(range(10))],
)
# n == len(seq) case
self.assertEqual(
chunkify(list(range(10)), 10),
list(map(lambda x: [x], range(10))),
)
# n > len(seq) case
self.assertEqual(
chunkify(list(range(1)), 2),
[[], [0]],
)
def test_uncachebust_no_cachebuster(self):
from smoketest.utils import uncachebust
expected = 'usnews.com?b=2&a=1&c='
actual = uncachebust('usnews.com?b=2&a=1&c=')
self.assertEqual(expected, actual)
def test_uncachebust_with_cachebuster(self):
from smoketest.utils import uncachebust
expected = 'usnews.com?b=2&a=1&c='
actual = uncachebust('usnews.com?_=123&b=2&a=1&c=')
self.assertEqual(expected, actual)
class TestTransformUrlBasedOnOptions(unittest.TestCase):
def test_cachebusting(self):
from smoketest.utils import transform_url_based_on_options
from collections import namedtuple
import re
Options = namedtuple('Options', ('scheme', 'level', 'port', 'cachebust'))
url = 'http://www.usnews.com'
cachebust_pattern = re.compile(r'\?_=\d+$')
options = Options(None, 'stag', None, True)
transformed = transform_url_based_on_options(url, options)
self.assertTrue(cachebust_pattern.search(transformed))
options = Options(None, 'stag', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertFalse(cachebust_pattern.search(transformed))
def test_level(self):
from smoketest.utils import transform_url_based_on_options
from collections import namedtuple
Options = namedtuple('Options', ('scheme', 'level', 'port', 'cachebust'))
url = 'http://www.usnews.com'
options = Options(None, 'live', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, url)
options = Options(None, 'stag', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://www-stag.usnews.com')
def test_custom_level(self):
from smoketest.utils import transform_url_based_on_options
from collections import namedtuple
Options = namedtuple('Options', ('scheme', 'level', 'port', 'cachebust'))
url = 'http://www-{LEVEL}.usnews.com'
options = Options(None, 'live', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://www.usnews.com')
options = Options(None, 'stag', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://www-stag.usnews.com')
url = 'http://{LEVEL}.usnews.com'
options = Options(None, 'live', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://usnews.com')
options = Options(None, 'stag', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://stag.usnews.com')
url = 'http://{LEVEL}-www.usnews.com'
options = Options(None, 'live', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://www.usnews.com')
options = Options(None, 'stag', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://stag-www.usnews.com')
url = 'http://www.usnews.com/{LEVEL}/'
options = Options(None, 'live', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://www.usnews.com/')
options = Options(None, 'stag', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://www.usnews.com/stag/')
url = 'http://www.usnews{LEVEL}.com'
options = Options(None, '-staging', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://www.usnews-staging.com')
options = Options(None, '', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://www.usnews.com')
def test_port(self):
from smoketest.utils import transform_url_based_on_options
from collections import namedtuple
Options = namedtuple('Options', ('scheme', 'level', 'port', 'cachebust'))
url = 'http://www.usnews.com'
options = Options(None, 'live', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, url)
options = Options(None, 'live', 8999, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'http://www.usnews.com:8999')
def test_scheme(self):
from smoketest.utils import transform_url_based_on_options
from collections import namedtuple
Options = namedtuple('Options', ('scheme', 'level', 'port', 'cachebust'))
url = 'http://www.usnews.com'
options = Options('https', 'live', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'https://www.usnews.com')
options = Options('https', 'stag', None, False)
transformed = transform_url_based_on_options(url, options)
self.assertEqual(transformed, 'https://www-stag.usnews.com')
| 38.672956 | 81 | 0.642056 | 704 | 6,149 | 5.451705 | 0.113636 | 0.058624 | 0.101876 | 0.113861 | 0.858781 | 0.83976 | 0.815008 | 0.804586 | 0.804586 | 0.77358 | 0 | 0.010118 | 0.228492 | 6,149 | 158 | 82 | 38.917722 | 0.798904 | 0.013661 | 0 | 0.568966 | 0 | 0 | 0.14736 | 0.014851 | 0 | 0 | 0 | 0 | 0.206897 | 1 | 0.068966 | false | 0 | 0.12931 | 0 | 0.215517 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
738fc1eaed3425e3780ab78109c895bf32037744 | 17,803 | py | Python | unirep_models.py | Tessier-Lab-UMich/Emi_Pareto_Opt_ML | 0ed9ea241ad154de86acce0bdc63586ce66d99fa | [
"MIT"
] | null | null | null | unirep_models.py | Tessier-Lab-UMich/Emi_Pareto_Opt_ML | 0ed9ea241ad154de86acce0bdc63586ce66d99fa | [
"MIT"
] | null | null | null | unirep_models.py | Tessier-Lab-UMich/Emi_Pareto_Opt_ML | 0ed9ea241ad154de86acce0bdc63586ce66d99fa | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Sun Sep 12 13:13:21 2021
@author: makow
"""
from holdout_utils import *
emi_binding = pd.read_csv("emi_binding.csv", header = 0, index_col = 0)
iso_binding = pd.read_csv("iso_binding.csv", header = 0, index_col = 0)
igg_binding = pd.read_csv("igg_binding.csv", header = 0, index_col = 0)
emi_reps = pd.read_csv("emi_reps.csv", header = 0, index_col = 0)
iso_reps = pd.read_csv("iso_reps.csv", header = 0, index_col = 0)
igg_reps = pd.read_csv("igg_reps.csv", header = 0, index_col = 0)
#%%
lda_ant = LDA()
cv_results = cv(lda_ant, emi_reps, emi_binding.iloc[:,0])
print('Antigen model cross validation average test accuracy: ' + str(np.mean(cv_results['test_score'])))
emi_ant_transform = pd.DataFrame(lda_ant.fit_transform(emi_reps, emi_binding.iloc[:,0])).set_index(emi_binding.index)
emi_ant_predict = pd.DataFrame(lda_ant.predict(emi_reps)).set_index(emi_binding.index)
print('Antigen model accuracy: ' + str(accuracy_score(emi_ant_predict.iloc[:,0], emi_binding.iloc[:,0])))
iso_ant_transform = pd.DataFrame(lda_ant.transform(iso_reps)).set_index(iso_binding.index)
iso_ant_predict = pd.DataFrame(lda_ant.predict(iso_reps)).set_index(iso_binding.index)
igg_ant_transform = pd.DataFrame(lda_ant.transform(igg_reps)).set_index(igg_binding.index)
lda_psy = LDA()
cv_results = cv(lda_psy, emi_reps, emi_binding.iloc[:,1])
print('Specificity model cross validation average test accuracy: ' + str(np.mean(cv_results['test_score'])))
emi_psy_transform = pd.DataFrame(lda_psy.fit_transform(emi_reps, emi_binding.iloc[:,1])).set_index(emi_binding.index)
emi_psy_predict = pd.DataFrame(lda_psy.predict(emi_reps)).set_index(emi_binding.index)
print('Specificity model accuracy: ' + str(accuracy_score(emi_psy_predict.iloc[:,0], emi_binding.iloc[:,1])))
iso_psy_transform = pd.DataFrame(lda_psy.transform(iso_reps)).set_index(iso_binding.index)
iso_psy_predict = pd.DataFrame(lda_psy.predict(iso_reps)).set_index(iso_binding.index)
igg_psy_transform = pd.DataFrame(lda_psy.transform(igg_reps)).set_index(igg_binding.index)
#%%
"""
# sample size elbow plot
emi_data = pd.concat([emi_binding, emi_reps.set_index(emi_binding.index)], axis = 1)
ant_test_acc = []
psy_test_acc = []
for i in np.arange(25,4000,25):
emi_data_subset = emi_data.sample(i)
emi_data_subset_train, emi_data_subset_test, emi_data_subset_target_train, emi_data_subset_target_test = train_test_split(emi_data_subset.iloc[:,3:8000], emi_data_subset.iloc[:,0:3])
cv_results = cv(lda_ant, emi_data_subset.iloc[:,3:8000], emi_data_subset.iloc[:,0])
ant_test_acc.append(np.mean(cv_results['test_score']))
cv_results = cv(lda_psy, emi_data_subset.iloc[:,3:8000], emi_data_subset.iloc[:,1])
psy_test_acc.append(np.mean(cv_results['test_score']))
#%%
plt.scatter(np.arange(25,4000,25), ant_test_acc, c = 'blue', edgecolor = 'k', linewidth = 0.25, s = 50)
plt.scatter(np.arange(25,4000,25), psy_test_acc, c = 'red', edgecolor = 'k', linewidth = 0.25, s = 50)
plt.xticks([0,1000,2000,3000,4000], fontsize = 24)
plt.yticks([0.5, 0.6, 0.7,0.8, 0.9, 1.0], [50, 60, 70, 80, 90, 100], fontsize = 24)
#%%
#KNN of sequences
from sklearn.neighbors import KNeighborsClassifier as KNC
emi_data = pd.concat([emi_binding, emi_reps.set_index(emi_binding.index)], axis = 1)
ant_predict_acc = []
psy_predict_acc = []
for j in np.arange(1,25):
knc = KNC(n_neighbors = j)
cv_results = cv(knc, emi_data.iloc[:,3:8000], emi_data.iloc[:,0])
ant_predict_acc.append(np.mean(cv_results['test_score']))
cv_results = cv(knc, emi_data.iloc[:,3:8000], emi_data.iloc[:,1])
psy_predict_acc.append(np.mean(cv_results['test_score']))
#%%
plt.scatter(np.arange(1,25), ant_predict_acc, c = 'blue', edgecolor = 'k', linewidth = 0.25, s = 50)
plt.scatter(np.arange(1,25), psy_predict_acc, c = 'red', edgecolor = 'k', linewidth = 0.25, s = 50)
plt.xticks(fontsize = 24)
plt.yticks([0.8, 0.9, 1.0], [80, 90, 100], fontsize = 24)
"""
#%%
#model accuracy distributions
plt.figure()
sns.distplot(emi_ant_transform.loc[emi_binding['ANT Binding'] == 0, 0], color = 'red')
sns.distplot(emi_ant_transform.loc[emi_binding['ANT Binding'] == 1, 0], color = 'blue')
plt.xticks([-4, -2, 0, 2, 4], [-4, -2, 0, 2, 4], fontsize = 26)
plt.yticks([0.0, 0.2, 0.4, 0.6], [0.0, 0.2, 0.4, 0.6], fontsize = 26)
plt.ylabel('')
plt.xlim(-5,5)
plt.figure()
sns.distplot(emi_psy_transform.loc[emi_binding['OVA Binding'] == 0, 0], color = 'blue')
sns.distplot(emi_psy_transform.loc[emi_binding['OVA Binding'] == 1, 0], color = 'red')
plt.xticks([-4, -2, 0, 2, 4], [-4, -2, 0, 2, 4], fontsize = 26)
plt.yticks([0.0, 0.2, 0.4, 0.6], [0.0, 0.2, 0.4, 0.6], fontsize = 26)
plt.ylabel('')
#%%
#yeast data correlations
plt.figure()
plt.scatter(iso_ant_transform.iloc[:,0], iso_binding.iloc[:,1], c = iso_ant_predict.iloc[:,0], cmap = cmap9r, s = 150, edgecolor = 'k', linewidth = 0.25)
plt.scatter(iso_ant_transform.iloc[125,0], iso_binding.iloc[125,1], c = 'k', s = 250, edgecolor = 'k', linewidth = 0.25)
plt.xticks([-4, -2, 0, 2, 4], [-4, -2, 0, 2, 4], fontsize = 26)
plt.yticks([0.0, 0.4, 0.8, 1.2, 1.6], [0.0, 0.4, 0.8, 1.2, 1.6], fontsize = 26)
plt.ylim(-0.15, 1.85)
print('Antigen model scFab correlation: ' + str(sc.stats.spearmanr(iso_ant_transform.iloc[:,0], iso_binding.iloc[:,1])))
plt.figure()
plt.scatter(iso_psy_transform.iloc[:,0], iso_binding.iloc[:,2], c = iso_psy_predict.iloc[:,0], cmap = cmap9, s = 150, edgecolor = 'k', linewidth = 0.25)
plt.scatter(iso_psy_transform.iloc[125,0], iso_binding.iloc[125,2], c = 'k', s = 250, edgecolor = 'k', linewidth = 0.25)
plt.xticks([-4, -2, 0, 2, 4], [-4, -2, 0, 2, 4], fontsize = 26)
plt.yticks([0.0, 0.4, 0.8, 1.2, 1.6], [0.0, 0.4, 0.8, 1.2, 1.6], fontsize = 26)
plt.ylim(-0.15, 1.85)
print('Specificity model scFab correlation: ' + str(sc.stats.spearmanr(iso_psy_transform.iloc[:,0], iso_binding.iloc[:,2])))
#%%
#pareto plots
plt.figure()
plt.scatter(emi_ant_transform, emi_psy_transform, color = 'white', edgecolor = 'k', s = 40, linewidth = 0.25)
plt.scatter(igg_ant_transform.iloc[0:41,0], igg_psy_transform.iloc[0:41,0], color = cmap(0.15), edgecolor= 'k', s = 80, linewidth = 0.25)
plt.scatter(igg_ant_transform.iloc[41:42,0], igg_psy_transform.iloc[41:42,0], color = 'black', s = 150, edgecolor= 'k', linewidth = 0.25)
plt.xticks([-6, -4, -2, 0, 2, 4, 6], [-6, -4, -2, 0, 2, 4, 6], fontsize = 26)
plt.yticks([-6, -4, -2, 0, 2, 4, 6], [-6, -4, -2, 0, 2, 4, 6], fontsize = 26)
plt.ylabel('')
#print(len(emi_ant_transform))
plt.figure()
plt.scatter(emi_ant_transform, emi_psy_transform, color = 'white', edgecolor = 'k', s = 40, linewidth = 0.25)
plt.scatter(igg_ant_transform.loc[igg_binding['Blosum62'] == 1,0], igg_psy_transform.loc[igg_binding['Blosum62'] == 1,0], color = cmap(0.15), edgecolor= 'k', s = 80, linewidth = 0.25)
plt.scatter(igg_ant_transform.iloc[41:42,0], igg_psy_transform.iloc[41:42,0], color = 'black', s = 150, edgecolor= 'k', linewidth = 0.25)
plt.scatter(igg_ant_transform.iloc[8,0], igg_psy_transform.iloc[8,0], c = 'orange', s = 150, edgecolor = 'k', linewidth = 0.25, zorder = 3)
plt.xticks([-6, -4, -2, 0, 2, 4, 6], [-6, -4, -2, 0, 2, 4, 6], fontsize = 26)
plt.yticks([-6, -4, -2, 0, 2, 4, 6], [-6, -4, -2, 0, 2, 4, 6], fontsize = 26)
plt.ylabel('')
#%%
#in-library IgG correlations
plt.figure()
plt.errorbar(igg_ant_transform.iloc[0:41,0], igg_binding.iloc[0:41,1], yerr = igg_binding.iloc[0:41,3], linewidth = 0, elinewidth = 0.5, ecolor = 'k', capsize = 3, zorder = 1)
plt.scatter(igg_ant_transform.iloc[0:41,0], igg_binding.iloc[0:41,1], c = cmap(0.15), s = 150, edgecolor = 'k', linewidth = 0.25, zorder = 2)
plt.scatter(igg_ant_transform.iloc[41:42,0], 1, color = 'k', s = 250, edgecolor= 'k', linewidth = 0.25, zorder = 3)
plt.xticks([1, 2, 3], [1, 2, 3], fontsize = 26)
plt.yticks([0.0, 0.4, 0.8, 1.2, 1.6], [0.0, 0.4, 0.8, 1.2, 1.6], fontsize = 26)
plt.ylim(-0.05, 1.65)
print('Antigen model in-library IgG correlation: ' + str(sc.stats.spearmanr(igg_ant_transform.iloc[0:42,0], igg_binding.iloc[0:42,1])))
plt.figure()
plt.errorbar(igg_psy_transform.iloc[0:41,0], igg_binding.iloc[0:41,2], yerr = igg_binding.iloc[0:41,4], linewidth = 0, elinewidth = 0.5, ecolor = 'k', capsize = 3, zorder = 1)
plt.scatter(igg_psy_transform.iloc[0:41,0], igg_binding.iloc[0:41,2], c = cmap(0.85), s = 150, edgecolor = 'k', linewidth = 0.25, zorder = 2)
plt.scatter(igg_psy_transform.iloc[41:42,0], 1, color = 'k', s = 250, edgecolor= 'k', linewidth = 0.25)
plt.xticks([0,1, 2, 3], [0,1, 2, 3], fontsize = 26)
plt.yticks([0.0, 0.4, 0.8, 1.2], [0.0, 0.4, 0.8, 1.2], fontsize = 26)
plt.ylim(-0.15, 1.45)
print('Specificity model in-library IgG correlation: ' + str(sc.stats.spearmanr(igg_psy_transform.iloc[0:42,0], igg_binding.iloc[0:42,2])))
#%%
#experimental pareto
plt.figure()
plt.errorbar(igg_binding.iloc[0:41,1], igg_binding.iloc[0:41,2], xerr = igg_binding.iloc[0:41,3], yerr = igg_binding.iloc[0:41,4], linewidth = 0, elinewidth = 0.5, ecolor = 'k', capsize = 3, zorder = 1)
plt.scatter(igg_binding.iloc[0:41,1], igg_binding.iloc[0:41,2], s = 150, c = 'blueviolet', edgecolor = 'k', linewidth = 0.25, zorder = 2)
#plt.scatter(igg_binding.loc[igg_binding['Scaffold'] == 1,'ANT Binding'], igg_binding.loc[igg_binding['Scaffold'] == 1,'OVA Binding'], s = 150, c = cmap(0.65), edgecolor = 'k', linewidth = 0.5, zorder = 3)
plt.scatter(1,1, s = 200, c = 'k', edgecolor = 'k', linewidth = 0.25, zorder = 4)
plt.scatter(1.2,0.51, s = 200, c = cmap(0.85), edgecolor = 'k', linewidth = 0.25, zorder = 4)
plt.xticks([0.0, 0.4, 0.8, 1.2], [0.0, 0.4, 0.8, 1.2], fontsize = 26)
plt.xlim(-0.05, 1.45)
plt.yticks([0.0, 0.4, 0.8, 1.2], [0.0, 0.4, 0.8, 1.2], fontsize = 26)
plt.ylim(-0.15, 1.35)
#%%
#novel IgG correlations
plt.figure()
plt.errorbar(igg_ant_transform.loc[igg_binding['Blosum62'] == 1,0], igg_binding.loc[igg_binding['Blosum62'] == 1,'ANT Binding'], yerr = igg_binding.loc[igg_binding['Blosum62'] == 1,'ANT STDEV'], linewidth = 0, elinewidth = 0.25, ecolor = 'k', capsize = 3, zorder = 1)
plt.scatter(igg_ant_transform.loc[igg_binding['Blosum62'] == 1,0], igg_binding.loc[igg_binding['Blosum62'] == 1,'ANT Binding'], c = cmap(0.15), s = 150, edgecolor = 'k', linewidth = 0.25, zorder = 2)
plt.scatter(igg_ant_transform.iloc[8,0], 1.2, c = 'orange', s = 250, edgecolor = 'k', linewidth = 0.25, zorder = 3)
plt.scatter(igg_ant_transform.iloc[41:42,0], 1, color = 'k', s = 250, edgecolor= 'k', linewidth = 0.25, zorder = 3)
plt.xticks([1, 2, 3,4], [1, 2, 3,4], fontsize = 26)
plt.yticks([0.0, 0.4, 0.8, 1.2, 1.6], [0.0, 0.4, 0.8, 1.2, 1.6], fontsize = 26)
plt.ylim(-0.05, 1.8)
print('Antigen model novel IgG correlation: ' + str(sc.stats.spearmanr(igg_ant_transform.loc[igg_binding['Blosum62'] == 1,0], igg_binding.loc[igg_binding['Blosum62'] == 1,'ANT Binding'])))
plt.figure()
plt.errorbar(igg_psy_transform.loc[igg_binding['Blosum62'] == 1,0], igg_binding.loc[igg_binding['Blosum62'] == 1,'OVA Binding'], yerr = igg_binding.loc[igg_binding['Blosum62'] == 1,'OVA STDEV'], linewidth = 0, elinewidth = 0.25, ecolor = 'k', capsize = 3, zorder = 1)
plt.scatter(igg_psy_transform.loc[igg_binding['Blosum62'] == 1,0], igg_binding.loc[igg_binding['Blosum62'] == 1,'OVA Binding'], c = cmap(0.85), s = 150, edgecolor = 'k', linewidth = 0.25, zorder = 2)
plt.scatter(igg_psy_transform.iloc[8,0], 0.51, c = 'orange', s = 250, edgecolor = 'k', linewidth = 0.25, zorder = 3)
plt.scatter(igg_psy_transform.iloc[41:42,0], 1, color = 'k', s = 250, edgecolor= 'k', linewidth = 0.25, zorder = 3)
plt.xticks([0,1, 2, 3], [0,1, 2, 3], fontsize = 26)
plt.yticks([0.0, 0.4, 0.8, 1.2], [0.0, 0.4, 0.8, 1.2], fontsize = 26)
plt.ylim(-0.15, 1.45)
print('Specificity model novel IgG correlation: ' + str(sc.stats.spearmanr(igg_psy_transform.loc[igg_binding['Blosum62'] == 1,0], igg_binding.loc[igg_binding['Blosum62'] == 1,'OVA Binding'])))
#%%
#novel IgG correlations without Blosum62 filter
print('Antigen model novel IgG correlation: ' + str(sc.stats.spearmanr(igg_ant_transform.iloc[42:100,0], igg_binding.iloc[42:100,1])))
print('Specificity model novel IgG correlation: ' + str(sc.stats.spearmanr(igg_psy_transform.iloc[42:100,0], igg_binding.iloc[42:100,2])))
#%%
#experimental pareto
plt.figure()
plt.errorbar(igg_binding.iloc[0:41,1], igg_binding.iloc[0:41,2], xerr = igg_binding.iloc[0:41,3], yerr = igg_binding.iloc[0:41,4], linewidth = 0, elinewidth = 0.5, ecolor = 'k', capsize = 3, zorder = 1)
plt.scatter(igg_binding.iloc[0:41,1], igg_binding.iloc[0:41,2], s = 150, c = 'blueviolet', edgecolor = 'k', linewidth = 0.5, zorder = 2)
plt.errorbar(igg_binding.loc[igg_binding['Blosum62'] == 1,'ANT Binding'], igg_binding.loc[igg_binding['Blosum62'] == 1,'OVA Binding'], yerr = igg_binding.loc[igg_binding['Blosum62'] == 1,'ANT STDEV'], linewidth = 0, elinewidth = 0.25, ecolor = 'k', capsize = 3, zorder = 1)
plt.scatter(igg_binding.loc[igg_binding['Blosum62'] == 1,'ANT Binding'], igg_binding.loc[igg_binding['Blosum62'] == 1,'OVA Binding'], c = 'mediumspringgreen', s = 150, edgecolor = 'k', linewidth = 0.25, zorder = 2)
#plt.scatter(igg_binding.loc[igg_binding['Scaffold'] == 1,'ANT Binding'], igg_binding.loc[igg_binding['Scaffold'] == 1,'OVA Binding'], s = 150, c = cmap(0.65), edgecolor = 'k', linewidth = 0.5, zorder = 3)
plt.scatter(1,1, s = 250, c = 'k', edgecolor = 'k', linewidth = 0.5, zorder = 4)
plt.scatter(1.2,0.51, s = 250, c = 'orange', edgecolor = 'k', linewidth = 0.5, zorder = 4)
plt.scatter(1.28, 0.3, s = 250, c = 'red', edgecolor = 'k', linewidth = 0.5, zorder = 4)
plt.xticks([0.0, 0.4, 0.8, 1.2], [0.0, 0.4, 0.8, 1.2], fontsize = 26)
plt.xlim(-0.05, 1.45)
plt.yticks([0.0, 0.4, 0.8, 1.2], [0.0, 0.4, 0.8, 1.2], fontsize = 26)
plt.ylim(-0.15, 1.35)
#%%
ax = plt.subplots()
sc.stats.probplot(iso_binding.iloc[:,1], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(iso_binding.iloc[:,1])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(iso_binding.iloc[:,2], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(iso_binding.iloc[:,2])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(igg_binding.iloc[0:42,1], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(igg_binding.iloc[0:42,1])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(igg_binding.iloc[0:42,2], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(igg_binding.iloc[0:42,2])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(igg_binding.loc[igg_binding['Blosum62'] == 1,'ANT Binding'], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(igg_binding.loc[igg_binding['Blosum62'] == 1,'ANT Binding'])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(igg_binding.loc[igg_binding['Blosum62'] == 1,'OVA Binding'], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(igg_binding.loc[igg_binding['Blosum62'] == 1,'OVA Binding'])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(iso_ant_transform.iloc[:,0], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(iso_ant_transform.iloc[:,0])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(iso_psy_transform.iloc[:,0], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(iso_psy_transform.iloc[:,0])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(igg_ant_transform.iloc[:,0], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(igg_binding.iloc[0:42,1])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(igg_psy_transform.iloc[:,0], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(igg_binding.iloc[0:42,2])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(igg_ant_transform.loc[igg_binding['Blosum62'] == 1,0], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(igg_ant_transform.loc[igg_binding['Blosum62'] == 1,0])
print(p)
#%%
ax = plt.subplots()
sc.stats.probplot(igg_psy_transform.loc[igg_binding['Blosum62'] == 1,0], dist = "norm", plot=plt)
plt.xticks(fontsize = 20)
plt.xlabel('Theoretical quantiles', fontsize = 24)
plt.yticks(fontsize = 20)
plt.ylabel('Ordered values', fontsize = 24)
plt.tight_layout()
stat, p = sc.stats.shapiro(igg_psy_transform.loc[igg_binding['Blosum62'] == 1,0])
print(p)
| 48.775342 | 273 | 0.683649 | 3,120 | 17,803 | 3.767308 | 0.060256 | 0.073167 | 0.036498 | 0.052748 | 0.916794 | 0.905139 | 0.868215 | 0.823635 | 0.797941 | 0.727412 | 0 | 0.086695 | 0.113015 | 17,803 | 364 | 274 | 48.909341 | 0.657653 | 0.042184 | 0 | 0.612069 | 0 | 0 | 0.108834 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.00431 | 0 | 0.00431 | 0.103448 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
73a6709248282e9ba0b4c8204e7c73b33d4df79d | 2,488 | py | Python | src/tests/test_group_read.py | devsetgo/test-api | 2a84bbacbf5cd043b2227e74332e8518927a8238 | [
"MIT"
] | 9 | 2019-05-22T08:46:01.000Z | 2021-12-10T06:44:56.000Z | src/tests/test_group_read.py | devsetgo/test-api | 2a84bbacbf5cd043b2227e74332e8518927a8238 | [
"MIT"
] | 285 | 2019-09-03T00:52:39.000Z | 2022-02-13T02:13:59.000Z | src/tests/test_group_read.py | devsetgo/test-api | 2a84bbacbf5cd043b2227e74332e8518927a8238 | [
"MIT"
] | 4 | 2019-09-19T18:14:09.000Z | 2020-12-15T18:35:07.000Z | # -*- coding: utf-8 -*-
import unittest
from starlette.testclient import TestClient
from src.main import app
client = TestClient(app)
directory_to__files: str = "data"
# api/v1/groups/list?delay=1&qty=10&offset=1&active=true&groupType=approval
class Test(unittest.TestCase):
def test_groups_get_list_error_delay(self):
url = f"/api/v1/groups/list?delay=122"
response = client.get(url)
assert response.status_code == 422
def test_groups_get_list_error_qty(self):
url = f"/api/v1/groups/list?qty=501"
response = client.get(url)
assert response.status_code == 422
def test_groups_get_list_error_type(self):
url = f"/api/v1/groups/list?groupType=bob"
response = client.get(url)
assert response.status_code == 422
def test_groups_get_list_all_options(self):
url = f"/api/v1/groups/list?delay=1&qty=10&offset=1&active=true&groupType=approval"
response = client.get(url)
assert response.status_code == 200
def test_groups_get_list(self):
url = f"/api/v1/groups/list"
response = client.get(url)
assert response.status_code == 200
def test_groups_get_list_name(self):
url = f"/api/v1/groups/list?groupName=test"
response = client.get(url)
assert response.status_code == 200
def test_groups_get_list_count(self):
url = f"/api/v1/groups/list/count"
response = client.get(url)
assert response.status_code == 200
def test_groups_get_list_count_error_delay(self):
url = f"/api/v1/groups/list/count?delay=122"
response = client.get(url)
assert response.status_code == 422
def test_groups_get_list_count_all_options(self):
group_type: list = ["approval", "notification"]
active_state: list = ["true", "false"]
for g in group_type:
for a in active_state:
url = f"/api/v1/groups/list/count?delay=1&active={a}&groupType={g}"
response = client.get(url)
assert response.status_code == 200
def test_groups_get_list_count_invalid_group(self):
url = f"/api/v1/groups/list/count?groupType=bob"
response = client.get(url)
assert response.status_code == 422
def test_groups_get_list_count_invalid_group(self):
url = f"/api/v1/groups/list/?groupType=bob"
response = client.get(url)
assert response.status_code == 422
| 28.272727 | 91 | 0.655547 | 348 | 2,488 | 4.477011 | 0.181034 | 0.038511 | 0.084724 | 0.115533 | 0.800385 | 0.800385 | 0.787548 | 0.743261 | 0.695122 | 0.65276 | 0 | 0.033455 | 0.231109 | 2,488 | 87 | 92 | 28.597701 | 0.780972 | 0.038183 | 0 | 0.444444 | 0 | 0.037037 | 0.1841 | 0.162343 | 0 | 0 | 0 | 0 | 0.203704 | 1 | 0.203704 | false | 0 | 0.055556 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
73bfa100052e5c93bcbc991642712c72f1cdf7db | 1,722 | py | Python | old/stoper2.py | Faralaks/the-game | cd08f1f0222eee71916763a11f99ea631dbad578 | [
"MIT"
] | null | null | null | old/stoper2.py | Faralaks/the-game | cd08f1f0222eee71916763a11f99ea631dbad578 | [
"MIT"
] | null | null | null | old/stoper2.py | Faralaks/the-game | cd08f1f0222eee71916763a11f99ea631dbad578 | [
"MIT"
] | null | null | null | #UTF-8
def stoper(x_object, y_object, side, stop_kords=False):
if stop_kords != False:
stop = True
if side == 0:
for i in stop_kords:
temp = i.split(' ')
x1 = int(temp[0])
y1 = int(temp[1])
x2 = int(temp[2])
y2 = int(temp[3])
if x_object + 48 >= x1 and x_object + 2 <= x2 and y_object + 52 >= y1 and y_object + 28 <= y2:
stop = False
if side == 1:
for i in stop_kords:
temp = i.split(' ')
x1 = int(temp[0])
y1 = int(temp[1])
x2 = int(temp[2])
y2 = int(temp[3])
if x_object + 48 >= x1 and x_object + 2 <= x2 and y_object + 38 >= y1 and y_object + 24 <= y2:
stop = False
if side == 2:
for i in stop_kords:
temp = i.split(' ')
x1 = int(temp[0])
y1 = int(temp[1])
x2 = int(temp[2])
y2 = int(temp[3])
if x_object + 50 >= x1 and x_object + 40 <= x2 and y_object + 38 >= y1 and y_object + 28 <= y2:
stop = False
if side == 3:
for i in stop_kords:
temp = i.split(' ')
x1 = int(temp[0])
y1 = int(temp[1])
x2 = int(temp[2])
y2 = int(temp[3])
if x_object + 45 >= x1 and x_object <= x2 and y_object + 38 >= y1 and y_object + 28 <= y2:
stop = False
return stop
else: return True
| 28.7 | 111 | 0.381533 | 218 | 1,722 | 2.90367 | 0.174312 | 0.176935 | 0.126382 | 0.063191 | 0.78357 | 0.756714 | 0.756714 | 0.756714 | 0.756714 | 0.734597 | 0 | 0.095745 | 0.508711 | 1,722 | 60 | 112 | 28.7 | 0.652482 | 0.002904 | 0 | 0.682927 | 0 | 0 | 0.00233 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02439 | false | 0 | 0 | 0 | 0.04878 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
73d42f338df026b4117d0fe5d196c3dfa45ebf91 | 3,171 | py | Python | gitlint/tests/rules/test_configuration_rules.py | dzhu/gitlint | c77d4a1009a8e9b567134b295720f92173911b33 | [
"MIT"
] | null | null | null | gitlint/tests/rules/test_configuration_rules.py | dzhu/gitlint | c77d4a1009a8e9b567134b295720f92173911b33 | [
"MIT"
] | null | null | null | gitlint/tests/rules/test_configuration_rules.py | dzhu/gitlint | c77d4a1009a8e9b567134b295720f92173911b33 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from gitlint.tests.base import BaseTestCase
from gitlint import rules
from gitlint.config import LintConfig
class ConfigurationRuleTests(BaseTestCase):
def test_ignore_by_title(self):
commit = self.gitcommit(u"Releäse\n\nThis is the secōnd body line")
# No regex specified -> Config shouldn't be changed
rule = rules.IgnoreByTitle()
config = LintConfig()
rule.apply(config, commit)
self.assertEqual(config, LintConfig())
self.assert_logged([]) # nothing logged -> nothing ignored
# Matching regex -> expect config to ignore all rules
rule = rules.IgnoreByTitle({"regex": u"^Releäse(.*)"})
expected_config = LintConfig()
expected_config.ignore = "all"
rule.apply(config, commit)
self.assertEqual(config, expected_config)
expected_log_message = u"DEBUG: gitlint.rules Ignoring commit because of rule 'I1': " + \
u"Commit title 'Releäse' matches the regex '^Releäse(.*)', ignoring rules: all"
self.assert_log_contains(expected_log_message)
# Matching regex with specific ignore
rule = rules.IgnoreByTitle({"regex": u"^Releäse(.*)",
"ignore": "T1,B2"})
expected_config = LintConfig()
expected_config.ignore = "T1,B2"
rule.apply(config, commit)
self.assertEqual(config, expected_config)
expected_log_message = u"DEBUG: gitlint.rules Ignoring commit because of rule 'I1': " + \
u"Commit title 'Releäse' matches the regex '^Releäse(.*)', ignoring rules: T1,B2"
def test_ignore_by_body(self):
commit = self.gitcommit(u"Tïtle\n\nThis is\n a relëase body\n line")
# No regex specified -> Config shouldn't be changed
rule = rules.IgnoreByBody()
config = LintConfig()
rule.apply(config, commit)
self.assertEqual(config, LintConfig())
self.assert_logged([]) # nothing logged -> nothing ignored
# Matching regex -> expect config to ignore all rules
rule = rules.IgnoreByBody({"regex": u"(.*)relëase(.*)"})
expected_config = LintConfig()
expected_config.ignore = "all"
rule.apply(config, commit)
self.assertEqual(config, expected_config)
expected_log_message = u"DEBUG: gitlint.rules Ignoring commit because of rule 'I2': " + \
u"Commit message line ' a relëase body' matches the regex '(.*)relëase(.*)'," + \
u" ignoring rules: all"
self.assert_log_contains(expected_log_message)
# Matching regex with specific ignore
rule = rules.IgnoreByBody({"regex": u"(.*)relëase(.*)",
"ignore": "T1,B2"})
expected_config = LintConfig()
expected_config.ignore = "T1,B2"
rule.apply(config, commit)
self.assertEqual(config, expected_config)
expected_log_message = u"DEBUG: gitlint.rules Ignoring commit because of rule 'I1': " + \
u"Commit message line ' a relëase body' matches the regex '(.*)relëase(.*)', ignoring rules: T1,B2"
| 44.041667 | 112 | 0.623778 | 356 | 3,171 | 5.455056 | 0.19382 | 0.086509 | 0.046344 | 0.064882 | 0.853244 | 0.828527 | 0.776004 | 0.776004 | 0.776004 | 0.776004 | 0 | 0.007268 | 0.262378 | 3,171 | 71 | 113 | 44.661972 | 0.823001 | 0.115106 | 0 | 0.607843 | 0 | 0 | 0.275751 | 0 | 0 | 0 | 0 | 0 | 0.196078 | 1 | 0.039216 | false | 0 | 0.058824 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
73f72336b88416e9c8be927bcb101e66f9f70838 | 154 | py | Python | ProgrammingBasicWithPython-KCL/Chapter-5/queuetest.py | mrmyothet/IPND | 204e010f815fa10951daf38669a9323cb6b13147 | [
"MIT"
] | 1 | 2020-07-04T14:00:48.000Z | 2020-07-04T14:00:48.000Z | ProgrammingBasicWithPython-KCL/Chapter-5/queuetest.py | mrmyothet/IPND | 204e010f815fa10951daf38669a9323cb6b13147 | [
"MIT"
] | 20 | 2020-06-01T04:32:16.000Z | 2020-09-14T07:18:54.000Z | ProgrammingBasicWithPython-KCL/Chapter-5/queuetest.py | mrmyothet/ipnd | 204e010f815fa10951daf38669a9323cb6b13147 | [
"MIT"
] | null | null | null | from pyqueue import Queue
q = Queue()
q.enqueue(6)
q.enqueue("cat")
q.enqueue(True)
print(q.size())
print(q.dequeue())
print(q.dequeue())
print(q.size())
| 15.4 | 25 | 0.688312 | 27 | 154 | 3.925926 | 0.444444 | 0.226415 | 0.188679 | 0.339623 | 0.301887 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007143 | 0.090909 | 154 | 9 | 26 | 17.111111 | 0.75 | 0 | 0 | 0.444444 | 0 | 0 | 0.019481 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0.444444 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
fb923e0361e58c340873980a3f63e0ee4f51099f | 381 | py | Python | pygrim/components/utils/__init__.py | ondrejkajinek/pyGrim | 1a99f88c790386d111d4978200a309514f7c8a1f | [
"MIT"
] | 3 | 2017-04-21T12:57:07.000Z | 2017-08-02T14:45:51.000Z | pygrim/components/utils/__init__.py | ondrejkajinek/pyGrim | 1a99f88c790386d111d4978200a309514f7c8a1f | [
"MIT"
] | 14 | 2017-05-09T15:25:10.000Z | 2017-08-03T08:21:24.000Z | pygrim/components/utils/__init__.py | ondrejkajinek/pyGrim | 1a99f88c790386d111d4978200a309514f7c8a1f | [
"MIT"
] | null | null | null | # coding: utf8
from .counter import Counter # noqa
from .functions import deep_update # noqa
from .functions import ensure_string, ensure_tuple # noqa
from .functions import fix_trailing_slash, remove_trailing_slash # noqa
from .functions import get_class_name, get_instance_name, get_method_name # noqa
from .functions import is_regex # noqa
from . import json2 # noqa
| 38.1 | 82 | 0.790026 | 54 | 381 | 5.314815 | 0.444444 | 0.167247 | 0.296167 | 0.400697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006231 | 0.15748 | 381 | 9 | 83 | 42.333333 | 0.88785 | 0.12336 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fbd98843bbc94548ba02ac6399baf7b72fb70d8a | 26 | py | Python | intake/catalog/__init__.py | ah-/intake | 1c971a9e579a18be603b4a74a71dbc111afbcb0c | [
"BSD-2-Clause"
] | null | null | null | intake/catalog/__init__.py | ah-/intake | 1c971a9e579a18be603b4a74a71dbc111afbcb0c | [
"BSD-2-Clause"
] | null | null | null | intake/catalog/__init__.py | ah-/intake | 1c971a9e579a18be603b4a74a71dbc111afbcb0c | [
"BSD-2-Clause"
] | null | null | null | from .base import Catalog
| 13 | 25 | 0.807692 | 4 | 26 | 5.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 26 | 1 | 26 | 26 | 0.954545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8384d725e6f46dbff5c34caa60cbadf4b6f552b4 | 37,900 | py | Python | instances/passenger_demand/pas-20210421-2109-int14000000000000001e/11.py | LHcau/scheduling-shared-passenger-and-freight-transport-on-a-fixed-infrastructure | bba1e6af5bc8d9deaa2dc3b83f6fe9ddf15d2a11 | [
"BSD-3-Clause"
] | null | null | null | instances/passenger_demand/pas-20210421-2109-int14000000000000001e/11.py | LHcau/scheduling-shared-passenger-and-freight-transport-on-a-fixed-infrastructure | bba1e6af5bc8d9deaa2dc3b83f6fe9ddf15d2a11 | [
"BSD-3-Clause"
] | null | null | null | instances/passenger_demand/pas-20210421-2109-int14000000000000001e/11.py | LHcau/scheduling-shared-passenger-and-freight-transport-on-a-fixed-infrastructure | bba1e6af5bc8d9deaa2dc3b83f6fe9ddf15d2a11 | [
"BSD-3-Clause"
] | null | null | null |
"""
PASSENGERS
"""
numPassengers = 3220
passenger_arriving = (
(7, 9, 8, 2, 0, 0, 8, 3, 6, 5, 1, 0), # 0
(4, 12, 8, 6, 3, 0, 7, 8, 3, 5, 2, 0), # 1
(6, 10, 9, 3, 2, 0, 3, 15, 5, 7, 1, 0), # 2
(3, 14, 15, 2, 3, 0, 6, 4, 4, 2, 2, 0), # 3
(3, 5, 15, 3, 2, 0, 9, 7, 10, 5, 1, 0), # 4
(4, 3, 3, 2, 1, 0, 3, 5, 6, 3, 0, 0), # 5
(5, 9, 7, 0, 1, 0, 6, 8, 9, 4, 1, 0), # 6
(3, 9, 8, 4, 3, 0, 6, 9, 5, 5, 1, 0), # 7
(8, 8, 11, 5, 1, 0, 7, 12, 3, 4, 4, 0), # 8
(2, 8, 9, 1, 2, 0, 6, 11, 8, 6, 2, 0), # 9
(3, 6, 4, 1, 4, 0, 9, 7, 6, 2, 2, 0), # 10
(2, 12, 6, 3, 2, 0, 11, 11, 4, 4, 6, 0), # 11
(3, 8, 5, 6, 3, 0, 5, 6, 9, 2, 4, 0), # 12
(3, 8, 7, 3, 1, 0, 2, 11, 6, 3, 0, 0), # 13
(5, 6, 8, 2, 1, 0, 10, 5, 7, 2, 1, 0), # 14
(11, 4, 9, 5, 2, 0, 5, 7, 10, 6, 4, 0), # 15
(3, 8, 5, 2, 2, 0, 10, 10, 4, 2, 1, 0), # 16
(8, 13, 8, 5, 3, 0, 9, 7, 6, 0, 2, 0), # 17
(7, 3, 3, 1, 2, 0, 7, 7, 2, 3, 4, 0), # 18
(5, 10, 4, 5, 3, 0, 4, 8, 4, 9, 2, 0), # 19
(4, 7, 10, 4, 4, 0, 2, 11, 7, 6, 2, 0), # 20
(2, 8, 6, 2, 2, 0, 7, 9, 4, 9, 4, 0), # 21
(2, 3, 7, 2, 5, 0, 4, 16, 6, 3, 3, 0), # 22
(2, 7, 4, 2, 0, 0, 7, 17, 3, 1, 4, 0), # 23
(3, 7, 5, 7, 3, 0, 4, 10, 5, 5, 4, 0), # 24
(4, 6, 8, 6, 2, 0, 6, 4, 7, 3, 1, 0), # 25
(5, 8, 11, 5, 1, 0, 9, 13, 5, 7, 3, 0), # 26
(4, 7, 7, 1, 2, 0, 2, 10, 9, 8, 1, 0), # 27
(4, 10, 10, 4, 0, 0, 7, 8, 5, 6, 2, 0), # 28
(5, 7, 10, 9, 2, 0, 6, 11, 8, 3, 2, 0), # 29
(3, 8, 9, 3, 4, 0, 10, 15, 7, 3, 2, 0), # 30
(6, 11, 4, 2, 3, 0, 6, 6, 6, 6, 1, 0), # 31
(1, 13, 3, 3, 0, 0, 5, 8, 7, 6, 4, 0), # 32
(4, 8, 4, 5, 3, 0, 6, 10, 12, 3, 2, 0), # 33
(5, 8, 5, 1, 4, 0, 7, 10, 8, 2, 4, 0), # 34
(6, 5, 12, 6, 4, 0, 8, 4, 9, 8, 5, 0), # 35
(7, 11, 5, 5, 7, 0, 4, 14, 5, 4, 2, 0), # 36
(6, 11, 7, 5, 2, 0, 4, 7, 7, 11, 3, 0), # 37
(10, 5, 8, 7, 0, 0, 2, 10, 5, 3, 3, 0), # 38
(3, 10, 4, 2, 2, 0, 2, 6, 8, 1, 4, 0), # 39
(4, 5, 7, 4, 2, 0, 6, 13, 6, 5, 1, 0), # 40
(3, 5, 8, 1, 3, 0, 8, 14, 5, 4, 1, 0), # 41
(4, 7, 6, 2, 4, 0, 7, 9, 5, 5, 5, 0), # 42
(3, 10, 11, 4, 5, 0, 8, 8, 3, 6, 1, 0), # 43
(4, 10, 6, 4, 3, 0, 8, 12, 3, 7, 4, 0), # 44
(6, 13, 3, 2, 2, 0, 7, 6, 3, 4, 3, 0), # 45
(3, 5, 4, 6, 3, 0, 9, 10, 15, 3, 1, 0), # 46
(7, 5, 10, 2, 2, 0, 6, 8, 9, 6, 2, 0), # 47
(9, 9, 3, 3, 2, 0, 9, 5, 6, 8, 2, 0), # 48
(9, 8, 4, 2, 3, 0, 5, 6, 6, 4, 4, 0), # 49
(5, 15, 9, 1, 2, 0, 3, 11, 5, 3, 3, 0), # 50
(4, 11, 9, 6, 2, 0, 8, 7, 8, 4, 2, 0), # 51
(2, 12, 10, 4, 2, 0, 4, 6, 4, 9, 1, 0), # 52
(1, 13, 11, 4, 4, 0, 4, 8, 8, 6, 0, 0), # 53
(5, 5, 9, 4, 1, 0, 4, 9, 6, 5, 3, 0), # 54
(5, 10, 4, 8, 1, 0, 3, 8, 9, 3, 2, 0), # 55
(6, 9, 3, 3, 0, 0, 8, 8, 9, 7, 2, 0), # 56
(4, 12, 9, 6, 2, 0, 5, 17, 5, 2, 1, 0), # 57
(8, 13, 10, 2, 2, 0, 5, 11, 5, 4, 1, 0), # 58
(0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0), # 59
)
station_arriving_intensity = (
(3.7095121817383676, 9.515044981060607, 11.19193043059126, 8.87078804347826, 10.000240384615385, 6.659510869565219), # 0
(3.7443308140669203, 9.620858238197952, 11.252381752534994, 8.920190141908213, 10.075193108974359, 6.657240994867151), # 1
(3.7787518681104277, 9.725101964085297, 11.31139817195087, 8.968504830917876, 10.148564102564103, 6.654901690821256), # 2
(3.8127461259877085, 9.827663671875001, 11.368936576156813, 9.01569089673913, 10.22028605769231, 6.652493274456523), # 3
(3.8462843698175795, 9.928430874719417, 11.424953852470724, 9.061707125603865, 10.290291666666668, 6.6500160628019325), # 4
(3.879337381718857, 10.027291085770905, 11.479406888210512, 9.106512303743962, 10.358513621794872, 6.647470372886473), # 5
(3.9118759438103607, 10.12413181818182, 11.53225257069409, 9.150065217391306, 10.424884615384617, 6.644856521739131), # 6
(3.943870838210907, 10.218840585104518, 11.58344778723936, 9.19232465277778, 10.489337339743592, 6.64217482638889), # 7
(3.975292847039314, 10.311304899691358, 11.632949425164242, 9.233249396135266, 10.551804487179488, 6.639425603864735), # 8
(4.006112752414399, 10.401412275094698, 11.680714371786634, 9.272798233695653, 10.61221875, 6.636609171195653), # 9
(4.03630133645498, 10.489050224466892, 11.72669951442445, 9.310929951690824, 10.670512820512823, 6.633725845410628), # 10
(4.065829381279876, 10.5741062609603, 11.7708617403956, 9.347603336352659, 10.726619391025642, 6.630775943538648), # 11
(4.094667669007903, 10.656467897727273, 11.813157937017996, 9.382777173913043, 10.780471153846154, 6.627759782608695), # 12
(4.122786981757876, 10.736022647920176, 11.85354499160954, 9.416410250603866, 10.832000801282053, 6.624677679649759), # 13
(4.15015810164862, 10.81265802469136, 11.891979791488144, 9.448461352657004, 10.881141025641025, 6.621529951690821), # 14
(4.1767518107989465, 10.886261541193182, 11.928419223971721, 9.478889266304348, 10.92782451923077, 6.618316915760871), # 15
(4.202538891327675, 10.956720710578002, 11.96282017637818, 9.507652777777778, 10.971983974358976, 6.61503888888889), # 16
(4.227490125353625, 11.023923045998176, 11.995139536025421, 9.53471067330918, 11.013552083333336, 6.611696188103866), # 17
(4.25157629499561, 11.087756060606061, 12.025334190231364, 9.560021739130436, 11.052461538461543, 6.608289130434783), # 18
(4.274768182372451, 11.148107267554012, 12.053361026313912, 9.58354476147343, 11.088645032051284, 6.604818032910629), # 19
(4.297036569602966, 11.204864179994388, 12.079176931590974, 9.60523852657005, 11.122035256410259, 6.601283212560387), # 20
(4.318352238805971, 11.257914311079544, 12.102738793380466, 9.625061820652174, 11.152564903846153, 6.597684986413044), # 21
(4.338685972100283, 11.307145173961842, 12.124003499000287, 9.642973429951692, 11.180166666666667, 6.5940236714975855), # 22
(4.358008551604722, 11.352444281793632, 12.142927935768354, 9.658932140700484, 11.204773237179488, 6.590299584842997), # 23
(4.3762907594381035, 11.393699147727272, 12.159468991002571, 9.672896739130437, 11.226317307692307, 6.586513043478261), # 24
(4.393503377719247, 11.430797284915124, 12.173583552020853, 9.684826011473431, 11.244731570512819, 6.582664364432368), # 25
(4.409617188566969, 11.46362620650954, 12.185228506141103, 9.694678743961353, 11.259948717948719, 6.5787538647343), # 26
(4.424602974100088, 11.492073425662877, 12.194360740681233, 9.702413722826089, 11.271901442307694, 6.574781861413045), # 27
(4.438431516437421, 11.516026455527497, 12.200937142959157, 9.707989734299519, 11.280522435897437, 6.570748671497586), # 28
(4.4510735976977855, 11.535372809255753, 12.204914600292774, 9.711365564613528, 11.285744391025641, 6.566654612016909), # 29
(4.4625, 11.55, 12.20625, 9.7125, 11.287500000000001, 6.562500000000001), # 30
(4.47319183983376, 11.56215031960227, 12.205248928140096, 9.712295118464054, 11.286861125886526, 6.556726763701484), # 31
(4.4836528452685425, 11.574140056818184, 12.202274033816424, 9.711684477124184, 11.28495815602837, 6.547834661835751), # 32
(4.493887715792838, 11.585967720170455, 12.197367798913046, 9.710674080882354, 11.281811569148937, 6.535910757121439), # 33
(4.503901150895141, 11.597631818181819, 12.19057270531401, 9.709269934640524, 11.277441843971632, 6.521042112277196), # 34
(4.513697850063939, 11.609130859374998, 12.181931234903383, 9.707478043300654, 11.27186945921986, 6.503315790021656), # 35
(4.523282512787724, 11.62046335227273, 12.171485869565219, 9.705304411764708, 11.265114893617023, 6.482818853073463), # 36
(4.532659838554988, 11.631627805397729, 12.159279091183576, 9.70275504493464, 11.257198625886524, 6.4596383641512585), # 37
(4.5418345268542195, 11.642622727272729, 12.145353381642513, 9.699835947712419, 11.248141134751775, 6.433861385973679), # 38
(4.5508112771739135, 11.653446626420456, 12.129751222826087, 9.696553125000001, 11.23796289893617, 6.40557498125937), # 39
(4.559594789002558, 11.664098011363638, 12.11251509661836, 9.692912581699348, 11.22668439716312, 6.37486621272697), # 40
(4.568189761828645, 11.674575390625, 12.093687484903382, 9.68892032271242, 11.214326108156028, 6.34182214309512), # 41
(4.576600895140665, 11.684877272727276, 12.07331086956522, 9.684582352941177, 11.2009085106383, 6.3065298350824595), # 42
(4.584832888427111, 11.69500216619318, 12.051427732487923, 9.679904677287583, 11.186452083333334, 6.26907635140763), # 43
(4.592890441176471, 11.704948579545455, 12.028080555555556, 9.674893300653595, 11.17097730496454, 6.229548754789272), # 44
(4.600778252877237, 11.714715021306818, 12.003311820652177, 9.669554227941177, 11.15450465425532, 6.188034107946028), # 45
(4.6085010230179035, 11.724300000000003, 11.97716400966184, 9.663893464052288, 11.137054609929079, 6.144619473596536), # 46
(4.616063451086957, 11.733702024147728, 11.9496796044686, 9.65791701388889, 11.118647650709221, 6.099391914459438), # 47
(4.623470236572891, 11.742919602272728, 11.920901086956523, 9.651630882352942, 11.099304255319149, 6.052438493253375), # 48
(4.630726078964194, 11.751951242897727, 11.890870939009663, 9.645041074346407, 11.079044902482272, 6.003846272696985), # 49
(4.6378356777493615, 11.760795454545454, 11.85963164251208, 9.638153594771243, 11.057890070921987, 5.953702315508913), # 50
(4.6448037324168805, 11.769450745738636, 11.827225679347826, 9.630974448529413, 11.035860239361703, 5.902093684407797), # 51
(4.651634942455243, 11.777915625, 11.793695531400965, 9.623509640522876, 11.012975886524824, 5.849107442112278), # 52
(4.658334007352941, 11.786188600852274, 11.759083680555555, 9.615765175653596, 10.989257491134753, 5.794830651340996), # 53
(4.6649056265984665, 11.79426818181818, 11.723432608695653, 9.60774705882353, 10.964725531914894, 5.739350374812594), # 54
(4.671354499680307, 11.802152876420456, 11.686784797705313, 9.599461294934642, 10.939400487588653, 5.682753675245711), # 55
(4.677685326086957, 11.809841193181818, 11.649182729468599, 9.59091388888889, 10.913302836879433, 5.625127615358988), # 56
(4.683902805306906, 11.817331640625003, 11.610668885869565, 9.582110845588236, 10.886453058510638, 5.566559257871065), # 57
(4.690011636828645, 11.824622727272727, 11.57128574879227, 9.573058169934642, 10.858871631205675, 5.507135665500583), # 58
(0.0, 0.0, 0.0, 0.0, 0.0, 0.0), # 59
)
passenger_arriving_acc = (
(7, 9, 8, 2, 0, 0, 8, 3, 6, 5, 1, 0), # 0
(11, 21, 16, 8, 3, 0, 15, 11, 9, 10, 3, 0), # 1
(17, 31, 25, 11, 5, 0, 18, 26, 14, 17, 4, 0), # 2
(20, 45, 40, 13, 8, 0, 24, 30, 18, 19, 6, 0), # 3
(23, 50, 55, 16, 10, 0, 33, 37, 28, 24, 7, 0), # 4
(27, 53, 58, 18, 11, 0, 36, 42, 34, 27, 7, 0), # 5
(32, 62, 65, 18, 12, 0, 42, 50, 43, 31, 8, 0), # 6
(35, 71, 73, 22, 15, 0, 48, 59, 48, 36, 9, 0), # 7
(43, 79, 84, 27, 16, 0, 55, 71, 51, 40, 13, 0), # 8
(45, 87, 93, 28, 18, 0, 61, 82, 59, 46, 15, 0), # 9
(48, 93, 97, 29, 22, 0, 70, 89, 65, 48, 17, 0), # 10
(50, 105, 103, 32, 24, 0, 81, 100, 69, 52, 23, 0), # 11
(53, 113, 108, 38, 27, 0, 86, 106, 78, 54, 27, 0), # 12
(56, 121, 115, 41, 28, 0, 88, 117, 84, 57, 27, 0), # 13
(61, 127, 123, 43, 29, 0, 98, 122, 91, 59, 28, 0), # 14
(72, 131, 132, 48, 31, 0, 103, 129, 101, 65, 32, 0), # 15
(75, 139, 137, 50, 33, 0, 113, 139, 105, 67, 33, 0), # 16
(83, 152, 145, 55, 36, 0, 122, 146, 111, 67, 35, 0), # 17
(90, 155, 148, 56, 38, 0, 129, 153, 113, 70, 39, 0), # 18
(95, 165, 152, 61, 41, 0, 133, 161, 117, 79, 41, 0), # 19
(99, 172, 162, 65, 45, 0, 135, 172, 124, 85, 43, 0), # 20
(101, 180, 168, 67, 47, 0, 142, 181, 128, 94, 47, 0), # 21
(103, 183, 175, 69, 52, 0, 146, 197, 134, 97, 50, 0), # 22
(105, 190, 179, 71, 52, 0, 153, 214, 137, 98, 54, 0), # 23
(108, 197, 184, 78, 55, 0, 157, 224, 142, 103, 58, 0), # 24
(112, 203, 192, 84, 57, 0, 163, 228, 149, 106, 59, 0), # 25
(117, 211, 203, 89, 58, 0, 172, 241, 154, 113, 62, 0), # 26
(121, 218, 210, 90, 60, 0, 174, 251, 163, 121, 63, 0), # 27
(125, 228, 220, 94, 60, 0, 181, 259, 168, 127, 65, 0), # 28
(130, 235, 230, 103, 62, 0, 187, 270, 176, 130, 67, 0), # 29
(133, 243, 239, 106, 66, 0, 197, 285, 183, 133, 69, 0), # 30
(139, 254, 243, 108, 69, 0, 203, 291, 189, 139, 70, 0), # 31
(140, 267, 246, 111, 69, 0, 208, 299, 196, 145, 74, 0), # 32
(144, 275, 250, 116, 72, 0, 214, 309, 208, 148, 76, 0), # 33
(149, 283, 255, 117, 76, 0, 221, 319, 216, 150, 80, 0), # 34
(155, 288, 267, 123, 80, 0, 229, 323, 225, 158, 85, 0), # 35
(162, 299, 272, 128, 87, 0, 233, 337, 230, 162, 87, 0), # 36
(168, 310, 279, 133, 89, 0, 237, 344, 237, 173, 90, 0), # 37
(178, 315, 287, 140, 89, 0, 239, 354, 242, 176, 93, 0), # 38
(181, 325, 291, 142, 91, 0, 241, 360, 250, 177, 97, 0), # 39
(185, 330, 298, 146, 93, 0, 247, 373, 256, 182, 98, 0), # 40
(188, 335, 306, 147, 96, 0, 255, 387, 261, 186, 99, 0), # 41
(192, 342, 312, 149, 100, 0, 262, 396, 266, 191, 104, 0), # 42
(195, 352, 323, 153, 105, 0, 270, 404, 269, 197, 105, 0), # 43
(199, 362, 329, 157, 108, 0, 278, 416, 272, 204, 109, 0), # 44
(205, 375, 332, 159, 110, 0, 285, 422, 275, 208, 112, 0), # 45
(208, 380, 336, 165, 113, 0, 294, 432, 290, 211, 113, 0), # 46
(215, 385, 346, 167, 115, 0, 300, 440, 299, 217, 115, 0), # 47
(224, 394, 349, 170, 117, 0, 309, 445, 305, 225, 117, 0), # 48
(233, 402, 353, 172, 120, 0, 314, 451, 311, 229, 121, 0), # 49
(238, 417, 362, 173, 122, 0, 317, 462, 316, 232, 124, 0), # 50
(242, 428, 371, 179, 124, 0, 325, 469, 324, 236, 126, 0), # 51
(244, 440, 381, 183, 126, 0, 329, 475, 328, 245, 127, 0), # 52
(245, 453, 392, 187, 130, 0, 333, 483, 336, 251, 127, 0), # 53
(250, 458, 401, 191, 131, 0, 337, 492, 342, 256, 130, 0), # 54
(255, 468, 405, 199, 132, 0, 340, 500, 351, 259, 132, 0), # 55
(261, 477, 408, 202, 132, 0, 348, 508, 360, 266, 134, 0), # 56
(265, 489, 417, 208, 134, 0, 353, 525, 365, 268, 135, 0), # 57
(273, 502, 427, 210, 136, 0, 358, 536, 370, 272, 136, 0), # 58
(273, 502, 427, 210, 136, 0, 358, 536, 370, 272, 136, 0), # 59
)
passenger_arriving_rate = (
(3.7095121817383676, 7.612035984848484, 6.715158258354756, 3.5483152173913037, 2.000048076923077, 0.0, 6.659510869565219, 8.000192307692307, 5.322472826086956, 4.476772172236504, 1.903008996212121, 0.0), # 0
(3.7443308140669203, 7.696686590558361, 6.751429051520996, 3.5680760567632848, 2.0150386217948717, 0.0, 6.657240994867151, 8.060154487179487, 5.352114085144928, 4.500952701013997, 1.9241716476395903, 0.0), # 1
(3.7787518681104277, 7.780081571268237, 6.786838903170522, 3.58740193236715, 2.0297128205128203, 0.0, 6.654901690821256, 8.118851282051281, 5.381102898550726, 4.524559268780347, 1.9450203928170593, 0.0), # 2
(3.8127461259877085, 7.8621309375, 6.821361945694087, 3.6062763586956517, 2.044057211538462, 0.0, 6.652493274456523, 8.176228846153847, 5.409414538043478, 4.547574630462725, 1.965532734375, 0.0), # 3
(3.8462843698175795, 7.942744699775533, 6.854972311482434, 3.624682850241546, 2.0580583333333333, 0.0, 6.6500160628019325, 8.232233333333333, 5.437024275362319, 4.569981540988289, 1.9856861749438832, 0.0), # 4
(3.879337381718857, 8.021832868616723, 6.887644132926307, 3.6426049214975844, 2.0717027243589743, 0.0, 6.647470372886473, 8.286810897435897, 5.463907382246377, 4.591762755284204, 2.005458217154181, 0.0), # 5
(3.9118759438103607, 8.099305454545455, 6.919351542416455, 3.660026086956522, 2.084976923076923, 0.0, 6.644856521739131, 8.339907692307692, 5.490039130434783, 4.612901028277636, 2.0248263636363637, 0.0), # 6
(3.943870838210907, 8.175072468083613, 6.950068672343615, 3.6769298611111116, 2.0978674679487184, 0.0, 6.64217482638889, 8.391469871794873, 5.515394791666668, 4.633379114895743, 2.043768117020903, 0.0), # 7
(3.975292847039314, 8.249043919753085, 6.979769655098544, 3.693299758454106, 2.1103608974358976, 0.0, 6.639425603864735, 8.44144358974359, 5.5399496376811594, 4.653179770065696, 2.062260979938271, 0.0), # 8
(4.006112752414399, 8.321129820075758, 7.00842862307198, 3.709119293478261, 2.12244375, 0.0, 6.636609171195653, 8.489775, 5.563678940217391, 4.672285748714653, 2.0802824550189394, 0.0), # 9
(4.03630133645498, 8.391240179573513, 7.03601970865467, 3.724371980676329, 2.134102564102564, 0.0, 6.633725845410628, 8.536410256410257, 5.586557971014494, 4.690679805769779, 2.0978100448933783, 0.0), # 10
(4.065829381279876, 8.459285008768239, 7.06251704423736, 3.739041334541063, 2.145323878205128, 0.0, 6.630775943538648, 8.581295512820512, 5.608562001811595, 4.70834469615824, 2.1148212521920597, 0.0), # 11
(4.094667669007903, 8.525174318181818, 7.087894762210797, 3.7531108695652167, 2.156094230769231, 0.0, 6.627759782608695, 8.624376923076923, 5.6296663043478254, 4.725263174807198, 2.1312935795454546, 0.0), # 12
(4.122786981757876, 8.58881811833614, 7.112126994965724, 3.766564100241546, 2.1664001602564102, 0.0, 6.624677679649759, 8.665600641025641, 5.649846150362319, 4.741417996643816, 2.147204529584035, 0.0), # 13
(4.15015810164862, 8.650126419753088, 7.135187874892886, 3.779384541062801, 2.1762282051282047, 0.0, 6.621529951690821, 8.704912820512819, 5.669076811594202, 4.756791916595257, 2.162531604938272, 0.0), # 14
(4.1767518107989465, 8.709009232954545, 7.157051534383032, 3.7915557065217387, 2.1855649038461538, 0.0, 6.618316915760871, 8.742259615384615, 5.6873335597826085, 4.771367689588688, 2.177252308238636, 0.0), # 15
(4.202538891327675, 8.7653765684624, 7.177692105826908, 3.803061111111111, 2.194396794871795, 0.0, 6.61503888888889, 8.77758717948718, 5.7045916666666665, 4.785128070551272, 2.1913441421156, 0.0), # 16
(4.227490125353625, 8.81913843679854, 7.197083721615253, 3.8138842693236716, 2.202710416666667, 0.0, 6.611696188103866, 8.810841666666668, 5.720826403985508, 4.798055814410168, 2.204784609199635, 0.0), # 17
(4.25157629499561, 8.870204848484848, 7.215200514138818, 3.824008695652174, 2.2104923076923084, 0.0, 6.608289130434783, 8.841969230769234, 5.736013043478262, 4.810133676092545, 2.217551212121212, 0.0), # 18
(4.274768182372451, 8.918485814043208, 7.232016615788346, 3.8334179045893717, 2.2177290064102566, 0.0, 6.604818032910629, 8.870916025641026, 5.750126856884058, 4.8213444105255645, 2.229621453510802, 0.0), # 19
(4.297036569602966, 8.96389134399551, 7.247506158954584, 3.8420954106280196, 2.2244070512820517, 0.0, 6.601283212560387, 8.897628205128207, 5.76314311594203, 4.831670772636389, 2.2409728359988774, 0.0), # 20
(4.318352238805971, 9.006331448863634, 7.261643276028279, 3.8500247282608693, 2.2305129807692303, 0.0, 6.597684986413044, 8.922051923076921, 5.775037092391305, 4.841095517352186, 2.2515828622159084, 0.0), # 21
(4.338685972100283, 9.045716139169473, 7.274402099400172, 3.8571893719806765, 2.2360333333333333, 0.0, 6.5940236714975855, 8.944133333333333, 5.785784057971015, 4.849601399600115, 2.2614290347923682, 0.0), # 22
(4.358008551604722, 9.081955425434906, 7.285756761461012, 3.8635728562801934, 2.2409546474358972, 0.0, 6.590299584842997, 8.963818589743589, 5.79535928442029, 4.857171174307341, 2.2704888563587264, 0.0), # 23
(4.3762907594381035, 9.114959318181818, 7.295681394601543, 3.869158695652174, 2.2452634615384612, 0.0, 6.586513043478261, 8.981053846153845, 5.803738043478262, 4.863787596401028, 2.2787398295454544, 0.0), # 24
(4.393503377719247, 9.1446378279321, 7.304150131212511, 3.8739304045893723, 2.2489463141025636, 0.0, 6.582664364432368, 8.995785256410255, 5.810895606884059, 4.869433420808341, 2.286159456983025, 0.0), # 25
(4.409617188566969, 9.17090096520763, 7.311137103684661, 3.8778714975845405, 2.2519897435897436, 0.0, 6.5787538647343, 9.007958974358974, 5.816807246376811, 4.874091402456441, 2.2927252413019077, 0.0), # 26
(4.424602974100088, 9.193658740530301, 7.31661644440874, 3.880965489130435, 2.2543802884615385, 0.0, 6.574781861413045, 9.017521153846154, 5.821448233695653, 4.877744296272493, 2.2984146851325753, 0.0), # 27
(4.438431516437421, 9.212821164421996, 7.320562285775494, 3.8831958937198072, 2.256104487179487, 0.0, 6.570748671497586, 9.024417948717948, 5.824793840579711, 4.8803748571836625, 2.303205291105499, 0.0), # 28
(4.4510735976977855, 9.228298247404602, 7.322948760175664, 3.884546225845411, 2.257148878205128, 0.0, 6.566654612016909, 9.028595512820512, 5.826819338768117, 4.881965840117109, 2.3070745618511506, 0.0), # 29
(4.4625, 9.24, 7.32375, 3.885, 2.2575000000000003, 0.0, 6.562500000000001, 9.030000000000001, 5.8275, 4.8825, 2.31, 0.0), # 30
(4.47319183983376, 9.249720255681815, 7.323149356884057, 3.884918047385621, 2.257372225177305, 0.0, 6.556726763701484, 9.02948890070922, 5.827377071078432, 4.882099571256038, 2.312430063920454, 0.0), # 31
(4.4836528452685425, 9.259312045454546, 7.3213644202898545, 3.884673790849673, 2.2569916312056737, 0.0, 6.547834661835751, 9.027966524822695, 5.82701068627451, 4.880909613526569, 2.3148280113636366, 0.0), # 32
(4.493887715792838, 9.268774176136363, 7.3184206793478275, 3.8842696323529413, 2.2563623138297872, 0.0, 6.535910757121439, 9.025449255319149, 5.826404448529412, 4.878947119565218, 2.3171935440340907, 0.0), # 33
(4.503901150895141, 9.278105454545454, 7.314343623188405, 3.8837079738562093, 2.2554883687943263, 0.0, 6.521042112277196, 9.021953475177305, 5.825561960784314, 4.876229082125604, 2.3195263636363634, 0.0), # 34
(4.513697850063939, 9.287304687499997, 7.3091587409420296, 3.882991217320261, 2.2543738918439717, 0.0, 6.503315790021656, 9.017495567375887, 5.824486825980392, 4.872772493961353, 2.3218261718749993, 0.0), # 35
(4.523282512787724, 9.296370681818182, 7.302891521739131, 3.8821217647058828, 2.253022978723404, 0.0, 6.482818853073463, 9.012091914893617, 5.823182647058824, 4.868594347826087, 2.3240926704545455, 0.0), # 36
(4.532659838554988, 9.305302244318183, 7.295567454710145, 3.881102017973856, 2.2514397251773044, 0.0, 6.4596383641512585, 9.005758900709218, 5.821653026960784, 4.86371163647343, 2.3263255610795457, 0.0), # 37
(4.5418345268542195, 9.314098181818181, 7.287212028985508, 3.8799343790849674, 2.249628226950355, 0.0, 6.433861385973679, 8.99851290780142, 5.819901568627452, 4.858141352657005, 2.3285245454545453, 0.0), # 38
(4.5508112771739135, 9.322757301136363, 7.277850733695652, 3.87862125, 2.247592579787234, 0.0, 6.40557498125937, 8.990370319148935, 5.817931875, 4.8519004891304345, 2.330689325284091, 0.0), # 39
(4.559594789002558, 9.33127840909091, 7.267509057971015, 3.8771650326797387, 2.245336879432624, 0.0, 6.37486621272697, 8.981347517730496, 5.815747549019608, 4.845006038647344, 2.3328196022727274, 0.0), # 40
(4.568189761828645, 9.3396603125, 7.256212490942029, 3.8755681290849675, 2.2428652216312055, 0.0, 6.34182214309512, 8.971460886524822, 5.813352193627452, 4.837474993961353, 2.334915078125, 0.0), # 41
(4.576600895140665, 9.34790181818182, 7.2439865217391315, 3.8738329411764707, 2.2401817021276598, 0.0, 6.3065298350824595, 8.960726808510639, 5.810749411764706, 4.829324347826088, 2.336975454545455, 0.0), # 42
(4.584832888427111, 9.356001732954544, 7.230856639492753, 3.8719618709150327, 2.2372904166666667, 0.0, 6.26907635140763, 8.949161666666667, 5.80794280637255, 4.820571092995169, 2.339000433238636, 0.0), # 43
(4.592890441176471, 9.363958863636363, 7.216848333333333, 3.8699573202614377, 2.2341954609929076, 0.0, 6.229548754789272, 8.93678184397163, 5.804935980392157, 4.811232222222222, 2.3409897159090907, 0.0), # 44
(4.600778252877237, 9.371772017045453, 7.201987092391306, 3.8678216911764705, 2.230900930851064, 0.0, 6.188034107946028, 8.923603723404256, 5.801732536764706, 4.80132472826087, 2.3429430042613633, 0.0), # 45
(4.6085010230179035, 9.379440000000002, 7.186298405797103, 3.8655573856209147, 2.2274109219858156, 0.0, 6.144619473596536, 8.909643687943262, 5.798336078431372, 4.790865603864735, 2.3448600000000006, 0.0), # 46
(4.616063451086957, 9.386961619318182, 7.16980776268116, 3.8631668055555552, 2.223729530141844, 0.0, 6.099391914459438, 8.894918120567375, 5.794750208333333, 4.77987184178744, 2.3467404048295455, 0.0), # 47
(4.623470236572891, 9.394335681818182, 7.152540652173913, 3.8606523529411763, 2.21986085106383, 0.0, 6.052438493253375, 8.87944340425532, 5.790978529411765, 4.7683604347826085, 2.3485839204545456, 0.0), # 48
(4.630726078964194, 9.401560994318181, 7.134522563405797, 3.8580164297385626, 2.2158089804964543, 0.0, 6.003846272696985, 8.863235921985817, 5.787024644607844, 4.7563483756038645, 2.3503902485795454, 0.0), # 49
(4.6378356777493615, 9.408636363636361, 7.115778985507247, 3.8552614379084966, 2.211578014184397, 0.0, 5.953702315508913, 8.846312056737588, 5.782892156862745, 4.743852657004831, 2.3521590909090904, 0.0), # 50
(4.6448037324168805, 9.415560596590907, 7.096335407608696, 3.852389779411765, 2.2071720478723407, 0.0, 5.902093684407797, 8.828688191489363, 5.778584669117648, 4.73089027173913, 2.353890149147727, 0.0), # 51
(4.651634942455243, 9.4223325, 7.0762173188405795, 3.84940385620915, 2.2025951773049646, 0.0, 5.849107442112278, 8.810380709219858, 5.774105784313726, 4.717478212560386, 2.355583125, 0.0), # 52
(4.658334007352941, 9.428950880681818, 7.055450208333333, 3.8463060702614382, 2.1978514982269504, 0.0, 5.794830651340996, 8.791405992907801, 5.769459105392158, 4.703633472222222, 2.3572377201704544, 0.0), # 53
(4.6649056265984665, 9.435414545454544, 7.034059565217391, 3.843098823529412, 2.192945106382979, 0.0, 5.739350374812594, 8.771780425531915, 5.764648235294119, 4.689373043478261, 2.358853636363636, 0.0), # 54
(4.671354499680307, 9.441722301136364, 7.012070878623187, 3.8397845179738566, 2.1878800975177306, 0.0, 5.682753675245711, 8.751520390070922, 5.759676776960785, 4.674713919082125, 2.360430575284091, 0.0), # 55
(4.677685326086957, 9.447872954545453, 6.989509637681159, 3.8363655555555556, 2.1826605673758865, 0.0, 5.625127615358988, 8.730642269503546, 5.754548333333334, 4.65967309178744, 2.361968238636363, 0.0), # 56
(4.683902805306906, 9.453865312500001, 6.966401331521738, 3.832844338235294, 2.1772906117021273, 0.0, 5.566559257871065, 8.70916244680851, 5.749266507352941, 4.644267554347826, 2.3634663281250003, 0.0), # 57
(4.690011636828645, 9.459698181818181, 6.942771449275362, 3.8292232679738563, 2.1717743262411346, 0.0, 5.507135665500583, 8.687097304964539, 5.743834901960785, 4.628514299516908, 2.3649245454545453, 0.0), # 58
(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), # 59
)
passenger_allighting_rate = (
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 0
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 1
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 2
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 3
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 4
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 5
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 6
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 7
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 8
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 9
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 10
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 11
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 12
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 13
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 14
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 15
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 16
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 17
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 18
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 19
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 20
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 21
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 22
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 23
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 24
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 25
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 26
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 27
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 28
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 29
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 30
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 31
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 32
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 33
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 34
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 35
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 36
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 37
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 38
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 39
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 40
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 41
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 42
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 43
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 44
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 45
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 46
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 47
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 48
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 49
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 50
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 51
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 52
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 53
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 54
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 55
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 56
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 57
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 58
(0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1, 0, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 0.16666666666666666, 1), # 59
)
"""
parameters for reproducibiliy. More information: https://numpy.org/doc/stable/reference/random/parallel.html
"""
#initial entropy
entropy = 258194110137029475889902652135037600173
#index for seed sequence child
child_seed_index = (
1, # 0
10, # 1
)
| 113.134328 | 212 | 0.729182 | 5,147 | 37,900 | 5.367204 | 0.227511 | 0.31276 | 0.247602 | 0.46914 | 0.328543 | 0.327674 | 0.327674 | 0.327674 | 0.327674 | 0.327674 | 0 | 0.819086 | 0.119103 | 37,900 | 334 | 213 | 113.473054 | 0.008357 | 0.031953 | 0 | 0.202532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.015823 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
83bb6951dce2c5f54e34fd8ba576b3f4b3dd1b3b | 10,741 | py | Python | src/complement.py | Lain-progressivehouse/probspace-youtube | 04740862fb28fb9a38131554369d6c54eb560fc5 | [
"MIT"
] | 5 | 2020-06-29T04:32:07.000Z | 2021-02-08T03:54:29.000Z | src/complement.py | Lain-progressivehouse/probspace-youtube | 04740862fb28fb9a38131554369d6c54eb560fc5 | [
"MIT"
] | null | null | null | src/complement.py | Lain-progressivehouse/probspace-youtube | 04740862fb28fb9a38131554369d6c54eb560fc5 | [
"MIT"
] | null | null | null | import re
import unicodedata
from collections import Counter
from itertools import product
import numpy as np
import pandas as pd
from sklearn.decomposition import TruncatedSVD
from sklearn.model_selection import StratifiedKFold
from sklearn.preprocessing import LabelEncoder
from src import sentence_splitter, data_frame, learn_sklearn, learn_lgb
def rating_dataset():
# target: ["likes", "dislikes"]
all_df = pd.concat(
[pd.read_csv("./data/input/train_data.csv").drop(["y"], axis=1),
pd.read_csv("./data/input/test_data.csv")]
).reset_index(drop=True)
# train = all_df[~all_df["ratings_disabled"] & ~all_df["comments_disabled"]].reset_index(drop=True)
# test = all_df[all_df["ratings_disabled"] & ~all_df["comments_disabled"]].reset_index(drop=True)
train = all_df[~all_df["ratings_disabled"]].reset_index(drop=True)
test = all_df[all_df["ratings_disabled"]].reset_index(drop=True)
test = test.drop(["likes", "dislikes"], axis=1)
train.likes = train.likes.apply(np.log1p)
train.dislikes = train.dislikes.apply(np.log1p)
train.comment_count = train.comment_count.apply(np.log1p)
test.comment_count = test.comment_count.apply(np.log1p)
train["publishedAt"] = pd.to_datetime(train.publishedAt).apply(lambda x: x.value)
test["publishedAt"] = pd.to_datetime(test.publishedAt).apply(lambda x: x.value)
train["title_len"] = train.title.apply(lambda x: len(str(x)))
test["title_len"] = test.title.apply(lambda x: len(str(x)))
train["channelTitle_len"] = train.channelTitle.apply(lambda x: len(str(x)))
test["channelTitle_len"] = test.channelTitle.apply(lambda x: len(str(x)))
train["description_len"] = train.description.apply(lambda x: len(str(x)))
test["description_len"] = test.description.apply(lambda x: len(str(x)))
train["tags_count"] = train.tags.apply(lambda x: str(x).count("|"))
test["tags_count"] = test.tags.apply(lambda x: str(x).count("|"))
# 日本語を含むかかどうかの判定
train["title_ja_count"] = train.title.apply(data_frame.is_japanese)
test["title_ja_count"] = test.title.apply(data_frame.is_japanese)
train["channelTitle_ja_count"] = train.channelTitle.apply(data_frame.is_japanese)
test["channelTitle_ja_count"] = test.channelTitle.apply(data_frame.is_japanese)
train["description_ja_count"] = train.description.apply(data_frame.is_japanese)
test["description_ja_count"] = test.description.apply(data_frame.is_japanese)
# アルファベットのカウント
train["title_en_count"] = train.title.apply(data_frame.count_alphabet)
test["title_en_count"] = test.title.apply(data_frame.count_alphabet)
train["channelTitle_en_count"] = train.channelTitle.apply(data_frame.count_alphabet)
test["channelTitle_en_count"] = test.channelTitle.apply(data_frame.count_alphabet)
train["description_en_count"] = train.description.apply(data_frame.count_alphabet)
test["description_en_count"] = test.description.apply(data_frame.count_alphabet)
# 数字のカウント
train["description_num_count"] = train.description.apply(data_frame.count_number)
test["description_num_count"] = test.description.apply(data_frame.count_number)
# urlのカウント
train["description_url_count"] = train.description.apply(lambda x: str(x).count("://"))
test["description_url_count"] = test.description.apply(lambda x: str(x).count("://"))
all_df: pd.DataFrame = pd.concat(
[train.drop(["likes", "dislikes"], axis=1), test], ignore_index=True).reset_index(drop=True)
category = ["channelId", "categoryId", "collection_date"]
target_list = ["comment_count", "title_len", "channelTitle_len", "description_len", "tags_count",
"description_ja_count", "description_en_count", "title_ja_count", "title_en_count",
"publishedAt"]
for col, target in product(category, target_list):
print(col, target)
data_frame.group(train, test, col, target, all_df)
data_frame.TE(train, test, "mean", train.likes, ["categoryId", "collection_date"])
data_frame.TE(train, test, "std", train.likes, ["categoryId", "collection_date"])
data_frame.TE(train, test, "mean", train.dislikes, ["categoryId", "collection_date"])
data_frame.TE(train, test, "std", train.dislikes, ["categoryId", "collection_date"])
return train, test
def comment_dataset():
# target: ["comment_dataset"]
all_df = pd.concat(
[pd.read_csv("./data/input/train_data.csv").drop(["y"], axis=1),
pd.read_csv("./data/input/test_data.csv")]
).reset_index(drop=True)
# train = all_df[~all_df["ratings_disabled"] & ~all_df["comments_disabled"]].reset_index(drop=True)
# test = all_df[~all_df["ratings_disabled"] & all_df["comments_disabled"]].reset_index(drop=True)
train = all_df[~all_df["comments_disabled"]].reset_index(drop=True)
test = all_df[all_df["comments_disabled"]].reset_index(drop=True)
test = test.drop(["comment_count"], axis=1)
train.likes = train.likes.apply(np.log1p)
train.dislikes = train.dislikes.apply(np.log1p)
test.likes = test.likes.apply(np.log1p)
test.dislikes = test.dislikes.apply(np.log1p)
train.comment_count = train.comment_count.apply(np.log1p)
train["publishedAt"] = pd.to_datetime(train.publishedAt).apply(lambda x: x.value)
test["publishedAt"] = pd.to_datetime(test.publishedAt).apply(lambda x: x.value)
train["title_len"] = train.title.apply(lambda x: len(str(x)))
test["title_len"] = test.title.apply(lambda x: len(str(x)))
train["channelTitle_len"] = train.channelTitle.apply(lambda x: len(str(x)))
test["channelTitle_len"] = test.channelTitle.apply(lambda x: len(str(x)))
train["description_len"] = train.description.apply(lambda x: len(str(x)))
test["description_len"] = test.description.apply(lambda x: len(str(x)))
train["tags_count"] = train.tags.apply(lambda x: str(x).count("|"))
test["tags_count"] = test.tags.apply(lambda x: str(x).count("|"))
# 日本語を含むかかどうかの判定
train["title_ja_count"] = train.title.apply(data_frame.is_japanese)
test["title_ja_count"] = test.title.apply(data_frame.is_japanese)
train["channelTitle_ja_count"] = train.channelTitle.apply(data_frame.is_japanese)
test["channelTitle_ja_count"] = test.channelTitle.apply(data_frame.is_japanese)
train["description_ja_count"] = train.description.apply(data_frame.is_japanese)
test["description_ja_count"] = test.description.apply(data_frame.is_japanese)
# アルファベットのカウント
train["title_en_count"] = train.title.apply(data_frame.count_alphabet)
test["title_en_count"] = test.title.apply(data_frame.count_alphabet)
train["channelTitle_en_count"] = train.channelTitle.apply(data_frame.count_alphabet)
test["channelTitle_en_count"] = test.channelTitle.apply(data_frame.count_alphabet)
train["description_en_count"] = train.description.apply(data_frame.count_alphabet)
test["description_en_count"] = test.description.apply(data_frame.count_alphabet)
# 数字のカウント
train["description_num_count"] = train.description.apply(data_frame.count_number)
test["description_num_count"] = test.description.apply(data_frame.count_number)
# urlのカウント
train["description_url_count"] = train.description.apply(lambda x: str(x).count("://"))
test["description_url_count"] = test.description.apply(lambda x: str(x).count("://"))
all_df: pd.DataFrame = pd.concat(
[train.drop(["likes", "dislikes"], axis=1), test], ignore_index=True).reset_index(drop=True)
category = ["channelId", "categoryId", "collection_date"]
target_list = ["likes", "dislikes", "title_len", "channelTitle_len", "description_len", "tags_count",
"description_ja_count", "description_en_count", "title_ja_count", "title_en_count",
"publishedAt"]
for col, target in product(category, target_list):
data_frame.group(train, test, col, target, all_df)
data_frame.TE(train, test, "mean", train.comment_count, ["categoryId", "collection_date"])
data_frame.TE(train, test, "std", train.comment_count, ["categoryId", "collection_date"])
return train, test
def rating_main():
train, test = rating_dataset()
drop_list = ["id", "video_id", "title", "channelId", "channelTitle",
"tags", "thumbnail_link", "description"]
ids = test.video_id
train_y_likes = train["likes"]
train_y_dislikes = train["dislikes"]
train_x = train.drop(drop_list + ["likes", "dislikes"], axis=1)
test_x = test.drop(drop_list, axis=1)
ensemble(train_x, train_y_likes, test_x, ids, "likes")
ensemble(train_x, train_y_dislikes, test_x, ids, "dislikes")
# return train_x, train_y_likes, train_y_dislikes, test_x, ids
def comment_main():
train, test = comment_dataset()
drop_list = ["id", "video_id", "title", "channelId", "channelTitle",
"tags", "thumbnail_link", "description"]
ids = test.video_id
train_y = train["comment_count"]
train_x = train.drop(drop_list + ["comment_count"], axis=1)
test_x = test.drop(drop_list, axis=1)
ensemble(train_x, train_y, test_x, ids, "comment")
# return train_x, train_y_likes, train_y_dislikes, test_x, ids
params = {
'objective': 'mean_squared_error',
# 'max_depth': 6,
'learning_rate': 0.1,
"boosting_type": "gbdt",
"metric": 'rmse',
'lambda_l1': 2.94393343297745e-08,
'lambda_l2': 0.00010003095098613326,
'num_leaves': 31,
'feature_fraction': 0.5,
'bagging_fraction': 0.8176254967309975,
'bagging_freq': 1,
'min_child_samples': 5,
'random_state': 0,
'early_stopping_rounds': 200,
'n_estimators': 50000,
}
def ensemble(train_x, train_y, test_x, ids, name):
preds_train = []
preds_test = []
drop_null = set(test_x.keys()[test_x.isna().any()].to_list() + train_x.keys()[train_x.isna().any()].to_list())
drop_list = ["publishedAt", "categoryId", "collection_date"] + list(drop_null)
train_x = train_x.drop(drop_list, axis=1)
test_x = test_x.drop(drop_list, axis=1)
for i in range(5):
# pred_train, pred_test = learn_sklearn.main(train_x, train_y, test_x, ids, i)
params["random_state"] = i
pred_train, pred_test = learn_lgb.predict_cv(params, train_x, train_y, test_x, seed=i)
preds_train.append(pred_train)
preds_test.append(pred_test)
pred_train = np.mean(preds_train, axis=0)
pred_test = np.mean(preds_test, axis=0)
learn_lgb.output_metrics(train_y, pred_train)
learn_lgb.output_metrics(np.expm1(train_y), np.expm1(pred_train))
sub = pd.DataFrame()
sub["video_id"] = ids
sub['y'] = np.expm1(pred_test)
sub.to_csv(f'./data/input/complement_{name}.csv', index=False)
| 45.706383 | 114 | 0.701983 | 1,478 | 10,741 | 4.841678 | 0.112314 | 0.046534 | 0.054779 | 0.042482 | 0.805757 | 0.785914 | 0.765651 | 0.74986 | 0.742594 | 0.742594 | 0 | 0.011105 | 0.144865 | 10,741 | 234 | 115 | 45.901709 | 0.767991 | 0.069826 | 0 | 0.535714 | 0 | 0 | 0.211693 | 0.04984 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029762 | false | 0 | 0.059524 | 0 | 0.10119 | 0.005952 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
83c56e6865214f8954cb512a3f76b04cefd56e51 | 114 | py | Python | spa/static/__init__.py | btubbs/spa | 9e14986a1e6a079dfeaf1b0b0c83d749cf38dd54 | [
"BSD-3-Clause"
] | 14 | 2015-06-05T19:29:20.000Z | 2021-05-07T15:02:56.000Z | spa/static/__init__.py | btubbs/spa | 9e14986a1e6a079dfeaf1b0b0c83d749cf38dd54 | [
"BSD-3-Clause"
] | 5 | 2015-06-20T17:53:24.000Z | 2015-12-14T20:50:24.000Z | spa/static/__init__.py | btubbs/spa | 9e14986a1e6a079dfeaf1b0b0c83d749cf38dd54 | [
"BSD-3-Clause"
] | 3 | 2015-05-29T09:21:08.000Z | 2015-08-06T12:06:22.000Z | from spa.static.handlers import Static, StaticHandler, StaticFileHandler
from spa.static.smart import SmartStatic
| 38 | 72 | 0.859649 | 14 | 114 | 7 | 0.642857 | 0.142857 | 0.265306 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087719 | 114 | 2 | 73 | 57 | 0.942308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f7ba313d0ea9a997702b9b07d4bdfb2ead122f3d | 21,171 | py | Python | tests/environments/execution/test_fargate_task_environment.py | trapped/prefect | 128f11570c35e7156d65ba65fdcbc1f4ccd7c2b7 | [
"Apache-2.0"
] | null | null | null | tests/environments/execution/test_fargate_task_environment.py | trapped/prefect | 128f11570c35e7156d65ba65fdcbc1f4ccd7c2b7 | [
"Apache-2.0"
] | null | null | null | tests/environments/execution/test_fargate_task_environment.py | trapped/prefect | 128f11570c35e7156d65ba65fdcbc1f4ccd7c2b7 | [
"Apache-2.0"
] | null | null | null | import os
import tempfile
from unittest.mock import MagicMock
import cloudpickle
import prefect
from prefect.environments import FargateTaskEnvironment
from prefect.environments.storage import Docker
from prefect.utilities.configuration import set_temporary_config
from botocore.exceptions import ClientError
def test_create_fargate_task_environment():
environment = FargateTaskEnvironment()
assert environment
assert environment.labels == set()
assert environment.on_start is None
assert environment.on_exit is None
assert environment.logger.name == "prefect.FargateTaskEnvironment"
def test_create_fargate_task_environment_labels():
environment = FargateTaskEnvironment(labels=["foo"])
assert environment
assert environment.labels == set(["foo"])
def test_create_fargate_task_environment_callbacks():
def f():
pass
environment = FargateTaskEnvironment(labels=["foo"], on_start=f, on_exit=f)
assert environment
assert environment.labels == set(["foo"])
assert environment.on_start is f
assert environment.on_exit is f
def test_fargate_task_environment_dependencies():
environment = FargateTaskEnvironment()
assert environment.dependencies == ["boto3", "botocore"]
def test_create_fargate_task_environment_aws_creds_provided():
environment = FargateTaskEnvironment(
labels=["foo"],
aws_access_key_id="id",
aws_secret_access_key="secret",
aws_session_token="session",
region_name="region",
)
assert environment
assert environment.labels == set(["foo"])
assert environment.aws_access_key_id == "id"
assert environment.aws_secret_access_key == "secret"
assert environment.aws_session_token == "session"
assert environment.region_name == "region"
def test_create_fargate_task_environment_aws_creds_environment(monkeypatch):
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "id")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "secret")
monkeypatch.setenv("AWS_SESSION_TOKEN", "session")
monkeypatch.setenv("REGION_NAME", "region")
environment = FargateTaskEnvironment(labels=["foo"])
assert environment
assert environment.labels == set(["foo"])
assert environment.aws_access_key_id == "id"
assert environment.aws_secret_access_key == "secret"
assert environment.aws_session_token == "session"
assert environment.region_name == "region"
def test_parse_task_definition_kwargs():
environment = FargateTaskEnvironment()
kwarg_dict = {
"family": "test",
"taskRoleArn": "test",
"executionRoleArn": "test",
"networkMode": "test",
"containerDefinitions": "test",
"volumes": "test",
"placementConstraints": "test",
"requiresCompatibilities": "test",
"cpu": "test",
"memory": "test",
"tags": "test",
"pidMode": "test",
"ipcMode": "test",
"proxyConfiguration": "test",
"inferenceAccelerators": "test",
}
task_definition_kwargs, task_run_kwargs = environment._parse_kwargs(kwarg_dict)
assert task_definition_kwargs == kwarg_dict
assert task_run_kwargs == {"placementConstraints": "test", "tags": "test"}
def test_parse_task_run_kwargs():
environment = FargateTaskEnvironment()
kwarg_dict = {
"cluster": "test",
"taskDefinition": "test",
"count": "test",
"startedBy": "test",
"group": "test",
"placementConstraints": "test",
"placementStrategy": "test",
"platformVersion": "test",
"networkConfiguration": "test",
"tags": "test",
"enableECSManagedTags": "test",
"propagateTags": "test",
}
task_definition_kwargs, task_run_kwargs = environment._parse_kwargs(kwarg_dict)
assert task_run_kwargs == kwarg_dict
assert task_definition_kwargs == {"placementConstraints": "test", "tags": "test"}
def test_parse_task_definition_and_run_kwargs():
environment = FargateTaskEnvironment()
def_kwarg_dict = {
"family": "test",
"taskRoleArn": "test",
"executionRoleArn": "test",
"networkMode": "test",
"containerDefinitions": "test",
"volumes": "test",
"placementConstraints": "test",
"requiresCompatibilities": "test",
"cpu": "test",
"memory": "test",
"tags": "test",
"pidMode": "test",
"ipcMode": "test",
"proxyConfiguration": "test",
"inferenceAccelerators": "test",
}
run_kwarg_dict = {
"cluster": "test",
"taskDefinition": "test",
"count": "test",
"startedBy": "test",
"group": "test",
"placementConstraints": "test",
"placementStrategy": "test",
"platformVersion": "test",
"networkConfiguration": "test",
"tags": "test",
"enableECSManagedTags": "test",
"propagateTags": "test",
}
kwarg_dict = {
"family": "test",
"taskRoleArn": "test",
"executionRoleArn": "test",
"networkMode": "test",
"containerDefinitions": "test",
"volumes": "test",
"placementConstraints": "test",
"requiresCompatibilities": "test",
"cpu": "test",
"memory": "test",
"tags": "test",
"pidMode": "test",
"ipcMode": "test",
"proxyConfiguration": "test",
"inferenceAccelerators": "test",
"cluster": "test",
"taskDefinition": "test",
"count": "test",
"startedBy": "test",
"group": "test",
"placementStrategy": "test",
"platformVersion": "test",
"networkConfiguration": "test",
"enableECSManagedTags": "test",
"propagateTags": "test",
}
task_definition_kwargs, task_run_kwargs = environment._parse_kwargs(kwarg_dict)
assert task_definition_kwargs == def_kwarg_dict
assert task_run_kwargs == run_kwarg_dict
def test_parse_task_kwargs_invalid_value_removed():
environment = FargateTaskEnvironment()
kwarg_dict = {"test": "not_real"}
task_definition_kwargs, task_run_kwargs = environment._parse_kwargs(kwarg_dict)
assert task_definition_kwargs == {}
assert task_run_kwargs == {}
def test_setup_definition_exists(monkeypatch):
boto3_client = MagicMock()
boto3_client.describe_task_definition.return_value = {}
monkeypatch.setattr("boto3.client", MagicMock(return_value=boto3_client))
environment = FargateTaskEnvironment()
environment.setup(Docker(registry_url="test", image_name="image", image_tag="tag"))
assert boto3_client.describe_task_definition.called
def test_setup_definition_register(monkeypatch):
boto3_client = MagicMock()
boto3_client.describe_task_definition.side_effect = ClientError({}, None)
boto3_client.register_task_definition.return_value = {}
monkeypatch.setattr("boto3.client", MagicMock(return_value=boto3_client))
environment = FargateTaskEnvironment(
family="test",
containerDefinitions=[
{
"name": "flow-container",
"image": "image",
"command": [],
"environment": [],
"essential": True,
}
],
)
environment.setup(Docker(registry_url="test", image_name="image", image_tag="tag"))
assert boto3_client.describe_task_definition.called
assert boto3_client.register_task_definition.called
assert boto3_client.register_task_definition.call_args[1]["family"] == "test"
assert boto3_client.register_task_definition.call_args[1][
"containerDefinitions"
] == [
{
"name": "flow-container",
"image": "test/image:tag",
"command": [
"/bin/sh",
"-c",
"python -c 'import prefect; prefect.Flow.load(prefect.context.flow_file_path).environment.run_flow()'",
],
"environment": [
{
"name": "PREFECT__CLOUD__GRAPHQL",
"value": prefect.config.cloud.graphql,
},
{"name": "PREFECT__CLOUD__USE_LOCAL_SECRETS", "value": "false"},
{
"name": "PREFECT__ENGINE__FLOW_RUNNER__DEFAULT_CLASS",
"value": "prefect.engine.cloud.CloudFlowRunner",
},
{
"name": "PREFECT__ENGINE__TASK_RUNNER__DEFAULT_CLASS",
"value": "prefect.engine.cloud.CloudTaskRunner",
},
{"name": "PREFECT__LOGGING__LOG_TO_CLOUD", "value": "true"},
],
"essential": True,
}
]
def test_setup_definition_register_no_defintions(monkeypatch):
boto3_client = MagicMock()
boto3_client.describe_task_definition.side_effect = ClientError({}, None)
boto3_client.register_task_definition.return_value = {}
monkeypatch.setattr("boto3.client", MagicMock(return_value=boto3_client))
environment = FargateTaskEnvironment(family="test")
environment.setup(Docker(registry_url="test", image_name="image", image_tag="tag"))
assert boto3_client.describe_task_definition.called
assert boto3_client.register_task_definition.called
assert boto3_client.register_task_definition.call_args[1]["family"] == "test"
assert boto3_client.register_task_definition.call_args[1][
"containerDefinitions"
] == [
{
"environment": [
{
"name": "PREFECT__CLOUD__GRAPHQL",
"value": prefect.config.cloud.graphql,
},
{"name": "PREFECT__CLOUD__USE_LOCAL_SECRETS", "value": "false"},
{
"name": "PREFECT__ENGINE__FLOW_RUNNER__DEFAULT_CLASS",
"value": "prefect.engine.cloud.CloudFlowRunner",
},
{
"name": "PREFECT__ENGINE__TASK_RUNNER__DEFAULT_CLASS",
"value": "prefect.engine.cloud.CloudTaskRunner",
},
{"name": "PREFECT__LOGGING__LOG_TO_CLOUD", "value": "true"},
],
"name": "flow-container",
"image": "test/image:tag",
"command": [
"/bin/sh",
"-c",
"python -c 'import prefect; prefect.Flow.load(prefect.context.flow_file_path).environment.run_flow()'",
],
}
]
def test_execute_run_task(monkeypatch):
boto3_client = MagicMock()
boto3_client.run_task.return_value = {}
monkeypatch.setattr("boto3.client", MagicMock(return_value=boto3_client))
with set_temporary_config({"cloud.auth_token": "test"}):
environment = FargateTaskEnvironment(
cluster="test", family="test", taskDefinition="test"
)
environment.execute(
storage=Docker(registry_url="test", image_name="image", image_tag="tag"),
flow_location=".prefect/flows",
)
assert boto3_client.run_task.called
assert boto3_client.run_task.call_args[1]["taskDefinition"] == "test"
assert boto3_client.run_task.call_args[1]["overrides"] == {
"containerOverrides": [
{
"name": "flow-container",
"environment": [
{
"name": "PREFECT__CLOUD__AUTH_TOKEN",
"value": prefect.config.cloud.get("auth_token"),
},
{"name": "PREFECT__CONTEXT__FLOW_RUN_ID", "value": "unknown"},
{"name": "PREFECT__CONTEXT__IMAGE", "value": "test/image:tag"},
{
"name": "PREFECT__CONTEXT__FLOW_FILE_PATH",
"value": ".prefect/flows",
},
],
}
]
}
assert boto3_client.run_task.call_args[1]["launchType"] == "FARGATE"
assert boto3_client.run_task.call_args[1]["cluster"] == "test"
def test_execute_run_task_agent_token(monkeypatch):
boto3_client = MagicMock()
boto3_client.run_task.return_value = {}
monkeypatch.setattr("boto3.client", MagicMock(return_value=boto3_client))
with set_temporary_config({"cloud.agent.auth_token": "test"}):
environment = FargateTaskEnvironment(
cluster="test", family="test", taskDefinition="test"
)
environment.execute(
storage=Docker(registry_url="test", image_name="image", image_tag="tag"),
flow_location=".prefect/flows",
)
assert boto3_client.run_task.called
assert boto3_client.run_task.call_args[1]["taskDefinition"] == "test"
assert boto3_client.run_task.call_args[1]["overrides"] == {
"containerOverrides": [
{
"name": "flow-container",
"environment": [
{
"name": "PREFECT__CLOUD__AUTH_TOKEN",
"value": prefect.config.cloud.agent.get("auth_token"),
},
{"name": "PREFECT__CONTEXT__FLOW_RUN_ID", "value": "unknown"},
{"name": "PREFECT__CONTEXT__IMAGE", "value": "test/image:tag"},
{
"name": "PREFECT__CONTEXT__FLOW_FILE_PATH",
"value": ".prefect/flows",
},
],
}
]
}
assert boto3_client.run_task.call_args[1]["launchType"] == "FARGATE"
assert boto3_client.run_task.call_args[1]["cluster"] == "test"
def test_run_flow(monkeypatch):
environment = FargateTaskEnvironment()
flow_runner = MagicMock()
monkeypatch.setattr(
"prefect.engine.get_default_flow_runner_class",
MagicMock(return_value=flow_runner),
)
with tempfile.TemporaryDirectory() as directory:
with open(os.path.join(directory, "flow_env.prefect"), "w+"):
flow = prefect.Flow("test")
flow_path = os.path.join(directory, "flow_env.prefect")
with open(flow_path, "wb") as f:
cloudpickle.dump(flow, f)
with set_temporary_config({"cloud.auth_token": "test"}):
with prefect.context(
flow_file_path=os.path.join(directory, "flow_env.prefect")
):
environment.run_flow()
assert flow_runner.call_args[1]["flow"].name == "test"
def test_run_flow_calls_callbacks(monkeypatch):
start_func = MagicMock()
exit_func = MagicMock()
environment = FargateTaskEnvironment(on_start=start_func, on_exit=exit_func)
flow_runner = MagicMock()
monkeypatch.setattr(
"prefect.engine.get_default_flow_runner_class",
MagicMock(return_value=flow_runner),
)
with tempfile.TemporaryDirectory() as directory:
with open(os.path.join(directory, "flow_env.prefect"), "w+"):
flow = prefect.Flow("test")
flow_path = os.path.join(directory, "flow_env.prefect")
with open(flow_path, "wb") as f:
cloudpickle.dump(flow, f)
with set_temporary_config({"cloud.auth_token": "test"}):
with prefect.context(
flow_file_path=os.path.join(directory, "flow_env.prefect")
):
environment.run_flow()
assert flow_runner.call_args[1]["flow"].name == "test"
assert start_func.called
assert exit_func.called
def test_entire_environment_process_together(monkeypatch):
boto3_client = MagicMock()
boto3_client.describe_task_definition.side_effect = ClientError({}, None)
boto3_client.register_task_definition.return_value = {}
boto3_client.run_task.return_value = {}
monkeypatch.setattr("boto3.client", MagicMock(return_value=boto3_client))
flow_runner = MagicMock()
monkeypatch.setattr(
"prefect.engine.get_default_flow_runner_class",
MagicMock(return_value=flow_runner),
)
monkeypatch.setenv("AWS_ACCESS_KEY_ID", "id")
monkeypatch.setenv("AWS_SECRET_ACCESS_KEY", "secret")
monkeypatch.setenv("AWS_SESSION_TOKEN", "session")
monkeypatch.setenv("REGION_NAME", "region")
with prefect.context({"flow_run_id": "id"}), set_temporary_config(
{"cloud.auth_token": "test"}
):
storage = Docker(registry_url="test", image_name="image", image_tag="tag")
environment = FargateTaskEnvironment(
containerDefinitions=[
{
"name": "flow-container",
"image": "image",
"command": [],
"environment": [],
"essential": True,
}
],
cluster="test",
family="test",
taskDefinition="test",
)
assert environment
assert environment.aws_access_key_id == "id"
assert environment.aws_secret_access_key == "secret"
assert environment.aws_session_token == "session"
assert environment.region_name == "region"
environment.setup(storage=storage)
assert boto3_client.describe_task_definition.called
assert boto3_client.register_task_definition.called
assert boto3_client.register_task_definition.call_args[1]["family"] == "test"
assert boto3_client.register_task_definition.call_args[1][
"containerDefinitions"
] == [
{
"name": "flow-container",
"image": "test/image:tag",
"command": [
"/bin/sh",
"-c",
"python -c 'import prefect; prefect.Flow.load(prefect.context.flow_file_path).environment.run_flow()'",
],
"environment": [
{
"name": "PREFECT__CLOUD__GRAPHQL",
"value": prefect.config.cloud.graphql,
},
{"name": "PREFECT__CLOUD__USE_LOCAL_SECRETS", "value": "false"},
{
"name": "PREFECT__ENGINE__FLOW_RUNNER__DEFAULT_CLASS",
"value": "prefect.engine.cloud.CloudFlowRunner",
},
{
"name": "PREFECT__ENGINE__TASK_RUNNER__DEFAULT_CLASS",
"value": "prefect.engine.cloud.CloudTaskRunner",
},
{"name": "PREFECT__LOGGING__LOG_TO_CLOUD", "value": "true"},
],
"essential": True,
}
]
environment.execute(storage=storage, flow_location=".prefect/flows")
assert boto3_client.run_task.called
assert boto3_client.run_task.call_args[1]["taskDefinition"] == "test"
assert boto3_client.run_task.call_args[1]["overrides"] == {
"containerOverrides": [
{
"name": "flow-container",
"environment": [
{
"name": "PREFECT__CLOUD__AUTH_TOKEN",
"value": prefect.config.cloud.get("auth_token"),
},
{"name": "PREFECT__CONTEXT__FLOW_RUN_ID", "value": "id"},
{"name": "PREFECT__CONTEXT__IMAGE", "value": "test/image:tag"},
{
"name": "PREFECT__CONTEXT__FLOW_FILE_PATH",
"value": ".prefect/flows",
},
],
}
]
}
assert boto3_client.run_task.call_args[1]["launchType"] == "FARGATE"
assert boto3_client.run_task.call_args[1]["cluster"] == "test"
with tempfile.TemporaryDirectory() as directory:
with open(os.path.join(directory, "flow_env.prefect"), "w+"):
flow = prefect.Flow("test")
flow_path = os.path.join(directory, "flow_env.prefect")
with open(flow_path, "wb") as f:
cloudpickle.dump(flow, f)
with set_temporary_config({"cloud.auth_token": "test"}):
with prefect.context(
flow_file_path=os.path.join(directory, "flow_env.prefect")
):
environment.run_flow()
assert flow_runner.call_args[1]["flow"].name == "test"
def test_roundtrip_cloudpickle():
with tempfile.TemporaryDirectory() as directory:
with open(os.path.join(directory, "job.yaml"), "w+") as file:
file.write("job")
environment = FargateTaskEnvironment(cluster="test")
assert environment.task_run_kwargs == {"cluster": "test"}
new = cloudpickle.loads(cloudpickle.dumps(environment))
assert isinstance(new, FargateTaskEnvironment)
assert new.task_run_kwargs == {"cluster": "test"}
| 35.761824 | 123 | 0.586368 | 1,955 | 21,171 | 6.032225 | 0.087468 | 0.052234 | 0.040363 | 0.027474 | 0.873739 | 0.838803 | 0.820572 | 0.807343 | 0.800644 | 0.781905 | 0 | 0.005194 | 0.290633 | 21,171 | 591 | 124 | 35.822335 | 0.780064 | 0 | 0 | 0.699797 | 0 | 0.006085 | 0.230976 | 0.077417 | 0 | 0 | 0 | 0 | 0.148073 | 1 | 0.040568 | false | 0.002028 | 0.024341 | 0 | 0.064909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f7ef62793c02b04ad0ace931db793a934399d930 | 29,059 | py | Python | peering/tests/test_api.py | adamgent/peering-manager | 46858766cf131da2378010189d13485dec98332f | [
"Apache-2.0"
] | null | null | null | peering/tests/test_api.py | adamgent/peering-manager | 46858766cf131da2378010189d13485dec98332f | [
"Apache-2.0"
] | null | null | null | peering/tests/test_api.py | adamgent/peering-manager | 46858766cf131da2378010189d13485dec98332f | [
"Apache-2.0"
] | null | null | null | from django.urls import reverse
from rest_framework import status
from peering.constants import *
from peering.models import (
AutonomousSystem,
Community,
ConfigurationTemplate,
DirectPeeringSession,
InternetExchange,
InternetExchangePeeringSession,
Router,
RoutingPolicy,
)
from utils.testing import APITestCase
class StaticChoiceTest(APITestCase):
def test_get_static_choice(self):
url = reverse(
"peering-api:field-choice-detail", kwargs={"pk": "router:platform"}
)
response = self.client.get(url, **self.header)
self.assertEqual(len(response.data), 6)
def test_list_static_choices(self):
url = reverse("peering-api:field-choice-list")
response = self.client.get(url, **self.header)
self.assertEqual(len(response.data), 6)
class AutonomousSystemTest(APITestCase):
def setUp(self):
super().setUp()
self.autonomous_system = AutonomousSystem.objects.create(
asn=201281, name="Guillaume Mazoyer"
)
def test_get_autonomous_system(self):
url = reverse(
"peering-api:autonomoussystem-detail",
kwargs={"pk": self.autonomous_system.pk},
)
response = self.client.get(url, **self.header)
self.assertEqual(response.data["asn"], self.autonomous_system.asn)
def test_list_autonomous_systems(self):
url = reverse("peering-api:autonomoussystem-list")
response = self.client.get(url, **self.header)
self.assertEqual(response.data["count"], 1)
def test_create_autonomous_system(self):
data = {"asn": 29467, "name": "LuxNetwork S.A."}
url = reverse("peering-api:autonomoussystem-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(AutonomousSystem.objects.count(), 2)
autonomous_system = AutonomousSystem.objects.get(pk=response.data["id"])
self.assertEqual(autonomous_system.asn, data["asn"])
def test_create_autonomous_system_bulk(self):
data = [{"asn": 15169, "name": "Google"}, {"asn": 32934, "name": "Facebook"}]
url = reverse("peering-api:autonomoussystem-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(AutonomousSystem.objects.count(), 3)
self.assertEqual(response.data[0]["asn"], data[0]["asn"])
self.assertEqual(response.data[1]["asn"], data[1]["asn"])
def test_update_autonomous_system(self):
data = {"asn": 201281, "name": "Guillaume Mazoyer"}
url = reverse(
"peering-api:autonomoussystem-detail",
kwargs={"pk": self.autonomous_system.pk},
)
response = self.client.put(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(AutonomousSystem.objects.count(), 1)
autonomous_system = AutonomousSystem.objects.get(pk=response.data["id"])
self.assertEqual(autonomous_system.asn, data["asn"])
def test_delete_autonomous_system(self):
url = reverse(
"peering-api:autonomoussystem-detail",
kwargs={"pk": self.autonomous_system.pk},
)
response = self.client.delete(url, **self.header)
self.assertStatus(response, status.HTTP_204_NO_CONTENT)
self.assertEqual(AutonomousSystem.objects.count(), 0)
def test_synchronize_with_peeringdb(self):
url = reverse(
"peering-api:autonomoussystem-synchronize-with-peeringdb",
kwargs={"pk": self.autonomous_system.pk},
)
response = self.client.post(url, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
def test_common_internet_exchanges(self):
url = reverse(
"peering-api:autonomoussystem-common-internet-exchanges",
kwargs={"pk": self.autonomous_system.pk},
)
response = self.client.get(url, format="json", **self.header)
self.assertEqual(response.data["common-internet-exchanges"], [])
def test_find_potential_ix_peering_sessions(self):
url = reverse(
"peering-api:autonomoussystem-find-potential-ix-peering-sessions",
kwargs={"pk": self.autonomous_system.pk},
)
response = self.client.patch(url, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
class CommunityTest(APITestCase):
def setUp(self):
super().setUp()
self.community = Community.objects.create(
name="Test", value="64500:1", type=COMMUNITY_TYPE_EGRESS
)
def test_get_community(self):
url = reverse("peering-api:community-detail", kwargs={"pk": self.community.pk})
response = self.client.get(url, **self.header)
self.assertEqual(response.data["value"], self.community.value)
def test_list_communities(self):
url = reverse("peering-api:community-list")
response = self.client.get(url, **self.header)
self.assertEqual(response.data["count"], 1)
def test_create_community(self):
data = {"name": "Other", "value": "64500:2", "type": COMMUNITY_TYPE_EGRESS}
url = reverse("peering-api:community-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(Community.objects.count(), 2)
community = Community.objects.get(pk=response.data["id"])
self.assertEqual(community.value, data["value"])
def test_create_community_bulk(self):
data = [
{"name": "Test1", "value": "64500:11", "type": COMMUNITY_TYPE_EGRESS},
{"name": "Test2", "value": "64500:12", "type": COMMUNITY_TYPE_EGRESS},
]
url = reverse("peering-api:community-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(Community.objects.count(), 3)
self.assertEqual(response.data[0]["value"], data[0]["value"])
self.assertEqual(response.data[1]["value"], data[1]["value"])
def test_update_community(self):
data = {"name": "Other", "value": "64500:2", "type": COMMUNITY_TYPE_INGRESS}
url = reverse("peering-api:community-detail", kwargs={"pk": self.community.pk})
response = self.client.put(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(Community.objects.count(), 1)
community = Community.objects.get(pk=response.data["id"])
self.assertEqual(community.value, data["value"])
def test_delete_community(self):
url = reverse("peering-api:community-detail", kwargs={"pk": self.community.pk})
response = self.client.delete(url, **self.header)
self.assertStatus(response, status.HTTP_204_NO_CONTENT)
self.assertEqual(Community.objects.count(), 0)
class ConfigurationTemplateTest(APITestCase):
def setUp(self):
super().setUp()
self.configuration_template = ConfigurationTemplate.objects.create(
name="Test", template="test_template"
)
def test_get_configuration_template(self):
url = reverse(
"peering-api:configurationtemplate-detail",
kwargs={"pk": self.configuration_template.pk},
)
response = self.client.get(url, **self.header)
self.assertEqual(
response.data["template"], self.configuration_template.template
)
def test_list_configuration_templates(self):
url = reverse("peering-api:configurationtemplate-list")
response = self.client.get(url, **self.header)
self.assertEqual(response.data["count"], 1)
def test_create_configuration_template(self):
data = {"name": "Other", "template": "other_template"}
url = reverse("peering-api:configurationtemplate-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(ConfigurationTemplate.objects.count(), 2)
configuration_template = ConfigurationTemplate.objects.get(
pk=response.data["id"]
)
self.assertEqual(configuration_template.template, data["template"])
def test_create_configuration_template_bulk(self):
data = [
{"name": "Test1", "template": "test1_template"},
{"name": "Test2", "template": "test2_template"},
]
url = reverse("peering-api:configurationtemplate-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(ConfigurationTemplate.objects.count(), 3)
self.assertEqual(response.data[0]["template"], data[0]["template"])
self.assertEqual(response.data[1]["template"], data[1]["template"])
def test_update_configuration_template(self):
data = {"name": "Test", "template": "updated_template"}
url = reverse(
"peering-api:configurationtemplate-detail",
kwargs={"pk": self.configuration_template.pk},
)
response = self.client.put(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(ConfigurationTemplate.objects.count(), 1)
configuration_template = ConfigurationTemplate.objects.get(
pk=response.data["id"]
)
self.assertEqual(configuration_template.template, data["template"])
def test_delete_configuration_template(self):
url = reverse(
"peering-api:configurationtemplate-detail",
kwargs={"pk": self.configuration_template.pk},
)
response = self.client.delete(url, **self.header)
self.assertStatus(response, status.HTTP_204_NO_CONTENT)
self.assertEqual(ConfigurationTemplate.objects.count(), 0)
class DirectPeeringSessionTest(APITestCase):
def setUp(self):
super().setUp()
self.autonomous_system = AutonomousSystem.objects.create(
asn=201281, name="Guillaume Mazoyer"
)
self.direct_peering_session = DirectPeeringSession.objects.create(
autonomous_system=self.autonomous_system,
relationship=BGP_RELATIONSHIP_PRIVATE_PEERING,
ip_address="2001:db8::1",
)
def test_get_direct_peering_session(self):
url = reverse(
"peering-api:directpeeringsession-detail",
kwargs={"pk": self.direct_peering_session.pk},
)
response = self.client.get(url, **self.header)
self.assertEqual(
response.data["ip_address"], self.direct_peering_session.ip_address
)
def test_list_direct_peering_sessions(self):
url = reverse("peering-api:directpeeringsession-list")
response = self.client.get(url, **self.header)
self.assertEqual(response.data["count"], 1)
def test_create_direct_peering_session(self):
data = {
"autonomous_system": self.autonomous_system.pk,
"relationship": BGP_RELATIONSHIP_PRIVATE_PEERING,
"ip_address": "192.168.0.1",
}
url = reverse("peering-api:directpeeringsession-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(DirectPeeringSession.objects.count(), 2)
direct_peering_session = DirectPeeringSession.objects.get(
pk=response.data["id"]
)
self.assertEqual(direct_peering_session.ip_address, data["ip_address"])
def test_create_direct_peering_session_bulk(self):
data = [
{
"autonomous_system": self.autonomous_system.pk,
"relationship": BGP_RELATIONSHIP_PRIVATE_PEERING,
"ip_address": "10.0.0.1",
},
{
"autonomous_system": self.autonomous_system.pk,
"relationship": BGP_RELATIONSHIP_PRIVATE_PEERING,
"ip_address": "10.0.0.2",
},
]
url = reverse("peering-api:directpeeringsession-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(DirectPeeringSession.objects.count(), 3)
self.assertEqual(response.data[0]["ip_address"], data[0]["ip_address"])
self.assertEqual(response.data[1]["ip_address"], data[1]["ip_address"])
def test_update_direct_peering_session(self):
data = {
"autonomous_system": self.autonomous_system.pk,
"relationship": BGP_RELATIONSHIP_PRIVATE_PEERING,
"ip_address": "2001:db8::2",
}
url = reverse(
"peering-api:directpeeringsession-detail",
kwargs={"pk": self.direct_peering_session.pk},
)
response = self.client.put(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(DirectPeeringSession.objects.count(), 1)
direct_peering_session = DirectPeeringSession.objects.get(
pk=response.data["id"]
)
self.assertEqual(direct_peering_session.ip_address, data["ip_address"])
def test_delete_direct_peering_session(self):
url = reverse(
"peering-api:directpeeringsession-detail",
kwargs={"pk": self.direct_peering_session.pk},
)
response = self.client.delete(url, **self.header)
self.assertStatus(response, status.HTTP_204_NO_CONTENT)
self.assertEqual(DirectPeeringSession.objects.count(), 0)
class InternetExchangeTest(APITestCase):
def setUp(self):
super().setUp()
self.internet_exchange = InternetExchange.objects.create(
name="Test", slug="test"
)
def test_get_internet_exchange(self):
url = reverse(
"peering-api:internetexchange-detail",
kwargs={"pk": self.internet_exchange.pk},
)
response = self.client.get(url, **self.header)
self.assertEqual(response.data["slug"], self.internet_exchange.slug)
def test_list_internet_exchanges(self):
url = reverse("peering-api:internetexchange-list")
response = self.client.get(url, **self.header)
self.assertEqual(response.data["count"], 1)
def test_create_internet_exchange(self):
data = {"name": "Other", "slug": "other"}
url = reverse("peering-api:internetexchange-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(InternetExchange.objects.count(), 2)
internet_exchange = InternetExchange.objects.get(pk=response.data["id"])
self.assertEqual(internet_exchange.slug, data["slug"])
def test_create_internet_exchange_bulk(self):
data = [{"name": "Test1", "slug": "test1"}, {"name": "Test2", "slug": "test2"}]
url = reverse("peering-api:internetexchange-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(InternetExchange.objects.count(), 3)
self.assertEqual(response.data[0]["slug"], data[0]["slug"])
self.assertEqual(response.data[1]["slug"], data[1]["slug"])
def test_update_internet_exchange(self):
data = {"name": "Test", "slug": "test"}
url = reverse(
"peering-api:internetexchange-detail",
kwargs={"pk": self.internet_exchange.pk},
)
response = self.client.put(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(InternetExchange.objects.count(), 1)
internet_exchange = InternetExchange.objects.get(pk=response.data["id"])
self.assertEqual(internet_exchange.slug, data["slug"])
def test_delete_internet_exchange(self):
url = reverse(
"peering-api:internetexchange-detail",
kwargs={"pk": self.internet_exchange.pk},
)
response = self.client.delete(url, **self.header)
self.assertStatus(response, status.HTTP_204_NO_CONTENT)
self.assertEqual(InternetExchange.objects.count(), 0)
def test_available_peers(self):
url = reverse(
"peering-api:internetexchange-available-peers",
kwargs={"pk": self.internet_exchange.pk},
)
response = self.client.get(url, **self.header)
self.assertStatus(response, status.HTTP_503_SERVICE_UNAVAILABLE)
def test_configuration(self):
url = reverse(
"peering-api:internetexchange-configuration",
kwargs={"pk": self.internet_exchange.pk},
)
response = self.client.get(url, **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(response.data["configuration"], "")
def test_import_peering_sessions(self):
url = reverse(
"peering-api:internetexchange-import-peering-sessions",
kwargs={"pk": self.internet_exchange.pk},
)
response = self.client.post(url, **self.header)
self.assertStatus(response, status.HTTP_503_SERVICE_UNAVAILABLE)
def test_prefixes(self):
url = reverse(
"peering-api:internetexchange-prefixes",
kwargs={"pk": self.internet_exchange.pk},
)
response = self.client.get(url, **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(response.data["prefixes"], [])
def test_configure_router(self):
url = reverse(
"peering-api:internetexchange-configure-router",
kwargs={"pk": self.internet_exchange.pk},
)
response = self.client.get(url, **self.header)
self.assertStatus(response, status.HTTP_503_SERVICE_UNAVAILABLE)
response = self.client.post(url, **self.header)
self.assertStatus(response, status.HTTP_503_SERVICE_UNAVAILABLE)
def test_update_peering_sessions(self):
url = reverse(
"peering-api:internetexchange-update-peering-sessions",
kwargs={"pk": self.internet_exchange.pk},
)
response = self.client.post(url, **self.header)
self.assertStatus(response, status.HTTP_503_SERVICE_UNAVAILABLE)
class InternetExchangePeeringSessionTest(APITestCase):
def setUp(self):
super().setUp()
self.autonomous_system = AutonomousSystem.objects.create(
asn=201281, name="Guillaume Mazoyer"
)
self.internet_exchange = InternetExchange.objects.create(
name="Test", slug="test"
)
self.internet_exchange_peering_session = InternetExchangePeeringSession.objects.create(
autonomous_system=self.autonomous_system,
internet_exchange=self.internet_exchange,
ip_address="2001:db8::1",
)
def test_get_internet_exchange_peering_session(self):
url = reverse(
"peering-api:internetexchangepeeringsession-detail",
kwargs={"pk": self.internet_exchange_peering_session.pk},
)
response = self.client.get(url, **self.header)
self.assertEqual(
response.data["ip_address"],
self.internet_exchange_peering_session.ip_address,
)
def test_list_internet_exchange_peering_sessions(self):
url = reverse("peering-api:internetexchangepeeringsession-list")
response = self.client.get(url, **self.header)
self.assertEqual(response.data["count"], 1)
def test_create_internet_exchange_peering_session(self):
data = {
"autonomous_system": self.autonomous_system.pk,
"internet_exchange": self.internet_exchange.pk,
"ip_address": "192.168.0.1",
}
url = reverse("peering-api:internetexchangepeeringsession-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(InternetExchangePeeringSession.objects.count(), 2)
internet_exchange_peering_session = InternetExchangePeeringSession.objects.get(
pk=response.data["id"]
)
self.assertEqual(
internet_exchange_peering_session.ip_address, data["ip_address"]
)
def test_create_internet_exchange_peering_session_bulk(self):
data = [
{
"autonomous_system": self.autonomous_system.pk,
"internet_exchange": self.internet_exchange.pk,
"ip_address": "10.0.0.1",
},
{
"autonomous_system": self.autonomous_system.pk,
"internet_exchange": self.internet_exchange.pk,
"ip_address": "10.0.0.2",
},
]
url = reverse("peering-api:internetexchangepeeringsession-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(InternetExchangePeeringSession.objects.count(), 3)
self.assertEqual(response.data[0]["ip_address"], data[0]["ip_address"])
self.assertEqual(response.data[1]["ip_address"], data[1]["ip_address"])
def test_update_internet_exchange_peering_session(self):
data = {
"autonomous_system": self.autonomous_system.pk,
"internet_exchange": self.internet_exchange.pk,
"ip_address": "2001:db8::2",
}
url = reverse(
"peering-api:internetexchangepeeringsession-detail",
kwargs={"pk": self.internet_exchange_peering_session.pk},
)
response = self.client.put(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(InternetExchangePeeringSession.objects.count(), 1)
internet_exchange_peering_session = InternetExchangePeeringSession.objects.get(
pk=response.data["id"]
)
self.assertEqual(
internet_exchange_peering_session.ip_address, data["ip_address"]
)
def test_delete_internet_exchange_peering_session(self):
url = reverse(
"peering-api:internetexchangepeeringsession-detail",
kwargs={"pk": self.internet_exchange_peering_session.pk},
)
response = self.client.delete(url, **self.header)
self.assertStatus(response, status.HTTP_204_NO_CONTENT)
self.assertEqual(InternetExchangePeeringSession.objects.count(), 0)
class RouterTest(APITestCase):
def setUp(self):
super().setUp()
self.router = Router.objects.create(
name="Test", hostname="test.example.com", platform=PLATFORM_JUNOS
)
def test_get_router(self):
url = reverse("peering-api:router-detail", kwargs={"pk": self.router.pk})
response = self.client.get(url, **self.header)
self.assertEqual(response.data["hostname"], self.router.hostname)
def test_list_routers(self):
url = reverse("peering-api:router-list")
response = self.client.get(url, **self.header)
self.assertEqual(response.data["count"], 1)
def test_create_router(self):
data = {
"name": "Other",
"hostname": "other.example.com",
"platform": PLATFORM_JUNOS,
}
url = reverse("peering-api:router-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(Router.objects.count(), 2)
router = Router.objects.get(pk=response.data["id"])
self.assertEqual(router.hostname, data["hostname"])
def test_create_router_bulk(self):
data = [
{
"name": "Test1",
"hostname": "test1.example.com",
"platform": PLATFORM_JUNOS,
},
{
"name": "Test2",
"hostname": "test2.example.com",
"platform": PLATFORM_JUNOS,
},
]
url = reverse("peering-api:router-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(Router.objects.count(), 3)
self.assertEqual(response.data[0]["hostname"], data[0]["hostname"])
self.assertEqual(response.data[1]["hostname"], data[1]["hostname"])
def test_update_router(self):
data = {
"name": "Test",
"hostname": "test.example.com",
"platform": PLATFORM_IOSXR,
}
url = reverse("peering-api:router-detail", kwargs={"pk": self.router.pk})
response = self.client.put(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(Router.objects.count(), 1)
router = Router.objects.get(pk=response.data["id"])
self.assertEqual(router.hostname, data["hostname"])
def test_delete_router(self):
url = reverse("peering-api:router-detail", kwargs={"pk": self.router.pk})
response = self.client.delete(url, **self.header)
self.assertStatus(response, status.HTTP_204_NO_CONTENT)
self.assertEqual(Router.objects.count(), 0)
def test_test_napalm_connection(self):
url = reverse(
"peering-api:router-test-napalm-connection", kwargs={"pk": self.router.pk}
)
response = self.client.get(url, **self.header)
self.assertStatus(response, status.HTTP_503_SERVICE_UNAVAILABLE)
class RoutingPolicyTest(APITestCase):
def setUp(self):
super().setUp()
self.routing_policy = RoutingPolicy.objects.create(
name="Test", slug="test", type=ROUTING_POLICY_TYPE_EXPORT
)
def test_get_routing_policy(self):
url = reverse(
"peering-api:routingpolicy-detail", kwargs={"pk": self.routing_policy.pk}
)
response = self.client.get(url, **self.header)
self.assertEqual(response.data["slug"], self.routing_policy.slug)
def test_list_routing_policies(self):
url = reverse("peering-api:routingpolicy-list")
response = self.client.get(url, **self.header)
self.assertEqual(response.data["count"], 1)
def test_create_routing_policy(self):
data = {"name": "Other", "slug": "other", "type": ROUTING_POLICY_TYPE_EXPORT}
url = reverse("peering-api:routingpolicy-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(RoutingPolicy.objects.count(), 2)
routing_policy = RoutingPolicy.objects.get(pk=response.data["id"])
self.assertEqual(routing_policy.slug, data["slug"])
def test_create_routing_policy_bulk(self):
data = [
{"name": "Test1", "slug": "test1", "type": ROUTING_POLICY_TYPE_EXPORT},
{"name": "Test2", "slug": "test2", "type": ROUTING_POLICY_TYPE_EXPORT},
]
url = reverse("peering-api:routingpolicy-list")
response = self.client.post(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_201_CREATED)
self.assertEqual(RoutingPolicy.objects.count(), 3)
self.assertEqual(response.data[0]["slug"], data[0]["slug"])
self.assertEqual(response.data[1]["slug"], data[1]["slug"])
def test_update_routing_policy(self):
data = {"name": "Test", "slug": "test", "type": ROUTING_POLICY_TYPE_IMPORT}
url = reverse(
"peering-api:routingpolicy-detail", kwargs={"pk": self.routing_policy.pk}
)
response = self.client.put(url, data, format="json", **self.header)
self.assertStatus(response, status.HTTP_200_OK)
self.assertEqual(RoutingPolicy.objects.count(), 1)
routing_policy = RoutingPolicy.objects.get(pk=response.data["id"])
self.assertEqual(routing_policy.type, data["type"])
def test_delete_routing_policy(self):
url = reverse(
"peering-api:routingpolicy-detail", kwargs={"pk": self.routing_policy.pk}
)
response = self.client.delete(url, **self.header)
self.assertStatus(response, status.HTTP_204_NO_CONTENT)
self.assertEqual(RoutingPolicy.objects.count(), 0)
| 39.057796 | 95 | 0.643209 | 3,112 | 29,059 | 5.848008 | 0.049807 | 0.070059 | 0.060333 | 0.065938 | 0.885873 | 0.823287 | 0.792681 | 0.757954 | 0.734326 | 0.70905 | 0 | 0.015497 | 0.224991 | 29,059 | 743 | 96 | 39.110363 | 0.792594 | 0 | 0 | 0.552941 | 0 | 0 | 0.139234 | 0.076396 | 0 | 0 | 0 | 0 | 0.213445 | 1 | 0.114286 | false | 0 | 0.013445 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
79169b5cb801edc7c206eba43dd90f933f65461d | 76 | py | Python | src/lib/kombi/TaskHolder/Dispatcher/Local/__init__.py | paulondc/chilopoda | 046dbb0c1b4ff20ea5f2e1679f8d89f3089b6aa4 | [
"MIT"
] | 2 | 2019-09-24T18:56:27.000Z | 2021-02-07T04:58:49.000Z | src/lib/kombi/TaskHolder/Dispatcher/Local/__init__.py | paulondc/kombi | 046dbb0c1b4ff20ea5f2e1679f8d89f3089b6aa4 | [
"MIT"
] | 20 | 2019-02-16T04:21:13.000Z | 2019-03-09T21:21:21.000Z | src/lib/kombi/TaskHolder/Dispatcher/Local/__init__.py | paulondc/kombi | 046dbb0c1b4ff20ea5f2e1679f8d89f3089b6aa4 | [
"MIT"
] | 3 | 2019-11-15T05:16:32.000Z | 2021-09-28T21:28:29.000Z | from .LocalDispatcher import LocalDispatcher, LocalDispatcherExecutionError
| 38 | 75 | 0.907895 | 5 | 76 | 13.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065789 | 76 | 1 | 76 | 76 | 0.971831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
792401234e894fa2658b87004196cc67e02d7771 | 39 | py | Python | src/workshop2022/background/__init__.py | VladimirMakhanov/workshop2022 | 421ca87fc86f99955083c7103efba540dd7765af | [
"MIT"
] | null | null | null | src/workshop2022/background/__init__.py | VladimirMakhanov/workshop2022 | 421ca87fc86f99955083c7103efba540dd7765af | [
"MIT"
] | null | null | null | src/workshop2022/background/__init__.py | VladimirMakhanov/workshop2022 | 421ca87fc86f99955083c7103efba540dd7765af | [
"MIT"
] | 2 | 2022-02-02T14:12:29.000Z | 2022-02-10T17:51:24.000Z | from .app import BackgroundApplication
| 19.5 | 38 | 0.871795 | 4 | 39 | 8.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 39 | 1 | 39 | 39 | 0.971429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f70802eb0367315e5902b750a11bf191194df1bf | 41 | py | Python | tools/uniqprimer/uniqprimer/test.py | InternationalRiceResearchInstitute/RiceGalaxy | 35083ed17d59ae91e622613587228d3f7ae7d794 | [
"CC-BY-3.0"
] | 4 | 2018-10-29T18:34:38.000Z | 2021-09-29T23:30:42.000Z | tools/uniqprimer/uniqprimer/test.py | InternationalRiceResearchInstitute/RiceGalaxy | 35083ed17d59ae91e622613587228d3f7ae7d794 | [
"CC-BY-3.0"
] | null | null | null | tools/uniqprimer/uniqprimer/test.py | InternationalRiceResearchInstitute/RiceGalaxy | 35083ed17d59ae91e622613587228d3f7ae7d794 | [
"CC-BY-3.0"
] | 3 | 2020-02-12T15:22:24.000Z | 2021-08-19T10:27:39.000Z | #!/usr/bin/python
from Bio import SeqIO
| 10.25 | 21 | 0.731707 | 7 | 41 | 4.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146341 | 41 | 3 | 22 | 13.666667 | 0.857143 | 0.390244 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.